Please wait as we load hundreds of rigorously documented facts for you.



What You’ll Find

For example:


Citation Generator
X
APA
MLA
Chicago (for footnotes)
Chicago (for bibliographies)

Introductory Notes

This research is based upon the most recent available data in 2022–2023. Unless otherwise stated, dollar figures from earlier years are adjusted for inflation to make them consistent in purchasing power with modern dollars.

In keeping with Just Facts’ Standards of Credibility, all charts in this research show the full range of available data, and all facts are cited based upon availability and relevance, not to slant results by singling out specific years that are different from others. Likewise, data associated with the effects of education in different geographical areas represent random, diverse places in which such data is available.

Many of the facts in this research reveal associations between education and other variables. These relationships may be caused in part (or whole) by factors that are related to education but not necessarily caused by education. For example, individuals with high intelligence and discipline tend to excel in education and obtain more of it, but they also tend to earn more money regardless of their education. Hence, the higher earnings of people with more education can be caused by factors beyond their education.[1] [2] [3] [4] [5]

Likewise, student achievement is often affected by family and cultural influences. Thus, the test scores of students at certain schools may be caused by factors other than the schools.[6] [7]

In attempting to isolate the effect of a single factor on a certain outcome, researchers often use statistical techniques to “control” for the effects of other variables. However, these techniques cannot objectively rule out the possibility that other factors are at play. This is called “omitted variable bias.”[8] [9] [10] [11] [12] [13] Moreover, the most common method used to control for multiple variables is prone to other pitfalls that can lead to false conclusions about causes and effects.[14] [15] [16] [17]

In the social sciences, the surest way to determine the effect of one factor upon another is by examining random, experimental data. An example of this is the outcomes of students who won and did not win a random lottery for admission to a certain educational program. Studies of such data can control for the impact of all confounding variables and allow for sound conclusions about cause and effect. However, these analyses sometimes have defects and should be interpreted with caution.[18] [19] [20] [21] [22] [23] [24] [25]

Collective Spending

* In 2022, federal, state, and local governments in the U.S. spent $1.2 trillion on education.[26] This amounts to $8,993 for every household in the U.S.,[27] 4.6% of the U.S. gross domestic product,[28] and 14% of government current expenditures.[29] [30] These figures don’t include:

  • land purchases for schools and other facilities.[31]
  • some of the costs of durable items like buildings and computers.[32]
  • the unfunded liabilities of post-employment non-pension benefits (like health insurance) for government employees.[33] [34] [35] [36] [37] [38] [39]

* Government education spending in 2022 was comprised of:

  • $834 billion on elementary and secondary education.
  • $226 billion on higher education.
  • $121 billion on libraries and other education.[40]

* Relative to other types of government spending in 2022, education spending was:

  • 47% lower than spending for healthcare.
  • 18% higher than spending for national defense and veterans’ benefits.
  • 2.4 times higher than spending for public order and safety, including law enforcement, courts, prisons, fire protection, and immigration enforcement.[41]

* During 2022, private consumers and nonprofit organizations in the U.S. spent about $399 billion on formal education. This amounts to 1.6% of the U.S. gross domestic product and $3,039 for every household in the U.S.[42] [43] [44] [45] [46]

* Relative to other spending by private consumers and nonprofit organizations in 2022, education spending was:

  • 56% lower than spending on motor vehicles and parts.
  • 36% lower than spending on clothing and footwear.
  • 4% lower than spending on alcoholic beverages.[47]

Collective Outcomes

Earnings

* In 2021, U.S. residents aged 25 to 64 reported average cash earnings of $54,252.[48] Cash earnings do not include non-cash compensation, such as employee fringe benefits.[49]

* In 2021, 79% of U.S. residents aged 25 to 64 reported at least some cash earnings, and 21% did not report any cash earnings.[50]

* Among U.S. residents aged 25 to 64 who reported cash earnings in 2021, the average was $68,900. Among the same group, the median was $51,000.[51]

* Click here for more data on education and earnings.


Practical Skills

* Per the U.S. Department of Education:

As a part of their everyday lives, adults in the United States interact with a variety of printed and other written materials to perform a multitude of tasks. A comprehensive list of such tasks would be virtually endless. It would include such activities as balancing a checkbook, following directions on a prescription medicine bottle, filling out a job application, consulting a bus schedule, correctly interpreting a chart in the newspaper, and using written instructions to operate a voting machine.
A common thread across all literacy tasks is that each has a purpose—whether that purpose is to pay the telephone bill or to understand a piece of poetry. All U.S. adults must successfully perform literacy tasks in order to adequately function—that is, to meet personal and employment goals as well as contribute to the community.[52]

* Per a book about productivity published by the International Labour Office:

There are two main goals of pre-employment education: to create productivity awareness and to prepare youth for productive work by teaching the necessary knowledge and skills. Unfortunately, too much attention is paid to developing formal knowledge and too little to practical skills.
Some prestigious educational institutions place too much emphasis on purely academic matters instead of teaching people how to manage factories and shop-floor production. Too much emphasis is still placed on management sciences and research instead of on preparing creative entrepreneurs capable of innovating, and of organizing and managing work.
A change of emphasis from a knowledge-based or academic system of education (both secondary and higher) to one based on problem-solving and the completion of concrete tasks would result in an improvement in the productivity culture.[53] [54]

2017 U.S. Skills Assessment

* In 2017, the U.S. Department of Education assessed the reading, math, and computer skills of U.S. residents aged 16–65 years. The assessment was nationally representative of people who live in households and excluded the homeless and people in group quarters like prisons and psychiatric institutions.[55] [56] [57]

* The test takers were allowed to use calculators and take as much time as they needed.[58]

* The actual questions on the test are not available to the public, but here are some sample questions:[59]

  • 70% of U.S. adults were able to answer a question similar to this one requiring the ability to compare data in a table to a chart:
The factory manager checked this graph that had been prepared using the data in the table for 2011. He noticed that two bars were incorrect. Click on the two incorrect bars on the graph.
Program for the International Assessment of Adult Competencies Numeracy Item Level 2

[60] [61] [62]

  • 37% of U.S. adults were able to answer a question similar to this one requiring basic logic, addition, and division:
How much would you pay during the sale if you purchase the two pairs of shoes shown?
Program for the International Assessment of Adult Competencies Numeracy Item Level 3

[63] [64] [65]

  • 10% of U.S. adults were able to answer a question similar to this one requiring knowledge of the fact that “mean” is another word for “average” and the ability to calculate a simple average:
What was the mean for the total expenditures over the three months?
Program for the International Assessment of Adult Competencies Numeracy Item Level 4

[66] [67] [68]


2003 U.S. Skills Assessment

* In 2003, the U.S. Department of Education assessed the English and math skills of U.S. residents aged 16 and older. The full assessment was nationally representative except for 5% of the population who were completely illiterate in English and Spanish or unable to answer very simple questions.[69]

* Below are some examples of the questions posed in the full assessment, along with the portions of people who answered them correctly:

  • 82% correctly answered this question requiring the ability to search and interpret text:
Refer to the chart to answer the following question. For the year 2000, what is the projected percentage of Black people who will be considered middle class?
National Assessment of Adult Literacy Question N120601

[70]

  • 60% correctly answered this question requiring the ability to search text, interpret it, and calculate using addition:
Refer to the medicine label to answer the following question. The patient forgot to take this medicine before lunch at 12:00 noon. What is the earliest time he can take it in the afternoon?
National Assessment of Adult Literacy Question C080101

[71]

  • 46% correctly answered this question requiring the ability to search text, interpret it, and calculate using multiplication:
Refer to the article below to answer the following question. Suppose that a family’s budget for one year is $29,500 and that there is one child in the family. Using the percentage given in the article, calculate how much money would go toward raising the child for that year.
National Assessment of Adult Literacy Question N130901

[72]

  • 18% correctly answered this question requiring the ability to search text, interpret it, and calculate using multiplication and division:
Refer to the advertisement for the Carpet Store on page three of the newspaper to answer the following question. Suppose that you want to carpet your living room which is 9 feet by 12 feet, and you purchase DuPont Stainmaster carpet at the sale price. Using the calculator, compute the total cost, excluding tax and labor, of exactly enough carpet to cover your living room floor.
National Assessment of Adult Literacy Question N091001

[73]

  • 11% correctly answered this question requiring the ability to examine a data table, draw inferences from it, and accurately express them:
Refer to the table on the next page to answer the following questions. Using the information in the table, write a brief paragraph summarizing the extent to which parents and teachers agreed or disagreed on the statements about issues pertaining to parental involvement at their school.
National Assessment of Adult Literacy Question N100701

[74]


Whole-Person Development

* Horace Mann, the “father” of the U.S. public education system,[75] [76] claimed in 1841:

The Common [i.e., public] School is the institution which can receive and train up children in the elements of all good knowledge, and of virtue, before they are subjected to the alienating competitions of life. This institution is the greatest discovery ever made by man;—we repeat it, the Common School is the greatest discovery ever made by man.
Let the Common School be expanded to its capabilities, let it be worked with the efficiency of which it is susceptible, and nine tenths of the crimes in the penal code would become obsolete; the long catalogue of human ills would be abridged; men would walk more safely by day; every pillow would be more inviolable by night; property, life, and character held by a stronger tenure; all rational hopes respecting the future brightened.[77] [78]

* Various federal agencies have reported that two-thirds to three-quarters of all 17- to 24-year-olds in the U.S. are unqualified for military service because of poor physical fitness, weak educational skills, illegal drug usage, medical conditions, or criminal records:

  • Based on data from the Pentagon, a 2009 study conducted by the U.S. Secretary of Education and a team of retired military officers found that 75% of young adults are unqualified for military service.[79]
  • A 2013 Department of Defense study found that this figure is 71%.[80]
  • In 2014, the commander of the U.S. Army Recruiting Command reported that the figure is 77.5%.[81]
  • A 2020 Pentagon study found that the figure is 77%.[82]

* A 2001 study of high school dropouts published in the American Economic Review found:

  • “It is common knowledge outside of academic journals that motivation, tenacity, trustworthiness, and perseverance are important traits for success in life.”
  • “It is thus surprising that academic discussions of skill and skill formation almost exclusively focus on measures of cognitive ability and ignore noncognitive skills.”
  • “Studies … demonstrate that job stability and dependability are traits most valued by employers as ascertained by supervisor ratings and questions of employers….”
  • “Our finding … demonstrates the folly of a psychometrically oriented educational evaluation policy that assumes cognitive skills to be all that matter.”[83]

K–12 Costs

Public Schools

* In 2022, federal, state, and local governments in the U.S. spent $834 billion on K–12 education. This amounts to $6,354 for every household in the U.S.[84]

* In the 2019–20 school year, governments in the U.S. spent an average of $17,013 for every student enrolled in K–12 public schools.[85] [86] This figure doesn’t include:

* A scientific, nationally representative survey commissioned in 2021 by the journal Education Next and the Kennedy School of Government at Harvard University found that U.S. adults on average estimate that their local public schools spend $8,719 per student.[101] [102]

* The average class size in public schools is 20.2 students.[103]

* In the 2019–20 school year, the average spending per public school classroom was about $343,663.[104] This excludes the items in the bullet points above.[105]

* A scientific, nationally representative survey commissioned in 2019 by Just Facts found that 53% of voters believe the average spending per public school classroom is less than $150,000 per year.[106] [107]

* Excluding the items in the bullet points above, the average inflation-adjusted spending per public school student has risen by 27% since 2000, 105% since 1980, 4.0 times since 1960, and 24 times since 1920:[108]

Inflation-Adjusted Public School Spending Per Student

[109]

* Since at least the early 1970s:

  • state governments have paid a growing share of the education expenses of low-income school districts in order to equalize their funding with higher-income districts.[110] [111]
  • school districts with higher portions of minority students have spent about the same average amount per student as school districts with smaller portions of minority students.[112] [113] [114] [115]

* Since at least 1994, school districts with higher portions of poor students have spent about the same average amount per student as school districts with smaller portions of poor students.[116]

* Adjusted for the cost of living in different states, the average spending per public school student in the 2019–20 school year ranged from $11,481 in Arizona to $30,682 in the District of Columbia. (This excludes state administration, unfunded pension liabilities, and non-pension post-employment benefits.[117])

Average Spending Per Public School Student, Adjusted for States’ Costs of Living

[118]


Private Schools

* In the 2019–20 school year, private consumers, nonprofit organizations, and governments spent an average of about $9,709 for every student enrolled in private K–12 schools.[119] [120] [121] [122] [123] [124]

* The average class size in private schools is 15.3 students.[125]

* In the 2019–20 school year, the average spending per private school classroom was about $148,548.[126]

* In the 2011–12 school year (latest available data), the average full tuition for students in private K–12 schools was $13,310. Full tuition or “sticker price” is “the highest annual tuition charged for a full-time student.” The actual amounts paid by individuals are lower if they receive discounts for reasons such as having low income, siblings in the school, or a parent who is a teacher. For different types of private schools, the average full tuition varied as follows:

School Type

Tuition

Catholic

$8,539

Other religious

$10,769

Nonsectarian

$26,657

[127] [128]


Homeschooling

* A nationwide study of 11,739 homeschooled students during the 2007–08 school year found that parents spent a median of $400 to $599 per student on “textbooks, lesson materials, tutoring, enrichment services, testing, counseling, evaluation,” and other incidentals.[129] [130] Regarding these findings:

  • The study was based on a survey with a response rate of approximately 19%.[131] Thus, the results are not definitive.[132] [133] [134]
  • Adjusted for inflation into 2023 dollars, the median annual cost to educate a homeschooled student ranged from $576 to $862.[135]
  • These figures do not account for the cost of parental time investment or the value of being able to live in areas without regard for the quality of the local schools.[136]

Sources of Funding

* From 1920 to 2020 the portion of K–12 public school funding provided by:

  • local governments decreased from 83% to 45%.
  • state governments increased from 17% to 48%.
  • the federal government increased from 0.3% to 8%.
Sources of K–12 School Funding

[137]

* In the 2019–20 school year, public school revenues came from the following sources:

Source

Portion of Revenues

Federal Government

8%

State Governments

48%

Local

45%

Property Taxes

37%

Other Government Revenues

7%

Private Revenues

1%

[138]


Spending by Function

* In the 2018–19 school year, 52% of public education spending was used for student instruction.[139] (This excludes state administration, unfunded pension liabilities, and non-pension post-employment benefits.[140]) The remainder was spent on:

Function

Portion of Total

Property purchases and building construction

10%

Operations and maintenance

8%

Administration

7%

Student guidance, health, attendance, and speech pathology services

5%

Instructional staff services, such as curriculum development, training, and computer centers

4%

Student transportation

4%

Food services

3%

Interest on school debt

3%

Other

4%

[141]

* In the 2018–19 school year, 69% of public school expenditures were spent on government employee benefits and salaries.[142] (This excludes state administration, unfunded pension liabilities, and non-pension post-employment benefits.[143])

* In 2021, 49% of all compensation for state and local government employees was paid to people who work in education.[144] This includes salaries and benefits.[145] [146] [147] [148] [149] [150]


Teacher Compensation

* In the 2021–22 school year, the average immediate costs to taxpayers of compensating each full-time public school teacher in the U.S. were:

  • $66,397 in salary, or 66% of the total.
  • $34,090 in benefits (such as health insurance, paid leave, and pensions), or 34% of the total.
  • $100,487 in total compensation.[151]

* Immediate costs of compensating teachers don’t include unfunded pension liabilities and non-pension post-employment benefits like health insurance.[152] [153] [154] [155]

* Adjusted for the costs of living in different states, the average immediate costs of compensating each full-time public school teacher in the 2021–22 school year ranged from $76,463 in Florida to $127,463 in New York.[156]

* Full-time public school teachers work an average of 1,490 hours per year, including time spent for lesson preparation, test construction and grading, providing extra help to students, coaching, and other activities.[157] [158] [159] [160]

* Full-time private industry workers work an average of 2,045 hours per year, or about 37% more than public school teachers. This includes time spent working beyond assigned schedules at the workplace and at home.[161]

* Accounting for the disparity between the work hours of public school teachers and private industry workers, the annualized immediate cost of compensating each full-time public school teacher in the 2021–22 school year was a nationwide average of $137,917.[162] [163]

* Adjusted for the costs of living in different states, the average annualized immediate costs of compensating each full-time public school teacher in the 2021–2022 school year ranged from $104,944 in Florida to $174,940 in New York:

Average Annualized Immediate Compensation Per Public School Teacher, Adjusted for States’ Costs of Living

[164] [165]


* In the 2020–21 school year (latest data):

  • 16.8% of teachers had a job outside the school system during the school year, and among them, the average salary was $6,090.
  • 16.1% of teachers had a non-school job during the summer, and among them, the average salary was $3,550.
  • non-school jobs accounted for 2.4% of public school teachers’ total salaries.[166]

* In the 2020–21 school year (latest data), the average base salary for full-time public school teachers was 33% higher than for full-time private school teachers.[167]

* In March 2020, the average immediate cost per contract hour of compensating public school teachers and private school teachers varied as follows:

Compensation Component

Cost Per Contract Hour

Public School

Private School

Public School Premium

Wages and salaries

$45.29

$38.33

18%

Benefits

$23.56

$13.22

78%

Total compensation

$68.85

$51.55

34%

[168]

* The following caveats apply to the data above:

  • Immediate costs don’t include unfunded pension liabilities and non-pension post-employment benefits like health insurance.[169] [170] [171] [172] These costs are common in the government sector and rare in the private sector.[173] [174]
  • Contract hours do not include the added time that teachers work beyond their contractual schedules for lesson preparation and other nonclassroom activities.[175] In 2010, full-time private school teachers worked an average of 11% more hours than full-time public school teachers.[176]

K–12 Outcomes

General

* In the U.S., all 50 states provide children with at least 13 years of taxpayer-financed education from kindergarten through 12th grade.[177]

* The average public school year is 179 days, and the average school day is 6.7 hours not including transportation and extracurricular activities.[178]

* In 2019, approximately 88% of K–12 students were enrolled in public schools, 10% were enrolled in private schools, and 2% were homeschooled.[179] [180]

* Among public school students who began high school in 2016, 87% graduated within four years. This was true for:

  • 93% of Asian students.
  • 90% of white students.
  • 83% of Hispanic students.
  • 81% of black students.
  • 75% of American Indian/Native Alaskan students.[181]

* In 2021, U.S. residents aged 25 to 64:

  • with some high school education who did not graduate high school reported an average of $20,648 in cash earnings.
  • with a high school degree and no further education reported an average of $33,989 in cash earnings.[182] [183]

* Click here for more data on education and earnings.


College Readiness

* In 2022, 37% of high school students who graduated that year took the ACT college readiness exam.[184] Among these graduates, 22% met ACT’s college readiness benchmarks in all four subjects (English, reading, math, and science). For each subject, the rates of college readiness were as follows:

  • English – 53%
  • Reading – 41%
  • Science – 32%
  • Mathematics – 31%[185]

* Among high school students who graduated in 2022 and took the ACT college readiness exam, the following racial/ethnic groups met ACT’s college readiness benchmarks in all four subjects:

  • Asian – 51%
  • White – 29%
  • Hispanic – 11%
  • Pacific Islander – 10%
  • American Indian – 6%
  • African American – 5%[186]

* From 2010 to 2021, the average GPA of high school students who took the ACT college readiness exam test rose by 5%, while their average ACT test score fell by 3%. Per ACT, “This suggests the presence of grade inflation” because “scores on a standardized measure of achievement” declined while GPAs increased.[187]


International Comparisons

* In 2019, the U.S. ranked 5th among 36 developed nations in average spending per full-time K–12 student. The average spending per U.S. pupil was 38% above the average of these nations.[188] [189]

Math

* In math tests administered by the International Mathematics and Science Study to 4th grade students during 2015, U.S. students ranked 14th among 48 nations. The average score of U.S. students was 8% above the average of all tested nations.[190]

* In math tests administered by the Program for International Student Assessment to 15-year-old students during 2018, U.S. students ranked 31st among 37 developed nations. The average score of U.S. students was 2% below the average of all tested nations.[191] [192] [193]

* U.S. students outperformed the following nations on the 4th-grade math exam but underperformed them on the 15-year-old math exam: Australia, Canada, Czech Republic, Finland, France, Germany, Hungary, Italy, Lithuania, Netherlands, New Zealand, Poland, Slovenia, Sweden, and Slovak Republic. U.S. students did not move ahead of any other nation between the 4th grade and 15 years old.[194] [195]

Reading

* In reading literacy tests administered by the Progress in International Reading Literacy Study to 4th grade students during 2016, U.S. students ranked 15th among 50 nations. The average score of U.S. students was 8% above the average of all tested nations.[196]

* In reading literacy tests administered by the Program for International Student Assessment to 15-year-old students during 2018, U.S. students ranked 9th among 36 developed nations. The average score of U.S. students was 4% above the average of all tested nations.[197] [198] [199]

* U.S. students outperformed Canada and New Zealand on the 4th-grade reading exam but underperformed them on the 15-year-old reading exam. U.S. students moved ahead of Hungary, Latvia, Norway, and the United Kingdom between the 4th grade and 15 years old.[200] [201]

Spending

* In 2013, Randi Weingarten, president of the American Federation of Teachers labor union, stated:

When people talk about other countries out-educating the United States, it needs to be remembered that those other nations are out-investing us in education as well.[202]

* In 2013, the U.S. ranked 5th among 33 developed nations in average spending per full-time K–12 student. The average spending per U.S. student was 28% above the average of these nations, and U.S. 15-year-olds ranked 19th in reading and 30th in math.[203] [204] [205]

* Among the same nations, U.S. 15-year-olds did not match or outperform any nation in both reading and math that outspent the U.S. The following nations matched or outperformed the U.S. in both reading and math while spending less than the U.S.:

Nation

U.S. Spending Premium

Math Advantage Over U.S.

Reading Advantage Over U.S.

Belgium

2%

8%

0%

United Kingdom

3%

5%

0%

Denmark

6%

9%

1%

Sweden

9%

5%

1%

Netherlands

12%

9%

1%

Germany

15%

8%

2%

France

22%

5%

0%

Finland

24%

9%

6%

Japan

24%

13%

4%

Australia

25%

5%

1%

Ireland

27%

7%

5%

New Zealand

32%

5%

2%

Slovenia

33%

9%

2%

Portugal

35%

5%

0%

South Korea

42%

12%

4%

Spain

53%

3%

0%

Estonia

75%

11%

4%

Poland

78%

7%

2%

[206] [207] [208]


Historical Perspective

* In 1885, the Jersey City, NJ school district spent an average of $13.24 over the course of the year for each of the 14,926 students in average daily attendance.[209] Adjusted for inflation into 2021 dollars, this is an average of $399 per student per year, or 1/43 the national average spending per student in 2020.[210] [211]

* Below are the arithmetic and algebra questions from the 1885 high school entrance exam in Jersey City, NJ. In order to enter high school, students had to score at least 75%. A copy of the full test and the names and scores of all passing students are shown in this footnote.[212]

Arithmetic

  1. If a 60 days note of $840 is discounted at a bank at 4½% what are the proceeds?
  1. Find the sum of √16.7281 and √.72¼.
  1. The interest of $50 from March 1st to July 1st is $2.50. What is the rate?
  1. What is the cost of 19 cwt. 83 lb. of sugar at $98.50 a ton? What is discount? A number?
  1. Divide the difference between 37 hundredths and 95 thousandths by 25 hundred thousandths and express the result in words.
  1. The mason work on a building can be finished by 16 men in 24 days, working 10 hours a day. How long will it take 22 men working 8 hours a day?
  1. A merchant sold a quantity of goods for $18,775. He deducts 5% for cash and then finds that he has made 10%. What did he pay for the goods?
  1. A requires 10 days and B 15 days to do a certain piece of work. How long will it take A and B working together to do the work?
  1. By selling goods it 12½% profits, a man clears $800. What was the cost of the goods, and for what were they sold?
  1. A merchant offered some goods for $1170.90 cash, or $1206 payable in 30 days. Which was the better offer for the customer, money being worth 10%?

Algebra

  1. Define Algebra, an algebraic expression, a polynomial. Make a literal trinomial.
  1. Write a homogeneous quadrinomial of the third degree. Express the cube root of 10ax in two ways.
  1. Find the sum and difference of 3x−4xy+7cd−4xy+16, and 10ay−3x−8xy+7cd−13.
  1. Express the following in its simplest form by removing the parentheses and combining: 1−(1−a)+(1−a+a2)−(1−a+a2−a3).
  1. Find the product of 3+4x+5x2−6x3, and 4−5x−6x2.
  1. Expand each of the following expressions and give the theorem for each: [a+4]2, [a2−10]2, [a+4] [a−4].
  1. Divide 6a4+4a3x−9a2x2−3ax3+2x4 by 2a2+2ax−x2.
  1. Find the prime factors of x4−b4 and x3−l.
  1. Find the greatest common denominator of 6a2+11ax+3x2 and 6a2+7ax−3x2.
  1. Divide [x2−2xy+y2]/ab by [x−y]/bc and give the answer in its lowest terms.
  1. Change [2x2+5]/[x+3] to a mixed quantity.

Higher Education Costs

Spending Per Student

* In the 2020–21 school year, public 4-year colleges spent an average of $52,896 per full-time-equivalent student. For other types of colleges, spending per student varied as follows:

Control of Institution[213]

4-Year Colleges

2-Year Colleges

Public

$52,896

$21,729

Private Nonprofit

$69,145

$27,464

Private For-Profit

$17,661

$14,828

[214]

* From 2000 to 2021, the average inflation-adjusted spending by private non-profit 4-year colleges per full-time-equivalent student rose by 29%. For other types of colleges, spending per student varied as follows:

Inflation-Adjusted Average Spending Per College Student

[215]


Spending By Function

* In the 2020–21 school year, private for-profit colleges spent an average of 28% of their finances on student instruction. For all types of colleges, their breakdown of spending on various functions varied as follows:

Function

Public

Private Nonprofit

Private For-Profit

Instruction[216]

27%

29%

28%

Research[217]

10%

11%

0.2%

Public service[218]

4%

1%

Academic support[219]

8%

9%

65%

Student services[220]

5%

8%

Institutional support[221]

9%

13%

Hospitals[222]

15%

16%

Auxiliary enterprises[223]

7%

7%

2%

Other[224] [225]

13%

6%

6%

[226] [227]


Taxpayer Funding

* During 2021, federal, state and local governments spent $232 billion on higher education.[228] not including additional government funding of university research, university hospitals, and student loans.[229] [230] [231] This $232 billion amounts to:

  • $1,796 for every household in the United States.[232]
  • 47% of spending by public and private colleges on all functions but research and hospitals.
  • 90% of spending by public and private colleges on all functions that directly contribute to the education of students and the general public.[233] This:
    • includes instruction (like teaching salaries and classrooms), public services (like informational conferences), and academic support (like libraries and information technology).[234] [235]
    • doesn’t include student services (like recreation and cultural events), institutional support (like administration and advertising), auxiliary enterprises (like dorms and food), and other miscellaneous expenses.[236] [237]

* From 1959 to 2021, inflation-adjusted government spending on higher education rose from $4,137 per student per year to $13,434. This doesn’t include additional government funding for university research, university hospitals, and student loans:

Inflation-Adjusted Government Spending Per Higher Education Student

[238] [239] [240] [241]


Tuition, Fees, Room & Board

* Colleges and universities publish “rates” or “sticker prices” for their tuition, fees, room, and board. Individual students pay less than these sticker prices if they receive discounts, scholarships, or financial aid.[242]

* In the 2020–21 school year, the average sticker price for:

  • tuition and fees at public 2-year colleges was:
    • $3,501 for in-state students.
    • $8,256 for out-of-state students.
  • tuition, fees, room, and board at public 4-year colleges was:
    • $21,337 for in-state students.
    • $39,054 for out-of-state students.
  • tuition, fees, room, and board at private 4-year colleges was $43,313.[243]

* From 1964 to 1980, the average annual inflation-adjusted sticker price for tuition, fees, room, and board for all full-time undergraduate students fell by 11%. From 1980 to 2022, it rose by 167%:

Inflation-Adjusted College Tuition. Fees, Room, and Board

[244]

* Colleges that are subsidized by taxpayers and donors generally spend more money per student than their sticker prices. In the 2020–21 school year, the average amount spent by colleges for each full-time-equivalent student at:

  • 2-year public colleges was about:
    • 6.2 times greater than their average sticker price for in-state students.[245]
    • 2.6 times greater than their average sticker price for out-of-state students.[246]
  • 4-year public colleges was about:
    • 2.5 times greater than their average sticker price for in-state students.[247]
    • 1.4 times greater than their average sticker price for out-of-state students.[248]
  • 4-year private non-profit colleges was about 1.5 times greater than their average sticker price.[249]

Fraud

* For the 2012 tax year, 12.2 million tax filers (claiming 13.4 million students) received $19 billion in higher education tax credits.[250] Tax credits decrease the taxes that people must pay on a dollar-for-dollar basis, and some are refundable, which means that households with credits that exceed their income taxes receive the difference as cash payouts from the government. Per the IRS Inspector General, “the risk of fraud for these types of claims is significant.”[251] [252] [253] [254] [255]

* In 2015, the IRS Inspector General published an investigation of higher education tax credits for the 2012 tax year. The investigation found that 3.6 million tax filers (claiming 3.8 million students) received $5.6 billion in credits “that appear to be erroneous based on IRS records.” Some examples include:

  • 1.6 million filers (claiming 1.7 million students) who received $2.5 billion in credits, even though the educational institutions listed on their tax forms were not eligible for the credits.
  • filers claiming 419,827 students who received at least five years of credits, even though they are legally limited to four years of credits.
  • 2,148 tax filers who received $3.9 million in credits for people who were incarcerated for the entire year.[256]

Student Loans

Overview

* The federal government offers student loans that can be used to attend college, vocational schools, or trade schools.[257]

* There are different types of federal student loans, each with its own set of conditions and interest rates. Most of these loans generally require borrowers to pay back the money within 10 years of finishing college.[258]

* For people with good credit histories, the market rates on private student loans are sometimes lower than the rates on federal student loans.[259]


Current Status

* As of the first quarter of 2023:

  • Americans owed $1.6 trillion dollars in student loans, or more than any other type of consumer debt except for mortgages.[260]
  • 95% of outstanding student loan balances were federal loans.[261]
  • 99% of student loans were not being repaid,[262] mainly because the federal government suspended payments in the wake of the Covid-19 pandemic.[263] [264] [265] [266] [267]

* In the context of student loans:

  • “default” means that no payments have been made for more than 360 days.
  • “deferment” means that payments have been postponed for reasons such as “returning to school, military service, or economic hardship.”
  • “forbearance” means that payments have been temporarily suspended or reduced because of financial hardship, including when the federal government suspended student loan repayments in the wake of the Covid-19 pandemic.[268] [269] [270]

* In the first quarter of 2023, 1% of all federal student loans were actively being repaid. The other 99% fell into the following categories:

  • 73% in forbearance
  • 9% in default
  • 8% in deferment
  • 7% still in school
  • 1% in a grace period
  • 0.5% other[271]

Risks

* Per the U.S. Treasury, the federal government creates loan programs so that people who are “unable to afford credit at the market rate” or have a “high risk” of defaulting can borrow money at “an interest rate lower than the market rate.”[272]

* Per the U.S. Congressional Budget Office, “When the government extends credit, the associated market risk of those obligations is effectively passed along to citizens….”[273]

* Per a 2014 report by the U.S. Treasury Borrowing Advisory Committee:

A key concern is that students are taking on student loans because historically an education has been correlated with economic mobility; however, today an average of 40% of students at four-year institutions (and 68% of students in for-profit institutions) do not graduate within six years, which means they most likely do not benefit from the income upside from a higher degree yet have the burden of student debt.[274]

* Per Deborah J. Lucas, director of the MIT Center for Finance and Policy and former chief economist of the Congressional Budget Office:[275]

Government credit programs may have adverse consequences that must be weighed against their expected benefits. One concern is that credit subsidies will distort the allocation of capital in the economy and crowd out productive investments by households and firms.
A related concern is that credit subsidies tend to affect the price of goods and services so as to reduce the benefits to the intended beneficiaries. Consider the mortgage guarantees offered to first-time home buyers by the FHA [Federal Housing Authority]. The program increases the demand for housing, which in turn puts upward pressure on home prices. Such price increases benefit current homeowners at the expense of first-time home buyers, possibly offsetting the value of the mortgage subsidy. As another example, some observers point to the easy and low-cost access to federal student loans as fueling the steep rise in the cost of higher education in the last decade.
Easier access to credit markets is not always advantageous to program participants. Unsophisticated borrowers, such as some college students and first-time homebuyers, may not be fully aware of the costs and risks associated with accumulating high debt loans. Consumer protection and disclosure laws usually do not extend to the government, and there is the possibility that it will inadvertently offer poorly designed products that can harm consumers. …
A well-understood consequence of government credit provision is that it tends to create incentives for greater risk taking, particularly when a borrower becomes financially distressed. The reason is that a debtor with a guaranteed debt benefits from the upside if a gamble pays off, whereas the government shares in the losses if the gamble fails.[276]

History

* In 1965, the 89th U.S. Congress and Democratic President Lyndon B. Johnson created a program to finance student loans for higher education. These loans were issued by private lenders and guaranteed against default by the federal government.[277] [278] [279]

* In 1993, the 103rd U.S. Congress and Democratic President Bill Clinton created a program to finance student loans directly from the U.S. Treasury. The law required that increasing portions of all new federal student loans be made through this program.[280] The bill passed Congress with 85% of Democrats voting for it and 100% of Republicans voting against it.[281]

* In 2010, the 111th Congress and Democratic President Barack Obama passed a law requiring that all new federal student loans be financed directly from the U.S. Treasury.[282] [283] [284] The bill passed Congress with 88% of Democrats voting for it and 99% of Republicans voting against it.[285]

* As of September 30th, 2022, 93% of all student loans were owed to or guaranteed by the federal government.[286]

* As of first quarter of 2023, Americans had $1.6 trillion of outstanding student loan debt:

Inflation-Adjusted Total Student Loan Debt

[287] [288] [289] [290]

* In 2012, the 90+ day delinquency rate for student loans exceeded that of credit cards for the first time since reliable data on this measure became available in 2003.[291] It remained the most common type of delinquent debt until early 2020 when the federal government passed a law that suspended student loan payments in the wake of the Covid-19 pandemic:[292] [293] [294] [295] [296]

Balance of Consumer Loans 90+ Days Delinquent

[297] [298] [299] [300]

* In 2020, Congress passed and President Trump signed a “Covid-19 relief” law that suspended student loan payments and interest through September 2020.[301] After this, President Trump and President Biden repeatedly extended this policy without clear legal authority to do so.[302] The cost to taxpayers of these actions was roughly $102 billion.[303]

* Before student loan payments were suspended in 2020, 57% of all federal student loan balances were actively being repaid or less than 360 days delinquent. As of second quarter of 2022, this figure is 1%:

Portion of Federal Student Loans Actively Being Repaid

[304] [305] [306]

* Before the federal government suspended student loan payments,[307] the 43% of loans that were not actively being repaid fell into the following categories:

  • 13% in default
  • 10% in forbearance
  • 9% still in school
  • 9% in deferment
  • 2% in a grace period

Forgiveness/Transference

* When people don’t pay back student loans because politicians forgive them, this debt is transferred to people who did not borrow the money.[310] [311]

* Since 1976, federal law has prohibited people from reneging on federal student loans by filing for bankruptcy (except in rare cases).[312] [313] [314] [315]

* Federal laws authorize more than 50 federal student loan forgiveness and repayment programs. Such programs reduce or eliminate student loan debt for various reasons, such as having income below certain thresholds or being a government employee.[316]

* In 2015, President Obama instructed his administration to “develop recommendations for regulatory and legislative changes for all student loan borrowers, including possible changes to the treatment of loans in bankruptcy proceedings….”[317]

* In 2015, the Obama administration issued regulations that limited student loan payments to 10% of borrowers’ monthly incomes and forgave:

  • undergraduate loans after 20 years of payments.
  • graduate program loans after 25 years.
  • government employee loans after 10 years.[318] [319]

* In 2021–22, the Biden administration issued regulations that:

  • made loan cancellation automatic for disabled individuals, regardless of income.[320]
  • will more than double the government employee loan forgiveness program.[321] [322]

* By law, the U.S. Department of Education can forgive federal student loans for borrowers who attended a school that “violated state law” through “misleading activities or other misconduct [that] directly relate to the loan or to the educational services for which the loan was provided.”[323] The Obama administration in 2015 and Biden administration in 2021–22 announced regulations to “streamline” these applications and expand the scope of loan forgiveness to include:

  • students whose schools closed down while they were in attendance.
  • people “who believe they were victims of fraud, regardless of whether their school closed.”
  • refunds of any student loan payments already made.
  • full loan forgiveness for people who previously received partial loan forgiveness.
  • automatic forgiveness for certain students who have not applied for it.[324] [325] [326] [327] [328]

* With regard to this law:

  • In 2015, the Obama administration announced that it was forgiving the federal student loans of people who attended schools owned by Corinthian Colleges, Inc., a for-profit company that filed for bankruptcy under allegations of fraud.[329] [330]
  • In 2017, the Trump administration announced a plan to protect “taxpayers from being forced to shoulder massive costs that may be unjustified” by calling for “tiers of relief … based on damages incurred.”[331] [332] After court losses, the Trump administration withdrew its plan and implemented the original regulations in 2019.[333] [334] [335] [336]
  • In 2021–22, the Biden administration:
    • cancelled about $13 billion in student loans.[337] [338] [339] [340]
    • cancelled loans averaging $40,000 per student of DeVry University because the school misrepresented its job placement rate.
    • continued issuing new loans to students of DeVry.[341] [342]

* In 2022, President Biden announced that he is “forgiving” $20,000 of student loan debt for the vast bulk of Pell Grant recipients and $10,000 for others who owe student loans.[343] With regard to this action, the Biden administration:

  • made this policy applicable to individuals with incomes up to $125,000 per year and households with incomes up to $250,000.[344]
  • proposed a “new income-driven repayment plan that will substantially reduce future monthly payments for lower- and middle-income borrowers.”[345]
  • claimed this action was legal under a 2003 law called the Higher Education Relief Opportunities for Students Act, or HEROES Act.”[346] [347]

* The Penn Wharton Budget Model estimated that Biden’s student loan cancellations and payment reductions would have cost $605 billion to more than $1 trillion.[348] If these expenses were equally divided among all households in the U.S., they would have cost each household about $4,700 to $7,700.[349]

* With regard to the legality of Biden’s action:

  • the sponsor of the HEROES Act, Republican John Kline of Minnesota,[350] introduced it by stating that it:
    • was “simple in its purpose” and “specific in its intent.”
    • will “assist students who are being called up to active duty or active service” and those who are impacted by “a war, military contingency operation or a national emergency.”
    • would not affect the “integrity” of student loan programs.[351]
  • the House debated the bill for 40 minutes and agreed that it would not cost any money.[352] [353]
  • the House passed the bill by a vote of 421–1, and the Senate passed it “without amendment by unanimous consent.”[354]
  • the text of the law states that it won’t impair “the integrity of the student financial assistance programs.”[355]
  • a federal district court judge struck down Biden’s action in November 2022, ruling that:
    • “the HEROES Act does not mention loan forgiveness.”
    • “it is unclear if Covid-19 is still a ‘national emergency’ under the Act,” especially since “the Covid-19 pandemic was declared a national emergency almost three years ago,” and President Biden stated that the pandemic was “over” just a few weeks before he announced that he was cancelling student loans.
    • Biden’s action constitutes “one of the largest exercises of legislative power without congressional authority in the history of the United States.”
    • “in this country, we are not ruled by an all-powerful executive,” as this would be a system that constitutes “the very definition of tyranny.”[356]
  • the Supreme Court ruled in June 2023 that President Biden did not have the authority to forgive student loan debt through the HEROES Act.[357]

* Shortly after the Supreme Court’s ruling, the Biden administration:

  • finalized a new plan to reduce repayments and forgive some student loans.[358]
  • formally announced a “regulatory process to provide debt relief.”[359] [360]
  • began notifying more 804,000 borrowers that they were no longer responsible for $39 billion of student loan debt.[361]

* In total, the Biden administration cancelled $117 billion in student loans from 2021–23, including:

  • $45 billion for 653,800 public workers.
  • $39 billion for borrowers who experienced “historical failures” in loan program administration.
  • $22 billion for about 1.3 million borrowers from schools that “violated state law” through “misleading activities or other misconduct.”
  • $10.5 billion for 491,000 disabled borrowers.[362] [363] [364]

Accreditation

* To receive a federal student loan to attend a specific college, the college must be accredited. This means that it must be officially certified as an institution that delivers quality education.[365]

* The process of accreditation takes place at least once every 10 years and is generally conducted by private non-profit agencies. These agencies are sanctioned by the Department of Education, which is under the authority of the U.S. president.[366] [367]

* Accrediting agencies have the power to sanction colleges by denying, suspending, or revoking their accreditation. These agencies can also take interim actions, such as placing colleges on probation and requiring them to submit financial reports.[368]

* In January 2015, the U.S. Government Accountability Office published the results of an investigation of accrediting agencies and the Department of Education from October 2009 through March 2014. The study found that:

  • the accreditors responsible for accrediting for-profit colleges “were no more likely to issue terminations or probations to schools with weaker student outcomes compared to schools with stronger student outcomes….” This includes outcomes such as graduation rates, dropout rates, and student loan default rates.
  • “for 36 of the 93 schools receiving federal student aid funds that were placed on probation by their accreditors in fiscal year 2012, we found no indication of follow-up activities by [the Department of] Education between the beginning of fiscal year 2012 and December 2013.”
  • a Department of Education official “noted that her team would never respond to accreditor probations because they occur too frequently to track and would disrupt other work.”[369]

* Per the study’s conclusion:

These findings raise questions about whether existing accreditor standards are sufficient to ensure the quality of schools, whether [the Department of] Education is effectively determining if these standards ensure educational quality, and whether federal student aid funds are appropriately safeguarded.[370]

* Five months after the results of this investigation were published, the Obama administration issued a press release stating:

Over the past six years, the Education Department has taken unprecedented steps to hold career colleges accountable for giving students what they deserve: a high-quality, affordable education that prepares them for their careers.[371]

Federal Accounting

* When the federal government lends money for student loans, the government doesn’t report these amounts as outlays in the federal budget. Instead, the budget reflects only what the government projects it will lose or gain on these loans.[372] [373]

* Under federal budget rules, the federal government typically projects that it will make money on student loans. Thus, the more money the government loans, the better the budget appears to be.[374]

* Federal budget rules do not account for the market risk of issuing student loans. Market risk stems from the possibility that the economy will perform worse than the government projects, which would increase default rates and have other negative effects on returns from these loans.[375] [376]

* Per estimates made by the Congressional Budget Office:

  • in 2012, the federal government:
    • projected it would reap an average profit of 9% on the student loans it made from 2010 to 2020.
    • would have projected an average loss of 12% if it accounted for the market risk of those loans.[377]
  • in 2022, the federal government:
    • projected it will reap an average profit of 2% on the student loans that it makes in 2023.
    • would have projected an average loss of 9% if it accounted for the market risk of these loans.[378]

* In 2022, the U.S. Government Accountability Office reported:

Although the Department of Education originally estimated federal direct [student] loans made in the last 25 years would generate billions in income for the federal government, its current estimates show these loans will cost the government billions. Education originally estimated these loans to generate $114 billion in income for the government. Although actual costs cannot be known until the end of the loan terms, as of fiscal year 2021 these loans are estimated to cost the federal government $197 billion.[379]

Higher Education Outcomes

General

* Institutes of higher learning are also known as colleges, universities, and post-secondary schools.[380] Such institutions award:

  • associate degrees for completing a program that typically requires 2–4 full-time school years.
  • baccalaureate (or bachelor’s) degrees for completing a program that typically requires 4–5 full-time school years.
  • master’s degrees, which typically require 1–2 full-time years of graduate school after obtaining a bachelor’s degree.[381]
  • doctoral academic (or Ph.D.) degrees, which typically require 5–10 years of full-time graduate school. The coursework for such degrees is largely geared toward people who intend to conduct research or become a professor, although it typically provides little instruction in how to teach.[382] [383] [384] [385]
  • doctoral professional degrees, which require at least two years of full-time college work before entering the program and then at least six full-time years in the program. The coursework for such degrees is largely geared toward people who intend to practice in fields such as medicine, dentistry, law, and theology.[386] [387] [388]

* As of the fall of 2021, roughly 19.0 million students were attending U.S. colleges. Among these students:

  • 59% are females, and 41% are males.
  • 73% are at 4-year colleges and 25% are at 2-year colleges.
  • 61% are attending full time, and 39% are attending part time.[389]

* From 1960 to 2021, the portion of recent high school graduates (aged 16–24) enrolled in college:

  • increased from 45% to 62%.
  • increased from 54% to 55% for males.
  • increased from 38% to 70% for females.
Portion of High School Graduates Aged 16–24 Enrolled in College

[390]

* Among recent high school graduates of different racial/ethnic groups, the rates of college enrollment in 2021 were:

  • 85% for Asians.
  • 62% for whites.
  • 59% for African Americans.
  • 59% for Hispanics.[391]

Graduation Rates

* Among full-time, new college students who entered a 2-year college in 2017, 34% graduated from it within 150% of the normal time required to do so (typically three years). This was true for:

  • 62% of students at for-profit colleges.
  • 52% of students at nonprofit colleges.
  • 29% of students at public colleges.
  • 35% of female students.
  • 32% of male students.
  • 42% of Asian students.
  • 36% of white students.
  • 32% of Hispanic students.
  • 30% of American Indian students.
  • 28% of mixed-race students.
  • 25% of black students.[392]

* Among full-time, new college students who entered a 4-year college in 2014, 47% graduated from the same institution within four years.[393]

* Among full-time, new college students who entered a 4-year college in 2013, 63% graduated from it within six years. This was true for:

  • 68% of students at nonprofit institutions.
  • 62% of students at public institutions.
  • 26% of students at for-profit institutions.
  • 66% of female students.
  • 60% of male students.
  • 76% of Asian students.
  • 67% of white students.
  • 60% of mixed-race students.
  • 58% of Hispanic students.
  • 44% of black students.
  • 41% of American Indian students.[394]

Earnings

* In 2021, people aged 25–64:

  • with some college but did not graduate reported an average of $41,233 in cash earnings.
  • with an associate’s degree and no further education reported an average of $44,453 in cash earnings.
  • with a bachelor’s degree and no further education reported an average of $72,333 in cash earnings.
  • with a master’s degree and no further education reported an average of $88,300 in cash earnings.
  • with a doctoral degree reported an average of $133,188 in cash earnings.
  • with a professional degree reported an average of $147,307 in cash earnings.[395] [396]

* Click here for more data on education and earnings.


Effort & Grades

* From 1961 to 2003, the average time spent by full-time college students on educational activities like attending class and studying dropped from roughly 40 hours per week to 27 hours per week.[397]

* During the 2005–06 and 2006–07 school years, full-time students at 4-year colleges spent an average of about:

• 27–28 hours per week or 16–17% of their time on educational activities.

• 43 hours per week or 26% of their time on leisure activities and sports.[398]

* In 1960, roughly 15% of college course grades were A’s. By 1988, approximately 31% of grades were A’s. By 2013, about 45% of grades were A’s.[399] [400]


Practical Skills

* The Collegiate Learning Assessment (CLA) is a test designed to measure the “core outcomes” of higher education, including “critical thinking, analytical reasoning, problem solving, and writing.”[401] This assessment evaluates how well college students perform “real-world tasks that are holistic and drawn from life situations.”[402] [403]

* In 2014, Professor Richard Arum of New York University and Assistant Professor Josipa Roksa of the University of Virginia published a study using the CLA to measure the “critical thinking, complex reasoning, and writing skills” of 1,666 full-time students who entered 4-year colleges in the fall of 2005 and graduated in the spring of 2009. The authors found that:

  • if the test “were rescaled to a one-hundred-point scale, approximately one-third of students would not improve more than one point over four years of college.”
  • “after four years of college, an average-scoring student in the fall of his or her freshman year would score at a level only eighteen percentile points higher in the spring of his or her senior year. Stated differently, freshmen who entered higher education at the 50th percentile would reach a level equivalent to the 68th percentile of the incoming freshman class by the end of their senior year.”
  • “students attending high-selectivity institutions improve on the CLA substantially more than those attending low-selectivity institutions, even when models are adjusted for students’ background and academic characteristics. … While students in more selective institutions gain more on the CLA, their gains are still modest….”[404] [405]

* Using test questions from the National Center for Education Statistics’ adult test of practical literacy, the American Institutes for Research assessed the literacy skills of 1,827 graduating college students in 2003. These students were randomly selected from across the U.S., and each was graded as Proficient, Intermediate, Basic, or Below Basic on three different types of literacy:[406]

1) Prose Literacy, which is the ability to “search, comprehend, and use information from continuous texts,” such as “editorials, news stories, brochures, and instructional materials.” Students who were proficient in this included:

  • 38% of males and 37% of females at 4-year colleges.
  • 24% of males and 22% of females at 2-year colleges.
  • 42% of whites, 29% of Hispanics, 23% of Asians/Pacific Islanders, and 16% of blacks at 4-year colleges.
  • 27% of whites, 22% of Hispanics, 11% of blacks, and 7% of Asians/Pacific Islanders at 2-year colleges.[407]

2) Document Literacy, which is the ability to “search, comprehend, and use information from noncontinuous texts,” such as “job applications, payroll forms, transportation schedules, maps, tables, and drug or food labels.” Students who were proficient in this included:

  • 43% of males and 38% of females at 4-year colleges.
  • 24% of males and 24% of females at 2-year colleges.
  • 45% of whites, 35% of Hispanics, 20% of Asians/Pacific Islanders, and 17% of blacks at 4-year colleges.
  • 28% of whites, 18% of Asians/Pacific Islanders, 15% of Hispanics, and 10% of blacks at 2-year colleges.[408]

3) Quantitative Literacy, which is the ability to “identify and perform computations … using numbers embedded in printed materials,” such as “balancing a checkbook, figuring out a tip, completing an order form, or determining the amount of interest on a loan from an advertisement.” Students who were proficient in this included:

  • 39% of males and 30% of females at 4-year colleges.
  • 20% of males and 16% of females at 2-year colleges.
  • 40% of whites, 20% of Asians/Pacific Islanders, 19% of Hispanics, and 5% of blacks at 4-year colleges.
  • 24% of whites, 14% of Hispanics, 7% of blacks, and 3% of Asians/Pacific Islanders at 2-year colleges.[409]

* The study also found:

  • “The literacy of students in 4-year public institutions was comparable to the literacy of students in 4-year private institutions.”
  • “Prose literacy was higher for students in selective 4-year colleges, though differences between selective and nonselective 4-year colleges for document and quantitative literacy could not be determined because of the sample size.”
  • “College students come from a variety of economic backgrounds, with some students supporting themselves and others relying on their families to pay for tuition and other necessities. Despite variations in income, most differences in the literacy of students across income groups were not significant.”[410]
College Student Literacy Scores and Family Income

[411]


* A 2013 Gallup poll of 623 business leaders found that over two-thirds do not think U.S. college graduates have the necessary “skills and competencies” for their particular business.[412]

* In 2020, the Association of American Colleges and Universities commissioned a poll of employers who hire people with bachelor’s degrees to assess their views of recent college graduates. The poll included 496 employers, had a margin of sampling error of plus or minus 5 percentage points, and found the following results:

  • About 49% employers are “very satisfied” with graduates’ “ability to apply the skills and knowledge learned in college to complex problems in the workplace.”
  • A majority of employers find 14 skills “very important,” and the portions of employers who think recent graduates are “very well prepared” in these skills are:
    • 49% for digital literacy.
    • 48% for teamwork.
    • 46% for creative thinking.
    • 44% for writing.
    • 44% for quantitative reasoning.
    • 43% for intercultural skills.
    • 42% for decision making.
    • 41% for data analysis.
    • 41% for ethical judgement.
    • 41% for verbal communication.
    • 39% for complex problem-solving.
    • 39% for critical thinking.
    • 39% for practical application of knowledge.
    • 39% for integrating ideas across settings.[413] [414]

Comparative Earnings

* In 2021, U.S. residents aged 25 to 64 reported average cash earnings of $54,252.[415] Cash earnings do not include non-cash compensation, such as employee fringe benefits.[416] For varying levels of education, average reported cash earnings were as follows:

Average Cash Earnings of People 25–64

[417] [418]

* In 2021, 79% of U.S. residents aged 25–64 reported at least some cash earnings and 21% did not report any cash earnings. For varying levels of education, the rates were as follows:

Portion of People Aged 25–64 with Cash Earnings

[419] [420]

* Among U.S. residents aged 25–64 who reported cash earnings in 2021, average cash earnings were $68,900. Among these same people, median cash earnings were $51,000.[421] For varying levels of education, median cash earnings were as follows:

Median Cash Earnings of People Aged 25–64 With Earnings

[422] [423]

Preschool Spending

Overview

* During 2022, private consumers and nonprofit organizations in the U.S. spent $22.0 billion on day care and preschools/nursery schools.[424] [425] [426] [427] [428]

* During 2015, the federal government funded 47 programs that provided or subsidized education and/or childcare for children under the age of five.[429]

* The largest federal education/childcare program for preschoolers is called “Head Start.”[430] [431] During 2022, Head Start served 788,341 children and 12,736 pregnant women at some point during the year.[432]

* In 2021, the federal government spent an average of $12,809 for each person enrolled in Head Start. This does not include additional funds from state governments.[433]


Fraud

* Federal law requires that at least 90% of Head Start enrollees have incomes below 130% of the federal poverty line. To determine if the law was being enforced, the U.S. Government Accountability Office (GAO) conducted 15 undercover tests of Head Start centers in six states from 2008 to 2010. The investigation found the following:

  • “In 8 instances staff at these centers fraudulently misrepresented information, including disregarding part of the families’ income to register over-income children into under-income slots.”
  • “At no point during our registrations was information submitted by GAO’s fictitious parents verified, leaving the program at risk that dishonest persons could falsify earnings statements and other documents in order to qualify.”
  • One Head Start staffer “explained that families often lie about being separated or divorced in order to reduce their income and that Head Start is not strict about checking whether that is true.”
  • The “lack of documentation made it virtually impossible to determine whether only under-income children were enrolled in spots reserved for under-income children.”[434]

* From 2017 to 2019, GAO reviewed the Head Start program to determine if eligibility and enrollment problems persisted. Of the 15 centers covertly tested:

  • seven correctly identified ineligible families.
  • three accepted applicants without verifying eligibility documentation.
  • three fabricated income information on applications.
  • two omitted documents that made the applicant ineligible.[435]

Preschool Outcomes

General

* During 2021, 26% of all 3-to-4 year-olds in the U.S. were enrolled in government-controlled education programs. In 1965, this figure was 1%.[436]

* During 2021, 19% of all 3-to-4 year-olds in the U.S. were enrolled in private education programs. In 1965, this figure was 4%.[437]

* In 2013, President Obama called on Congress to fund certain initiatives that would allow every child in the U.S. from birth to age five to have access to government-controlled early learning programs. Specifically, he called for funding to:

  • provide “new, full-day” Early Head Start programs for children from birth to age three.
  • allow all four-year-olds from families with incomes at or below 200% of the poverty line to be enrolled in government preschools.[438]

* In May 2015, U.S. Senator Patty Murray (D-WA) introduced a bill that would enact much of President Obama’s early learning agenda. At the end of President Obama’s term in January 2017:

  • the bill had 24 cosponsors, including 23 Democrats and a self-described “democratic socialist” who caucuses with the Democrats.
  • the Senate, which had a Republican majority, had not taken any action on this bill.[439] [440] [441] [442] [443] [444]

* In 2021, President Biden proposed a social spending plan that called for:

  • subsidizing child care costs for most families with young children, including “nearly all families of four making up to $300,000 per year.”
  • taxpayer-funded preschool for all three- and four-year-olds in both government-run and private schools.[445] [446]

* In September 2021, U.S. Representative John Yarmuth (D-KY) introduced a bill that would have enacted much of President Biden’s childcare and preschool agenda.[447] The bill passed the House of Representatives with 220 of 221 Democrats voting for it and 212 of 213 Republicans voting against it.[448] The Democrat-controlled Senate however, stripped Biden’s childcare and preschool agenda from the bill.[449] [450] [451]


Head Start

* The largest federal education/childcare program for preschoolers is Head Start, which “provides comprehensive educational, social, health, and nutritional services to low-income preschool children and their families.”[452] [453] [454] [455] [456]

* Head Start operates mostly during the school year and has full-day and part-day programs. When Head Start programs are in session, the average participant attends about 24 to 28 hours per week.[457] [458]

* From 2002 through 2008, the U.S. Department of Health & Human Services conducted a nationally representative study of 3- and 4-year-old children whose parents had applied for enrollment in Head Start and were found to be eligible. The study included 4,667 children from high-poverty communities. The design and results were as follows:

  • The children were randomly assigned to groups that were either enrolled in Head Start or not enrolled in Head Start due to a lack of available slots.
  • Among the children not enrolled in Head Start, about 60% were placed by their parents in other types of preschool programs.
  • The researchers measured 41 outcomes relating to the children’s educational performance, physical health, emotional development, and parental interactions up through third grade.
  • The researchers found that “there were initial positive impacts from having access to Head Start, but by the end of 3rd grade there were very few impacts,” and among these, some were positive and some were negative with no “clear pattern” in either direction.[459]

* In 2022, the Inspector General of the U.S. Department of Health & Human Services published a five-year review of the Head Start program that found 27% of grant recipients failed to promptly report incidents of child neglect, including:

  • lack of supervision (533 incidents)
  • child abuse (454 incidents)
  • children released to an unauthorized person (42 incidents)[460]

Government Pre–K Program in Tennessee

* In 2022, Vanderbilt University conducted an experimental study of two groups of 4-year children whose parents applied for enrollment in a government preschool program in Tennessee. The study included 2,990 children from low-income families. The design and results were as follows:

  • The children were randomly assigned to groups that were either enrolled or not enrolled in the preschool program due to a lack of available slots.
  • Among the children not enrolled, about 39% were placed by their parents in other preschools.
  • The researchers measured nine outcomes relating to the children’s educational performance, discipline, and attendance.
  • The researchers found “significant positive immediate effects” at the end of the preschool program which “disappeared by the end of kindergarten and turned negative by the end of third grade.”
  • By the sixth grade, the preschool program participants exhibited:
    • a greater need for special education.
    • a higher occurrence of being left back.
    • more discipline issues.
    • less frequent attendance.[461] [462]

High/Scope Perry Program

* From 1962 to 1967, a Ph.D. public school administrator named David Weikart led a study of 123 preschool-aged children in a town near Detroit named Ypsilanti, Michigan. This famous study is known as the “High/Scope Perry Preschool” study, because HighScope is the name of the research firm that Weikart later founded, and the study was conducted on children who lived near the Perry Elementary School in Ypsilanti.[463] [464] [465] [466]

* The study’s design was as follows:

  • To be included in the study, children had to be 3–4 years old, African American, impoverished, and have an IQ ranging from 70 to 85 (as compared to the national average of 111 at the time).[467] [468]
  • The children were randomly assigned to groups that were either enrolled in the preschool program or not enrolled.[469] [470]
  • The preschool curriculum was “centered around play that is based on problem-solving and guided by open-ended questions” like “What happened? How did you make that? Can you show me? Can you help another child? [471]
  • Most of the children who attended the program did so for two years but some for only one year.[472] [473]
  • The children in the program attended preschool for 2.5 hours per weekday from mid-October through May. A teacher also visited each student once per week at his or her home for 1.5 hours. Per child, this is a total of 14 hours per week, 462 hours per year, or 924 total hours for those who attended two full years.[474]
  • The child/teacher ratio ranged from 5:1 to 6:1.[475]
  • The preschool program cost about $27,000 per student in inflation-adjusted 2023 dollars.[476] [477] Adjusted for the cost growth of public schooling since 1965, the program cost about $71,429 per student.[478]
  • When the study participants were ages 4–10, 12, 14, 17–19, 27, and 40, researchers measured “numerous factors” relating their careers, finances, criminal history, education, intellect, and personality.[479] [480] [481]
  • The sample groups that were evaluated consisted of roughly 25 males and 25 females who were in the program and 25 males and 25 females who were not.[482] [483]

* The authors of a 2008 paper in the Journal of the American Statistical Association examined the outcomes of the four Perry sample groups and found the following statistically significant outcomes at different ages.

  • At age 5, the average IQs of males and females in the program were respectively 11 and 13 points higher than those not in the program.
  • At age 18, females in the program had an 84% graduation rate, as opposed to 35% for those not in the program.
  • At age 19, 5% of the females in the program had been arrested, as opposed to 42% of those not in the program.
  • At age 19, 40% of the females in the program were unemployed, as opposed to 71% of those not in the program.
  • At age 27, females in the program had been arrested an average of 0.32 times, as opposed to 2.3 times for those not in the program.
  • At age 27, 40% of the females in the program were married, as opposed to 8% of those not in the program.[484]

* The authors of the study also found:

  • “In contrast to females, males appear to not derive lasting benefits” from the Perry program.
  • Studies of two other preschool programs with children from similar backgrounds have replicated the early IQ and female graduation rate outcomes of the Perry program.
  • Previous studies that found other benefits from the Perry program have “serious statistical” problems, because the “samples are very small,” and the researchers failed to account for a common issue with studies that measure numerous outcomes: seemingly significant results “emerge simply by chance, even if there are no” actual effects.[485] [486]
  • Studies of another preschool program with children from similar backgrounds who spent 10 times as many hours in preschool have not replicated the large reductions in criminality that statistically flawed studies of the Perry program have found.[487] [488] [489]

* Using “novel statistical approaches” to account for “small sample sizes” and a “corrupted randomization” process in the original study, researchers at the University of Chicago found several other statistically significant outcomes between the four Perry sample groups at different ages. For example:

  • At age 27, 80% of the females in the program were employed, as opposed to 55% of those not in the program.
  • At age 40, females in the program had been arrested an average of 2.2 times, as opposed to 4.8 times for those not in the program.
  • At age 19, 70% of the males in the program were employed, as opposed to 50% of those not in the program.
  • At age 27, males in the program earned an average of $2,310 per month, as opposed to $1,430 for those not in the program.
  • At age 40, males in the program had been arrested an average of 8.2 times, as opposed to 12.4 times for those not in the program.[490] [491]

* Given the sample sizes of the four Perry groups (roughly 25 each), the approximate margin of error with 95% confidence for any outcome is ± 20 percentage points.[492] [493] Per an academic textbook on statistical analysis by University of Pennsylvania professor Paul D. Allison:

There’s very little information in a small sample, so estimates of correlations are very unreliable. … Almost anyone would consider a sample less than 60 to be small, and virtually everyone would agree that a sample of 1,000 or more is large.[494]

* Policymakers and activists have pointed to the Perry program as a reason to enact universal government preschool.[495] [496] [497] [498] [499] [500] Per an academic book on applied statistics by Harvard Ph.D. and social physiologist Rebecca M. Warner:

  • “Researchers in the behavioral and social sciences almost always want to make inferences beyond their samples,” but this is “always risky.”
  • It is “questionable to generalize” the results of a study to populations who are “drastically different” from the subjects of a study.[501] [502]

* The subjects of the Perry study (black, impoverished, IQ of 70–85) represented 2% of the U.S. population and 16% of the African American population at the time the study was conducted.[503] [504]


Abecedarian Project

* From 1972 to 1977, researchers at the University of North Carolina led a study of 111 preschool-aged children in the area of Chapel Hill, NC. This study is known as the “Abecedarian Project,” because that was the name of the main curriculum used in the program.[505] [506] [507]

* The study’s design was as follows:

  • Children included in the study “were believed to be at risk of retarded intellectual and social development.” Most were African Americans whose mothers had about 10 years of education and an IQ of 85. Roughly 75% of the children were from single-parent households, and 55% of the households were receiving cash welfare.[508]
  • The children were randomly assigned to groups that were either enrolled in the preschool program or not enrolled.[509] [510]
  • The preschool curriculum was focused on “developing cognitive, language, and social skills.”[511] [512]
  • The children in the program attended from shortly after birth (at an average age of 4.4 months) until they began kindergarten.[513]
  • The children in the program attended preschool for 8–10 hours per weekday and 50 weeks per year. Per child, this is 40–50 hours per week, 2,000–2,500 hours per year, and a total 8,000–10,000 hours for those who attended for four years. This is roughly 10 times more hours than the Perry program.[514] [515]
  • The child/teacher ratio ranged from 3:1 to 6:1.[516]
  • Based on the child/teacher ratio, the total classroom time, and the cost growth of public schooling since the 1970s, the Abecedarian program would cost about $301,000 per student to implement today.[517]
  • When the study participants were ages 2–8, 12, 15, 18, 19, and 21, researchers measured numerous factors relating their careers, criminal history, education, intellect, and personality.[518]
  • The sample groups who were evaluated consisted of roughly 25 males and 25 females who were in the program and 25 males and 25 females who were not.[519] [520]

* The authors of a 2008 paper in the Journal of the American Statistical Association examined the outcomes of the four Abecedarian sample groups and found the following statistically significant outcomes at different ages:

  • At age 12, the average IQ of females in the program was 8 points higher than those not in the program.
  • At age 21, 40% of the females in the program were in college, as opposed to 11% of those not in the program.
  • At age 21, 4% of the females in the program were marijuana users, as opposed to 36% of those not in the program.[521]

* The authors of the study also found:

  • Previous studies that found other benefits from the Abecedarian program have “serious statistical” problems, because the “samples are very small,” and the researchers failed to account for a common issue with studies that measure numerous outcomes: seemingly significant results “emerge simply by chance, even if there are no” actual effects.[522] [523]
  • The Abecedarian subjects did not show significant reductions in criminality like previous studies of the Perry program had found, even though the Abecedarian children spent 10 times as many hours in preschool.[524] [525] [526] [527]

* Policymakers and activists have cited the Abecedarian Project as a reason to enact universal government preschool.[528] [529]

School Choice

Overview

* Laws in all 50 U.S. states generally compel people to:

  • pay taxes that fund government-run K–12 schools.[530] [531] [532]
  • send their children to specific public schools based on physical boundaries around their homes unless they:
    • pay additional money for private school.
    • spend additional money and/or time for homeschooling.[533] [534]

* School choice initiatives allow parents to select the schools their children attend, with part or all of the costs paid by their taxes or other government revenues. This can include:

  • public schools outside a child’s neighborhood or school district.
  • charter and magnet schools.[535] [536] [537] [538]
  • private schools.
  • tutors and homeschools.[539]

* In the U.S., government revenues regularly fund the education of students who attend private colleges and universities but rarely students who attend private K–12 schools.[540] [541] [542]

* In other economically advanced nations—like Austria, Canada, Spain, France, Hungary, Australia, New Zealand, and the Netherlands—government revenues commonly fund the education of students who attend private K–12 schools and sometimes those who are homeschooled.[543]

* In different nations, governments exercise varying amounts of centralized control over public and private schools. Public schools in some countries have more autonomy than private schools in others.[544]

* Per the academic serial work Handbook of Research on School Choice:

Much of the debate over school choice is based on the premise that there is a public monopoly over the provision of schooling and that schools are inefficient, in part, because of the absence of competition. If families could be treated as consumers and had the right to freely choose which kind of education they would prefer for their children, choice advocates assert that both government and non-government schools would improve….[545] [546] [547] [548] [549]

Costs

* In the 2019–20 school year, governments in the U.S. spent an average of $17,013 for every student enrolled in K–12 public schools.[550] [551] This excludes state administration spending, unfunded pension liabilities, and non-pension post-employment benefits.[552]

* In the 2019–20 school year, the average spending per student enrolled in private K–12 schools was about $9,709.[553] [554] [555] [556] [557] [558]

* Per the academic textbook Antitrust Law:

Monopoly pricing confronts the consumer with false alternatives: the product that he chooses because it seems cheaper actually requires more of society’s scarce resources to produce. Under monopoly, consumer demands are satisfied at a higher cost than necessary.[559] [560] [561] [562]

* Per the U.S. Supreme Court’s unanimous decision in Abood v. Detroit Board of Education:

A public employer, unlike his private counterpart, is not guided by the profit motive and constrained by the normal operation of the market.
Although a public employer, like a private one, will wish to keep costs down, he lacks an important discipline against agreeing to increases in labor costs that in a market system would require price increases.[563]

* Governments are subject to certain types of competition, because people and businesses sometimes migrate to locations where governments provide better value for their tax dollars, and because voters sometimes remove politicians for reasons such as increasing taxes and government spending.[564] [565]


Effects on Students

NOTE: In order to curb the methodological trickery that besets public policy debates, Just Facts has developed Standards of Credibility that call for the presentation of “data in its rawest comprehensible form.” However, the results of all experimental studies on the academic outcomes of students who experience school choice are more processed than Just Facts would prefer. Thus, instead of ignoring them or attempting to analyze all of the raw data, Just Facts has briefly summarized all of these studies and documented their results in the footnotes below.

* At least 23 experimental (or quasi-experimental) studies have been conducted on the academic outcomes of students who experience school choice.[566] [567] Among them:

* In a 2014 interview, Bill O’Reilly asked Barack Obama, “Why do you oppose school vouchers when it would give poor people a chance to go to better schools?” Obama replied:

Actually—every study that’s been done on school vouchers, Bill, says that it has very limited impact if any.
I’ve taken a look at it. As a general proposition, vouchers has not significantly improved the performance of kids that are in these poorest communities.[587]

* A 2010 experimental study of a school voucher initiative in the District of Columbia published by the Obama administration’s Department of Education found the following statistically significant results:

  • Students who applied for a voucher and did not win a lottery to receive one had a graduation rate of 70%.
  • Students who applied for a voucher and won a lottery to receive one had a graduation rate of 82%.
  • Students who applied for a voucher, won a lottery to receive one, and then used it had a graduation rate of 91%.[588] [589]

* Per a 2004 report by the Civil Rights Project at Harvard University, the Urban Institute, Advocates for Children of New York, and the Civil Society Institute:

In an increasingly competitive global economy, the consequences of dropping out of high school are devastating to individuals, communities and our national economy. At an absolute minimum, adults need a high school diploma if they are to have any reasonable opportunities to earn a living wage. A community where many parents are dropouts is unlikely to have stable families or social structures.[590] [591] [592]

* The 2012 Democratic Party Platform states:

Too many students, particularly students of color and disadvantaged students, drop out of our schools, and Democrats know we must address the dropout crisis with the urgency it deserves.[593]

* In 2013, the Journal of Policy Analysis and Management published an experimental study of the same District of Columbia voucher initiative by the same lead author. The study found the following statistically significant results:

  • “The impact of using a [voucher] scholarship was an increase of 21 percentage points in the likelihood of graduating. The positive impact of the program on this important student outcome was highly statistically significant.”
  • “Our analysis indicated a marginally statistically significant positive overall impact of the program on reading achievement after at least four years.”
  • “We did find evidence to suggest that scholarship use boosted student reading scores by the equivalent of about one month of additional learning per year.”[594]

* In 2011, the Quarterly Journal of Economics published an experimental study of a public school choice initiative in the 20th largest school district in the nation (Charlotte-Mecklenburg, North Carolina). The study compared the adult crime outcomes of male students who won and did not win a lottery for their parents’ first choice of school. The author found the following statistically significant results:

  • “Across various schools and for both middle and high school students, I find consistent evidence that winning the lottery reduces adult crime.”
  • “The effect is concentrated among African American males and youth who are at highest risk for criminal involvement.”
  • “Across several different outcome measures and scalings of crime by severity, high-risk youth who win the lottery commit about 50% less crime.”
  • “They are also more likely to remain enrolled and ‘on track’ in school, and they show modest improvements on school-based behavioral outcomes such as absences and suspensions.”[595] [596]

* Per a 2006 book about school choice written by Harvard professors William G. Howell and Paul E. Peterson:

No publicly funded voucher program offers all students within a political jurisdiction the opportunity to attend the private school of their choice. All are limited in size and scope, providing vouchers only to students who come from low-income families, who attend “failing” public schools, or who lack a public school in their community.
Most publicly funded voucher programs today are so small that they do little to enrich the existing educational market.
Most privately funded voucher programs operating today promise financial support for only three to four years.
In the short term, vouchers may yield some educational benefits to the low-income families that use them. But sweeping, systemic change will not materialize as long as small numbers of vouchers, worth small amounts of money, are offered to families for short periods of time. The claims of vouchers’ strongest advocates as well as those of the most ardent opponents, both of whom forecast all kinds of transformations, will be put to the test only if and when the politics of voucher programs stabilizes, support grows, and increasing numbers of educational entrepreneurs open new private schools.[597]

Effects on Government Schools

* The primary measure of school resources is spending per student.[598] [599] [600]

* School choice initiatives that allow students to attend private schools typically increase the funding per student in public schools, because public schools do not have to educate students who leave and because private schools typically spend less per student than public schools.[601] [602]

* Certain school costs are fixed in the short term (like buildings), and thus, the cost savings of educating fewer students occurs in steps instead of linearly. This means that private school choice programs can temporarily decrease the funding per student in public schools.[603]

* In 2022, the journal Education Next published a study of a Florida school choice initiative that awards scholarships for low-income students to use towards private school tuition and transportation. It measured how “increased competition” from the program’s expansion since 2002 affected educational and behavioral outcomes of students who remained in public schools. The study found that increased program enrollment and competition:

  • “are associated with positive behavioral outcomes among non-scholarship students,” including fewer suspensions and lower absenteeism.
  • produced larger improvements in reading and math test scores than schools with less market competition, especially among poorer students.
  • benefited 90% of students and created no educational losses for the other 10%.[604]

* In 2013, the Journal of School Choice: International Research and Reform published a systematic review of 21 “high-quality” studies about the academic outcomes of U.S. students who remain in public schools after other students leave through choice programs. This review was designed to measure the effects of competition on public schools whose enrollments are threatened by private school choice programs. The author found:

  • “All but one of these 21 studies found neutral/positive or positive results” on public school students.
  • None of the studies found negative results on public school students.
  • The quasi-experimental studies, which are studies that are best able to determine causal effects, “unanimously find positive impacts on student academic achievement.”
  • “The only study to find no effects across all subjects … was restricted to a relatively small number of participants in the year this study was conducted. Furthermore, a ‘hold-harmless’ provision ensured that public schools were insulated from the financial loss from any students that transferred into private schools with a voucher. The absence of a positive competition effect is thus unsurprising, given these design features.”[605]

* In 2004, the journal Education Next published an experimental study of a Florida school choice initiative that offered private and public school vouchers to students enrolled in chronically failing public schools. The study compared the academic gains of public school students whose schools were eligible for vouchers and public school students whose schools were not eligible for vouchers. The study found the following statistically significant results:

  • On the Florida Comprehensive Assessment Test, “gains in test scores were 15 points higher among those schools whose students were eligible for vouchers than the gains among the rest of Florida’s public schools. Schools whose students were on the verge of becoming eligible also made greater gains.”
  • “The same pattern—of greater gains among schools facing competition or the threat thereof—was witnessed on the national Stanford-9 exam, confirming that the gains reflect genuine improvements in learning rather than teaching to the test or cheating.”
  • After one year, “the gains among [chronically failing] schools whose students were eligible for vouchers were enough to erase almost one-fifth of the [achievement] gap between their average score in the 2001–02 school year and the average score of all other Florida public schools.”[606]

Politics

* According to donations reported to the Federal Election Commission, the following education groups were among the top 100 organizations that gave the most money to federal candidates, parties, political action committees, and related organizations during the 1990–2022 election cycles:

Group

Rank in the Top 100

Total Contributions

Portion to Democrats & Liberal Groups

American Federation of Teachers (AFT)

16

$127,218,032

100%

National Education Association (NEA)

20

$108,586,479

96%

[607] [608] [609]

* The NEA and AFT are labor unions.[610] For facts about the accuracy of union donations reported to the Federal Election Commission, visit Just Facts’ research on labor unions.

* In 2009, the president of the NEA sent an open letter to Democrats in the U.S. House and Senate stating that “opposition to [private school] vouchers is a top priority for NEA.”[611]

* The 2020 Democratic Party Platform opposes private school vouchers and supports a ban on federal funding of for-profit charter schools.[612]

* The Republicans didn’t adopt a platform in 2020.[613] The 2016 Republican Party Platform supports “options for learning, including home-schooling, career and technical education, private or parochial schools, magnet schools, charter schools, online learning, and early-college high schools,” as well as “education savings accounts (ESAs), vouchers, and tuition tax credits.”[614]

* The President of the United States appoints justices to the U.S. Supreme Court. These appointments must be approved by a majority of the Senate.[615]

* Once seated, federal judges serve for life unless they voluntarily resign or are removed through impeachment, which requires a majority vote of the House of Representatives and two-thirds of the Senate.[616]

* Senate rules previously allowed for a “filibuster,” in which a vote to approve a judge or Supreme Court justice could be blocked unless a super-majority of the senators (typically 60 out of 100) agreed to let it take place.[617] [618] [619] These rules were repealed:

  • in 2013 when the Democrat majority voted to eliminate filibusters of all presidential nominees except Supreme Court justices.[620]
  • in 2017 when the Republican majority voted to eliminate filibusters of Supreme Court justices.[621]

* Once seated, federal judges serve for life unless they voluntarily resign or are removed through impeachment, which requires a majority vote of the House of Representatives and a two-thirds majority vote in the Senate.[622]

* In 2002, the U.S. Supreme Court ruled (5 to 4) that a school choice initiative in Cleveland was constitutional (details below). Five of the seven justices appointed by Republicans ruled that it was constitutional, and both of the justices appointed by Democrats ruled that it was not.[623]


Positions & Actions

* A nationally representative poll of U.S. adults commissioned in 2015 by Education Next and the Kennedy School of Government at Harvard University found that the following portions of Americans:

  • are opposed to giving “all families with children in public schools a wider choice, by allowing them to enroll their children in private schools instead, with government helping to pay the tuition”:
    • 57% of teachers
    • 43% of whites
    • 36% of the general public
    • 30% of parents
    • 18% of African Americans
    • 13% of Hispanics
  • have ever enrolled their own children in private K–12 schools:
    • 22% of teachers
    • 18% of whites
    • 14% of the general public
    • 14% of parents
    • 14% of African Americans
    • 8% of Hispanics[624] [625]

* An analysis of U.S. Census data from the year 2000 by the Thomas B. Fordham Institute (a proponent of school choice) found that the following portions of parents were sending at least one of their own children to a private K–12 school:

  • 12.2% of all households with children
  • 17.5% of urban households with children
  • 21.5% of urban public school teacher households with children[626] [627]

* The following opponents of private school choice personally attended and/or sent their own children to private K–12 schools:

* The American Civil Liberties Union (ACLU) opposes taxpayer-funded private school choice programs. One of the ACLU’s arguments for this stance is that:

School voucher schemes would force all taxpayers to support religious beliefs and practices with which they may strongly disagree.[661]

* The ACLU supports taxpayer-funded abortions. With regard to whether all taxpayers should be forced to support practices with which they may strongly disagree, the ACLU asks the following rhetorical question:

What about those who are morally or religiously opposed to abortion?

And answers:

Our tax dollars fund many programs that individual people oppose.[662]

Affluence & Connections

* Per the academic serial work Handbook of Research on School Choice:

It may be misleading … to distinguish traditional public schools as “unchosen.” Some parents choose to live near excellent public schools and thereby choose their children’s schools by residential location.[663] [664]

* Per the academic reference book 21st Century Geography, “economically depressed populations with limited access to resources … have restricted choices on where they can live….”[665]

* In 2013, homes in top-ranked school districts cost an average of $50 more per square foot than homes in average-ranked school districts.[666]

* In 2009, Barack Obama’s Secretary of Education, Arne Duncan, was asked, “Where does your daughter go to school, and how important was the school district in your decision about where to live?” Duncan replied:

She goes to Arlington public schools. That was why we chose where we live, it was the determining factor. That was the most important thing to me. … I didn’t want to try to save the country’s children and our educational system and jeopardize my own children’s education.[667]

* In 2009, families living in Arlington, Virginia reported an inflation-adjusted median cash income of $193,467 the highest among all counties in the United States.[668] [669]

* When Arne Duncan was the chief executive of the Chicago public school system, his office contacted school principals to help the children of politically connected parents get into better public schools. Per a 2010 Chicago Tribune article:

Whispers have long swirled that some children get spots in the city’s premier schools based on whom their parents know. But a list maintained over several years in Duncan’s office and obtained by the Tribune lends further evidence to those charges.
The log is a compilation of politicians and influential business people who interceded on behalf of children during Duncan’s tenure.
After getting a request … [Duncan’s staffers] would look up the child’s academic record. If the student met their standard, they would call the principal of the desired school.
[A Duncan staffer] said the calls from his office were not directives to the principals—no one was ever told they had to accept a student. Often, students did not get any of their top choices but were placed in larger, less competitive, but still desirable schools….
The initials “AD” are listed 10 times as the sole person requesting help for a student, and as a co-requester about 40 times. [A Duncan staffer] said “AD” stood for Arne Duncan, though Duncan’s involvement is unclear.[670]

Court Rulings

Zelman v. Simmons-Harris

* In the 2002 case of Zelman v. Simmons-Harris, the U.S. Supreme Court ruled (5–4) that a school choice initiative in Cleveland was constitutional. This program provided tuition aid for students:

to attend participating public or private schools of their parent’s choosing and tutorial aid for students who choose to remain enrolled in public school. Both religious and nonreligious schools in the district may participate, as may public schools in adjacent school districts. Tuition aid is distributed to parents according to financial need, and where the aid is spent depends solely upon where parents choose to enroll their children.[671]

* The Zelman case hinged upon:

  • the “Establishment of Religion” clause in the First Amendment to the Constitution, which prohibits Congress from making any law “respecting an establishment of religion, or prohibiting the free exercise thereof.”
  • the Fourteenth Amendment to the Constitution, which, among other things, made the First Amendment applicable to state and local governments.”[672] [673] [674]

* Per the majority ruling in Zelman:

The Ohio program is entirely neutral with respect to religion. It provides benefits directly to a wide spectrum of individuals, defined only by financial need and residence in a particular school district. It permits such individuals to exercise genuine choice among options public and private, secular and religious. The program is therefore a program of true private choice. In keeping with an unbroken line of decisions rejecting challenges to similar programs, we hold that the program does not offend the Establishment Clause.[675]

* Per a dissent by Justice David Souter:

In the city of Cleveland the overwhelming proportion of large appropriations for voucher money must be spent on religious schools if it is to be spent at all, and will be spent in amounts that cover almost all of tuition. The money will thus pay for eligible students’ instruction not only in secular subjects but in religion as well, in schools that can fairly be characterized as founded to teach religious doctrine and to imbue teaching in all subjects with a religious dimension.[676]

* Per a concurrence by Justice Sandra Day O’Connor, the Cleveland school choice program:

pales in comparison to the amount of funds that federal, state, and local governments already provide religious institutions. … Although data for all states is not available, data from Minnesota, for example, suggest that a substantial share of Pell Grant and other federal funds for college tuition reach religious schools.[677]

* Per a dissent by Justice John Paul Stevens:

… I am convinced that the Court’s decision is profoundly misguided. Admittedly, in reaching that conclusion I have been influenced by my understanding of the impact of religious strife on the decisions of our forbearers to migrate to this continent, and on the decisions of neighbors in the Balkans, Northern Ireland, and the Middle East to mistrust one another. Whenever we remove a brick from the wall that was designed to separate religion and government, we increase the risk of religious strife and weaken the foundation of our democracy.[678]

* Per a concurrence by Justice Clarence Thomas, the Cleveland program:

does not force any individual to submit to religious indoctrination or education. It simply gives [poor] parents a greater choice as to where and in what manner to educate their children. This is a choice that those with greater means have routinely exercised.[679]

* State supreme courts have ruled differently regarding whether various school choice programs are prohibited by their respective constitutions.[680] For example, the states of Florida and Indiana both enacted school choice programs that allowed certain children to attend private schools, but:

  • in 2006, the Florida Supreme Court ruled (5–2) that the program violated the state’s constitution, which calls for a “uniform, efficient, safe, secure, and high quality system of free public schools….”[681]
  • in 2013, the Indiana Supreme Court ruled (5–0) that the program did not violate the state’s constitution, which calls for a “general and uniform system of Common Schools.”[682]

Espinoza v. Montana Department of Revenue

* In the 2020 case of Espinoza v. Montana Department of Revenue, the U.S. Supreme Court ruled (5–4) that a publicly subsidized scholarship program could not exclude religious schools.[683]

* The Espinoza case hinged upon:

  • the “Establishment” and “Free Exercise” clauses in the First Amendment to the Constitution, which prohibit Congress from making any law “respecting an establishment of religion, or prohibiting the free exercise thereof.”
  • the proposed use of the scholarship for education at a religious school.[684] [685]

* Per the majority ruling in Espinoza:

Montana’s no-aid provision bars religious schools from public benefits solely because of the religious character of the schools. The provision also bars parents who wish to send their children to a religious school from those same benefits, again solely because of the religious character of the school. This is apparent from the plain text.[686]

* Per a dissent by Justice Sonia Sotomayor:

Contra the Court’s current approach, our free exercise [of religion] precedents had long-granted the government “some room to recognize the unique status of religious entities and to single them out on that basis for exclusion from otherwise generally applicable laws.”[687]

* Per a concurrence by Justice Clarence Thomas:

[T]he modern view, which presumes that States must remain both completely separate from and virtually silent on matters of religion to comply with the Establishment Clause, is fundamentally incorrect. Properly understood, the Establishment Clause does not prohibit States from favoring religion. They can legislate as they wish, subject only to the limitations in the State and Federal Constitutions.[688]

* Per a dissent by Justice Stephen Breyer:

There is no dispute that religious schools seek generally to inspire religious faith and values in their students. How else could petitioners claim that barring them from using state aid to attend these schools violates their free exercise rights? Thus, the question in this case … boils down to what the schools would do with state support. And the upshot is that here … we confront a State’s decision not to fund the inculcation of religious truths.[689]

* Per a concurrence by Justice Samuel Alito:

I join the opinion of the Court in full. The basis of the decision below was a Montana constitutional provision that, according to the Montana Supreme Court, forbids parents from participating in a publicly funded scholarship program simply because they send their children to religious schools. Regardless of the motivation for this provision or its predecessor, its application here violates the Free Exercise Clause.[690]

* As of 2023, the Friedman Foundation for Educational Choice has identified 81 school choice programs in 33 states, the District of Columbia, and Puerto Rico.[691]

Common Core

Overview

* Per the official Common Core website:

The Common Core is a set of high-quality [K–12] academic standards in mathematics and English language arts/literacy (ELA). These learning goals outline what a student should know and be able to do at the end of each grade.[692]

* The Common Core standards were developed and are maintained by the Common Core State Standards Initiative (CCSSI), which is a joint project of the Council of Chief State School Officers and the National Governors Association’s Center for Best Practices.[693] [694] [695]

* The Council of Chief State School Officers is a nonprofit organization controlled by the chief public education officials of each state, the District of Columbia, the U.S. military, and each U.S. territory.[696]

* The National Governors Association is an organization funded by the states and controlled by the governors of 55 U.S. states, territories, and commonwealths.[697] [698] The Center for Best Practices is a nonprofit organization that is “an integral part of the National Governors Association” but is “funded through federal grants and contracts, fee-for-service programs, private and corporate foundation contributions, and NGA’s Corporate Fellows program.”[699] [700]

* The Bill and Melinda Gates Foundation provided most of the money to create Common Core. This included money to develop the standards and build support for Common Core through donations to politicians, unions, civic leaders, and business organizations. Bill Gates, the wealthiest person in the world at the time, personally authorized this funding.[701] [702]

* In July 2009, CCSSI announced the names of 29 people that it had chosen to write the Common Core standards. The press release stated that:

  • these individuals were organized into two “working groups” consisting of 15 people for math and 14 people for ELA.
  • a “feedback group” will offer “expert input on draft documents,” but “final decisions” on the standards will be made by the working groups.
  • all “deliberations” of the working groups will “be confidential throughout the process.”[703]

* In September 2009, CCSSI announced that six governors and six chief state school officers had appointed a 29-person “validation committee” of education experts to review and certify the Common Core standards.[704]

* As a condition of being on the validation committee, each member had to agree to keep all “deliberations, discussions, and work” of the committee “strictly confidential” in perpetuity.[705] [706]

* In June 2010, the validation committee issued a report certifying the standards. The report listed 24 people who had signed the standards and:

  • stated that CCSSI had “convened a 25-member Validation Committee” (as opposed to the 29 members it actually convened[707]).
  • did not explicitly state that four of the committee members listed among the authors of the report refused to certify the standards.[708] [709]

* At least two of the committee members who declined to certify the standards have publicly criticized them and the process by which they were created.[710] [711]

* In November 2009, the Obama administration issued regulations governing how states could compete for $4.35 billion in federal education funds under its “Race to the Top” program. These regulations required states to demonstrate their “commitment to adopting a common set” of K–12 education standards. The regulations also stipulated that states would earn “high points” if they adopted the same standards as the “majority of the States in the country.”[712] [713] [714] [715]

* By September 2011, 44 states and the District of Columbia had adopted the Common Core standards.[716] [717]

* Among the 46 states that had adopted all or part of the Common Core standards by early 2014, six had done so by legislative action, and 40 by decisions made by state boards of education or chief education officials.[718] [719]

* As of March 2019:

  • 27 states who previously adopted the Common Core standards have replaced or rewritten them.
  • 18 states and the District of Columbia retain the full Common Core standards.
  • 1 state retains only the English language arts standards.
  • 4 states never adopted the standards.[720]

* New Hampshire, which retains the Common Core standards, passed a law in 2017 that prevents the state government “from requiring the implementation of the Common Core standards in any school or school district in this state.[721]

* In 2020, the governor of Florida removed and replaced the Common Core standards.[722]

* CCSSI asserts that the Common Core standards:

  • “represent what American students need to know and do to be successful in college and careers.”[723]
  • are “for the benefit of all students.”[724]
  • “are research- and evidence-based.”[725]

* Two of the Common Core validation committee members who refused to validate the standards assert that they:

  • “barely prepare students for attending a community college, let alone a 4-year university.”
  • employ approaches to math and geometry that have yielded “bad outcomes” for students.
  • are not justified by “suitable research.”[726] [727] [728] [729]

* Organizations other than CCSSI are developing common standards for science, world languages, and arts.[730]


Centralization & Decentralization

* In the field of education, “centralization” refers to the transfer of decision-making authority from individuals, teachers, schools, and local governments to state or national governments. “Decentralization” is the opposite of centralization.[731] [732]

* Common Core is a form of educational centralization because it specifies “a single body of knowledge and skills that students … will be expected to possess.”[733]

* The alleged benefits of centralizing education include but are not limited to:

  • more equity between schools with regard to standards, curriculum, testing, graduation requirements, funding, and teacher qualifications.[734] [735]
  • reduced costs through economies of scale, so that certain tasks are not repeated needlessly.[736] [737]
  • increased effectiveness in nations where students have similar cultural, ethnic, and linguistic backgrounds.[738]
  • greater likelihood of equipping students with broader skills that transcend local and regional variations.[739]
  • the ability to rapidly spread educational improvements across schools.[740] [741]

* The alleged benefits of decentralizing education include but are not limited to:

  • more flexibility for educators to teach and motivate students based upon their personal aptitudes, backgrounds, interests, and goals.[742] [743] [744] [745] [746]
  • reduced bureaucracy leading to fewer costs, “bureaucratic stagnation, centralized inefficiencies, and corruption.”[747] [748] [749] [750]
  • increased opportunity for communities to be involved in education and greater ability for them to effect change.[751] [752]
  • greater likelihood of equipping students with skills that are needed in the areas where they live.[753] [754]
  • less proliferation of counterproductive or ineffectual polices favored by central authorities.[755]

* Some of the factors that make it difficult to determine the effects of centralization and decentralization include:

  • the complexity of measuring centralization.[756] [757] [758] [759] [760]
  • dynamics that may cause centralization to be helpful in some settings and harmful in others.[761] [762]
  • numerous conflating variables that affect students and school systems.[763] [764]
  • a dearth of experimental studies on this issue.[765] [766] [767]

* CCSSI asserts that a “root cause” of U.S. academic stagnation has been “an uneven patchwork of academic standards that vary from state to state and do not agree on what students should know and be able to do at each grade level.”[768]

* In October 2015, Just Facts asked CCSSI to provide “specific studies” that prove academic stagnation has been caused by differing state education standards.[769] CCSSI responded but did not provide such research.[770]

* From 1920 to 2020, the portion of K–12 public school funding provided by:

  • local governments decreased from 83% to 45%.
  • state governments increased from 16% to 47%.
  • the federal government increased from 0.3% to 8%.[771]

* As the federal and state governments have funded a growing share of K–12 school expenses, the U.S. education system has become increasingly centralized. This has transferred decision-making power from community schools to higher levels of government through:

  • a decrease in public school districts from about 117,000 in 1940 to 13,318 in 2021.[772] [773]
  • district control over school personnel and curriculums.[774]
  • state-mandated “uniform standards across grade levels, schools, and districts.”[775]
  • state requirements on teacher certification, unionization, and binding arbitration.[776] [777] [778] [779] [780] [781] [782]
  • federal laws and regulations that require and incentivize states to adopt various standards, assessments, and policies.[783] [784] [785] [786]

* Per a 1980 academic book on the U.S. education system:

[T]he American assumption is that communities constitute the unit most capable of running the schools. While the state may mandate that districts’ boundaries be redrawn, the notion that a particular state might be capable of running all schools within its boundaries is unthinkable in the American context.[787]

* Per a 1997 academic book on education decentralization:

Site-based management [SMB] is a business derivative of decentralization and participatory decision-making. The intent of site-based management is to improve student performance by making those closest to the delivery of services—teachers and principals—more autonomous, resulting in their being more responsive to parents and students concerns.
While many schools in the United States claim to implement SBM, very little decision-making is truly decentralized. In most cases SBM is only a subset of the various types of decisions that are made at the district level. … The illusion of autonomy based on SBM is often constrictive because the district office retains the final authority or limits the range of decision-making….[788]

Math

* The complete Common Core math standards are available here.[789]

* R. James Milgram, Emeritus Professor at Stanford University’s Department of Mathematics, was the only mathematician who served on Common Core’s validation committee.[790] [791] [792] He refused to certify the standards and has been critical of them.[793]

* Other mathematicians have supported and opposed the standards.[794] [795] [796]


* CCSSI asserts that the math standards “call for speed and accuracy in calculation.”[797]

* The math standards require first graders to “think of whole numbers between 10 and 100 in terms of tens and ones” and solve problems such as:

  • “8 + 6” with “strategies” like this: “8 + 2 + 4 = 10 + 4 = 14”
  • “13 – 4” by “decomposing” numbers like this: “13 – 3 – 1 = 10 – 1 = 9”
  • “6 + 7” by “creating equivalent but easier or known sums” like this: “6 + 6 + 1 = 12 + 1 = 13”[798]

* The math techniques above are illustrated in the following videos produced by a local NBC television station. The station made these videos so that parents can help students who “find the math lessons confusing.” The lessons are taught by a local public school math teacher:[799]

Addition Using Base 10 for 1st Grade & Older

Subtraction Using Place Value Chart (2nd Grade)

* CCSSI asserts that the Common Core standards are “research- and evidence-based.”[800]

* In October 2015, Just Facts asked CCSSI to provide “specific studies” that prove the math strategies above are effective.[801] CCSSI responded but did not provide such research.[802] [803] [804]


* The Common Core math standards compel students to explain “why a particular mathematical statement is true or where a mathematical rule comes from.”[805]

* The following sample question and answer are from a teaching guide for 3rd grade Common Core math from the North Carolina Department of Public Instruction:

[Question]: “What do you notice about the numbers highlighted in pink in the multiplication table? Explain a pattern using properties of operations.”
Common Core Math Verbalization Problem
[Answer]: “When (commutative property) one changes the order of the factors they will still gets the same product, example 6 x 5 = 30 and 5 x 6 = 30.”[806]

* Per W. Stephen Wilson, Ph.D. mathematician, professor of mathematics at Johns Hopkins University, and Common Core supporter:[807] [808]

There will always be people who believe that you do not understand mathematics if you cannot write a coherent essay about how you solved a problem, thus driving future STEM [science, technology, engineering and math] students away from mathematics at an early age. A fairness doctrine would require English language arts (ELA) students to write essays about the standard [math] algorithms, thus also driving students away from ELA at an early age. The ability to communicate is NOT essential to understanding mathematics.[809]

* Per CCSSI:

There is a world of difference between a student who can summon a mnemonic device [i.e., reminder of as rule] to expand a product such as (a + b)(x + y) and a student who can explain where the mnemonic comes from. The student who can explain the rule understands the mathematics, and may have a better chance to succeed at a less familiar task such as expanding (a + b + c)(x + y). Mathematical understanding and procedural skill are equally important….[810]

* In October 2015, Just Facts asked CCSSI to provide “specific studies” proving that forcing student to verbalize “why a particular mathematical statement is true” improves their math education.[811] CCSSI responded but did not provide such research.[812] [813] [814]


* The Common Core math standards require students to solve math problems by using “concrete models,” “drawings,” and “objects.”[815] The following video shows an example of this:

Homework Helper: Division with a Remainder (4th Grade & Up)

* Per a 1989 meta-study of student learning styles published in Educational Leadership and republished in 2002 in the California Journal of Science Education:

  • “Learning style is a biologically and developmentally imposed set of personal characteristics that make the same teaching method effective for some and ineffective for others.”
  • Students have differing sensory learning preferences, such as sight, sound, and touch.
  • Students learn better and achieve higher test scores when they are taught with instructional resources that correspond to their sensory preferences.[816] [817]

* CCSSI asserts that the Common Core standards are “for the benefit of all students.”[818]

* In October 2015, Just Facts asked CCSSI to provide “specific studies” that prove drawing pictures and using objects improve students’ math abilities.[819] CCSSI responded but did not provide such research.[820] [821] [822]


English Language Arts

* The complete Common Core standards for “English Language Arts & Literacy in History/Social Studies, Science, and Technical Subjects” are available here.[823]

* Sandra Stotsky was the only expert on K–12 English language arts (ELA) standards who served on Common Core’s validation committee.[824] The validation committee report states that she is an:

Endowed Chair in Teacher Quality at the University of Arkansas’s Department of Education Reform and Chair of the Sadlier Mathematics Advisory Board
Stotsky has abundant experience in developing and reviewing ELA standards. As senior associate commissioner of the Massachusetts Department of Education, she helped revise pre-K–12 standards. She also served on the 2009 steering committee for NAEP reading and on the 2006 National Math Advisory Panel.[825]

* Stotsky refused to certify the standards and has been critical of them.[826] [827]

* Per CCSSI, the ELA standards:

  • “set grade-specific standards but do not define the intervention methods or materials necessary to support students who are well below or well above grade-level expectations.”[828]
  • “focus on what is most essential,” but “they do not describe all that can or should be taught. A great deal is left to the discretion of teachers and curriculum developers.”[829]
  • “intentionally do not include a required reading list. Instead, they include numerous sample texts to help teachers prepare for the school year and allow parents and students to know what to expect during the year.”[830]
  • require “much greater attention to a specific category of informational text—literary nonfiction—than has been traditional.”[831] [832]
  • require students to “evaluate the argument and specific claims in a text, assessing whether the reasoning is valid and the evidence is relevant and sufficient….”[833]
  • require students to “evaluate a speaker’s point of view, reasoning, and use of evidence and rhetoric, identifying any fallacious reasoning or exaggerated or distorted evidence.”[834]

* The ELA standards assert that “a particular standard was included in the document only when the best available evidence indicated that its mastery was essential for college and career readiness in a twenty-first-century, globally competitive society.”[835]


Impact on Curriculum & Teaching

* CCSSI states that the Common Core standards are “not a curriculum.”[836]

* Per a 2003 academic book about middle school education standards:

No issue currently impacts the middle level school more than curriculum reform based on state and national standards. … In fact, the aligning of curriculum and instruction to specific state content standards has become a universal teaching skill now taught in colleges of education and practiced in literally all school districts. Does this mean that the content of middle level curriculum is being controlled by the content of state standards, and, to some degree, the content of the state tests that are based on these standards? Certainly, without a doubt.[837]

* In 2009, Bill Gates, the primary financial backer of Common Core,[838] wrote that “identifying common standards is not enough. We’ll know we’ve succeeded when the curriculum and the tests are aligned to these standards.”[839]

* In 2010, the Common Core validation committee wrote that “alignment of curricula and assessments to the Common Core State Standards … will be essential to the staying power and lasting impact of the standards.”[840]

* In 2014, Bill Gates wrote:

These are standards, just like the ones schools have always had; they are not a curriculum. They are a blueprint of what students need to know, but they have nothing to say about how teachers teach that information. It’s still up to local educators to select the curriculum.[841]

* CCSSI asserts that the Common Core standards “do not dictate how teachers should teach.”[842]

* In 2014, Bill Gates wrote that the Common Core standards “are a blueprint of what students need to know, but they have nothing to say about how teachers teach that information.”[843]

* The Common Core ELA standards state that they “do not mandate such things as a particular writing process or the full range of metacognitive strategies that students may need to monitor and direct their thinking and learning.”[844] [845] The Common Core math standards do not contain a similar statement.[846]

* The Common Core math standards dictate the specific teaching processes and learning strategies shown in the examples above.

* Mathematician and Common Core supporter Hung-Hsi Wu has written that the Common Core math standards “say explicitly what needs to be taught” about the “process of reasoning” for solving equations.[847] In a commentary for American Educator, Wu detailed how Common Core requires the use of certain processes for adding fractions and mandates that these processes be taught over three years from grades 3 through 5. The first part of the 3rd grade teaching process is as follows:

Briefly, in grade 3, students learn to think of a fraction as a point on the number line that is “so many copies” of its corresponding unit fraction. For example, 5/6 is 5 copies of the unit fraction 1/6 (and 1/6 is 1 copy). When we represent a fraction as a point on the number line, we place a unit fraction such as 1/6 on the division point to the right of 0 when the unit segment from 0 to 1 is divided into 6 equal segments. It is natural to identify such a point with the segment between the point itself and 0. Thus, as shown below, 1/6 is identified with the red segment between 0 and 1/6, 5/6 is identified with the segment between 0 and 5/6, etc. Then, the statement that “5/6 is 5 copies of 1/6” acquires an obvious visual meaning: the segment from 0 to 5/6 is 5 copies of the segment from 0 to 1/6.[848]

* According to a math framework adopted by the Los Angeles County Office of Education and sponsored by Bill Gates:[849] [850]

Universal access to the content standards requires that educators apply a strong equity lens as they plan their instruction.[851]

* This framework asserts that, “White supremacy culture shows up in math classrooms when students are required to ‘show their work’ ” and instructs teachers to practice “how to answer mathematical problems without using words or numbers.”[852]

* For more facts about the impact of Common Core on teaching processes, see the forthcoming section on standardized tests.


Standardized Tests

* Tests (standardized and otherwise) can be used to:

  • “diagnose students’ strengths and weaknesses.”
  • “serve as the basis for teacher reflections on their instructional effectiveness.”
  • help “teachers to identify students who need additional instruction, special services, or more advanced work.”[853]
  • motivate students to learn and cognitively assist them in this process.[854] [855] [856]
  • help colleges and employers evaluate potential students and job candidates.[857] [858]
  • help parents, taxpayers, and policymakers evaluate the effectiveness of educators and education policies.[859]

* Per a 1980 academic book on the U.S. education system:

If no standardization exists, schools can postulate anything as satisfying graduation requirements. The development of standardized testing constitutes a response to this problem in the United States, but many graduates are led to believe that they have received a certain kind of education when, in reality their achievement is low.[860]

* When parents and governments don’t have access to valid information about student outcomes, school employees have leeway to minimize their workloads and favor their own interests over that of the students. Per a 2005 paper in the journal Education Economics, standardized exams can help remedy this problem “by supplying information about the performance of individual students relative to the national (or regional) student population.”[861] [862]

* Standardized tests can provide valid information about student outcomes if they accurately measure the desired effects of education. In education literature, this is called test validity. Per the Encyclopedia of Educational Psychology:

Validity is the extent to which a test measures what it was designed to measure. This means that tests are designed for specific purposes, and each test must have its own validity for the purpose for which it was designed. … That is, a test may consistently measure the wrong thing. Establishing test validity is thought to be a more complex process than establishing test reliability because establishing validity depends on the judgments to be made based on test results and how the results will be used. It is necessary to collect information as evidence that a test provides a true measure of such abstractions. To validate that tests provide true measures, certain information or evidence must be collected depending on the type of validity to be determined.[863]

* Per the Encyclopedia of Measurement and Statistics:

  • Standardized “test scores should never be used for purposes that are not validated.”
  • “The process of validation is a responsibility of the test developer and the sponsor of the testing program.”
  • “A technical report or test manual” should show “the argument and evidence supporting each intended test score interpretation or use.”[864]

* In 2010, the Obama administration awarded $330 million to two state-led consortiums to develop standardized tests that are aligned to Common Core:[865]

  1. The Partnership for Assessment of Readiness in College and Career (PARCC)[866]
  2. The SMARTER (Summative Multi-State Assessment Resources for Teachers and Educational Researchers) Balanced Assessment Consortium (SBAC)[867]

* In 2011, the Obama administration announced that it would exempt states from various requirements of federal education law if the states adhered to four conditions. The first of these was to adopt “college- and career-ready standards” and administer standardized tests aligned with these standards.[868] [869] [870] CCSSI refers to Common Core as “college- and career-readiness standards.”[871]

* Among the 46 states that adopted the Common Core standards, at least 26 became members of the PARCC consortium at some point, and at least 31 became members of the SBAC consortium at some point.[872] [873]

* In the 2014–15 school year, the first time the PARCC and SBAC tests were administered, 11 states and the District of Columbia used the PARCC exam, and 18 states used the SBAC exam.[874] [875] [876]

* Since the 2014–15 school year, several states that previously used the PARCC and SBAC exams have announced that they will not use them in the future.[877] [878] [879] [880] [881]

* SBAC maintains a list of states that are current members of the consortium.[882]

* PARCC used to maintain a list of states that were members of the consortium but no longer does so.[883] Per Education Week:

In 2015, its leaders decided to go in a new direction, allowing states to license content like specific test questions, rather than having a rigid membership model in which member states gave the whole test.[884] [885]

* Per a 2001 book on educational assessments published by the National Academies of Science:

[P]olicy makers see large-scale assessments of student achievement as one of their most powerful levers for influencing what happens in local schools and classrooms. Increasingly, assessments are viewed as a way not only to measure performance, but also to change it, by encouraging teachers and students to modify their practices.[886]

* David Coleman was a lead writer for the Common Core ELA standards, a cofounder of an organization that “played a leading role in developing” the standards, and one of the key people who lobbied Bill Gates to fund Common Core.[887] [888] [889] In 2011, Coleman stated that the Common Core standards:

are worthy of nothing if the assessments built on them are not worthy of teaching to, period. … [T]he great rule that I think is a statement of reality, though not a pretty one, which is teachers will teach towards the test. There is no force strong enough on this earth to prevent that. … Tests exert an enormous effect on instructional practice, direct and indirect, and it’s hence our obligation to make tests that are worthy of that kind of attention. It is in my judgment the single most important work we have to do over the next two years to ensure that that is so, period.[890]

* In 2012, Coleman became president of the College Board, the organization that produces the SAT college entrance exam and Advanced Placement tests.[891] [892] [893] [894]

* In 2013, Coleman announced that the College Board was going to “redesign the SAT” to “prepare students for the rigors of college and career.”[895]

* In 2014, the College Board published a “conversation guide” for the redesigned SAT that posed the question, “Is the SAT aligned to the Common Core?” The guide answered:

The redesigned SAT measures the skills and knowledge that evidence shows are essential for college and career success. It is not aligned to any single set of standards.[896]

* In March 2016, the redesigned SAT replaced the former version.[897]

Homeschooling

* Homeschooling is the oldest form of education, and it was common practice until public schools became prevalent in the mid-1800s.[898] [899] [900] [901]

* In 2019, approximately 1.46 million children or 2.8% of K–12 students in the U.S. were homeschooled. These figures are not categorical, because some of these students also took some classes and played sports in public and private schools and colleges.[902] [903] [904]

* Depending upon their level of education, parents homeschooled their children at the following rates in 2019:

  • High school diploma or less – 2.2%
  • Vocational/technical/associate’s or some college – 2.9%
  • Bachelor’s degree/some graduate school – 3.3%
  • Graduate/professional degree – 3.1%[905]

* Homeschooling is legal throughout the U.S. with widely varying state regulations on it. In the state of Washington, parents must be certified as teachers in order to homeschool.[906] [907] [908] [909]

* Homeschooling is permitted in most nations.[910] Germany has generally prohibited homeschooling since 1938 when the Nazi government enacted a law that effectively banned it.[911] [912] [913] Some other nations that ban or strictly limit homeschooling include Bulgaria, Greece, and the Netherlands.[914]

* A 2007 survey of parents who homeschool their children found that they did so for the following reasons:

Reason

Portion of Parents

Concern about the school environment, such as safety, drugs, or negative peer pressure

88%

Desire to provide religious or moral instruction to their children

83%

Dissatisfaction with academic instruction at other schools

73%

Desire to take a nontraditional approach to education

65%

Increased family time, financial considerations, flexibility to travel, or lack of proximity to an appropriate school

32%

Having a child with special needs “other than a physical or mental health problem that the parent feels the school cannot or will not meet”

21%

Having a child with a physical or mental health problem

11%

[915]

* In 2010, the journal Academic Leadership published a nationwide study of 11,739 homeschooled students during the 2007–08 school year. It found that parents spent a median of $400 to $599 per student on “textbooks, lesson materials, tutoring, enrichment services, testing, counseling, evaluation,” and other incidentals.[916] [917] Regarding these findings:

  • The study was based on a survey with a response rate of approximately 19%.[918] Thus, the results are not definitive.[919] [920] [921]
  • The families who participated in the survey were more likely than the general population to have bachelor’s degrees, be married, and not be racial minorities.[922] [923]
  • Adjusted for inflation into 2023 dollars, the median annual cost to educate a homeschooled student ranged from $576 to $862.[924]
  • These figures do not account for the cost of parental time investment or the value of being able to live in areas without regard for the quality of the local schools.[925]

* The same study in Academic Leadership examined the academic performance of 22,584 homeschooled students who took standardized tests administered by three major testing services. This was the broadest sample of homeschooled student test scores ever studied. The researcher found that the average performance of these students ranked in the top 20% of all U.S. students in each of the five academic disciplines examined:

Subject

Average National Ranking

Reading

87%

Language

81%

Math

80%

Science

82%

Social Studies

80%

[926]

* Regarding the findings above, the paper documents that “the above-average nature of these achievement test scores is also consistent” with nine other similar studies. Per the study’s author, Brian D. Ray (Ph.D. in science education):[927]

Comparisons between home-educated students and institutional school students nationwide should, however, be interpreted with thoughtfulness and care. … [This study] is not an experiment and readers should be careful about assigning causation to anything.
 
One could say … “This study simply shows that those parents choosing to make a commitment to home schooling are able to provide a very successful academic environment.” On the other hand, it may be that something about the typical nature and practice of home-based education causes higher academic achievement, on average, than does institutional state-run schooling….[928] [929]

* Per the Encyclopedia of Education Economics & Finance (2014):

More rigorous empirical work is needed regarding the “black box” of homeschooling before definitive conclusions are drawn.
 
At issue are several limitations for the study of homeschooler outcomes. First, there has been no empirical study thus far based on data obtained from a random sample of all homeschoolers. This means that the findings cannot be generalized from the study samples to the entire homeschooling population.[930]

Digital Learning

* Digital learning involves the use of computerized technologies to increase the effectiveness of education or reduce its costs.[931] [932]

* Forms of digital learning include (but are not limited to):

  • Online courses, which allow students to take courses that are not offered at their local public, private, or home schools. These courses also give students flexibility to pursue careers, independent study programs, athletics, and other endeavors.[933] [934] [935] [936]
  • Fully online schools, which “provide a student’s entire education online.” These schools offer accessibility to “hospitalized, homebound, pregnant, incarcerated, or other students in similar uncommon circumstances.”[937]
  • Adaptive learning software and platforms, which teach students through interactive courseware that analyzes each student’s learning style, academic needs, and intellectual abilities. The software then uses this information to deliver content designed to optimize each student’s learning potential. Per the SAGE Encyclopedia of Educational Technology:
Adaptive learning software and platforms, due to their ability to change the content and representations according to a student’s needs, resemble the situation when a personal instructor is available for each individual student.[938] [939]
  • Blended or hybrid learning, which combines traditional face-to-face teaching with digital technologies. Blended learning typically does not yield the cost savings of other digital learning approaches, because it does not reduce the need for school staff, school buildings, or student transportation.[940] [941] [942] [943]

* In the 2019–20 school year prior to the Covid-19 pandemic,[944] 17 states had publicly funded online K–12 schools that allowed students to take supplemental courses. Students took more than 1 million courses through these schools.[945] [946]

* With regard to fully online K–12 schools—primarily operated without physical buildings—that attract students from across districts:

  • Prior to the Covid-19 pandemic in the 2018–19 school year, 32 states and the District of Columbia permitted such schools, which educated about 375,000 students.[947] [948]
  • Amid the Covid-19 pandemic in the 2020–21 school year, 35 states and the District of Columbia permitted such schools, which educated about 656,000 students. This figure does not include students who temporarily studied remotely due to government shutdown of public schools amidst the Covid-19 pandemic.[949] [950] [951]

* With regard to college students:

  • In the 2015–16 school year, 43% of undergraduate students took at least one online class, and 11% took their entire degree program online.[952]
  • In the 2011–12 school year, 36% of graduate students took at least one online class, and 20% took exclusively online classes.[953]

* In 2013, the journal Teachers College Record published an analysis of 45 experimental (and quasi-experimental) studies that measured 50 effects of online and blended learning versus traditional face-to-face classrooms. These studies included K–12 students, college students, and people receiving job-related training. The authors found that:

  • “Among the 50 individual contrasts between online and face-to-face instruction, 11 were significantly positive, favoring the online or blended learning condition. Three significant negative effects favored traditional face-to-face instruction.”
  • In total, the studies showed that students who learned online fared about the same as students in traditional classrooms, and students in blended learning environments performed better than students in traditional classrooms. According to common (yet subjective) statistical conventions, the overall positive effect of blended learning was “small” to “medium.”
  • “Studies using blended learning also tended to involve additional learning time, instructional resources, and course elements that encourage interactions among learners.” These variables and others may have “contributed to the particularly positive outcomes for blended learning.”
  • This analysis of studies does “not reflect the latest technology innovations” since 2009, because “the cycle time for study design, execution, analysis, and publication cannot keep up with the fast-changing world of Internet technology.”[954] [955]

Footnotes

[1] Book: The SAGE Encyclopedia of Educational Technology. Edited by J. Michael Spector. Sage Publications, 2015. Article: “Adaptive Learning Software and Platforms.” By Dr. Kinshuk. Pages 7–10.

Page 9: “Various cognitive abilities of students are crucial for learning. Examples of these abilities include working memory capacity, inductive reasoning ability, information processing speed, associative learning skills, metacognitive skills, observation ability, analysis ability, and abstraction ability.”

[2] Paper: “The Importance of Noncognitive Skills: Lessons from the GED Testing Program.” By James J. Heckman and Yona Rubinstein. American Economic Review, May, 2001. Pages 145–149. <www.jstor.org>

Pages 145–146:

Studies by Samuel Bowles and Herbert Gintis (1976), Rick Edwards (1976), and Roger Klein and others (1991) demonstrate that job stability and dependability are traits most valued by employers as ascertained by supervisor ratings and questions of employers although they present no direct evidence on wages and educational attainment. Perseverance, dependability, and consistency are the most important predictors of grades in school (Bowles and Gintis, 1976).

[3] Encyclopedia of Education Economics and Finance. Edited by Dominic J. Brewer and Lawrence O. Picus. Sage Publications, 2014.

Page 498:

Omitted variable bias (OVB) occurs when an important independent variable is excluded from an estimation model, such as a linear regression, and its exclusion causes the estimated effects of the included independent variables to be biased. Bias will occur when the excluded variable is correlated with one or more of the included variables. An example of this occurs when investigating the returns to education. This typically involves regressing the log of wages on the number of years of completed schooling as well as on other demographic characteristics such as an individual’s race and gender. One important variable determining wages, however, is a person’s ability. In many such regressions, a measure of ability is not included in the regression (or the measure included only imperfectly controls for ability). Since ability is also likely to be correlated with the amount of schooling an individual receives, the estimated return to years of completed schooling will likely suffer from OVB.

[4] Report: “Improving Health and Social Cohesion through Education.” Organization for Economic Cooperation and Development, Center for Educational Research and Innovation, 2010. <www.oecd.org>

Pages 31–33:

(a) Reverse causality

One source of endogeneity stems from the possibility that there is reverse causality, whereby poor health or low CSE [civic and social engagement] reduces educational attainment. Poor health in youth might interfere with educational attainment by interfering with student learning because of increased absences and inability to concentrate. It may also lead to poor adult health, thus creating a correlation between education and adult health. Similarly, low CSE such as lack of trust and political interest might also reduce educational attainment. For example, a family with low CSE might reduce their involvement with schools, which might lead to poorer student outcomes.7

The bias due to reverse causality can be re-cast as an omitted variable problem after considering timing issues. Since health and CSE tend to persist over time, past health or CSE can be an important determinant of current health or CSE. Thus, past health or CSE is an omitted variable in equation (1) which is captured by the error term. The extent to which omitting past health or CSE will lead to an omitted variable bias depends on the extent to which past health or CSE is also correlated with the included variable Education. Because the current stock of education depends on past decisions about investments in education, reverse causality generates a correlation between past health or CSE and the individual’s current stock of education.8 If the estimated coefficient picks up the effect of past health or CSE … will be biased towards overestimating the causal effect of education.

(b) Hidden third variables

The second source of endogeneity comes from the possibility that there might be one or more hard-to-observe hidden third variables which are the true causes of both educational attainment and health and CSE.9 In the context of the education–earnings link, the most commonly mentioned hidden third variable is ability.10 The long-standing concern in this line of research has been that people with greater cognitive ability are more likely to invest in more education, but even without more education their higher cognitive ability would lead to higher earnings (Card, 2001). More recently, non-cognitive abilities such as the abilities to think ahead, to persist in tasks, or to adapt to their environments have been suggested as important determinants of both education and earnings outcomes (Heckman and Rubinstein, 2001).

In the context of the education–health link, Fuchs (1993) describes time preference and self-efficacy as his favorite candidates for hidden third variables. People with a low rate of time preference are more willing to forego current utility and invest more in both education and health capital that pays off in the future (Farrell and Fuchs, 1982, Fuchs, 1982). A classic example is the Stanford Marshmallow Experiment in which 4 year-olds were given the choice between eating the marshmallow now or waiting for the experimenter’s return and getting a second marshmallow. When these children were tested again at age 18, Shoda and others (1990) found a strong correlation between delayed gratification at age 4 and mathematical and English competence. Similarly, people with greater self-efficacy, i.e. those who believe in their ability to exercise control over outcomes, will be more likely to invest in schooling and health. Most studies of the schooling–health link use data sets that do not contain direct or proxy measures of time preference and self-efficacy. Consequently, these variables are typically omitted when estimating equation (1). The resulting omitted variable bias again implies that … will be biased towards overestimating the causal effect of education on health.

In the context of the education–CSE link, Milligan and others (2004) suggest that the same parents who encourage their children to participate in civic activities might also instill in their children a stronger taste for education.11 It also seems reasonable to suggest time preference and self-efficacy as candidates for hidden third variables behind the education–CSE link. As suggested by the term “social capital”, education capital, health capital and CSE share some common features. In particular, a belief in self-efficacy is a potentially important determinant of civic participation and other aspects of investments in CSE. As in the education–health link, this type of omitted variable bias implies that … will be biased towards overestimating the causal effect of education on CSE.

A few recent studies have explored the issue of biases due to omitting measures of cognitive or non-cognitive skills in the context of the education–health link. Sander (1998) suggests that some of the negative correlation between attending college and smoking in the US can be attributed to differences in cognitive ability. Auld and Sidhu (2005) using the U.S. Armed Forces Qualification Test (AFQT) scores suggest that cognitive ability accounts for roughly one-quarter of the association between education and self-reported health limitations. Kenkel and others (2006) also use the AFQT score as a measure of cognitive skills and in addition include the Rotter index of the locus of control as a proxy for non-cognitive skills. They find that cognitive ability has strong associations with smoking, but weaker associations with being overweight. Their results for the Rotter index of locus of control12 suggest that men who believe that what happens to them is outside their control are more likely to currently smoke and are less likely to be former smokers. Locus of control is more weakly associated with women’s smoking and is not associated with the probability of being overweight or obese for either men or women. Hence, the empirical evidence from the United States suggests that cognitive and non-cognitive ability might be important omitted variables in many previous studies of the education–health link.

Page 36: “Although studies identifying the causal effect of education on health and CSE should strive to control for hidden third variables such as time preference, in most cases data limitations will severely limit the usefulness of this strategy.”

[5] Book: Higher Education: Handbook of Theory and Research (Volume 28). Edited by Michael B. Paulsen. Springer, 2013.

Chapter 6: “Instrumental Variables: Conceptual Issues and an Application Considering High School Course Taking.” By Rob M. Bielby and others. Pages 263–312.

Page 273:

Some student characteristics may be difficult or impossible to obtain information about in observational datasets, but this does not change the fact that they are confounding factors (Cellini, 2008). Examples of potential unobservable factors in course taking effects research include a student’s enjoyment of the learning process and a student’s desire to undertake and persevere through challenges. It is likely that these unobservable factors contribute to student selection into high school courses and a student’s subsequent choice to attain a bachelor’s degree.

[6] Paper: “What Roles Do Parent Involvement, Family Background, and Culture Play in Student Motivation?” By Alexandra Usher and Nancy Kober. Center on Education Policy, 2012. <eric.ed.gov>

Page 1:

Research has long documented a strong relationship between family background factors, such as income and parents’ educational levels, and student achievement. Studies have also shown that parents can play an important role in supporting their children’s academic achievement. But to what extent do family background and parent involvement affect student motivation, a critical underpinning of academic achievement and success in school?

This paper examines findings from research about the impact of various family background and cultural factors on student motivation, as well as the role of parental beliefs, attitudes, and actions in fostering children’s motivation. The paper does not attempt to be a comprehensive review of the broad literature on family background and achievement, but rather is a sampling of some current findings from the field that appear to impinge on motivation.

[7] Book: Knowing What Students Know: The Science and Design of Educational Assessment. Edited by James W. Pellegrino, Naomi Chudowsky, and Robert Glaser. National Academies Press, 2001. <www.nap.edu>

Page 39: “[A] teacher whose students have higher test scores is not necessarily better than one whose students have lower scores. The quality of inputs—such as the entry characteristics of students or educational resources available—must also be considered.”

Page 40:

As with evaluating teachers, care must be taken not to extend the results of assessments at a particular school to reach conclusions not supported by the evidence. For example, a school whose students have higher test scores is not necessarily better than one whose students have lower test scores. As in judging teacher performance, the quality of inputs—such as the entry characteristics of students or educational resources available—must also be considered.

[8] Book: Introductory Econometrics: Using Monte Carlo Simulation with Microsoft Excel. By Humberto Barreto and Frank M. Howland. Cambridge University Press, 2006.

Page 491:

Omitted variable bias is a crucial topic because almost every study in econometrics is an observational study as opposed to a controlled experiment. Very often, economists would like to be able to interpret the comparisons they make as if they were the outcomes of controlled experiments. In a properly conducted controlled experiment, the only systematic difference between groups results from the treatment under investigation; all other variation stems from chance. In an observational study, because the participants self-select into groups, it is always possible that varying average outcomes between groups result from systematic difference between groups other than the treatment. We can attempt to control for these systematic differences by explicitly incorporating variables in a regression. Unfortunately, if not all of those differences have been controlled for in the analysis, we are vulnerable to the devastating effects of omitted variable bias.

[9] Book: Multiple Regression: A Primer. By Paul D. Allison. Pine Forge Press, 1998.

Chapter 1: “What Is Multiple Regression?” <us.sagepub.com>

Page 20:

Multiple regression shares an additional problem with all methods of statistical control, a problem that is the major focus of those who claim that multiple regression will never be a good substitute for the randomized experiment. To statistically control for a variable, you have to be able to measure that variable so that you can explicitly build it into the data analysis, either by putting it in the regression equation or by using it to form homogeneous subgroups. Unfortunately, there’s no way that we can measure all the variables that might conceivably affect the dependent variable. No matter how many variables we include in a regression equation, someone can always come along and say, “Yes, but you neglected to control for variable X and I feel certain that your results would have been different if you had done so.”

[10] Book: Theory-Based Data Analysis for the Social Sciences (2nd edition). By Carol S. Aneshensel. SAGE Publication, 2013.

Page 90:

The numerous variables that are omitted from any model are routinely assumed to be uncorrelated with the error term, a requirement for obtaining unbiased parameter estimates from regression models. However, the possibility that unmeasured variables are correlated with variables that are in the model obviously cannot be eliminated on empirical grounds. Thus, omitted variable bias cannot be ruled out entirely as a counterargument for the empirical association between the focal independent and dependent variables in observational studies.

[11] Book: Applied Statistics for Economists. By Margaret Lewis. Routledge, 2012.

Page 413: “In economics, our primary concern is to identify and then include all relevant independent variables as indicated by economic theory.9 Omitting such variables will cause the regression model to be underspecified, with the partial regression coefficients that are affected by the omitted variable(s) will not equal the true population parameters.”

[12] Encyclopedia of Education Economics and Finance. Edited by Dominic J. Brewer and Lawrence O. Picus. Sage Publications, 2014.

Page 498:

Omitted variable bias (OVB) occurs when an important independent variable is excluded from an estimation model, such as a linear regression, and its exclusion causes the estimated effects of the included independent variables to be biased. Bias will occur when the excluded variable is correlated with one or more of the included variables. An example of this occurs when investigating the returns to education. This typically involves regressing the log of wages on the number of years of completed schooling as well as on other demographic characteristics such as an individual’s race and gender. One important variable determining wages, however, is a person’s ability. In many such regressions, a measure of ability is not included in the regression (or the measure included only imperfectly controls for ability). Since ability is also likely to be correlated with the amount of schooling an individual receives, the estimated return to years of completed schooling will likely suffer from OVB.

[13] Book: Higher Education: Handbook of Theory and Research (Volume 28). Edited by Michael B. Paulsen. Springer, 2013.

Chapter 6: “Instrumental Variables: Conceptual Issues and an Application Considering High School Course Taking.” By Rob M. Bielby and others. Pages 263–312.

Page 273:

An additional issue with the aforementioned studies is that none employ strategies to eliminate the influence of unobservable factors on course taking and attainment. Some student characteristics may be difficult or impossible to obtain information about in observational datasets, but this does not change the fact that they are confounding factors (Cellini, 2008). Examples of potential unobservable factors in course taking effects research include a student’s enjoyment of the learning process and a student’s desire to undertake and persevere through challenges. It is likely that these unobservable factors contribute to student selection into high school courses and a student’s subsequent choice to attain a bachelor’s degree. However, none of the studies we examined that employ a standard regression approach accounted for a student’s intrinsic love of learning or ability to endure through difficulties; the failure to account for these unobserved factors may bias the estimates that result from these studies.

[14] Book: Multiple Regression: A Primer. By Paul D. Allison. Pine Forge Press, 1998.

Chapter 1: “What Is Multiple Regression?” <us.sagepub.com>

Page 1: “Multiple regression is a statistical method for studying the relationship between a single dependent variable and one or more independent variables. It is unquestionably the most widely used statistical technique in the social sciences. It is also widely used in the biological and physical sciences.”

Chapter 3: “What Can Go Wrong With Multiple Regression?” <us.sagepub.com>

Page 49:

Any tool as widely used as multiple regression is bound to be frequently misused. Nowadays, statistical packages are so user-friendly that anyone can perform a multiple regression with a few mouse clicks. As a result, many researchers apply multiple regression to their data with little understanding of the underlying assumptions or the possible pitfalls. Although the review process for scientific journals is supposed to weed out papers with incorrect or misleading statistical methods, it often happens that the referees themselves have insufficient statistical expertise or are simply too rushed to catch the more subtle errors. The upshot is that you need to cast a critical eye on the results of any multiple regression, especially those you run yourself.

Fortunately, the questions that you need to ask are neither extremely technical nor large in number. They do require careful thought, however, which explains why even experts occasionally make mistakes or overlook the obvious. Virtually all the questions have to do with situations where multiple regression is used to make causal inferences.

NOTE: Pages 49–65 detail eight possible pitfalls of regression analyses.

Page 65: “The preceding eight problems are the ones I believe most often lead to serious errors in judging the results of a multiple regression. By no means do they exhaust the possible pitfalls that may arise. Before concluding this chapter, I’ll briefly mention a few others.”

Page 67: “Non-experimental data rarely tell you anything about the direction of a causal relationship. You must decide the direction based on your prior knowledge of the phenomenon you’re studying.”

[15] Book: Regression With Social Data: Modeling Continuous and Limited Response Variables. By Alfred DeMaris. John Wiley & Sons, 2004.

Page 9:

Regression modeling of nonexperimental data for the purpose of making causal inferences is ubiquitous in the social sciences. Sample regression coefficients are typically thought of as estimates of the causal impacts of explanatory variables on the outcome. Even though researchers may not acknowledge this explicitly, their use of such language as impact or effect to describe a coefficient value often suggest a causal interpretation. This practice is fraught with controversy….

Page 12:

Friedman … is especially critical of drawing causal inferences from observational data, since all that can be “discovered,” regardless of the statistical candlepower used, is association. Causation has to be assumed into the structure from the beginning. Or, as Friedman … says: “If you want to pull a causal rabbit out of the hat, you have to put the rabbit into the hat.” In my view, this point is well taken; but it does not preclude using regression for causal inference. What it means, instead, is that prior knowledge of the causal status of one’s regressors is a prerequisite for endowing regression coefficients with a causal interpretation, as acknowledged by Pearl 1998.

Page 13: “In sum, causal modeling via regression, using nonexperimental data, can be a useful enterprise provided we bear in mind that several strong assumptions are required to sustain it. First, regardless of the sophistication of our methods, statistical techniques only allow us to examine associations among variables.”

[16] Article: “Statistical Malpractice.” By Bruce G. Charlton. Journal of the Royal College of Physicians of London, March 1996. <www.researchgate.net>

Page 112: “Science is concerned with causes but statistics is concerned with correlations.”

Page 113:

The root of most instances of statistical malpractice is the breaking of mathematical neutrality and the introduction of causal assumptions into analysis without justifying them on scientific grounds. This amounts to performing science by sleight-of-hand: the quickness of the statistics deceives the mind. The process is often accidental, the product of misunderstanding rather than of malice—as commonly happens when statistical adjustments or standardization of populations are performed to remove the effects of confounding variables.6 These are maneuvers by which data sets are recalculated (for example, by stratified or multivariate analysis) in an attempt to eliminate the consequences of uncontrolled “interfering” variables which distort the causal relationship under study. …

There are however, no statistical rules by which confounders can be identified, and the process of adjustment involves making quantitative causal assumptions based upon secondary analysis of the database in question. …

Adjustment is therefore, implicitly, a way of modeling the magnitude of a causal process in order to subtract its effects from the data. However, modeling is not mathematically neutral and involves inputting assumptions—an activity which requires to be justified for each case. …

… Statistical malpractice occurs, however, exactly because it has not dispensed with causation, but has merely concealed it under a cloak of mathematical neutrality.

[17] Paper: “Econometric Methods for Causal Evaluation of Education Policies and Practices: A Non-Technical Guide.” By Martin Schlotter, Guido Schwerdt, and Ludger Woessmann. Education Economics, January 2011. <www.tandfonline.com>

Page 110:

Using standard statistical methods, it is reasonably straightforward to establish whether there is an association between two things—for example, between the introduction of a certain education reform (the treatment) and the learning outcome of students (the outcome). However, whether such a statistical correlation can be interpreted as the causal effect of the reform on outcomes is another matter. The problem is that there may well be other reasons why this association comes about.

Page 111:

Whenever other reasons exist that give rise to some correlation between the two things of interest—the treatment and the outcome—the overall correlation cannot be interpreted as the causal effect of the treatment on the outcome. Broadly speaking, this is what economists call the ‘endogeneity problem’. The term stems from the idea that treatment cannot be viewed as exogenous to the model of interest, as it should be, but that it is rather endogenously determined within the model—depending on the outcome or being jointly determined with the outcome by a third factor. Because of the problem of endogeneity, estimates of the association between treatment and outcome based on correlations will be biased estimates of the causal effect of treatment on outcome.2

Standard approaches try to deal with this problem by observing the other sources of possible correlation and take out the difference in outcomes that can be attributed to these other observed differences. This is the approach of multivariate models that estimate the effects of multiple variables on the outcome at the same time, such as the classical ordinary least-squares (OLS) or multilevel modeling (or hierarchical linear models, HLM) techniques. They allow estimating the association between treatment and outcome conditional on the effects of the other observed factors.

2 Other possible sources of endogeneity include self-selection (objects with different characteristics can choose whether to be treated or not) and simultaneity (treatment and outcome are choice variables that are jointly determined). In econometric terms, measurement error in the treatment variable can also be interpreted as an endogeneity problem, because it gives rise to a particular form of association between treatment and outcome (one that generally biases the estimates toward finding no effect, even if there was one).

Page 131:

But obtaining convincing evidence on the effects on specific education policies and practices is not an easy task. As a precondition, relevant data on possible outcomes has to be gathered. What is more, showing a mere correlation between a specific policy or practice and potential outcomes is no proof that the policy or practice caused the outcome. For policy purposes, mere correlations are irrelevant, and only causation is important. What policy-makers care about is what would really happen if they implemented a specific policy or practice—would it really change any outcome that society cares about? In order to implement evidence-based policy, policy-makers require answers to such causal questions.

[18] Book: Multiple Regression: A Primer. By Paul D. Allison. Pine Forge Press, 1998.

Chapter 1: “What Is Multiple Regression?” <us.sagepub.com>

Page 20:

Multiple regression shares an additional problem with all methods of statistical control, a problem that is the major focus of those who claim that multiple regression will never be a good substitute for the randomized experiment. To statistically control for a variable, you have to be able to measure that variable so that you can explicitly build it into the data analysis, either by putting it in the regression equation or by using it to form homogeneous subgroups. Unfortunately, there’s no way that we can measure all the variables that might conceivably affect the dependent variable. No matter how many variables we include in a regression equation, someone can always come along and say, “Yes, but you neglected to control for variable X and I feel certain that your results would have been different if you had done so.”

That’s not the case with randomization in an experimental setting. Randomization controls for all characteristics of the experimental subjects, regardless of whether those characteristics can be measured. Thus, with randomization there’s no need to worry about whether those in the treatment group are smarter, more popular, more achievement oriented, or more alienated than those in the control group (assuming, of course, that there are enough subjects in the experiment to allow randomization to do its job effectively).

[19] Book: The Education Gap: Vouchers and Urban Schools (Revised edition). By William G. Howell and Paul E. Peterson with Patrick J. Wolf and David E. Campbell. Brookings Institution Press, 2006 (first published in 2002). <www.brookings.edu>

Page 39:

In a perfectly controlled experiment in the natural sciences, the researcher is able to control for all factors while manipulating the variable of interest. …

Experiments with humans are much more difficult to manage. Researchers cannot give out pills or placebos and then ask subjects not to change any other aspect of their lives. To conduct an experiment in the social sciences that nonetheless approximates the natural-science ideal, scientists have come up with the idea of random assignment—drawing names out of a hat (or, today, by computer) and putting subjects into a treatment or control group. When individuals are assigned randomly to one of two categories, one can assume that the two groups do not differ from each another systematically, except in the one respect under investigation.

Page 40:

It is the very simplicity of random assignment that makes such studies so eloquent and their findings so compelling. Simply by comparing what happens to members of the treatment and control groups, analysts can assess whether an intervention makes any difference, positive or negative. Of course, complications inevitably arise. People in the treatment group refuse treatment. People in the control group discover alternative ways of getting the treatment. People fail to report back, or move away, or provide inaccurate information. Still, statisticians have found a variety of ways to correct for such eventualities; such adjustments are discussed in greater detail below.

[20] Paper: “Econometric Methods for Causal Evaluation of Education Policies and Practices: A Non-Technical Guide.” By Martin Schlotter, Guido Schwerdt, and Ludger Woessmann. Education Economics, January 2011. <www.tandfonline.com>

Page 132:

In medical research, experimental evaluation techniques are a well-accepted standard device to learn what works and what does not. No one would treat large numbers of people with a certain medication unless it has been shown to work. Experimental and quasi-experimental studies are the best way to reach such an assessment. It is hoped that a similar comprehension is reached in education so that future education policies and practices will be able to better serve the students.

[21] Paper: “Private School Vouchers and Student Achievement: An Evaluation of the Milwaukee Parental Choice Program.” By Cecilia Elena Rouse. Quarterly Journal of Economics, May, 1998. Pages 553–602. <eml.berkeley.edu>

Page 554:

Ideally, the issue of the relative effectiveness of private versus public schooling could be addressed by a social experiment in which children in a well-defined universe were randomly assigned to a private school (the “treatment group”), while others were assigned to attend public schools (the “control group”). After some period of time, one could compare outcomes, such as test scores, high school graduation rates, or labor market success between the treatment and control groups. Since, on average, the only differences between the groups would be their initial assignment—which was randomly determined—any differences in outcomes could be attributed to the type of school attended.

[22] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1483: “The random assignment process makes estimation of causal effects straightforward.”

Page 1484: “Note that no assumptions regarding the distributions or independence of potential outcomes are needed. This is because the randomized design itself is the basis for inference (Fisher 1935), and preexisting clusters cannot be positively correlated with the treatment assignments in any systematic way.”

[23] Paper: “Another Look at the New York City School Voucher Experiment.” By Alan B. Krueger and Pei Zhu. American Behavioral Scientist, January 2004. Pages 658–698. <abs.sagepub.com>

Page 660: “Because of random assignment, however, estimates are unbiased even

without conditioning on baseline information….”

Pages 693–694:

Researchers are often unsure as to whether they should or should not control for baseline characteristics when a treatment is randomly assigned. We would advise that key results be presented both ways, with and without baseline characteristics (and with and without varying samples). …

Controlling for baseline characteristics can be justified if their inclusion increases the precision of the key estimates. As a practical matter, however, controlling for baseline characteristics tends to reduce the sample size, which could well offset the decline in residual variance and create a nonrepresentative sample.

Simplicity and transparency are valuable in their own right and can help prevent mistakes. These benefits may be well worth the loss of some precision. A complicated design increases the likelihood of error down the road, for example, in the derivation of weights or in the delineation of strata within which the treatment is randomly assigned. An underappreciated virtue of presenting results without baseline covariates is that the results are transparent and simple, and therefore less prone to human error.

[24] Book: Regression With Social Data: Modeling Continuous and Limited Response Variables. By Alfred DeMaris. John Wiley & Sons, 2004.

Page 10:

Nonetheless, according to the potential response model, the average causal effect can be estimated in an unbiased fashion if there is random assignment to the cost. Unfortunately, this pretty much rules out making causal inferences from nonexperimental data. … Still, hard-core adherence to the potential response framework would deny the causal status of most of the interesting variables in the social sciences because they are not capable of being assigned randomly. Holland and Rubin, for example have made up a motto that expresses this quite succinctly: “No causation without manipulation” (Holland, 1986, p. 959). In other words, only “treatments” that can be assigned randomly to any case at will are considered candidates for exhibiting causal effects. … I agree with others … who take exception to this restrictive conception of causality, despite the intuitive appeal of counterfactual reasoning.

Page 13:

Sobel’s (1988, p. 346) advice is in the same vein: “[s]ociologists might follow the example of epidemiologists. Here, when an association is found in an observational study that might plausibly suggest causation, the findings are treated as preliminary and tentative. The next step, when possible, is to conduct the randomized study that will more definitively answer the causal question of interest.”

[25] Paper: “A Modified General Location Model for Noncompliance with Missing Data: Revisiting the New York City School Choice Scholarship Program Using Principal Stratification.” By Hui Jin and others. Journal of Educational and Behavioral Statistics, April 2010. Pages 154–173. <jeb.sagepub.com>

Pages 154–155: “Although quite a few school choice voucher programs have been conducted across the United States, the New York City School Choice Scholarship Program is arguably the largest and best-implemented private school choice randomized experiment to date. However, even this program suffers from two common complications in social science experiments: missing data and noncompliance.”

[26] Dataset: “Table 3.16. Government Current Expenditures by Function [Billions of Dollars].” U.S. Department of Commerce, Bureau of Economic Analysis. Last revised November 17, 2023. <apps.bea.gov>

“Government1 … Education … 2022 [=] 1,179.9”

[27] Calculated with the dataset: “HH-1. Households by Type: 1940 to Present.” U.S. Census Bureau, Current Population Survey, November 2022. <www.census.gov>

“Total households (in thousands) … All [=] 131,202”

CALCULATION: $1,179,900,000,000 education spending / 131,202,000 households = $8,993

[28] Calculated with the dataset: “Table 1.1.5. Gross Domestic Product [Billions of Dollars].” United States Department of Commerce, Bureau of Economic Analysis. Last revised September 29, 2022. <apps.bea.gov>

“Gross domestic product … 2022 [=] 25,744.1”

CALCULATION: $1,179.9 billion education spending / $25,744.1 billion GDP = 4.6%

[29] Calculated with the dataset: “Table 3.16. Government Current Expenditures by Function [Billions of Dollars].” U.S. Department of Commerce, Bureau of Economic Analysis. Last revised November 17, 2023. <apps.bea.gov>

“2022 … Government1 [=] 8,691.7 … Education [=] 1,179.9”

CALCULATION: $1,179.9 billion / $8,691.7 billion = 14%

[30] As documented below, government “total expenditures” is a more inclusive measure of spending than “current expenditures,” but the U.S. Bureau of Economic Analysis—which provides the only comprehensive and timely estimates of government spending at all levels—does not publish total expenditures broken down by function (for example, education, healthcare, etc.). Instead, it only publishes current expenditures by function.† ‡

“Current expenditures” include “all spending by government on current-period activities,” such as:

  • “consumption expenditures,” or “what government spends on its work force and for goods and services, such as fuel for military jets and rent for government buildings and other structures.”
  • “current transfer payments,” which consist of:
    • “social benefits,” or “payments from social insurance funds, such as social security and Medicare, and payments providing other income support, such as Medicaid and food stamp benefits.”
    • “grants-in-aid to state and local governments.”
    • “transfers to the rest of the world,” or “federal aid to foreign countries and payments to international organizations such as the United Nations.”
  • “interest payments,” or the costs “of borrowing by governments to finance their capital and operational costs.”
  • “subsidies,” or grants to businesses, other government entities, and homeowners.§ †

“Total expenditures” include all current expenditures plus:

  • “gross investment,” or “what government spends on structures, equipment, and software, such as new highways, schools, and computers.” This also includes research expenditures.
  • “other capital-type expenditures that affect future-period activities,” such as payments to foreigners.
  • “net purchases of nonproduced assets,” such as land.§ † Φ

NOTES:

  • † Report: “A Primer on BEA’s Government Accounts.” By Bruce E. Baker and Pamela A. Kelly. U.S. Bureau of Economic Analysis, March 2008. <apps.bea.gov>. Page 29: “The federal estimates in the NIPAs [National Income and Product Accounts] contain much of the same information as the Budget of the United States Government, although the information is classified differently. The state and local estimates in the NIPAs are the only comprehensive estimates of state and local government activity available on a timely basis.” Page 34: “Current transfer payments. These consist of social benefits and other current transfer payments to the rest of the world. Social benefits are payments from social insurance funds, such as social security and Medicare, and payments providing other income support, such as Medicaid and food stamp benefits. Other current transfers to the rest of the world consists of federal aid to foreign countries and payments to international organizations such as the United Nations. Federal ‘other current transfer payments’ also includes grants-in-aid to state and local governments. … Interest payments. These represent the cost of borrowing by governments to finance their capital and operational costs. … Subsidies. These are payments to businesses, including homeowners and government enterprises at another level of government.”
  • ‡ Email from the U.S. Bureau of Economic Analysis to Just Facts, March 18, 2015. “BEA does not produce an estimate of government total expenditures by function as defined by the national income and product accounts (NIPAs).”
  • § Webpage: “FAQ: BEA Seems to Have Several Different Measures of Government Spending. What Are They for and What Do They Measure?” U.S. Bureau of Economic Analysis (BEA). Last modified April 28, 2020. <www.bea.gov>. “Consumption expenditures include what government spends on its work force and for goods and services, such as fuel for military jets and rent for government buildings and other structures. Gross investment includes what government spends on structures, equipment, and software, such as new highways, schools, and computers. … Current expenditures measures all spending by government on current-period activities, and consists not only of government consumption expenditures, but also current transfer payments, interest payments, and subsidies (and removes wage accruals less disbursements#). … Total government expenditures: In addition to the transactions that are included in current expenditures, this measure includes gross investment (as defined earlier), and other capital-type expenditures that affect future-period activities, such as capital transfer payments and net purchases of nonproduced assets (for example, land).”£
  • # Email from the U.S. Bureau of Economic Analysis to Just Facts, March 18, 2015. “Wage accruals less disbursements is no longer an adjustment that is needed in the accounts as BEA’s income estimates for wages were moved to an accrual basis during the 2013 comprehensive revision.”
  • £ Webpage: “Glossary: Capital Transfers to the Rest of the World (Net).” U.S. Bureau of Economic Analysis. Last modified April 13, 2018. <www.bea.gov>. “Cash or in-kind transfers to foreigners that are linked to the acquisition or disposition of a fixed asset.”
  • Φ Email from the U.S. Bureau of Economic Analysis to Just Facts, June 19, 2015. “As of July 2013, research expenditures are included in the NIPAs as investment.”

[31] See the footnote above, which documents that the U.S. Bureau of Economic Analysis does not publish total expenditures for education or any other specific function of government. Per the U.S. Bureau of Economic Analysis, land purchases are included in total expenditures but not in current expenditures:

Total government expenditures: In addition to the transactions that are included in current expenditures, this measure includes … net purchases of nonproduced assets (for example, land).” [Webpage: “FAQ: BEA Seems to Have Several Different Measures of Government Spending. What Are They for and What Do They Measure?” U.S. Bureau of Economic Analysis. Last modified April 28, 2020. <www.bea.gov>]

[32] See the second footnote above, which documents that the U.S. Bureau of Economic Analysis does not publish data for education total expenditures. Per the U.S. Bureau of Economic Analysis, purchases of durable items such as buildings and computers are included in total expenditures but not in current expenditures:

Gross investment includes what government spends on structures, equipment, and software, such as new highways, schools, and computers. …

Total government expenditures: In addition to the transactions that are included in current expenditures, this measure includes gross investment….†

Note that although current expenditures do not include gross investment, they do include “consumption of fixed capital,” which measures the depreciation of durable items as they are used.‡ § This accounts for most (but not all) of the costs of these items. From 1929 through 2014, consumption of fixed capital was roughly 70% of gross government investment.#

NOTES:

  • † Webpage: “FAQ: BEA Seems to Have Several Different Measures of Government Spending. What Are They for and What Do They Measure?” U.S. Bureau of Economic Analysis. Last modified April 28, 2020. <www.bea.gov>
  • ‡ Report: “A Primer on BEA’s Government Accounts.” By Bruce E. Baker and Pamela A. Kelly. U.S. Bureau of Economic Analysis, March 2008. <apps.bea.gov>

Page 33: “Consumption expenditures [include] … consumption of fixed capital….”

Page 38: “In estimating the national income and product accounts, it is necessary to compute consumption of fixed capital (CFC) or depreciation. … In the government accounts, CFC is used as a proxy for the services derived from government capital investment, both past and present.”

  • § Calculated with data from “Table 3.1. Government Current Receipts and Expenditures.” U.S. Bureau of Economic Analysis. Last revised February 27, 2015. <apps.bea.gov>. NOTE: An Excel file containing the data and calculations is available upon request.

[33] The next six footnotes document that:

  • Substantial amounts of healthcare benefits promised to government employees are unfunded.
  • Accrual accounting (as opposed to cash accounting) of these benefits would measure these unfunded liabilities.
  • The U.S. Bureau of Economic Analysis (the source of the education spending figures cited above) uses cash accounting (as opposed to accrual accounting) to measure government spending on retiree healthcare benefits.

[34] Report: “State and Local Government Retiree Health Benefits: Liabilities Are Largely Unfunded, but Some Governments Are Taking Action.” U.S. Government Accountability Office, November 2009. <www.gao.gov>

Page 2 (of PDF):

Accounting standards require governments to account for the costs of other post-employment benefits (OPEB)—the largest of which is typically retiree health benefits—when an employee earns the benefit. As such, governments are reporting their OPEB liabilities—the amount of the obligation to employees who have earned OPEB. As state and local governments have historically not funded retiree health benefits when the benefits are earned, much of their OPEB liability may be unfunded. Amid fiscal pressures facing governments, this has raised concerns about the actions the governments can take to address their OPEB liabilities. …

The total unfunded OPEB liability reported in state and the largest local governments’ CAFRs [comprehensive annual financial reports] exceeds $530 billion. However, as variations between studies’ totals show, totaling unfunded OPEB liabilities across governments is challenging for a number of reasons, including the way that governments disclose such data. The unfunded OPEB liabilities for states and local governments GAO [Government Accountability Office] reviewed varied widely in size. Most of these governments do not have any assets set aside to fund them. The total for unfunded OPEB liabilities is higher than $530 billion because GAO reviewed OPEB data in CAFRs for the 50 states and 39 large local governments but not data for all local governments or additional data reported in separate financial reports. Also, the CAFRs we reviewed report data that predate the market downturn. Finally, OPEB valuations are based on assumptions about the health care cost inflation rate and discount rates for assets, which also affect the size of the unfunded liability.

Some state and local governments have taken actions to address liabilities associated with retiree health benefits by setting aside assets to prefund the liabilities before employees retire and reducing these liabilities by changing the structure of retiree health benefits. Approximately 35 percent of the 89 governments for which GAO reviewed CAFRs reported having set aside some assets for OPEB liabilities, but the percentage of the OPEB liability funded varied.

[35] Article: “Defined Benefit Pensions and Household Income and Wealth.” By Marshall B. Reinsdorf and David G. Lenze. Survey of Current Business, U.S. Bureau of Economic Analysis, August 2009. Pages 50–62. <apps.bea.gov>

Pages 50–51:

U.S. households usually participate in two kinds of retirement income programs: social security, and a plan sponsored by their employer. The employer plan may be organized as either a defined contribution plan, such as a 401(k) plan, or a defined benefit plan. Defined contribution plans provide resources during retirement based on the amount of money that has been accumulated in an account, while defined benefit plans determine the level of benefits by a formula that typically depends on length of service and average or final pay. …

… A defined benefit plan has an actuarial liability for future benefits equal to the expected present value of the benefits to which the plan participants are entitled under the benefit formula. The value of participants’ benefit entitlement often does not coincide with the value of the assets that the plan has on hand; indeed, a plan that has a pay-as-you-go funding scheme might have only enough assets to ensure that it can make the current period’s benefit payments.2

A complete measure of the wealth of defined benefit plan participants is the expected present value of the benefits to which they are entitled, not the assets of the plan. This follows from the fact that if the assets of a defined benefit plan are insufficient to pay promised benefits, the plan sponsor must cover the shortfall. …

… [U]nder the accrual approach, the measure of compensation income for the participants in the plan is no longer the employer’s actual contributions to the plan. Instead, it is the present value of the benefits to which employees become entitled as a result of their service to the employer.

Measuring household income from defined benefit plans by actual contributions from employers plus actual investment income on plan assets can be considered a cash accounting approach to measuring these plans’ transactions…. We use the term “accrual accounting” to mean any approach that adopts the principle that a plan’s benefit obligations ought to be recorded as they are incurred.

2. Federal law requires that private pension plans operate as funded plans, not as pay-as-you-go plans.

[36] Report: “Preview of the 2013 Comprehensive Revision of the National Income and Product Accounts: Changes in Definitions and Presentations.” By Shelly Smith and others. U.S. Bureau of Economic Analysis, March 2013. <apps.bea.gov>

Page 21: “Accrual accounting is the preferred method for compiling national accounts because it matches incomes earned from production with the corresponding productive activity and records both in the same period.”

[37] Statement: “Employers’ Accounting for Postretirement Benefits Other Than Pensions.” Financial Accounting Standards Board, December 1990. <fasb.org>

The Board believes that measurement of the obligation and accrual of the cost based on best estimates are superior to implying, by a failure to accrue, that no obligation exists prior to the payment of benefits. The Board believes that failure to recognize an obligation prior to its payment impairs the usefulness and integrity of the employer’s financial statements. …

This Statement relies on a basic premise of generally accepted accounting principles that accrual accounting provides more relevant and useful information than does cash basis accounting. …

[L]ike accounting for other deferred compensation agreements, accounting for postretirement benefits should reflect the explicit or implicit contract between the employer and its employees.

[38] Email from the U.S. Bureau of Economic Analysis to Just Facts, March 19, 2015.

“Retiree health care benefits (which are separate from pensions) are treated on a cash basis and are effectively included in the compensation of current workers.”

[39] Webpage: “What Is Included in Federal Government Employee Compensation?” U.S. Bureau of Economic Analysis. Last modified July 26, 2018. <www.bea.gov>

The contributions for employee health insurance consist of the federal share of premium payments to private health insurance plans for current employees and retirees1.

1 The payments to amortize the unfunded health care liabilities of the Postal Service Retiree Health Benefits Fund are treated as capital transfers to persons and are therefore not included in compensation.

[40] Dataset: “Table 3.16. Government Current Expenditures by Function [Billions of Dollars].” U.S. Department of Commerce, Bureau of Economic Analysis. Last revised November 17, 2023. <apps.bea.gov>

“2022 … Education [=] 1,179.9 … Elementary and secondary [=] 833.7 … Higher [=] 225.7 … Libraries and other [=] 120.5”

[41] Calculated with data from:

a) Dataset: “Table 3.16. Government Current Expenditures by Function [Billions of Dollars].” U.S. Department of Commerce, Bureau of Economic Analysis. Last revised November 17, 2023. <apps.bea.gov>

b) Report: “Fiscal Year 2024 Historical Tables: Budget Of The U.S. Government.” White House Office of Management and Budget, March 2023. <www.whitehouse.gov>

“Table 3.1—Outlays by Superfunction and Function: 1940–2028.” <www.whitehouse.gov>

NOTE: An Excel file containing the data and calculations is available here.

[42] The next 3 footnotes document that:

  • Private-sector economic output is equal to personal consumption expenditures (PCE) + gross private domestic investment (GPDI) + net exports of goods and services.
  • PCE is the “primary measure of consumer spending on goods and services” by private individuals and nonprofit organizations.
  • GPDI is a measure of private spending on “structures, equipment, and intellectual property products.”

Since education is not a service that is typically imported or exported, a valid approximation of private spending on education can be arrived at by summing PCE and GPDI. The fourth footnote below details the data used in this calculation.

[43] Report: “Fiscal Year 2013 Analytical Perspectives, Budget of the U.S. Government.” White House Office of Management and Budget, February 12, 2012. <www.gpo.gov>

Page 471:

The main purpose of the NIPAs [national income and product accounts published by the U.S. Bureau of Economic Analysis] is to measure the Nation’s total production of goods and services, known as gross domestic product (GDP), and the incomes generated in its production. GDP excludes intermediate production to avoid double counting. Government consumption expenditures along with government gross investment—State and local as well as Federal—are included in GDP as part of final output, together with personal consumption expenditures, gross private domestic investment, and net exports of goods and services (exports minus imports).

[44] Report: “Concepts and Methods of the U.S. National Income and Product Accounts, Chapter 5: Personal Consumption Expenditures.” U.S. Bureau of Economic Analysis. Updated December 2023. <www.bea.gov>

Page 5-1:

Personal consumption expenditures (PCE) is the primary measure of consumer spending on goods and services in the U.S. economy.1 It accounts for about two-thirds of domestic final spending, and thus it is the primary engine that drives future economic growth. PCE shows how much of the income earned by households is being spent on current consumption as opposed to how much is being saved for future consumption.

PCE also provides a comprehensive measure of types of goods and services that are purchased by households. Thus, for example, it shows the portion of spending that is accounted for by discretionary items, such as motor vehicles, or the adjustments that consumers make to changes in prices, such as a sharp run-up in gasoline prices.2

Page 5-2:

PCE measures the goods and services purchased by “persons”—that is, by households and by nonprofit institutions serving households (NPISHs)—who are resident in the United States. Persons resident in the United States are those who are physically located in the United States and who have resided, or expect to reside, in this country for 1 year or more. PCE also includes purchases by U.S. government civilian and military personnel stationed abroad, regardless of the duration of their assignments, and by U.S. residents who are traveling or working abroad for 1 year or less.3

Page 5-70:

Nonprofit Institutions Serving Households

In the NIPAs [National Income and Product Accounts], nonprofit institutions serving households (NPISHs), which have tax-exempt status, are treated as part of the personal sector of the economy. Because NPISHs produce services that are not generally sold at market prices, the value of these services is measured as the costs incurred in producing them.

In PCE, the value of a household purchase of a service that is provided by a NPISH consists of the price paid by the household or on behalf of the household for that service plus the value added by the NPISH that is not included in the price. For example, the value of the educational services provided to a student by a university consists of the tuition fee paid by the household to the university and of the additional services that are funded by sources other than tuition fees (such as by the returns to an endowment fund).

[45] Report: “Measuring the Economy: A Primer on GDP and the National Income and Product Accounts.” U.S. Bureau of Economic Analysis, December 2015. <www.bea.gov>

Page 8: “Gross private domestic investment consists of purchases of fixed assets (structures, equipment, and intellectual property products) by private businesses that contribute to production and have a useful life of more than one year, of purchases of homes by households, and of private business investment in inventories.”

[46] Calculated with data from:

a) Dataset: “Table 2.4.5U. Personal Consumption Expenditures by Type of Product.” U.S. Bureau of Economic Analysis. Last revised April 27, 2023. <apps.bea.gov>

b) Dataset: “Table 1.1.5. Gross Domestic Product.” U.S. Bureau of Economic Analysis. Last revised April 27, 2023. <apps.bea.gov>

c) Dataset: “HH-1. Households by Type: 1940 to Present (Numbers in Thousands).” U.S. Census Bureau, Current Population Survey, November 2022. <www.census.gov>

NOTE: An Excel file containing the data and calculations is available upon request.

[47] Calculated with the dataset: “Table 2.4.5U. Personal Consumption Expenditures by Type of Product.” U.S. Bureau of Economic Analysis. Last revised April 27, 2023. <apps.bea.gov>

NOTE: An Excel file containing the data and calculations is available upon request.

[48] Calculated with the dataset: “Person Income in 2021, Both Sexes, 25 to 64 Years, Total Work Experience.” U.S. Census Bureau. Accessed January 24, 2023 at <www2.census.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • Like all Census Bureau measures of “money” income, this dataset doesn’t include noncash benefits like subsidized housing, food stamps, charitable services, and government or employer-provided health benefits. Also, the data are collected via government surveys, and low-income households substantially underreport their income on such surveys.

[49] Report: “Income in the United States: 2021.” By Jessica Semega and Melissa Kollar. U.S. Census Bureau, September 2022. <www.census.gov>

Page 13:

Data on income collected in the CPS ASEC [Current Population Survey Annual Social and Economic Supplements] by the U.S. Census Bureau cover money income received (exclusive of certain money receipts such as capital gains) before payments for personal income taxes, Social Security, union dues, Medicare deductions, etc. Money income also excludes tax credits such as the Earned Income Tax Credit, the Child Tax Credit, and special COVID-19- related stimulus payments. Money income does not reflect that some families receive noncash benefits such as Supplemental Nutrition Assistance/food stamps, health benefits, and subsidized housing. In addition, money income does not reflect the fact that noncash benefits often take the form of the use of business transportation and facilities, full or partial payments by business for retirement programs, or medical and educational expenses. …

Data users should consider these elements when comparing income levels. Moreover, readers should be aware that for many different reasons there is a tendency in household surveys for respondents to underreport their income. Based on an analysis of independently derived income estimates, the Census Bureau determined that respondents report income earned from wages or salaries more accurately than other sources of income, and that the reported wage and salary income is nearly equal to independent estimates of aggregate income.

[50] Calculated with the dataset: “Person Income in 2021, Both Sexes, 25 to 64 Years, Total Work Experience.” U.S. Census Bureau. Accessed January 24, 2023 at <www2.census.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • Like all Census Bureau measures of “money” income, this dataset doesn’t include noncash benefits like subsidized housing, food stamps, charitable services, and government or employer-provided health benefits. Also, the data are collected via government surveys, and low-income households substantially underreport their income on such surveys.

[51] Calculated with the dataset: “Person Income in 2021, Both Sexes, 25 to 64 Years, Total Work Experience.” U.S. Census Bureau. Accessed January 24, 2023 at <www2.census.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • Like all Census Bureau measures of “money” income, this dataset doesn’t include noncash benefits like subsidized housing, food stamps, charitable services, and government or employer-provided health benefits. Also, the data are collected via government surveys, and low-income households substantially underreport their income on such surveys.

[52] Report: “Key Concepts and Features of the 2003 National Assessment of Adult Literacy.” By Sheida White and Sally Dillow. U.S. Department of Education, National Center for Education Statistics, December 2005. <nces.ed.gov>

Page 3:

NAAL [National Assessment of Adult Literacy] Measures How Well U.S. Adults Perform Tasks with Printed Materials

As a part of their everyday lives, adults in the United States interact with a variety of printed and other written materials to perform a multitude of tasks. A comprehensive list of such tasks would be virtually endless. It would include such activities as balancing a checkbook, following directions on a prescription medicine bottle, filling out a job application, consulting a bus schedule, correctly interpreting a chart in the newspaper, and using written instructions to operate a voting machine. …

Literacy is not a single skill or quality that one either possesses or lacks. Rather, it encompasses various types of skills that different individuals possess to varying degrees. There are different levels and types of literacy, which reflect the ability to perform a wide variety of tasks using written materials that differ in nature and complexity. A common thread across all literacy tasks is that each has a purpose—whether that purpose is to pay the telephone bill or to understand a piece of poetry. All U.S. adults must successfully perform literacy tasks in order to adequately function—that is, to meet personal and employment goals as well as contribute to the community.

[53] Book: Productivity Management: A Practical Handbook. By Joseph Prokopenko. International Labour Office, 1987.

Page 242:

Pre-Employment Education

There are two main goals of pre-employment education: to create productivity awareness and to prepare youth for productive work by teaching the necessary knowledge and skills. Unfortunately, too much attention is paid to developing formal knowledge and too little to practical skills.

For example British industrialists have long been complaining that business and management education in the United Kingdom is oriented towards teaching how to trade and how to invest, rather than how to add new value.

Some prestigious educational institutions place too much emphasis on purely academic matters instead of teaching people how to manage factories and shop-floor production. Too much emphasis is still placed on management sciences and research instead of on preparing creative entrepreneurs capable of innovating, and of organising and managing work. Under such a system, it is quite normal that the most gifted go on to academic studies, and the less gifted are forced to work in industry.

A change of emphasis from a knowledge-based or academic system of education (both secondary and higher) to one based on problem-solving and the completion of concrete tasks would result in an improvement in the productivity culture.

[54] Book: Youth Unemployment in the North: Young People on the Labour Market – Actions to Combat Unemployment. Nordic Council of Ministers, 1987.

Page 190:

General secondary education is oriented towards higher education. As a rule, secondary schools do not equip school leavers with any practical skills that would enable them to get a job. Young people receive a good general education from secondary school but their business and financial skills are mostly insufficient. Moreover, they are not prepared for entering into a competitive labour market.

[55] Webpage: “What PIAAC Is.” U.S. Department of Education, National Center for Education Statistics. Accessed February 8, 2022 at <nces.ed.gov>

The Program for the International Assessment of Adult Competencies (PIAAC) is an international study for measuring, analyzing, and comparing adults’ basic skills of literacy, numeracy, and digital problem solving. The assessment focuses on the basic cognitive and workplace skills needed for individuals to participate in society and for economies to prosper. Data from PIAAC is meant to help countries better understand their education and training systems and the distribution of these basic skills across the adult working-age population.

Developed by the Organization for Economic Cooperation and Development (OECD), PIAAC is intended to be administered at least once a decade. PIAAC was first conducted in 2011 and the same survey instruments were administered twice more through 2017. In total, 39 countries participated in PIAAC in Cycle I (2011–17). Cycle II of PIAAC, with revised survey instruments, will begin in 2022, with 33 countries scheduled to participate in the first round of administration.

[56] Report: “Highlights of PIACC 2017 U.S. Results.” U.S. Department of Education, National Center for Education Statistics. Accessed February 8, 2022 at <nces.ed.gov>

The Program for the International Assessment of Adult Competencies (PIACC) is a cyclical, large-scale study of adult cognitive skills and life experiences developed by the Organization for Economic Cooperation and Development (OECD) and, in the United States, conducted by the National Center for Education Statistics (NCES). …

In 2017, the third round of PIACC data collection in the United States took place. The results of the first round of U.S. PIACC data collection in 2012 and the second round of data collection in 2014 (officially known as the National Supplement to the Main Study) are combined, by design, into one data point for 2012/14.

[57] Webpage: “Frequently Asked Questions.” U.S. Department of Education, National Center for Education Statistics. Accessed February 8, 2022 at <nces.ed.gov>

Countries that participate in PIAAC [Program for the International Assessment of Adult Competencies] must draw a sample of individuals ages 16–65 that represents the entire population of adults living in households in the country. … In the United States, a nationally representative household sample was drawn from the most current Census Bureau population estimates. …

The results from the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS) and the Adult Literacy and Lifeskills Survey (ALL) have been rescaled to match the statistical models used in creating PIAAC scores for literacy and, in the case of ALL, numeracy. Rescaling was possible because PIAAC repeated a sufficient number of the same test questions used in NALS, IALS, and ALL. Rescaled plausible values for literacy for IALS and ALL and numeracy for ALL, along with some trend background questionnaire items, allow for trend analysis with the U.S. PIAAC results along an international trend line. Separately, the rescaled plausible values for literacy for NALS, along with some trend background questionnaire items, allow for trend analysis with the U.S. PIAAC results along a national trend line.

The results from the National Assessment for Adult Literacy (NAAL) were not rescaled to match the model used in creating PIAAC scores. For various reasons, including that there is no overlap between NAAL and PIAAC literacy items, NAAL and PIAAC are thought to be the least comparable of the adult literacy assessments (or practically, not comparable) in terms of literacy items.

[58] Book: The Survey of Adult Skills: Reader’s Companion (2nd edition). Organization for Economic Cooperation and Development, June 2016. <read.oecd-ilibrary.org>

Chapter 3: “The Methodology of the Survey of Adult Skills (PIAAC) and the Quality of Data.” Pages 45–61. <read.oecd-ilibrary.org>

Page 55: “Respondents were permitted to use technical aids such as an electronic calculator, a ruler (which were provided by the interviewers) and to take notes or undertake calculations using a pen and pad during the assessment. Respondents were not allowed to seek assistance from others in completing the cognitive assessment.”

Page 56: “The direct-assessment component of the survey was not designed as a timed test; respondents could take as much or as little time as needed to complete it. However, interviewers were trained to encourage respondents to move to another section of the assessment if they were having difficulties.”

[59] E-mail from U.S. Department of Education to Just Facts on February 11, 2022:

Unfortunately, we do not have a straightforward examples of items with the percentage of respondents answering correctly. …

The actual items used to assess literacy, numeracy, and problem solving in technology-rich environments (PS-TRE) in PIAAC [Program for the International Assessment of Adult Competencies] are not released to public. Most of the assessment items are being preserved for possible use for trend purposes in future assessments. …

One could potentially use the U.S. data files to calculate the percentage who received the item and who answered it correctly, as there are variables for scored responses to each item. However … PIAAC (2012/2014 and 2017) is available in two modes; paper-and-pencil-based assessment (PBA) and computer-based assessment (CBA), meaning that some items would have both a paper and computer-based version. In addition, the computer-based PIAAC assessment used an adaptive design, meaning that respondents were directed to a set of easier or more difficult items based on their answers to the … core questions (as well as their performance in the CBA Module as they advance through the assessment). This adds complexity to interpreting the percent who answered correctly, as the sample of adults receiving the item is not random or representative and is influenced by respondents’ skill levels.

[60] Report: “Sample PIAAC Tasks in Literacy, Numeracy, and Problem Solving in Technology-Rich Environments.” Prepared by American Institutes for Research for U.S. Department of Education, National Center for Education Statistics. Accessed March 23, 2022 at <piaacgateway.com>

Page 27: “Numeracy Item: Level 2 … Beauchamp Manufacturing”

[61] Webpage: “What PIAAC Measures.” U.S. Department of Education, National Center for Education Statistics. Accessed May 24, 2023 at <nces.ed.gov>

Numeracy Proficiency Levels

In PIAAC [Program for the International Assessment of Adult Competencies], results are reported as averages on a 500-point scale or as proficiency levels. Proficiency refers to competence that involves “mastery” of a set of abilities along a continuum that ranges from simple to complex information-processing tasks.

This continuum has been divided into five levels of proficiency. Each level is defined by a particular score-point range associated with competence at specific information-processing tasks. Adults with literacy scores within the score-point range for a particular proficiency level are likely to successfully complete the tasks at that proficiency level as well as any lower proficiency levels. Adults with scores at a particular proficiency level might be able to complete a task at a higher proficiency level but the probability is small and diminishes greatly the higher the level. …

Level 2

226–275 points

Tasks at this level require the respondent to identify and act on mathematical information and ideas embedded in a range of common contexts where the mathematical content is fairly explicit or visual with relatively few distractors. Tasks tend to require the application of two or more steps or processes involving calculations with whole numbers and common decimals, percentages, and fractions; simple measurement and spatial representation; estimation; or interpretation of relatively simple data and statistics in texts, tables, and graphs. …

In general, this means that tasks located at a particular proficiency level can be successfully completed by the “average” person at that level approximately two-thirds of the time. However, individuals scoring at the bottom of the level would successfully complete tasks at that level only about half the time while individuals scoring at the top of the level would successfully complete tasks at that level about 80 percent of the time.

[62] Calculated with data from the webpage: “PIAAC [Program for the International Assessment of Adult Competencies] Data Explorer.” U.S. Department of Education, National Center for Education Statistics. Accessed March 23, 2022 at <nces.ed.gov>

Percentages for U.S. Adults, 16–74 (Household and Prison) PIAAC numeracy: overall scale, by PIAAC Numeracy proficiency levels [BMNUM] and jurisdiction: PIAAC 2012–2017 and PIAAC 2017 … PIAAC 2017 … U.S. Household (16–65 years old) … Percentage … Below Level 1 [=] 9% … Level 1 [=] 20% … Level 2 [=] 33% … Level 3 [=] 27% … Level 4 [=] 9% … Level 5 [=] 1%

CALCULATION: 33% + 27% + 9% + 1% = 70% at Level 2 or higher

[63] Report: “Sample PIAAC Tasks in Literacy, Numeracy, and Problem Solving in Technology-Rich Environments.” Prepared by American Institutes for Research for U.S. Department of Education, National Center for Education Statistics. Accessed March 23, 2022 at <piaacgateway.com>

Page 32: “Numeracy Item: Level 3 … Running Shoes”

[64] Webpage: “What PIAAC Measures.” U.S. Department of Education, National Center for Education Statistics. Accessed May 24, 2023 at <nces.ed.gov>

Numeracy Proficiency Levels

In PIAAC [Program for the International Assessment of Adult Competencies], results are reported as averages on a 500-point scale or as proficiency levels. Proficiency refers to competence that involves “mastery” of a set of abilities along a continuum that ranges from simple to complex information-processing tasks.

This continuum has been divided into five levels of proficiency. Each level is defined by a particular score-point range associated with competence at specific information-processing tasks. Adults with literacy scores within the score-point range for a particular proficiency level are likely to successfully complete the tasks at that proficiency level as well as any lower proficiency levels. Adults with scores at a particular proficiency level might be able to complete a task at a higher proficiency level but the probability is small and diminishes greatly the higher the level. …

Level 3

276–325 points

Tasks at this level require the respondent to understand mathematical information that may be less explicit, embedded in contexts that are not always familiar, and represented in more complex ways. Tasks require several steps and may involve the choice of problem-solving strategies and relevant processes. Tasks tend to require the application of number sense and spatial sense; recognizing and working with mathematical relationships, patterns, and proportions expressed in verbal or numerical form; or interpretation and basic analysis of data and statistics in texts, tables, and graphs. …

In general, this means that tasks located at a particular proficiency level can be successfully completed by the “average” person at that level approximately two-thirds of the time. However, individuals scoring at the bottom of the level would successfully complete tasks at that level only about half the time while individuals scoring at the top of the level would successfully complete tasks at that level about 80 percent of the time.

[65] Report: “Highlights of PIACC 2017 U.S. Results.” U.S. Department of Education, National Center for Education Statistics. Accessed February 8, 2022 at <nces.ed.gov>

“Figure 1-B. Percentage Distribution of U.S. Adults Age 16 to 65 at Selected Levels of Proficiency on PIAAC Literacy, Numeracy, and Digital Problem Solving: 2012/14 and 2017 … Numeracy … 2017 … Level 3 or above [=] 37%”

[66] Report: “Sample PIAAC Tasks in Literacy, Numeracy, and Problem Solving in Technology-Rich Environments.” Prepared by American Institutes for Research for U.S. Department of Education, National Center for Education Statistics. Accessed March 23, 2022 at <piaacgateway.com>

Page 37: “Numeracy Item: Level 4”

[67] Webpage: “What PIAAC Measures.” U.S. Department of Education, National Center for Education Statistics. Accessed March 23, 2022 at <nces.ed.gov>

Numeracy Proficiency Levels

In PIAAC [Program for the International Assessment of Adult Competencies], results are reported as averages on a 500-point scale or as proficiency levels. Proficiency refers to competence that involves “mastery” of a set of abilities along a continuum that ranges from simple to complex information-processing tasks.

This continuum has been divided into five levels of proficiency. Each level is defined by a particular score-point range associated with competence at specific information-processing tasks. Adults with literacy scores within the score-point range for a particular proficiency level are likely to successfully complete the tasks at that proficiency level as well as any lower proficiency levels. Adults with scores at a particular proficiency level might be able to complete a task at a higher proficiency level but the probability is small and diminishes greatly the higher the level. …

Level 4

326–375 points

Tasks at this level require the respondent to understand a broad range of mathematical information that may be complex, abstract, or embedded in unfamiliar contexts. These tasks involve undertaking multiple steps and choosing relevant problem-solving strategies and processes. Tasks tend to require analysis and more complex reasoning about quantities and data; statistics and chance; spatial relationships; or change, proportions, and formulas. Tasks at this level may also require understanding arguments or communicating well-reasoned explanations for answers or choices. …

In general, this means that tasks located at a particular proficiency level can be successfully completed by the “average” person at that level approximately two-thirds of the time. However, individuals scoring at the bottom of the level would successfully complete tasks at that level only about half the time while individuals scoring at the top of the level would successfully complete tasks at that level about 80 percent of the time.

[68] Calculated with data from the webpage: “PIAAC [Program for the International Assessment of Adult Competencies] Data Explorer.” U.S. Department of Education, National Center for Education Statistics. Accessed March 23, 2022 at <nces.ed.gov>

Percentages for U.S. Adults, 16–74 (Household and Prison) PIAAC numeracy: overall scale, by PIAAC Numeracy proficiency levels [BMNUM] and jurisdiction: PIAAC 2012–2017 and PIAAC 2017 … PIAAC 2017 … U.S. Household (16–65 years old) … Percentage … Below Level 1 [=] 9% … Level 1 [=] 20% … Level 2 [=] 33% … Level 3 [=] 27% … Level 4 [=] 9% … Level 5 [=] 1%

CALCULATION: 9% + 1% = 10% at Level 4 or higher

[69] Report: “Key Concepts and Features of the 2003 National Assessment of Adult Literacy.” By Sheida White and Sally Dillow. U.S. Department of Education, Institute of Education Sciences, December 2005. <nces.ed.gov>

Page 1: “Sponsored by the National Center for Education Statistics (NCES) in the U.S. Department of Education’s Institute of Education Sciences, the 2003 National Assessment of Adult Literacy (NAAL) is a nationally representative assessment of literacy among adults (age 16 and older) residing in households and prisons in the United States.”

Page 3:

The National Assessment of Adult Literacy (NAAL) measures the ability of a nationally representative sample of adults to perform literacy tasks similar to those that they encounter in their daily lives. Statistical procedures ensure that NAAL participants represent the entire population of U.S. adults who are age 16 and older and live in households or prisons. In 2003, the 19,714 adults who participated in NAAL represented a U.S. adult population of about 222 million. …

Like other adults, NAAL participants bring to literacy tasks a full range of backgrounds, experiences, and skill levels. Like real-life tasks, NAAL tasks vary with respect to the difficulty of the materials used as well as the complexity of the actions to be performed. However, in order to be fair to all participants, none of the tasks require specialized background knowledge, and all of them were reviewed for bias against particular groups. …

NAAL tasks reflect a definition of literacy that emphasizes the use of written materials to function adequately in one’s environment and to develop as an individual. Of course, the actual literacy tasks that individuals must perform in their daily lives vary to some extent depending on the nature of their work and personal goals. However, virtually all literacy tasks require certain underlying skills, such as the ability to read and understand common words. NAAL measures adults’ performance on a range of tasks mimicking actual tasks encountered by adults in the United States. Adults with very low levels of performance on NAAL tasks may be unable to function adequately in 21st century America.

Page 4:

NAAL Examines Three Literacy Areas—Prose, Document, and Quantitative

NAAL reports a separate score for each of three literacy areas:

Prose literacy refers to the knowledge and skills needed to perform prose tasks—that is, to search, comprehend, and use continuous texts. Prose examples include editorials, news stories, brochures, and instructional materials.

Document literacy refers to the knowledge and skills needed to perform document tasks—that is, to search, comprehend, and use noncontinuous texts in various formats. Document examples include job applications, payroll forms, transportation schedules, maps, tables, and drug or food labels.

Quantitative literacy refers to the knowledge and skills required to perform quantitative tasks—that is, to identify and perform computations, either alone or sequentially, using numbers embedded in printed materials. Examples include balancing a checkbook, computing a tip, completing an order form, or determining the amount of interest on a loan from an advertisement.

Pages 13–14:

In addition to the four performance levels that were developed using the bookmark method, the Committee on Performance Levels for Adult Literacy also recommended that NCES report on a fifth category—Nonliterate in English. This category includes two groups of adults:

• Two percent of the adults who were selected to participate in the 2003 NAAL could not be tested—in other words, could not participate in NAAL at all—because they knew neither English nor Spanish (the other language spoken by interviewers in most areas). The Nonliterate in English category includes these adults because their inability to communicate in English indicates a lack of English literacy skills.

• Three percent of the adults who were tested in 2003 did not take the main part of the assessment, which was too difficult for them, but did take an alternative assessment specifically designed for the least-literate adults. Questions on the alternative assessment were asked in either English or Spanish, but all written materials were in English only. While some adults in this group displayed minimal English literacy skills (for example, the ability to identify a letter or a common word in a simple text), others lacked such skills entirely. (For example, an adult who was able to attempt the alternative assessment by following oral Spanish instructions might still prove unable to do even the minimal amount of English reading needed to provide any correct answers.) The Nonliterate in English category includes these adults because their English literacy skills are minimal at best.

In 2003, the two groups of adults classified as Nonliterate in English—the 2 percent who could not be tested because of a language barrier (i.e., inability to communicate in English or Spanish) and the 3 percent who took the alternative assessment—accounted for 11 million adults, or 5 percent of the population. These adults range from having no English literacy skills to being able to “recognize some letters, numbers, or common sight words in everyday contexts” (Hauser and others 2005).

Page 32: “Because NAAL is designed to assess literacy in English, all the written instructions and responses are in English.”

[70] Webpage: “National Assessment of Adult Literacy, Sample Questions Search: 1985, 1992 & 2003.” U.S. Department of Education, National Center for Education Statistics. Accessed July 19, 2015 at <nces.ed.gov>

“Item Number: N120601 … Scale: Document Literacy … Task Demand: Text Search … Percent who answered correctly: 82.0%”

[71] Webpage: “National Assessment of Adult Literacy, Sample Questions Search: 1985, 1992 & 2003.” U.S. Department of Education, National Center for Education Statistics. Accessed July 19, 2015 at <nces.ed.gov>

“Item Number: C080101 … Scale: Quantitative Literacy … Task Demand: Computation, Text Search … Percent who answered correctly: 59.6%”

[72] Webpage: “National Assessment of Adult Literacy, Sample Questions Search: 1985, 1992 & 2003.” U.S. Department of Education, National Center for Education Statistics. Accessed July 19, 2015 at <nces.ed.gov>

“Item Number: N130901 … Quantitative Literacy: XX … Task Demand: Computation, Text Search … Percent who answered correctly: 45.8%”

[73] Webpage: “National Assessment of Adult Literacy, Sample Questions Search: 1985, 1992 & 2003.” U.S. Department of Education, National Center for Education Statistics. Accessed July 19, 2015 at <nces.ed.gov>

“Item Number: N091001 … Scale: Quantitative Literacy … Task Demand: Computation, Text Search … Percent who answered correctly: 17.6%”

[74] Webpage: “National Assessment of Adult Literacy, Sample Questions Search: 1985, 1992 & 2003.” U.S. Department of Education, National Center for Education Statistics. Accessed July 19, 2015 at <nces.ed.gov>

“Item Number: N100701 … Scale: Document Literacy … Task Demand: Application, Inferential, Text Search … Percent who answered correctly: 10.6%”

[75] Webpage: “Horace Mann.” PBS. Accessed July 9, 2015 at <www.pbs.org>

Horace Mann, often called the Father of the Common School, began his career as a lawyer and legislator. … He spearheaded the Common School Movement, ensuring that every child could receive a basic education funded by local taxes. His influence soon spread beyond Massachusetts as more states took up the idea of universal schooling. …

… These developments were all part of Mann’s driving determination to create a system of effective, secular, universal education in the United States.

[76] Article: “Mann, Horace.” Encyclopædia Britannica Ultimate Reference Suite 2004.

U.S. educator, the first great American advocate of public education, who believed that, in a democratic society, education should be free and universal, nonsectarian, democratic in method, and reliant on well-trained, professional teachers. …

… He started a biweekly Common School Journal for teachers and lectured widely to interested groups of citizens. His annual reports to the board ranged far and wide through the field of pedagogy, stating the case for the public school and discussing its problems. Essentially his message centred on six fundamental propositions:

(1) that a republic cannot long remain ignorant and free, hence the necessity of universal popular education;

(2) that such education must be paid for, controlled, and sustained by an interested public;

(3) that such education is best provided in schools embracing children of all religious, social, and ethnic backgrounds;

(4) that such education, while profoundly moral in character, must be free of sectarian religious influence;

(5) that such education must be permeated throughout by the spirit, methods, and discipline of a free society, which preclude harsh pedagogy in the classroom; and

(6) that such education can be provided only by well-trained, professional teachers.

Mann encountered strong resistance to these ideas—from clergymen who deplored nonsectarian schools, from educators who condemned his pedagogy as subversive of classroom authority, and from politicians who opposed the board as an improper infringement of local educational authority—but his views prevailed.

[77] The Common School Journal for the Year 1841 (Volume III). Edited by Horace Mann (Secretary of the Massachusetts Board of Education). Marsh, Capen, Lyon, and Webb, 1841.

Page 15:

Conclusion.

The tendency of the preceding remarks must be obvious, and therefore our application of them may be brief.

In the first place, if there must be institutions, associations, combinations amongst men, whose tendency is to alienation and discord; to whet the angry feelings of individuals against each other; to transmit the contentions of the old to the young, and to make the enmities of the dead survive to the living;—if these things must continue to be, in a land calling itself Christian;—let there be one institution, at least, which shall be sacred from the ravages of the spirit of party,—one spot, in the wide land, unblasted by the fiery breath of animosity. Amid unions for aggression, let there be one rallying point for a peaceful and harmonious cooperation and fellowship, where all the good may join, in the most beneficent of labors. The young do not come into life, barbed and fanged against each other. A blow is never the salutation which two infants give, on meeting for the first time. By a proper training, the kindly feelings may be kept uppermost. Those powers may be cultivated, which have the double blessing of bestowing happiness on the possessor and on the race. The Common School is the institution which can receive and train up children in the elements of all good knowledge, and of virtue, before they are subjected to the alienating competitions of life. This institution is the greatest discovery ever made by man;—we repeat it, the Common School is the greatest discovery ever made by man. In two grand, characteristic attributes, it is supereminent over all others:—first, in its universality;—for it is capacious enough to receive and cherish in its parental bosom every child that comes into the world; and second, in the timeliness of the aid it proffers;—its early, seasonable supplies of counsel and guidance making security antedate danger. Other social organizations are curative and remedial; this is a preventive and an antidote; they come to heal diseases and wounds; this to make the physical and moral frame invulnerable to them. Let the Common School be expanded to its capabilities, let it be worked with the efficiency of which it is susceptible, and nine tenths of the crimes in the penal code would become obsolete; the long catalogue of human ills would be abridged; men would walk more safely by day; every pillow would be more inviolable by night; property, life, and character held by a stronger tenure; all rational hopes respecting the future brightened.

[78] Book: Life of Horace Mann (Volume 1). By Mary Tyler Peabody Mann (Horace Mann’s wife). Walker, Fuller, and Company, 1865.

Pages 141–142:

Dec. 20. Have been engaged mainly this week with a long article for the first number of the third volume of the “Common-school Journal.” …

In this introduction, Mr. Mann shows how forcibly his mind had been led, by the “wild roar of party politics” of that year, to look into the secret springs of public action; and how futile is the attempt to “define truth by law, and to perpetuate it by power and wealth, instead of knowledge.” He closes it in these words, which apply equally to our own times: …

… The common school is the institution which can receive and train up children in the elements of all good knowledge and of virtue before they are subjected to the alienating competitions of life. This institution is the greatest discovery ever made by man: we repeat it, the common school is the greatest discovery ever made by man. … Let the common school be expanded to its capabilities, let it be worked with the efficiency of which it is susceptible, and nine-tenths of the crimes in the penal code would become obsolete; the long catalogue of human ills would be abridged; men would walk more safely by day; every pillow would be more inviolable by night; property, life, and character held by a stronger tenure; all rational hopes respecting the future brightened.

[79] Article: “Most U.S. Youths Unfit to Serve, Data Show.” By William H. McMichael. Army Times, November 3, 2009. <www.armytimes.com>

In a study being released Thursday in Washington, Education Secretary Arne Duncan and a group of retired military officers led by former Army Gen. Wesley Clark will sound the alarm bells and call young Americans’ relative lack of overall fitness for military duty a national security threat. The group, Mission: Readiness, will release a report that draws on Pentagon data showing that 75 percent of the nation’s 17- to 24-year-olds are ineligible for service for a variety of reasons.

Put another way, only 4.7 million of the 31.2 million 17- to 24-year-olds in a 2007 survey are eligible to enlist, according to a periodic survey commissioned by the Pentagon. This group includes those who have scored in the top four categories on the Armed Forces Qualification Test, or AQFT; eligible college graduates; and qualified college students.

According to the Pentagon, the ineligible population breaks down this way:

* Medical/physical problems, 35 percent.

* Illegal drug use, 18 percent.

* Mental Category V (the lowest 10 percent of the population), 9 percent.

* Too many dependents under age 18, 6 percent.

* Criminal record, 5 percent.

[80] Calculated with data from the report: “2013 Qualified Military Available (QMA) Results Summary.” Department of Defense, Joint Advertising Market Research & Studies, 2013. <www.justfacts.com>

Pages 1–2:

The Department of Defense (DoD) 2013 Qualified Military Available (QMA) Study examined the number of youth eligible and available for military service. This number is an important indicator used by the Department of Defense (DoD) to plan recruiting policy and programs. The basic ingredient of this metric is the size of the population aged 17–24, reduced by the number who are disqualified or unavailable for military service. The 2013 QMA Study revised and updated previous QMA estimations by using more recent data to estimate the prevalence of disqualifying conditions and by accounting for the correlations of disqualifying conditions that account for overlap among multiple disqualifiers.

The 2013 QMA project used the most recent data from the following sources:

• National Health and Nutrition Examination Survey (NHANES)

• National Survey on Drug Use and Health (NSDUH)

• Joint Advertising Market Research & Studies Youth Poll Survey (JAMRS-YP)

• MEPCOM Production Applicant AFQT score database (MEPCOM)

• Woods & Poole Population Estimates

• 1997 Profile of American Youth (PAY97)1

Methodology & Results Summary

After reviewing the particular guidelines established by DoD Instructions 1304.26, “Qualification Standards for Enlistment, Appointment, and Induction and 6130.03, Medical Standards for Appointment, Enlistment, or Induction in the Military Services,” which govern military entrance eligibility criteria, disqualifying conditions were grouped into seven broad disqualification categories: medical/physical, overweight, mental health, drugs, conduct, dependents and aptitude. Disqualification estimates were derived for each of the seven disqualification categories based on data obtained from the sources cited above. Next, a Multivariate Probit Model (MVP) was used to estimate the overlap between disqualifying conditions. This is an accepted method for analyzing binary outcomes that are “seemingly unrelated.” After probabilities of disqualifying on each of the conditions and the overlap were calculated, the probabilities were applied to the Woods & Poole population counts to obtain ZIP Code level estimates.

According to the analysis, only 28.6% of the youth population is estimated to be qualified to enlist in the Military without a waiver. Additionally, the 2013 QMA Study estimated that only 17% of youth would qualify without a waiver and be available, not enrolled in college, for enlisted Active Duty Military service. In practice, the Services typical deny enlistment to youth who score in the bottom 30th percentile (i.e., category IV and V) on the Armed Forces Qualification Test (AFQT). Incorporating this criterion, only 13% of youth would qualify without a waiver, be available for full-time enlisted Military service, and score above the 30th percentile on the AFQT test. Disqualification rates for each of the 7 overarching disqualification categories are as follows:

1. Medical/Physical = 30% disqualified

2. Overweight = 31% disqualified

3. Mental Health = 15% disqualified

4. Drugs = 30% disqualified

5. Conduct = 10% disqualified

6. Dependents = 12% disqualified

7. Aptitude = 9% disqualified

* Note. These percentages represent the proportion of youth 17–24 who are estimated to have an issue in each category that would disqualify an applicant. Percentages sum to greater than 100% as many youth are predicted to demonstrate more than one issue.

Furthermore, results from the 2013 QMA Study demonstrated that the majority of youth who would be disqualified for military service would be disqualified for more than one reason. In all, 39% of all youth are predicted to be disqualified from enlisting in the Military for more than one issue (not including college enrollment as a condition). The five most common categories of multiple disqualifications are as follows:

1. Medical/physical & overweight

2. Medical/physical & drugs

3. Drugs & overweight

4. Medical/physical & mental health

5. Medical/physical, drugs, & mental health

1 The AFQT test was normed based on results from this study which established youth population scores for the AFQT test.

CACULATION: 100% – 28.6% qualified = 71.4% unqualified

[81] Article: “Army: 77% of Young Americans Now Unfit to Serve.” By Kevin Haraldson. News Radio 1200 WOIA (San Antonio, TX), January 5, 2014. <bit.ly>

The commander of the U.S. Army Recruiting Command tells 1200 WOAI news that more than three quarters of all of the 17 to 24 year old men and women in America are currently not eligible for enlistment in the Army, mainly because they are overweight.

“The latest figures we have is 77.5% are disqualified for one reason or another,” Maj. Gen. Allen Batschelet said in an interview. “That means just 22.5% would be qualified.”

He said prospective recruits disqualify themselves for three main reasons. One is what the Army refers to as “morally disqualified,” meaning they have used or are using illegal drugs or have a criminal record. Number two are “cognitive disqualifications,” meaning they are not educated enough to pass the Army entrance exam. But the third, and the most widespread, are physical disqualifications, which are mainly due to being overweight.

[82] Article: “Even More Young Americans Are Unfit to Serve, a New Study Finds. Here's Why..” By Thomas Novelly. Military.com, September 28, 2022. <www.military.com>

A new study from the Pentagon shows that 77% of young Americans would not qualify for military service without a waiver due to being overweight, using drugs or having mental and physical health problems. …

“When considering youth disqualified for one reason alone, the most prevalent disqualification rates are overweight (11%), drug and alcohol abuse (8%), and medical/physical health (7%),” the study, which examined Americans between the ages of 17 and 24, read. The study was conducted by the Pentagon’s office of personnel and readiness.

Mental health accounted for 4% of disqualifications, while aptitude, conduct or being a dependent accounted for 1% each. Most youth, 44%, were disqualified for multiple reasons.

[83] Paper: “The Importance of Noncognitive Skills: Lessons from the GED Testing Program.” By James J. Heckman and Yona Rubinstein. American Economic Review, May, 2001. Pages 145–149. <www.jstor.org>

Page 145:

It is common knowledge outside of academic journals that motivation, tenacity, trustworthiness, and perseverance are important traits for success in life. … Numerous instances can be cited of high-IQ people who failed to achieve success in life because they lacked self discipline and low-IQ people who succeeded by virtue of persistence, reliability, and self-discipline. The value of trustworthiness has recently been demonstrated when market systems were extended to Eastern European societies with traditions of corruption and deceit.

It is thus surprising that academic discussions of skill and skill formation almost exclusively focus on measures of cognitive ability and ignore noncognitive skills. … Most assessments of school reforms stress the gain from reforms as measured by the ability of students to perform on a standardized achievement test. …

Studies by Samuel Bowles and Herbert Gintis (1976), Rick Edwards (1976), and Roger Klein and others (1991) demonstrate that job stability and dependability are traits most valued by employers as ascertained by supervisor ratings and questions of employers….

Page 146:

The GED [General Educational Development] is a mixed signal. Dropouts who take the GED are smarter (have higher cognitive skills) than other high-school dropouts and yet at the same time have lower levels of noncognitive skills. Both types of skill are valued in the market and affect schooling choices. Our finding challenges the conventional signaling literature, which assumes a single skill. It also demonstrates the folly of a psychometrically oriented educational evaluation policy that assumes cognitive skills to be all that matter. Inadvertently, a test has been created that separates out bright but nonpersistent and undisciplined dropouts from other dropouts. It is, then, no surprise that GED recipients are the ones who drop out of school, fail to complete college (Stephen Cameron and James Heckman, 1993) and who fail to persist in the military (Janice Laurence, 2000). GED’s are “wiseguys,” who lack the abilities to think ahead, to persist in tasks, or to adapt to their environments. The performance of the GED recipients compared to both high-school dropouts of the same ability and high-school graduates demonstrates the importance of noncognitive skills in economic life.

[84] Calculated with data from:

a) Dataset: “Table 3.16. Government Current Expenditures by Function [Billions of Dollars].” U.S. Department of Commerce, Bureau of Economic Analysis. Last revised November 17, 2023. <apps.bea.gov>

“Government1 … Education … Elementary and secondary … 2022 [=] 833.7”

b) Dataset: “HH-1. Households by Type: 1940 to Present.” U.S. Census Bureau, Current Population Survey, November 2022. <www.census.gov>

“Year …. 2022 …. Total households [thousands] [=] 131,202

CALCULATION: $833,700,000,000 / 131,202,000 = $6,354

[85] Dataset: “Table 236.55. Total and Current Expenditures Per Pupil in Public Elementary and Secondary Schools: Selected Years, 1919–20 Through 2019–20.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

Expenditure per pupil in fall enrollment … Total expenditure3 … 2019–20 … Unadjusted dollars1 [=] 15,5186 … Constant 2020–21 dollars2 [=] 17,0136

1 Unadjusted (or “current”) dollars have not been adjusted to compensate for inflation. …

3 Excludes “Other current expenditures,” such as community services, private school programs, adult education, and other programs not allocable to expenditures per student at public schools. …

6 Excludes prekindergarten expenditures and prekindergarten enrollment for California.

[86] Report: “Documentation to the NCES Common Core of Data Public Elementary/Secondary School Universe Survey: School Year 2010–11, Version Provisional 2a.” U.S. Department of Education, National Center for Education Statistics, September 2012. <nces.ed.gov>

Page C-6: “Elementary A general level of instruction classified by state and local practice as elementary, composed of any span of grades not above grade 8; preschool or kindergarten included only if it is an integral part of an elementary school or a regularly established school system.”

Page C-14: “Secondary The general level of instruction classified by state and local practice as secondary and composed of any span of grades beginning with the next grade following the elementary grades and ending with or below grade 12.”

[87] Dataset: “Table 236.55. Total and Current Expenditures Per Pupil in Public Elementary and Secondary Schools: Selected Years, 1919–20 Through 2019–20.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

“NOTE: … Beginning in 1980–81, state administration expenditures are excluded from both ‘total’ and ‘current’ expenditures.”

[88] The next seven footnotes document that:

  • National Center for Education Statistics data on education spending doesn’t account for unfunded pension or healthcare benefits.
  • “Defined benefit” pension programs guarantee employees specified levels of benefits, regardless of how much money the employer has previously set aside to pay those benefits.
  • Most government employees receive defined benefit pensions.
  • Many government pension plans are underfunded.

[89] Email from the U.S. Department of Education, National Center for Education Statistics to Just Facts, March 31, 2015.

“The expenditures reported [in Table 213 and elsewhere by the National Center for Education Statistics] do not include or account for unfunded pension benefits or unfunded healthcare benefits.”

[90] Report: “Documentation for the NCES Common Core of Data National Public Education Financial Survey (NPEFS), School Year 2010–11 (Fiscal Year 2011), Preliminary File Version 1a.” U.S. Department of Education, National Center for Education Statistics, December 2013. <nces.ed.gov>

Pages 5–6:

NPEFS [National Public Education Finance Survey] collects employee benefits for the functions of instruction, support services, and operation of noninstructional services. NPEFS respondents are currently reporting employee benefits, which are defined as the “Amounts paid by the school district on behalf of employees (amounts not included in gross salary but in addition to that amount). Such payments are fringe benefits payments and although not directly paid to employees, nevertheless are part of the cost of personal services.”13 The definition of employee benefits is derived from the NCES school finance accounting handbook, Financial Accounting for Local and State School Systems: 2009 Edition (Allison, Honegger, and Johnson 2009). NPEFS does not collect actuarially determined annual required contributions;14 accrued annual requirement contribution liability;15 or the actuarial value of pension plan assets.16

13 The NPEFS instruction manual provides that employee benefits “include amounts paid by, or on behalf of, an LEA [Local Education Agency] for fringe benefits such as group insurance (including health benefits for current and retired employees), social security contributions, retirement contributions, tuition reimbursements, unemployment compensation, worker’s compensation, and other benefits such as unused sick leave (NCES, 2012).

14 Actuarially determined annual required contributions are the annual required contribution (ARC) that incorporates both the cost of benefits in the current year and the amortization of the plan’s unfunded actuarial accrued liability.

15 The accrued annual requirement contribution liability is the difference between actuarially determined contributions and actual payments made to the pension fund.

16 Actuarial value of pension plan assets is the value of cash, investments, and other property belonging to a pension plan as used by an actuary for the purpose of an actuarial valuation.

[91] Article: “Defined Benefit Pensions and Household Income and Wealth.” By Marshall B. Reinsdorf and David G. Lenze. U.S. Bureau of Economic Analysis, Survey of Current Business, August 2009. Pages 50–62. <apps.bea.gov>

Pages 50–51:

U.S. households usually participate in two kinds of retirement income programs: social security, and a plan sponsored by their employer. The employer plan may be organized as either a defined contribution plan, such as a 401(k) plan, or a defined benefit plan. Defined contribution plans provide resources during retirement based on the amount of money that has been accumulated in an account, while defined benefit plans determine the level of benefits by a formula that typically depends on length of service and average or final pay. …

… A defined benefit plan has an actuarial liability for future benefits equal to the expected present value of the benefits to which the plan participants are entitled under the benefit formula. The value of participants’ benefit entitlement often does not coincide with the value of the assets that the plan has on hand; indeed, a plan that has a pay-as-you-go funding scheme might have only enough assets to ensure that it can make the current period’s benefit payments.2

A complete measure of the wealth of defined benefit plan participants is the expected present value of the benefits to which they are entitled, not the assets of the plan. This follows from the fact that if the assets of a defined benefit plan are insufficient to pay promised benefits, the plan sponsor must cover the shortfall. …

… [U]nder the accrual approach, the measure of compensation income for the participants in the plan is no longer the employer’s actual contributions to the plan. Instead, it is the present value of the benefits to which employees become entitled as a result of their service to the employer.

Measuring household income from defined benefit plans by actual contributions from employers plus actual investment income on plan assets can be considered a cash accounting approach to measuring these plans’ transactions…. We use the term “accrual accounting” to mean any approach that adopts the principle that a plan’s benefit obligations ought to be recorded as they are incurred.

2 Federal law requires that private pension plans operate as funded plans, not as pay-as-you-go plans.

[92] Report: “Preview of the 2013 Comprehensive Revision of the National Income and Product Accounts: Changes in Definitions and Presentations.” By Shelly Smith and others. U.S. Bureau of Economic Analysis, March 2013. <apps.bea.gov>

Page 22:

For defined benefit plans, the cash accounting approach is inadequate because the value of the benefit entitlements that participants accrue during a year often fails to coincide with the plans’ cash receipts.33

An employer who offers a defined benefit pension plan promises that an employee will receive a specified amount of future benefits that usually increases with each year of service.

[93] Textbook: Fiscal Administration. By John Mikesell. Wadsworth, Cengage Learning, 2014.

Page 170:

The vast majority of public employee pension programs are defined benefit programs.30

30 Exceptions to the rule that government employees are in defined benefit programs: faculty at many state universities are in the TIAA/CREF [Teachers Insurance and Annuity Association/College Retirement Equities Fund] defined contribution program and federal employees in the Federal Employee Retirement System Thrift Savings Plan. In 1996, Michigan established a defined contribution plan for all new employees. In 1991, West Virginia school employees were put in such a plan.

[94] Paper: “Bringing Actuarial Measures of Defined Benefit Pensions into the U.S. National Accounts.” By Marshall Reinsdorf (International Monetary Fund), David Lenze (U.S. Bureau of Economic Analysis), and Dylan Rassier (U.S. Bureau of Economic Analysis). International Monetary Fund, International Association for Research in Income and Wealth, 33rd General Conference, August 24–30, 2014. <www.justfacts.com>

Pages 11–12:

Although private DB [defined benefit] plans are on the decline, for state and local government employees DB pension plans continue to be the predominant form of retirement plan. … In 2012, there were 227 state-administered and 3,771 locally-administered DB pension plans according to the Survey of Public Pension Plans conducted by the U.S. Census Bureau. The number of active state and local plan members was 14.4 million (91 percent of the 15.9 million full-time equivalent employees), and the number of beneficiaries receiving periodic benefit payments was 9.0 million.

[95] Webpage: “What Changes Were Made to Pensions During the 2013 Comprehensive Revision, and How Have the Changes Affected Private, Federal, and State and Local Compensation?” U.S. Bureau of Economic Analysis, July 31, 2013. Last modified 7/26/18. <www.bea.gov>

“A large number of state and local pension plans are underfunded, which means that the value of the plans’ assets is less than their accrued pension liabilities for current workers and retirees.”

[96] The next 4 footnotes document that:

  • National Center for Education Statistics data on education spending does not account for unfunded healthcare and other post-employment benefits.
  • Retiree health benefits are common in the government sector and rare in the private sector.
  • Substantial amounts of healthcare benefits promised to government employees are unfunded.

[97] Email from the U.S. Department of Education, National Center for Education Statistics to Just Facts, March 31, 2015.

“The expenditures reported [in Table 213 and elsewhere by the National Center for Education Statistics] do not include or account for unfunded pension benefits or unfunded healthcare benefits.”

[98] Report: “Documentation for the NCES Common Core of Data National Public Education Financial Survey (NPEFS), School Year 2010–11 (Fiscal Year 2011), Preliminary File Version 1a.” U.S. Department of Education, National Center for Education Statistics, December 2013. <nces.ed.gov>

Pages 5–6:

NPEFS [National Public Education Finance Survey] collects employee benefits for the functions of instruction, support services, and operation of noninstructional services. NPEFS respondents are currently reporting employee benefits, which are defined as the “Amounts paid by the school district on behalf of employees (amounts not included in gross salary but in addition to that amount). Such payments are fringe benefits payments and although not directly paid to employees, nevertheless are part of the cost of personal services.”13 The definition of employee benefits is derived from the NCES school finance accounting handbook, Financial Accounting for Local and State School Systems: 2009 Edition (Allison, Honegger, and Johnson 2009). NPEFS does not collect actuarially determined annual required contributions;14 accrued annual requirement contribution liability;15 or the actuarial value of pension plan assets.16

13 The NPEFS instruction manual provides that employee benefits “include amounts paid by, or on behalf of, an LEA [Local Education Agency] for fringe benefits such as group insurance (including health benefits for current and retired employees), social security contributions, retirement contributions, tuition reimbursements, unemployment compensation, worker’s compensation, and other benefits such as unused sick leave (NCES, 2012).

14 Actuarially determined annual required contributions are the annual required contribution (ARC) that incorporates both the cost of benefits in the current year and the amortization of the plan’s unfunded actuarial accrued liability.

15 The accrued annual requirement contribution liability is the difference between actuarially determined contributions and actual payments made to the pension fund.

16 Actuarial value of pension plan assets is the value of cash, investments, and other property belonging to a pension plan as used by an actuary for the purpose of an actuarial valuation.

[99] Report: “Employment-Based Retiree Health Benefits: Trends in Access and Coverage, 1997–2010.” By Paul Fronstin and Nevin Adams. Employee Benefit Research Institute, October 1, 2012. <papers.ssrn.com>

Page 1: “Very few private-sector employers currently offer retiree health benefits, and the number offering them has been declining. In 2010, 17.7 percent of workers were employed at establishments that offered health coverage to early retirees, down from 28.9 percent in 1997.”

Page 4:

One of the most important factors (if not the single most important) contributing to the decline in the availability of retiree health benefits was a 1990 accounting rule change.1

The Financial Accounting Standards Board (FASB) issued Financial Accounting Statement No. 106 (FAS 106), “Employers’ Accounting for Postretirement Benefits Other Than Pensions” in December 1990, and it triggered many of the changes that private-sector employers have made to retiree health benefits. FAS 106 required companies to record retiree-health-benefit liabilities on their financial statements in accordance with generally accepted accounting principles, beginning with fiscal years after Dec. 15, 1992. Specifically, FAS 106 required private-sector employers to accrue and expense certain payments for future claims as well as actual paid claims. The immediate income-statement inclusion and balance-sheet-footnote recognition of these liabilities dramatically affected companies’ reported profits and losses. With this new view of the cost and the increasing expense of providing retiree health benefits, many private-sector employers overhauled their retiree health programs in ways that controlled, reduced, or eliminated these costs.2

Page 8:

The AHRQ [Agency for Healthcare Research and Quality] data show a similar trend among state-government employers. Among state employers, the percentage offering retiree health benefits increased between 1997 and 2003. In 2003, 94.9 percent were providing health coverage to early retirees and 88.6 percent were providing health coverage to Medicare-eligible retirees…. However, recently, the percentage of state-government employers offering retiree health benefits has fallen. By 2010, 70 percent were offering health coverage to early retirees and 63.2 percent were offering it to Medicare-eligible retirees.

Similarly, there has been a recent decline in the percentage of local-government employers offering retiree health benefits. Between 2006 and 2010, the percentage of local governments with 10,000 or more workers that offered health coverage to early retirees fell from 95.1 percent to 77.6 percent, and the percentage offering it to Medicare-eligible retirees fell from 86.2 percent to 67.3 percent…. Some of this decline may be due to recent GASB [Governmental Accounting Standards Board] rules mentioned above.

Only a few local governments reported that they have either recently or soon plan to eliminate health benefits for retirees. Instead, local governments have shifted (or plan to shift) the costs to retirees. In 2011, 2 percent of local governments reported that they eliminated coverage in the past two years or planned to eliminate coverage in the next two years for early retirees…. Five percent reported doing so, or planning to do so, for Medicare-eligible retirees. In contrast, 21 percent reported that they eliminated the employer subsidy in the past two years or planned to do so in the following two years for early-retiree coverage, and 32 percent reported taking such an action for Medicare-eligible retirees.

[100] Report: “State and Local Government Retiree Health Benefits: Liabilities Are Largely Unfunded, but Some Governments Are Taking Action.” U.S. Government Accountability Office, November 2009. <www.gao.gov>

Page 2 (of PDF):

Accounting standards require governments to account for the costs of other postemployment benefits (OPEB)—the largest of which is typically retiree health benefits—when an employee earns the benefit. As such, governments are reporting their OPEB liabilities—the amount of the obligation to employees who have earned OPEB. As state and local governments have historically not funded retiree health benefits when the benefits are earned, much of their OPEB liability may be unfunded. Amid fiscal pressures facing governments, this has raised concerns about the actions the governments can take to address their OPEB liabilities. …

The total unfunded OPEB liability reported in state and the largest local governments’ CAFRs [comprehensive annual financial reports] exceeds $530 billion. However, as variations between studies’ totals show, totaling unfunded OPEB liabilities across governments is challenging for a number of reasons, including the way that governments disclose such data. The unfunded OPEB liabilities for states and local governments GAO [U.S. Government Accountability Office] reviewed varied widely in size. Most of these governments do not have any assets set aside to fund them. The total for unfunded OPEB liabilities is higher than $530 billion because GAO reviewed OPEB data in CAFRs for the 50 states and 39 large local governments but not data for all local governments or additional data reported in separate financial reports. Also, the CAFRs we reviewed report data that predate the market downturn. Finally, OPEB valuations are based on assumptions about the health care cost inflation rate and discount rates for assets, which also affect the size of the unfunded liability.

Some state and local governments have taken actions to address liabilities associated with retiree health benefits by setting aside assets to prefund the liabilities before employees retire and reducing these liabilities by changing the structure of retiree health benefits. Approximately 35 percent of the 89 governments for which GAO reviewed CAFRs reported having set aside some assets for OPEB liabilities, but the percentage of the OPEB liability funded varied.

[101] Article: “Hunger for Stability Quells Appetite for Change.” By Michael B. Henderson and others. Education Next. Last updated August 31, 2021. <www.educationnext.org>

The survey was conducted from May 28 to June 21, 2021, by the polling firm Ipsos Public Affairs via its KnowledgePanel®. The KnowledgePanel® is a nationally representative panel of American adults (obtained via address-based sampling techniques) who agree to participate in a limited number of online surveys. Ipsos provides internet access and/or an appropriate device to KnowledgePanel® members who lack the necessary technology to participate. For individual surveys, Ipsos samples respondents from the KnowledgePanel®. Respondents could elect to complete this survey in English or Spanish.

The total sample for the survey (3,156 respondents) consists of two overlapping samples. The first is a nationally representative, stratified general-population sample of adults in the United States (1,410 respondents). The second consists of American parents, stepparents, or foster parents of at least one child living in the respondent’s household who is in a grade from kindergarten through 12th (2,022 respondents). The parent sample includes oversamples of parents with at least one child in a charter school (232 respondents), parents with at least one child in a private school (325 respondents), Black parents (288 respondents), and Hispanic parents (472 respondents). The completion rate for this survey is 54%.

For parents, after initially screening for qualification, we created a roster of the children in kindergarten through 12th grade who live in their household by asking for the grade, gender, race, ethnicity, school type (traditional public school, charter school, private school, or home school), and age for each child. We also allowed parents to label each child in the roster with a name or initials if they chose to do so. In all, the parent sample provided information on 3,443 K–12 students. We asked a series of questions about the schooling experiences for each of these children. After completing these questions about each child individually, parents proceeded to the remainder of the survey.

In this report, we analyze responses to questions about individual children at the child level. We analyze all other questions at the respondent level. For both student-level and parent-level analyses, we use survey weights designed for representativeness of the national population of parents of school-age children. For analysis of the general-population sample, we use survey weights designed for representativeness of the national population of adults.

NOTE: For facts about what constitutes a scientific survey and the factors that impact their accuracy, visit Just Facts’ research on Deconstructing Polls & Surveys.

[102] “Education Next—Program on Education Policy and Governance—Survey 2021.” Commissioned by Education Next and the Program on Education Policy and Governance at the Harvard Kennedy School of Government. Conducted by Ipsos Public Affairs during May–June 2021. <www.educationnext.org>

Page 5:

12. Based on your best guess, what is the average amount of money spent each year for a child in public schools in your local school district?

All – $8,719

Parents – $7,809

Teachers – $8,144

Hispanics – $10,820

Black, NonHispanic – $8,558

White, NonHispanic – $8,072

Income < $75,000 – $7,556

Income >= $75,000 – $9,715

College Graduate – $9,096

Not College Graduate – $8,532

Republicans – $8,927

Democrats – $8,490

[103] Calculated with the dataset: “Average Class Size in Public K–12 Schools, by School Level, Class Type, and Selected School Characteristics: 2020–21.” U.S. Department Of Education, National Center for Education Statistics. Accessed May 25, 2023 at <nces.ed.gov>

NOTES:

  • “[T]eachers for students with disabilities and other special teachers … are generally excluded from class size calculations.” [Dataset: “Table 208.20. Public and Private Elementary and Secondary Teachers, Enrollment, Pupil/Teacher Ratios, and New Teacher Hires: Selected Years, Fall 1955 Through Fall 2030.” U.S. Department of Education, National Center for Education Statistics, September 2021. <nces.ed.gov>]
  • An Excel file containing the data and calculations is available upon request.

[104] Calculated with data from:

a) Dataset: “Average Class Size in Public Schools, by School Level, Class Type, and Selected School Characteristics: 2020–21.” U.S. Department Of Education, National Center for Education Statistics. Accessed May 25, 2023 at <nces.ed.gov>

b) Dataset: “Table 236.55. Total and Current Expenditures Per Pupil in Public Elementary and Secondary Schools: Selected Years, 1919–20 Through 2019–20.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

NOTES:

  • “[T]eachers for students with disabilities and other special teachers … are generally excluded from class size calculations.” [Dataset: “Table 208.20. Public and Private Elementary and Secondary Teachers, Enrollment, Pupil/Teacher Ratios, and New Teacher Hires: Selected Years, Fall 1955 Through Fall 2030.” U.S. Department of Education, National Center for Education Statistics, September 2021. <nces.ed.gov>]
  • An Excel file containing the data and calculations is available upon request.

[105] Click here for documentation that the following items are excluded from spending data published by the National Center for Education Statistics:

  • State administration spending
  • Unfunded pension benefits
  • Post-employment non-pension benefits like health insurance

[106] Article: “Scientific Survey Shows Voters Widely Accept Misinformation Spread By the Media.” By James D. Agresti. Just Facts, January 2, 2020. <www.justfacts.com>

The findings are from a nationally representative annual survey commissioned by Just Facts, a non-profit research and educational institute. The survey was conducted by Triton Polling & Research, an academic research firm that used sound methodologies to assess U.S. residents who regularly vote. …

The survey was conducted by Triton Polling & Research, an academic research firm that serves scholars, corporations, and political campaigns. The responses were obtained through live telephone surveys of 700 likely voters across the U.S. during December 2–11, 2019. This sample size is large enough to accurately represent the U.S. population. Likely voters are people who say they vote “every time there is an opportunity” or in “most” elections.

The margin of sampling error for the total pool of respondents is ±4% with at least 95% confidence. The margins of error for the subsets are 6% for Democrat voters, 6% for Trump voters, 5% for males, 5% for females, 12% for 18 to 34 year olds, 5% for 35 to 64 year olds, and 6% for 65+ year olds.

The survey results presented in this article are slightly weighted to match the ages and genders of likely voters. The political parties and geographic locations of the survey respondents almost precisely match the population of likely voters. Thus, there is no need for weighting based upon these variables.

NOTE: For facts about what constitutes a scientific survey and the factors that impact their accuracy, visit Just Facts’ research on Deconstructing Polls & Surveys.

[107] Dataset: “Just Facts’ 2019 U.S. Nationwide Survey.” Just Facts, January 2020. <www.justfacts.com>

Page 1:

Q3. On average across the United States, how much do you think public schools spend per year to educate each classroom of students?

Less than $150,000 per classroom per year … Percent [=] 52.7

More than $150,000 per classroom per year … Percent [=] 36.2

Unsure … Percent [=] 11.1

[108] Calculated with data from: “Table 236.55. Total and Current Expenditures Per Pupil in Public Elementary and Secondary Schools: Selected Years, 1919–20 Through 2019–20.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

NOTE: An Excel file containing the data and calculations is available upon request.

[109] Dataset: “Table 236.55. Total and Current Expenditures Per Pupil in Public Elementary and Secondary Schools: Selected Years, 1919–20 Through 2019–20.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

NOTE: An Excel file containing the data is available upon request.

[110] Report: “Racial Disparities in Education Finance: Going Beyond Equal Revenues.” By Kim Rueben and Sheila Murray. Urban Institute, November 2008. <www.taxpolicycenter.org>

Page 1:

In the past, because public schools were funded largely by local property taxes, property-rich and -poor school districts differed greatly in expenditures per pupil. Since the early 1970s, however, state legislatures have, on their own initiative or at the behest of state courts, implemented school finance equalization programs to reduce the disparity in within-state education spending. …

Since the 1990s, many of the challenges to state finance systems have focused on ensuring that all students have equitable access to adequate educational opportunities as required by state education clauses (Minorini and Sugarman 1999). The argument is that some districts do not provide students with an adequate education and that it is the state’s responsibility to see that districts receive the funding to enable them to do so. The remedy might require some districts to spend more (perhaps significantly more) than other districts, depending on their student population. For example, in districts with many students from low-income families and families where English is not the first language, an “adequate education” may cost more money, and the state is required to ensure that these needs are met.

[111] Report: “How Schools Work & How to Work With Schools: A Primer For Those Who Want To Serve Children and Youth In Schools.” National Association of State Boards of Education, 2014. <www.cdc.gov>

Page 13:

At the local level, most funds for K–12 public schools are raised through local taxes on private property. Although a local property tax is a fairly stable source of funding, disparities in local wealth often directly affect the funds available to schools, reflected in the disparities in per student spending within and between school districts. Even if voters choose to tax themselves at a relatively high rate, low community property values can mean inadequate resources for schools.

Many states have taken the initiative or have been forced by legal challenges to address these inequities in education funding, which compromise the guarantee found in state constitutions that all students have equal access to an adequate public education. States have adopted ballot measures, such as California’s Proposition 98 and 111, to ensure funding equity, or have raised funds from lotteries and other mechanisms, or redistributed locally raised taxes through legislative means to help ensure equity in funding.

[112] Report: “Do Districts Enrolling High Percentages of Minority Students Spend Less?” By Thomas Parrish. U.S. Department of Education, National Center for Education Statistics, December 1996. <nces.ed.gov>

Figure 1 shows expenditures for four categories of school districts by the percentage of minority students enrolled. Each of these four categories of school districts represents about 25 percent of the nation’s public school children. Figure 1 shows that on average, during the 1989–90 school year, spending was fairly equal across school districts with less than 50 percent minority enrollment. However, districts in which 50 percent or more of the students enrolled were racial minorities spent more than those districts with less than 50 percent minority enrollment. For example, the average expenditure differential between districts with the highest and the lowest percentage of minority students was $431 per student ($5,474 versus $5,043).

Figure 1. Education Expenditures in the United States in Relation to Percentage of Minority Enrollment (1989–90) …

School Districts by Percentage of Minority Enrollment; … Average Expenditures per Student

Less than 5% [=] $5,043

5%–<20% [=] $5,169

20%–<50% [=] $5,071

50% or more [=] $5,474 …

In terms of “buying power” in school year 1989–90, districts with the highest percentages of minority students spent $286 less on public education per year than did districts with the lowest percentages of minority students ($4,103 vs. $4,389 per student) (figure 2). This change in direction occurs because school districts enrolling high percentages of minority students are more likely to be located in high-cost urban centers and to serve substantial numbers of students with special needs, thereby reducing the “buying power” of the dollars received.

Figure 2. Education “Buying Power” in the United States in Relation to Percentage of Minority Enrollment (1989–90)

Less than 5% [=] $4,389

5%–<20% [=] $4,350

20%–<50% [=] $4,190

50% or more [=] $4,103

[113] Book: Generational Change: Closing the Test Score Gap. Edited by Paul E. Peterson. Rowman & Littlefield, 2006.

Chapter 2: “How Families and Schools Shape the Achievement Gap.” By Derek Neal (University of Chicago and National Bureau of Economic Research). Pages 26–46.

Pages 32, 44:

Under the assumption that spending per student does not vary by race within a school district, the combination of school district data on per-pupil expenditure and school-level data on the racial composition of students provides information on average per pupil spending by public schools on black and white students. Given several different definitions of average expenditure, average spending per black student in public schools ranged from roughly $100 to $500 more than the corresponding figure for white students in 2001.15 These data provide suggestive but not definitive evidence concerning racial differences in resources provided to public schools. …

15. The data come from two Common Core of Data files: the Local (School District) Education Financial Survey and the Public Elementary/Secondary School Data. I calculated averages based on just educational expenditures as well as total expenditures. I also examined the sensitivity of results to the inclusion of allocated data.

[114] Report: “Racial Disparities in Education Finance: Going Beyond Equal Revenues.” By Kim Rueben and Sheila Murray. Urban Institute, November 2008. <www.taxpolicycenter.org>

Page 2:

In the past, because public schools were funded largely by local property taxes, property-rich and -poor school districts differed greatly in expenditures per pupil. Since the early 1970s, however, state legislatures have, on their own initiative or at the behest of state courts, implemented school finance equalization programs to reduce the disparity in within-state education spending. …

Since the 1990s, many of the challenges to state finance systems have focused on ensuring that all students have equitable access to adequate educational opportunities as required by state education clauses (Minorini and Sugarman 1999). The argument is that some districts do not provide students with an adequate education and that it is the state’s responsibility to see that districts receive the funding to enable them to do so. The remedy might require some districts to spend more (perhaps significantly more) than other districts, depending on their student population. For example, in districts with many students from low-income families and families where English is not the first language, an “adequate education” may cost more money, and the state is required to ensure that these needs are met.

Page 5:

To examine spending patterns across different populations of students, we compared average per pupil spending across districts weighted by the number of students in each racial or ethnic group. In general, differences in spending per pupil in districts serving nonwhite and white students are very small. In 1972, the ratio of nonwhite to white spending was .98; this trend had reversed by 1982, as spending per pupil for nonwhite students was slightly higher than for white students in most states and in the United States as a whole and has been for the past 20 years (figure 2). Table 2 presents spending per pupil figures for 2002 weighted by the number of students in each subgroup.

Page 7:

The results presented thus far need to be considered with a few caveats. These ratios do not reflect that the costs of educating students of different groups differ and that minority students are often found in urban districts that have higher cost structures. … In addition, although spending differences have lessened between districts, it is unclear whether inequities are lessened at the school level.

[115] Report: “The Myth of Racial Disparities in Public School Funding.” By Jason Richwine. Heritage Foundation, April 20, 2011. <thf_media.s3.amazonaws.com>

Pages 2–3:

One of the more rigorous reports on funding disparities was published by the Urban Institute.11 The authors of the study combined district-level spending data with the racial and ethnic composition of schools within districts. … This paper employs a similar methodology, using 2006–2007 datasets from the U.S. Department of Education to examine school funding at both the national and regional levels.

Pages 3–4:

Because the cost of living varies across the U.S., school expenditures are not always directly comparable. In areas with a lower cost of living, the same amount of money can buy more resources than in high-cost areas. To account for this difference, the NCES [National Center for Education Statistics] calculates a Comparable Wage Index (CWI) for each school district based on the average non-teacher wage in the district’s labor market.14

Cost adjustments should be regarded cautiously. Living expenses can still vary within markets, sometimes considerably. The District of Columbia, for example, is a high-expense city overall, but its poorest (and mostly black and Hispanic) sections have a lower cost of living than the white sections. While the raw data are likely to overstate the minority school funding advantage, the adjusted data probably understate it.

Page 4:

Public Education Spending by Race and Ethnic Group

National

Unadjusted

Adjusted For Cost of Living

Per-Pupil Spending

% of White Per-Pupil Spending

% of White Per-Pupil Spending

White

$10,816

100%

100%

Black

$11,387

105%

101%

Hispanic

$10,951

101%

96%

Asian

$11,535

107%

97%

[116] Paper: “How Progressive is School Funding in the United States?” By Matthew M. Chingos. Education Next, June 19, 2017. <www.educationnext.org>

[W]e calculate the average per-pupil funding levels of districts attended by poor students (those from families below the federal poverty level), compared to the funding of districts attended by non-poor students. Specifically, we calculate two weighted averages of the funding of all regular school districts in each state: one using the number of poor students in each district as weights, and the other using the number of non-poor students as weights. We adjust funding levels in each district using average wage levels in its local labor market.3

In this report, I use the same methodology to produce a national overview of the distribution of school funding in the U.S., both in the most recent year for which data are available (2013–14) and over time since 1994–95. The national distribution of funding reflects both the distribution within states, as well as how students are distributed across states. For example, the average state could have progressive funding, but at the national level funding could still be regressive if poor students are substantially more likely to live in states with low funding levels.

Nationwide, per-student K–12 education funding from all sources (local, state, and federal) is similar, on average, at the districts attended by poor students ($12,961) and non-poor students ($12,640), a difference of 2.5 percent in favor of poor students.

Figure 1 shows that this difference has not changed much since 1994–95, when funding levels were lower (less than $10,000 in 2014 dollars) but the difference between poor and non-poor students was similar in percentage terms (2.4 percent). Average progressivity nationwide did rise in the years prior to the Great Recession, with the poor/non-poor difference peaking at 4.3 percent in 2007–08.

[117] Click here for documentation that the following items are excluded from spending data published by the National Center for Education Statistics:

  • State administration spending
  • Unfunded pension benefits
  • Post-employment non-pension benefits like health insurance

[118] Calculated with data from:

a) Dataset: “Table 236.75. Total and Current Expenditures Per Pupil in Fall Enrollment in Public Elementary and Secondary Schools, by Function and State or Jurisdiction: 2019–20.” U.S. Department Of Education, National Center for Education Statistics, September 2021. <nces.ed.gov>

b) Dataset: “Table 236.55. Total and Current Expenditures Per Pupil in Public Elementary and Secondary Schools: Selected Years, 1919–20 Through 2019–20.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

c) Report: “Real Personal Consumption Expenditures and Personal Income by State, 2020.” U.S. Bureau of Economic Analysis, December 14, 2021. <www.bea.gov>

Page 9 (of PDF): “Table 2. Regional Price Parities and Implicit Regional Price Deflators, by State, 2020” <www.bea.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • The precise figures for each state are provided below.

Average Spending Per Public School Student

State

Spending

Arizona

$11,481

Idaho

$11,652

Utah

$11,843

Oklahoma

$12,637

Florida

$12,706

North Carolina

$12,929

Nevada

$13,047

Tennessee

$13,058

Mississippi

$13,371

Alabama

$13,593

Indiana

$14,373

Texas

$14,534

Arkansas

$14,746

South Dakota

$14,823

Georgia

$15,182

Louisiana

$15,246

Colorado

$15,403

Virginia

$15,409

Missouri

$15,499

Kentucky

$15,553

California

$15,945

New Mexico

$16,204

South Carolina

$16,225

Michigan

$16,485

Delaware

$16,812

West Virginia

$16,821

Hawaii

$17,030

Montana

$17,134

Iowa

$17,291

Wisconsin

$17,462

Kansas

$17,559

Nebraska

$17,599

Oregon

$18,324

Washington

$18,490

Maryland

$18,522

Ohio

$18,543

Minnesota

$18,721

Rhode Island

$19,861

Maine

$19,891

New Hampshire

$19,941

North Dakota

$19,956

Alaska

$20,884

Massachusetts

$21,193

Illinois

$21,451

Pennsylvania

$21,550

Wyoming

$21,922

New Jersey

$22,392

Connecticut

$23,915

Vermont

$25,344

New York

$27,080

D.C.

$30,682

[119] Calculated with data from:

a) Dataset: “Table 2.4.5U. Personal Consumption Expenditures by Type of Product.” U.S. Bureau of Economic Analysis. Last revised April 27, 2023. <apps.bea.gov>

b) Dataset: “Table 1.1.5. Gross Domestic Product.” U.S. Bureau of Economic Analysis. Last revised April 27, 2023. <apps.bea.gov>

c) Dataset: “CPI—All Urban Consumers (Current Series).” U.S. Department of Labor, Bureau of Labor Statistics. Accessed January 27, 2023 at <www.bls.gov>

“Series Id: CUUR0000SA0; Series Title: All Items in U.S. City Average, All Urban Consumers, Not Seasonally Adjusted; Area: U.S. City Average; Item: All Items; Base Period: 1982–84=100”

d) Dataset: “Table 236.20. Total Expenditures for Public Elementary and Secondary Education and Other Related Programs, by Function and Subfunction: Selected Years, 1990–91 Through 2019–20.” U.S. Department Of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

e) Dataset: “Table 105.20. Enrollment in Elementary, Secondary, and Degree-Granting Postsecondary Institutions, by Level and Control of Institution, Enrollment Level, and Attendance Status and Sex of Student: Selected Years, Fall 1990 Through Fall 2030.” U.S. Department Of Education, National Center for Education Statistics, March 2022. <nces.ed.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • The next five footnotes provide important context for understanding the data and calculations used to determine this fact. In short, they document the following:
    • The result of this calculation is corroborated by a 1995 working paper published by the U.S. Department of Education.
    • The combination of “elementary” and “secondary” schools roughly encompasses grades K–12.
    • Private-sector spending on education is equal to the sum of these three measures reported by the federal government’s Bureau of Economic Analysis:
      1) personal consumption expenditures (PCE)
      2) gross private domestic investment (GPDI)
      3) net exports of goods and services
    • PCE is the “primary measure of consumer spending on goods and services” by private individuals and nonprofit organizations.
    • GPDI is a measure of private spending on “structures, equipment, and intellectual property products.”
    • Since private school education is not a service that is typically imported or exported, a valid approximation of spending on private K–12 schools can be obtained by summing PCE, GPDI, and government spending on private K–12 schools.

[120] In 1995, the U.S. Department of Education (DOE) published a working paper which “estimated that the total expenditures for private schools in 1991–92 (including operating expenses and capital) were between $18.0 and $19.4 billion.”† Using the same methodology as in the footnote above, Just Facts calculated that total expenditures for private schools in the same year were $16.2 billion, or 10–16% lower than DOE’s estimates.‡ This difference fits with the DOE working paper’s statement that “we would be surprised if improved data changed our overall estimate of total expenditures on private education by more than perhaps 10 or 15%.” Some of shortcomings of the data used in DOE working paper are as follows:

  • “The main area of concern in the data for Catholic elementary and secondary schools is the response rate: each had a response rate far below 100%. (The response rate for the elementary survey was just above 50%, and for the secondary survey it was about 57%.)”
  • “In addition, our use of region as a proxy for geographic variation may be somewhat crude.”
  • “The principal caveat that needs to be attached to our estimates is that we are uncertain about the specific expenditures school officials included in their responses to the survey items we relied on in our analysis.”
  • “Nor do we know whether most schools responded to the survey items on the basis of a formal school budget or on the basis of less formal materials.”

NOTES:

  • † Working paper: “Estimates of Expenditures for Private K–12 Schools.” By Michael Garet, Tsze H. Chan, and Joel D. Sherman. U.S. Department of Education, National Center for Education Statistics, May 1995. <nces.ed.gov>
  • ‡ An Excel file containing the data and calculations is available here.

[121] Report: “Documentation to the NCES Common Core of Data Public Elementary/Secondary School Universe Survey: School Year 2010–11, Version Provisional 2a.” U.S. Department of Education, National Center for Education Statistics, September 2012. <nces.ed.gov>

Page C-6: “Elementary A general level of instruction classified by state and local practice as elementary, composed of any span of grades not above grade 8; preschool or kindergarten included only if it is an integral part of an elementary school or a regularly established school system.”

Page C-14: “Secondary The general level of instruction classified by state and local practice as secondary and composed of any span of grades beginning with the next grade following the elementary grades and ending with or below grade 12.”

[122] Report: “Fiscal Year 2013 Analytical Perspectives, Budget of the U.S. Government.” White House Office of Management and Budget, February 12, 2012. <www.gpo.gov>

Page 471:

The main purpose of the NIPAs [national income and product accounts published by the U.S. Bureau of Economic Analysis] is to measure the Nation’s total production of goods and services, known as gross domestic product (GDP), and the incomes generated in its production. GDP excludes intermediate production to avoid double counting. Government consumption expenditures along with government gross investment—State and local as well as Federal—are included in GDP as part of final output, together with personal consumption expenditures, gross private domestic investment, and net exports of goods and services (exports minus imports).

[123] Report: “Concepts and Methods of the U.S. National Income and Product Accounts, Chapter 5: Personal Consumption Expenditures.” U.S. Bureau of Economic Analysis. Updated December 2022. <www.bea.gov>

Page 5-1:

Personal consumption expenditures (PCE) is the primary measure of consumer spending on goods and services in the U.S. economy.1 It accounts for about two-thirds of domestic final spending, and thus it is the primary engine that drives future economic growth. PCE shows how much of the income earned by households is being spent on current consumption as opposed to how much is being saved for future consumption.

PCE also provides a comprehensive measure of types of goods and services that are purchased by households. Thus, for example, it shows the portion of spending that is accounted for by discretionary items, such as motor vehicles, or the adjustments that consumers make to changes in prices, such as a sharp run-up in gasoline prices.2

Page 5-2:

PCE measures the goods and services purchased by “persons”—that is, by households and by nonprofit institutions serving households (NPISHs)—who are resident in the United States. Persons resident in the United States are those who are physically located in the United States and who have resided, or expect to reside, in this country for 1 year or more. PCE also includes purchases by U.S. government civilian and military personnel stationed abroad, regardless of the duration of their assignments, and by U.S. residents who are traveling or working abroad for 1 year or less.3

Page 5-69:

Nonprofit Institutions Serving Households

In the NIPAs [National Income and Product Accounts], nonprofit institutions serving households (NPISHs), which have tax-exempt status, are treated as part of the personal sector of the economy. Because NPISHs produce services that are not generally sold at market prices, the value of these services is measured as the costs incurred in producing them.

In PCE, the value of a household purchase of a service that is provided by a NPISH consists of the price paid by the household or on behalf of the household for that service plus the value added by the NPISH that is not included in the price. For example, the value of the educational services provided to a student by a university consists of the tuition fee paid by the household to the university and of the additional services that are funded by sources other than tuition fees (such as by the returns to an endowment fund).

[124] Report: “Measuring the Economy: A Primer on GDP and the National Income and Product Accounts.” U.S. Bureau of Economic Analysis, December 2015. <www.bea.gov>

Page 8: “Gross private domestic investment consists of purchases of fixed assets (structures, equipment, and intellectual property products) by private businesses that contribute to production and have a useful life of more than one year, of purchases of homes by households, and of private business investment in inventories.”

[125] Calculated with the dataset: “Average Class Size in Private K–12 Schools, by School Level, Class Type, and Affiliation 2020–21.” U.S. Department Of Education, National Center for Education Statistics. Accessed May 26, 2023 at <nces.ed.gov>

NOTES:

  • “[T]eachers for students with disabilities and other special teachers … are generally excluded from class size calculations.” [Dataset: “Table 208.20. Public and Private Elementary and Secondary Teachers, Enrollment, Pupil/Teacher Ratios, and New Teacher Hires: Selected Years, Fall 1955 Through Fall 2030.” U.S. Department of Education, National Center for Education Statistics, September 2021. <nces.ed.gov>]
  • An Excel file containing the data and calculations is available upon request.

[126] CALCULATION: $9,709 spending per student × 15.3 students per classroom = $148,548 spending per classroom

[127] Calculated with data from:

a) Dataset: “Table 205.50. Private Elementary and Secondary Enrollment, Number of Schools, and Average Tuition, by School Level, Orientation, and Tuition: Selected Years, 1999–2000 Through 2011–12.” U.S. Department of Education, National Center for Education Statistics, August 2021. <nces.ed.gov>

“Each school reports the highest annual tuition charged for a full-time student; this amount does not take into account discounts that individual students may receive. This amount is weighted by the number of students enrolled in each school and averaged.”

b) Dataset: “Table 236.55. Total and Current Expenditures Per Pupil in Public Elementary and Secondary Schools: Selected Years, 1919–20 Through 2019–20.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

NOTES:

  • As of August 5, 2023, this is the latest available dataset on private school tuitions.
  • An Excel file containing the data and calculations is available upon request.

[128] Report: “Documentation to the NCES [National Center for Education Statistics] Common Core of Data Public Elementary/Secondary School Universe Survey: School Year 2010–11, Version Provisional 2a.” U.S. Department of Education, National Center for Education Statistics, September 2012. <nces.ed.gov>

Page C-6: “Elementary A general level of instruction classified by state and local practice as elementary, composed of any span of grades not above grade 8; preschool or kindergarten included only if it is an integral part of an elementary school or a regularly established school system.”

Page C-14: “Secondary The general level of instruction classified by state and local practice as secondary and composed of any span of grades beginning with the next grade following the elementary grades and ending with or below grade 12.”

[129] Paper: “Academic Achievement and Demographic Traits of Homeschool Students: A Nationwide Study.” By Brian D. Ray. Academic Leadership, Winter 2010. <www.nheri.org>

Page 4: “Students were included in the study if a parent affirmed that his or her student was ‘… taught at home within the past twelve months by his/her parent for at least 51% of the time in the grade level now being tested.’

Page 7: “The target population was all families in the United States who were educating their school-age children at home and having standardized achievement tests administered to their children. … A total of 11,739 students provided useable questionnaires with corresponding achievement tests.”

[130] Email from Brian D. Ray to Just Facts, May 12, 2015.

“The median amount spent per this one year on the student’s education for textbooks, lesson materials, tutoring, enrichment services, testing, counseling, evaluation, and so forth is $400 to $599. Here is the frequency list regarding the answers. …”

[131] Calculated with data from the paper: “Academic Achievement and Demographic Traits of Homeschool Students: A Nationwide Study.” By Brian D. Ray. Academic Leadership, Winter 2010. <www.nheri.org>

Page 7:

It was very challenging to calculate the response rate. One of the main problems was that, well into the study, it was discovered that many of the large-group test administrators were not communicating to their constituent homeschool families that they had been invited to participate in the study. Based on the best evidence available, the response rate was a minimum of 19% for the four main testing services with whom the study was originally planned, who worked fairly hard to get a good response from the homeschooled families, and whose students accounted for 71.5% (n = 8,397) of the participants in the study. That is, of the students who were tested and whose parents were invited to participate in the study, both test scores and survey responses were received for this group. It is possible that the response rate was higher, perhaps as much as 25% for these four testing services. For the other testing services and sources of data, the response rate was notably lower, at an estimated 11.0%. These testing services and other sources of data used a less-concentrated approach to soliciting participation and following-up with reminders to secure participation.

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • For facts about what constitutes a scientific survey and the factors that impact their accuracy, visit Just Facts’ research on Deconstructing Polls & Surveys.

[132] Textbook: Mind on Statistics (4th edition). By Jessica M. Utts and Robert F. Heckard. Brooks/Cole Cengage Learning, 2012.

Pages 164–165:

Surveys that simply use those who respond voluntarily are sure to be biased in favor of those with strong opinions or with time on their hands. …

According to a poll taken among scientists and reported in the prestigious journal Science … scientists don’t have much faith in either the public or the media. … It isn’t until the end of the article that we learn who responded: “The study reported a 34% response rate among scientists, and the typical respondent was a white, male physical scientist over the age of 50 doing basic research.” … With only about a third of those contacted responding, it is inappropriate to generalize these findings and conclude that most scientists have so little faith in the public and the media.

[133] Book: Sampling: Design and Analysis (2nd edition). By Sharon L. Lohr. Brooks/Cole Cengage Learning, 2010.

Pages 5–6:

The following examples indicate some ways in which selection bias can occur. …

… Nonresponse distorts the results of many surveys, even sources that are carefully designed to minimize other sources of selection bias. Often, nonrespondents differ critically from the respondents, but the extent of that difference is unknown unless you can later obtain information about the nonrespondents. Many surveys reported in newspapers or research journals have dismal response rates—in some, the response rate is as low as 10%. It is difficult to see how results can be generalized of the population when 90% of the targeted sample cannot be reached or refuses to participate.

[134] Paper: “Response Rates to Mail Surveys Published in Medical Journals.” By David A. Asch and others. Journal of Clinical Epidemiology, 1997. Pages 1129–1136. <www.jclinepi.com>

Page 1129:

The purpose of this study was to characterize response rates for mail surveys published in medical journals…. The mean response rate among mail surveys published in medical journals is approximately 60%. However, response rates vary according to subject studied and techniques used. Published surveys of physicians have a mean response rate of only 54%, and those of non-physicians have a mean response rate of 68%. … Although several mail survey techniques are associated with higher response rates, response rates to published mail surveys tend to be moderate. However, a survey’s response rate is at best an indirect indication of the extent of non-respondent bias. Investigators, journal editors, and readers should devote more attention to assessments of bias, and less to specific response rate thresholds.

The level of art and interpretation in calculating response rates reflects the indirect and therefore limited use of the response rate in evaluating survey results. So long as one has sufficient cases for statistical analyses, non-response to surveys is a problem only because of the possibility that respondents differ in a meaningful way from non-respondents, thus biasing the results22, 23. Although there are more opportunities for non-response bias when response rates are low than high, there is no necessary relationship between response rates and bias. Surveys with very low response rates may provide a representative sample of the population of interest, and surveys with high response rates may not.

Nevertheless, because it is so easy to measure response rates, and so difficult to identify bias, response rates are a conventional proxy for assessments of bias. In general, investigators do not seem to help editors and readers in this regard. As we report, most published surveys make no mention of attempts to ascertain non-respondent bias. Similarly, some editors and readers may discredit the results of a survey with a low response rate even if specific tests limit the extent or possibility of this bias.

[135] Webpage: “CPI Inflation Calculator.” United States Department of Labor, Bureau of Labor Statistics. Accessed July 11, 2023. <www.bls.gov>

$400 in August 2007 has the same buying power as $575.56 in January 2023

$599 in August 2007 has the same buying power as $861.90 in January 2023

About the CPI Inflation Calculator

The CPI inflation calculator uses the Consumer Price Index for All Urban Consumers (CPI-U) U.S. city average series for all items, not seasonally adjusted. This data represents changes in the prices of all goods and services purchased for consumption by urban households.

[136] Article: “What Have We Learned About Homeschooling?” By Eric J. Isenberg. Peabody Journal of Education, December 5, 2007. Pages 387–409. <www.tandfonline.com>

Page 398:

Parents make school choice decisions based on preferences, the quality of local schools, and constraints of income and available leisure time. Separating the causal effect of each variable on school choice requires holding the others constant. For instance, if two families with identical preferences, income, and leisure time choose different schools, the difference can be ascribed to the local education market. Families who live in the same area with the same time and income constraints but who choose different schools must have different preferences.

Page 404:

Using aggregate data or child-level data, there is some evidence that poorer academic quality of public schools and decreased choice of private schools both contribute to an increase in homeschooling. Isenberg (2003) used test score data to measure academic school quality in Wisconsin. The results indicate that in small towns, a decrease in math test scores in a school district increases the likelihood of homeschooling. The magnitude of this effect is significant. A decrease in math scores from the 1 standard deviation above the mean to 1 standard deviation below the mean increases homeschooling by 29%, from 1.9 percentage points to 2.4 percentage points, all else equal. A decrease from 2 standard deviations above to 2 standard deviations below increases homeschooling by 65%, from 1.6 percentage points to 2.7 percentage points.

Page 405:

If parents are dissatisfied with the public schools for academic, religious, or other reasons, they must choose between homeschooling and private schooling. Private school has tuition costs; homeschooling has opportunity costs of time. Isenberg (2006) showed the ways in which mothers are motivated by the amount of disposable time they have, the opportunity cost of time, and income constraints. The results are summarized in Table 3.

If a mother has preschool children as well as a school-age child, she is predisposed to stay home, decrease her work hours, or even stay out of the labor force entirely and therefore more likely to homeschool. Of course, small children require a great deal of time to care for, but this pull on a mother’s time is dominated by the incentive to withdraw from the labor force, freeing daytime hours and eliminating commute time, thereby increasing the likelihood of homeschooling. All else equal, having a preschool child younger than 3 years old increases the probability of homeschooling a school-age sibling by 1.2 percentage points; a toddler age 3 to 6 increases the probability of homeschooling by 0.5 percentage points

Having school-age siblings also increases the likelihood that a child is homeschooled. Each additional sibling beyond the first sibling increases the probability that a particular child is homeschooled. All else equal, a child with two other school-age siblings is 1.2 percentage points more likely to be homeschooled than a child with one school-age sibling, and a child with three or more siblings in school is an additional 1.7 percentage points more likely to be homeschooled than a child with two siblings. There appear to be economies of scale in homeschooling.

The presence of other adults in the household also has a significant effect on the likelihood of homeschooling. This may be because these extra adults take over household tasks, giving the mother more disposable time. Other adults in the household, including but not limited to a husband, increase the likelihood of homeschooling by 0.5 percentage points per extra adult.

[137] Dataset: “Table 235.10. Revenues for Public Elementary and Secondary Schools, by Source of Funds: Selected Years, 1919–20 Through 2019–20.” U.S. Department Of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

NOTE: An Excel file containing the data is available upon request.

[138] Dataset: “Table 235.10. Revenues for Public Elementary and Secondary Schools, by Source of Funds: Selected Years, 1919–20 Through 2019–20.” U.S. Department Of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

NOTE: An Excel file containing the data is available upon request.

[139] Calculated with the dataset: “Table 236.20. Total Expenditures for Public Elementary and Secondary Education and Other Related Programs, by Function and Subfunction: Selected Years, 1990–91 Through 2018–19.” U.S. Department of Education, National Center for Education Statistics, September 2021. <nces.ed.gov>

“Excludes expenditures for state education agencies.”

NOTE: An Excel file containing the data and calculations is available upon request.

[140] Click here for documentation that the following items are excluded from spending data published by the National Center for Education Statistics:

  • State administration spending
  • Unfunded pension benefits
  • Post-employment non-pension benefits like health insurance

[141] Calculated with the dataset: “Table 236.20. Total Expenditures for Public Elementary and Secondary Education and Other Related Programs, by Function and Subfunction: Selected Years, 1990–91 Through 2018–19.” U.S. Department of Education, National Center for Education Statistics, September 2021. <nces.ed.gov>

“Excludes expenditures for state education agencies.”

NOTE: An Excel file containing the data and calculations is available upon request.

[142] Calculated with the dataset: “Table 236.20. Total Expenditures for Public Elementary and Secondary Education and Other Related Programs, by Function and Subfunction: Selected Years, 1990–91 Through 2018–19.” U.S. Department of Education, National Center for Education Statistics, September 2021. <nces.ed.gov>

“Excludes expenditures for state education agencies.”

NOTE: An Excel file containing the data and calculations is available upon request.

[143] Click here for documentation that the following items are excluded from spending data published by the National Center for Education Statistics:

• State administration spending

• Unfunded pension benefits

• Post-employment non-pension benefits like health insurance

[144] Calculated with the dataset: “Table 6.2D. Compensation of Employees by Industry [Millions of Dollars].” United States Department of Commerce, Bureau of Economic Analysis. Last revised September 30, 2022. <apps.bea.gov>

“2021 … Government … State and local [=] 1,638,488 … Education [=] 803,998”

CALCULATION: 803,998 / 1,638,488 = 49%

[145] As documented in the next five footnotes, this data on government employee compensation:

  • Accounts for defined benefit pensions benefits “as employees earn them, rather than when employers actually make cash payments to pension plans.”
  • Does not include the unfunded liabilities of retirement non-pension benefits (like health insurance). It does, however, include such spending for current retirees.

[146] Webpage: “What Changes Were Made to Pensions During the 2013 Comprehensive Revision, and How Have the Changes Affected Private, Federal, and State and Local Compensation?” U.S. Bureau of Economic Analysis, July 31, 2013. <www.bea.gov>

BEA [U.S. Bureau of Economic Analysis] changed its method for recording the transactions of defined benefit pension plans from a cash accounting basis to an accrual accounting basis as part of the comprehensive revision of the national income and product accounts (NIPAs) released on July 31, 2013. This improvement reflects the most recent international guidelines for the compilation of national accounts—the System of National Accounts 2008 (2008 SNA), which recommends an accrual-based treatment of defined benefit pension plans.

Defined benefit plans provide benefits during retirement based on a formula that typically depends on an employee’s length of service and average pay among other factors. The promised benefit entitlements tend to grow in a relatively smooth manner, whereas employers’ cash contributions may be volatile or sporadic. Accrual accounting is preferred over cash accounting for compiling national accounts because it aligns production with the incomes earned from that production and records both in the same period; cash accounting, on the other hand, reflects incomes when paid, regardless of when they were earned. Thus, the accrual accounting method better reflects the relatively smooth manner in which benefits are earned by employees each period as a result of the work they perform.

The new treatment applies to all defined benefit pension plans—private, federal government, and state and local government—and this change resulted in revisions to BEA’s estimates of private, federal, and state and local compensation.

[147] Article: “Changes to How the U.S. Economy is Measured Roll Out July 31.” U.S. Bureau of Economic Analysis, July 23, 2013. <www.bea.gov>

“On July 31, we will switch from a cash accounting method to an accrual accounting method to measure the transactions of defined benefit pension plans. That means we will count the benefits as employees earn them, rather than when employers actually make cash payments to pension plans.”

[148] Report: “Preview of the 2013 Comprehensive Revision of the National Income and Product Accounts: Changes in Definitions and Presentations.” By Shelly Smith and others. U.S. Bureau of Economic Analysis, March 2013. <apps.bea.gov>

Page 25: “With this comprehensive revision, estimates of wages and salaries that are a component of personal income will be presented on an accrual basis back to 1929.”

[149] Email from the U.S. Bureau of Economic Analysis to Just Facts, March 19, 2015.

“Retiree health care benefits (which are separate from pensions) are treated on a cash basis and are effectively included in the compensation of current workers.”

[150] Webpage: “What Is Included in Federal Government Employee Compensation?” U.S. Bureau of Economic Analysis. Last modified July 26, 2018. <www.bea.gov>

The contributions for employee health insurance consist of the federal share of premium payments to private health insurance plans for current employees and retirees1.

1 The payments to amortize the unfunded health care liabilities of the Postal Service Retiree Health Benefits Fund are treated as capital transfers to persons and are therefore not included in compensation.

[151] Calculated with data from:

a) Dataset: “Table 211.60 Estimated Average Annual Salary of Teachers in Public Elementary and Secondary Schools, by State: Selected School Years, 1969–70 Through 2021–22.” US Department Of Education, National Center for Education Statistics, August 2022. <nces.ed.gov>

b) Dataset: “Employer Costs for Employee Compensation: State and Local Government Datasheet.” U.S. Department of Labor, Bureau of Labor Statistics, December 15, 2022. <www.bls.gov>

“State and Local Government, Employer Costs Per Hour Worked for Employee Compensation and Costs as a Percentage of Total Compensation, 2021–2022 (Teachers)”

NOTE: An Excel file containing the data and calculations is available upon request.

[152] The source of the benefits data for teacher compensation is the U.S. Bureau of Labor Statistics’ Employer Costs for Employee Compensation [ECEC] survey. The next three footnotes show that this survey does not capture the costs of retirement health benefits or the unfunded liabilities of pensions.

[153] Paper: “Compensation for State and Local Government Workers.” By Maury Gittleman (U.S. Department of Labor) and Pierce Brooks (U.S. Department of Labor). Journal of Economic Perspectives, Winter 2012. Pages 217–242. <pubs.aeaweb.org>

Appendix (<www.aeaweb.org>): “Note that the ECEC [Employer Expenditures for Employee Compensation] by design excludes retiree health plan costs.”

NOTE: On 12/27/2014, Just Facts wrote to the authors of this paper to confirm the statement above, and the lead author affirmed this is true. He was unable to find an explicit statement from a Bureau of Labor Statistics publication stating that retiree health plan costs are not included in the ECEC [Employer Expenditures for Employee Compensation], but he wrote, “If one looks at what is explicitly included, however, it is apparent that they are not included in ECEC costs.”

[154] NOTE: The information in the source below implicitly confirms the source above, because it explains that the ECEC measures the costs of employing current employees divided by their working hours. Thus, it does not capture healthcare costs for previous employees (for example, retirees).

Article: “Analyzing Employers’ Costs for Wages, Salaries, and Benefits.” By Felicia Nathan (economist in the Division of Employment Cost Trends, Bureau of Labor Statistics). Bureau of Labor Statistics Monthly Labor Review, October 1987. <www.bls.gov>

Pages 6–9:

How Compensation Costs Are Calculated

At least two approaches can be taken in measuring an employer’s costs for employee compensation. One approach focuses on past expenditures—that is, the actual money an employer spent on compensation during a specified time, usually a past year. The other approach focuses on current costs—annual costs based on the current price of benefits under current plan provisions. The Bureau’s previous measure of compensation cost levels, the Employer Expenditures for Employee Compensation survey, used the past expenditures approach.5 Because the ECI [Employment Cost Index] measures change from one time to another, it uses the current cost approach.

To estimate the total compensation cost per hour worked, the ECI (1) identifies the benefits provided, (2) determines, from current cost information (current price and current plan provisions), the cost per hour worked for each benefit, then (3) sums the costs for the benefits with the straight-time wage or salary rate. The following examples illustrate how current costs are determined for specific benefit plans, and how they differ from costs based on past expenditures. …

Example 2. A health insurance plan is provided all employees. The monthly premium for each employee is $120 for the first 6 months of a given year, and increases to $140 for the last 6 months. Each employee works 2,000 hours per year. …

In this example, the current cost at any time during the first half of the year is the annual premium divided by the annual hours worked….

Compensation cost levels, however, should reflect the current industry and occupational mix each year they are published. Thus, to estimate current cost levels for the aggregate series, it is necessary to have employment data that refer to the current mix. Such data are obtained by apportioning industry employment from the Bureau’s Current Employment Statistics program, using occupational employment by industry from the ECI sample. Industry employment estimates from the Current Employment Statistics program are published monthly, and are adjusted each year to a universe of all nonfarm establishments from March of the previous year.

5 The Employer Expenditures for Employee Compensation (EEEC) [note the difference from ECEC] survey was discontinued in 1977. While differing from the ECI in that it measured expenditures rather than current costs, the EEEC survey had other characteristics similar to those of the ECI. It covered virtually the same benefits and reported the costs on a work-hour basis. The scope of the EEEC survey was also similar to that of the ECI in that it covered the private nonfarm work force.

[155] Email from the U.S. Bureau of Labor Statistics to Just Facts, May 12, 2015.

For the purposes of NCS [the National Compensation Survey, the source of the Employer Cost for Employee Compensation data†], defined benefit costs are: actual dollar amount an establishment placed in the pension fund from cash, stock, corporate bonds and other financial instrument; Pension Benefit Guaranty Corporation (PBGC) premiums; and administration fees paid to third party administrators (ex. legal, actuary, broker’s). Those costs that are considered out-of-scope for NCS are: actuarial costs (i.e. estimate of current and future obligations); pension benefits paid to retirees; service costs (actuarially determined estimate of employer pension obligations based on benefits earned by employees during the period); and other costs (ex. interest, amortization of prior costs).

NOTE: † Report: “Work Schedules in the National Compensation Survey.” By Richard Schumann. U.S. Bureau of Labor Statistics, July 28, 2008. <www.bls.gov>

Page 1: “The National Compensation Survey (NCS) produces data on occupational earnings, compensation cost trends—the Employment Cost Index (ECI) and the Employer Cost for Employee Compensation (ECEC) series—and benefits.”

[156] Calculated with data from:

a) Dataset: “Table 211.60 Estimated Average Annual Salary of Teachers in Public Elementary and Secondary Schools, by State: Selected School Years, 1969–70 Through 2021–22.” US Department Of Education, National Center for Education Statistics, August 2022. <nces.ed.gov>

b) Dataset: “Employer Costs for Employee Compensation: State and Local Government Datasheet.” U.S. Department of Labor, Bureau of Labor Statistics, December 15, 2022. <www.bls.gov>

“State and Local Government, Employer Costs Per Hour Worked for Employee Compensation and Costs as a Percentage of Total Compensation, 2021–2022 (Teachers)”

c) Report: “Real Personal Consumption Expenditures by State and Real Personal Income by State and Metropolitan Area, 2021.” U.S. Bureau of Economic Analysis, December 15, 2022. <www.bea.gov>

Page 9 (of PDF): “Table 2. Regional Price Parities and Implicit Regional Price Deflators, by State, 2021” <www.bea.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • The figures for each state are provided below.

State

Price Parity-Adjusted Compensation

United States

$98,138

Florida

$74,238

Mississippi

$79,353

South Dakota

$81,835

West Virginia

$82,378

Arizona

$84,037

Colorado

$84,334

South Carolina

$85,075

Missouri

$85,656

North Carolina

$85,664

Louisiana

$86,113

Indiana

$86,191

Montana

$86,309

Virginia

$87,344

Arkansas

$87,832

Tennessee

$88,003

Idaho

$88,474

Hawaii

$88,809

Texas

$88,862

Kansas

$89,334

New Mexico

$89,598

Maine

$89,837

Nevada

$89,869

Oklahoma

$90,049

North Dakota

$90,319

Kentucky

$90,324

New Hampshire

$90,403

Utah

$91,310

Nebraska

$91,630

Alabama

$91,637

Vermont

$93,450

Georgia

$95,474

Wisconsin

$96,611

Iowa

$97,125

Wyoming

$97,606

Delaware

$98,594

Oregon

$100,913

Ohio

$101,333

Michigan

$101,425

Minnesota

$103,082

Alaska

$103,966

Illinois

$105,065

Maryland

$105,490

New Jersey

$106,448

D.C.

$109,069

Washington

$110,115

Pennsylvania

$111,646

Rhode Island

$112,511

California

$116,276

Connecticut

$117,302

Massachusetts

$120,041

New York

$121,173

[157] The next two footnotes contain studies of teacher work hours from the U.S. Bureau of Labor Statistics. Unlike less rigorous studies of working hours, these studies are based on comprehensive, detailed records. The first of these studies employed field economists to measure actual working hours, as opposed to relying solely upon assigned work schedules. The second study is based on teacher journals of work hours, as opposed to subjective estimates about how long they think they work.

The first study found that full-time public school teachers work an average of 1,405 hours per year and full-time private school teachers work an average of 1,560 hours per year. The second study found that full-time teachers (public and private) work an average of 39.2 hours per week during the weeks in which they work. The U.S. Department of Labor estimates that full-time teachers work an average of 37 or 38 weeks per year. At 39.2 hours per week, this amounts to 1,450 to 1,490 hours per year.

In keeping with Just Facts’ Standards of Credibility, Just Facts is citing the highest of these numbers in order to give “preference to figures that are contrary to our viewpoints.” To triple-check these two studies, Just Facts conducted a detailed time study of a full-time public school teacher. This is shown in the third footnote below.

[158] Report: “National Compensation Survey: Occupational Earnings in the United States, 2010.” U.S. Bureau of Labor Statistics, May 2011. <www.bls.gov>

Page 8:

Survey data were collected over a 13-month period for the 87 larger areas; for the 140 smaller areas, data were collected over a 4-month period. For each establishment in the survey, the data reflect the establishment’s most recent information at the time of collection. The data for the National bulletin were compiled from locality data collected between December 2009 and January 2011. The average reference period is July 2010.

Page 9:

For hourly workers, scheduled hours worked per day and per week, exclusive of overtime, are recorded. For salaried workers, field economists record the typical number of hours actually worked because those exempt from overtime provisions often work beyond the assigned work schedule.

The number of weeks worked annually is determined as well. Because salaried workers who are exempt from overtime provisions often work beyond the assigned work schedule, the typical number of hours they actually worked is collected.

Page 58 (of PDF): “Table 4. Full-time1 private industry workers: Mean and median hourly, weekly, and annual earnings and mean weekly and annual hours … Primary, secondary, and special education school teachers … Annual5 … Mean hours [=] 1,560”

Page 93 (of PDF): “Table 5. Full-time1 State and local government workers: Mean and median hourly, weekly, and annual earnings and mean weekly and annual hours … Primary, secondary, and special education school teachers … Annual … Mean hours [=] 1,405”

[159] Report: “Teachers’ Work Patterns: When, Where, and How Much Do U.S. Teachers Work?” By Rachel Krantz-Kent. U.S. Bureau of Labor Statistics Monthly Labor Review, March 2008. Pages 52–59. <www.bls.gov>

Page 1:

In the ATUS [American Time Use Survey], interviewers collect data in a time diary format, in which survey participants provide information about activities that they engaged in “yesterday.” Because of the way in which the data are collected, it is possible to identify and quantify the work that teachers do at home, at a workplace, and at other locations and to examine the data by day of the week and time of day. Data are available for nearly every day of 2003–06, which is the reference period for this analysis.

In the presentation that follows, “teachers” refers to persons whose main job is teaching preschool-to-high school students. Persons in the “other professionals” occupations also are classified by their main job. With the exception of chart 1, all estimates presented are restricted to persons who were employed during the week prior to their interview and who did some work during that period. Thus, a teacher who was on summer or semester break during the week of the survey is not included in this analysis. Unless otherwise specified, data pertain to persons who work full time; that is, they usually work 35 hours or more per week. Estimates of work hours refer to persons’ main job only.

Page 59: “Full-time teachers worked nearly 3 more hours per day than part-time teachers. On average for all days of the week, full-time teachers worked 5.6 hours per day and part-time teachers worked 2.8 hours per day.”

NOTE: This survey found that during the weeks in which full-time teachers work, they work an average of 5.6 hours per day (including weekends). This amounts to 39.2 hours per week. Per the U.S. Department of Labor, full-time teachers work an average of 37 or 38 weeks per year.† At 39.2 hours per week, this amounts to 1,450 to 1,490 hours per year.

BLS Handbook of Methods. U.S. Bureau of Labor Statistics. Chapter 8: “National Compensation Measures.” Last modified July 10, 2013. <www.bls.gov>. Page 16: “Primary, secondary, and special education teachers typically have a work schedule of 37 or 38 weeks per year.”

[160] To triple-check the two studies above, in April 2015 Just Facts conducted a detailed time study of a full-time public school teacher who works an average of 3.3 hours per workday beyond contractually required work hours. This teacher:

  • arrives at school 20 minutes before the required contract time to set up and plan.
  • stays an extra hour per day after the required contract time to help students.
  • spends an average of two hours per workday grading tests and preparing lessons.
  • coaches two sports teams.

This teacher works 1,516 hours per year not including coaching, 1,759 hours including one sport, and 1,913 hours including two sports. These figures are higher than but consistent with the studies above given the exceptional commitment of this particular teacher. Beyond working 3.3 extra unpaid hours per workday, this teacher earns approximately $16,000 per year in supplemental contracts, as opposed to the national average of $1,170.†

NOTES:

  • † Calculated with data from: “Table 211.10. Average Salaries for Full-Time Teachers in Public and Private Elementary and Secondary Schools, by Selected Characteristics: 2011–12 and 2015–16.” U.S. Department of Education, National Center for Education Statistics, November 2017. <nces.ed.gov>
  • An Excel file containing the data and calculations is available upon request.

[161] Calculated with data from the report: “National Compensation Survey: Occupational Earnings in the United States, 2010.” U.S. Bureau of Labor Statistics, May 2011. <www.bls.gov>

Page 8:

Survey data were collected over a 13-month period for the 87 larger areas; for the 140 smaller areas, data were collected over a 4-month period. For each establishment in the survey, the data reflect the establishment’s most recent information at the time of collection. The data for the National bulletin were compiled from locality data collected between December 2009 and January 2011. The average reference period is July 2010.

Page 9:

For hourly workers, scheduled hours worked per day and per week, exclusive of overtime, are recorded. For salaried workers, field economists record the typical number of hours actually worked because those exempt from overtime provisions often work beyond the assigned work schedule.

The number of weeks worked annually is determined as well. Because salaried workers who are exempt from overtime provisions often work beyond the assigned work schedule, the typical number of hours they actually worked is collected.

Page 49 (of PDF): Table 4. Full-time1 private industry workers: Mean and median hourly, weekly, and annual earnings and mean weekly and annual hours … All workers … Annual … Mean hours [=] 2,045”

CALCULATION: (2,045 hours – 1,490 hours) / 1,490 hours = 37%

[162] Calculated with data from:

a) Dataset: “Table 211.60 Estimated Average Annual Salary of Teachers in Public Elementary and Secondary Schools, by State: Selected School Years, 1969–70 Through 2021–22.” US Department Of Education, National Center for Education Statistics, August 2022. <nces.ed.gov>

b) Dataset: “Employer Costs for Employee Compensation: State and Local Government Datasheet.” U.S. Department of Labor, Bureau of Labor Statistics, December 15, 2022. <www.bls.gov>

“State and Local Government, Employer Costs Per Hour Worked for Employee Compensation and Costs as a Percentage of Total Compensation, 2021–2022 (Teachers)”

c) Report: “Real Personal Consumption Expenditures by State and Real Personal Income by State and Metropolitan Area, 2021.” U.S. Bureau of Economic Analysis, December 15, 2022. <www.bea.gov>

Page 9 (of PDF): “Table 2. Regional Price Parities and Implicit Regional Price Deflators, by State, 2021” <www.bea.gov>

d) Report: “National Compensation Survey: Occupational Earnings in the United States, 2010.” U.S. Bureau of Labor Statistics, May 2011. <www.bls.gov>

Pages 8, 9, 49, and 93 (of PDF).

NOTE: An Excel file containing the data and calculations is available upon request.

[163] See these four footnotes for documentation that the following items are excluded from employee compensation data published by the Bureau of Labor Statistics:

  • Unfunded pension liabilities
  • Post-employment expenses of worker compensation, such as retirement health benefits

[164] Calculated with data from:

a) Dataset: “Table 211.60 Estimated Average Annual Salary of Teachers in Public Elementary and Secondary Schools, by State: Selected School Years, 1969–70 Through 2021–22.” US Department Of Education, National Center for Education Statistics, August 2022. <nces.ed.gov>

b) Dataset: “Employer Costs for Employee Compensation: State and Local Government Datasheet.” U.S. Department of Labor, Bureau of Labor Statistics, December 15, 2022. <www.bls.gov>

“State and Local Government, Employer Costs Per Hour Worked for Employee Compensation and Costs as a Percentage of Total Compensation, 2021–2022 (Teachers)”

c) Report: “Real Personal Consumption Expenditures by State and Real Personal Income by State and Metropolitan Area, 2021.” U.S. Bureau of Economic Analysis, December 15, 2022. <www.bea.gov>

Page 9 (of PDF): “Table 2. Regional Price Parities and Implicit Regional Price Deflators, by State, 2021” <www.bea.gov>

d) Report: “National Compensation Survey: Occupational Earnings in the United States, 2010.” U.S. Bureau of Labor Statistics, May 2011. <www.bls.gov>

Pages 8, 9, 49, and 93 (of PDF).

NOTE: An Excel file containing the data and calculations is available upon request.

[165] See these four footnotes for documentation that the following items are excluded from employee compensation data published by the Bureau of Labor Statistics:

  • Unfunded pension liabilities
  • Post-employment expenses of worker compensation, such as retirement health benefits

[166] Calculated with data from: “Table 211.10. Average Total Income, Base Salary, and Other Sources of School and Nonschool Income for Full-Time Teachers in Public and Private Elementary and Secondary Schools, by Selected Characteristics: School Year 2020–21.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

Total school-year and summer earned income from school and nonschool sources1 [=] $65,400 …

Job outside the school system during the school year … Percent of teachers [=] 16.8 … Average income [=] $6,090 …

Employed in a nonschool job during the summer … Percent of teachers [=] 16.1 … Average income [=] $3,550 …

Calculation ((.168 × $6,090) + (.161 × $3,550)) / $65,400 = 2.4%

[167] Calculated with the dataset: “Table 211.10. Average total income, base salary, and other sources of school and nonschool income for full-time teachers in public and private elementary and secondary schools, by selected characteristics: 2020–21.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

“Base salary … Public schools total [=] $61,600 … Private schools total [=] $46,400”

CALCULATION: $61,600 – $46,400) / $46,400 = 33%

[168] Calculated with data from:

a) Dataset: “Employer Costs Per Hour Worked for Employee Compensation and Costs as a Percent of Total Compensation: Private Industry Teachers, March 2020.” U.S. Bureau of Labor Statistics, National Compensation Survey. Sent to Just Facts on August 26, 2020.

b) Report: “Employer Costs for Employee Compensation, Historical Listing, March 2004–March 2020.” U.S. Bureau of Labor Statistics, June 9, 2020. <www.bls.gov>

NOTE: An Excel file containing the data and calculations is available upon request.

[169] The source of the benefits data for teacher compensation is the U.S. Bureau of Labor Statistics’ Employer Costs for Employee Compensation [ECEC] survey. The next three footnotes show that this survey does not capture the costs of retirement health benefits or the unfunded liabilities of pensions.

[170] Paper: “Compensation for State and Local Government Workers.” By Maury Gittleman (U.S. Department of Labor) and Pierce Brooks (U.S. Department of Labor). Journal of Economic Perspectives, Winter 2012. Pages 217–242. <pubs.aeaweb.org>

Appendix (<www.aeaweb.org>): “Note that the ECEC [Employer Expenditures for Employee Compensation] [Employer Expenditures for Employee Compensation] by design excludes retiree health plan costs.”

NOTE: On 12/27/2014, Just Facts wrote to the authors of this paper to confirm the statement above, and the lead author affirmed this is true. He was unable to find an explicit statement from a Bureau of Labor Statistics publication stating that retiree health plan costs are not included in the ECEC, but he wrote, “If one looks at what is explicitly included, however, it is apparent that they are not included in ECEC costs.”

[171] NOTE: The information in the source below implicitly confirms the source above, because it explains that the ECEC measures the costs of employing current employees divided by their working hours. Thus, it does not capture healthcare costs for previous employees (for example, retirees).

Article: “Analyzing Employers’ Costs for Wages, Salaries, and Benefits.” By Felicia Nathan (economist in the Division of Employment Cost Trends, Bureau of Labor Statistics). U.S. Bureau of Labor Statistics Monthly Labor Review, October 1987. <www.bls.gov>

Pages 6–8:

How Compensation Costs Are Calculated

At least two approaches can be taken in measuring an employer’s costs for employee compensation. One approach focuses on past expenditures—that is, the actual money an employer spent on compensation during a specified time, usually a past year. The other approach focuses on current costs—annual costs based on the current price of benefits under current plan provisions. The Bureau’s previous measure of compensation cost levels, the Employer Expenditures for Employee Compensation survey, used the past expenditures approach.5 Because the ECI [Employment Cost Index] measures change from one time to another, it uses the current cost approach.

To estimate the total compensation cost per hour worked, the ECI (1) identifies the benefits provided, (2) determines, from current cost information (current price and current plan provisions), the cost per hour worked for each benefit, then (3) sums the costs for the benefits with the straight-time wage or salary rate. The following examples illustrate how current costs are determined for specific benefit plans, and how they differ from costs based on past expenditures. …

Example 2. A health insurance plan is provided all employees. The monthly premium for each employee is $120 for the first 6 months of a given year, and increases to $140 for the last 6 months. Each employee works 2,000 hours per year. …

In this example, the current cost at any time during the first half of the year is the annual premium divided by the annual hours worked….

Compensation cost levels, however, should reflect the current industry and occupational mix each year they are published. Thus, to estimate current cost levels for the aggregate series, it is necessary to have employment data that refer to the current mix. Such data are obtained by apportioning industry employment from the Bureau’s Current Employment Statistics program, using occupational employment by industry from the ECI sample. Industry employment estimates from the Current Employment Statistics program are published monthly, and are adjusted each year to a universe of all nonfarm establishments from March of the previous year.

5 The Employer Expenditures for Employee Compensation (EEEC) [note the difference from ECEC] survey was discontinued in 1977. While differing from the ECI in that it measured expenditures rather than current costs, the EEEC survey had other characteristics similar to those of the ECI. It covered virtually the same benefits and reported the costs on a work-hour basis. The scope of the EEEC survey was also similar to that of the ECI in that it covered the private nonfarm work force.

[172] Email from the U.S. Bureau of Labor Statistics to Just Facts, May 12, 2015.

For the purposes of NCS [the National Compensation Survey, the source of the Employer Cost for Employee Compensation data†], defined benefit costs are: actual dollar amount an establishment placed in the pension fund from cash, stock, corporate bonds and other financial instrument; Pension Benefit Guaranty Corporation (PBGC) premiums; and administration fees paid to third party administrators (ex. legal, actuary, broker’s). Those costs that are considered out-of-scope for NCS are: actuarial costs (i.e. estimate of current and future obligations); pension benefits paid to retirees; service costs (actuarially determined estimate of employer pension obligations based on benefits earned by employees during the period); and other costs (ex. interest, amortization of prior costs).

NOTE: † Report: “Work Schedules in the National Compensation Survey.” By Richard Schumann. U.S. Bureau of Labor Statistics, July 28, 2008. <www.bls.gov>

Page 1: “The National Compensation Survey (NCS) produces data on occupational earnings, compensation cost trends—the Employment Cost Index (ECI) and the Employer Cost for Employee Compensation (ECEC) series—and benefits.”

[173] See these five footnotes for documentation that:

  • “Defined benefit” pension programs guarantee employees specified levels of benefits, regardless of how much money the employer has previously set aside to pay those benefits.
  • Most government employees receive defined benefit pensions and most private sector employees do not.
  • Many government pension plans are underfunded.

[174] See these two footnotes for documentation that retiree health benefits are common in the government sector and rare in the private sector.

[175] Report: “Work Schedules in the National Compensation Survey.” By Richard Schumann. Bureau of Labor Statistics, July 28, 2008. <www.bls.gov>

Page 1:

Work schedules in the United States are generally viewed as consisting of an 8-hour day and a 40-hour week. But the National Compensation Survey (NCS) covers many occupations that have different types of work schedules: fire fighters, for example, who often work 24 straight hours followed by 48 hours off; truck drivers, many of whom spend days at a time on the road; waiters and waitresses, whose schedules may vary every week; and school teachers, who tend to work many hours at home. Fitting all of these different schedules into a common form for data publication can be challenging.

The National Compensation Survey (NCS) produces data on occupational earnings, compensation cost trends—the Employment Cost Index (ECI) and the Employer Cost for Employee Compensation (ECEC) series—and benefits. The wage and benefit data collected from NCS respondents come in several time frames: hourly, weekly, biweekly, monthly, or annually. Converting the raw data into a common format requires accurate work schedules. This article explains how the NCS calculates these work schedules and the role that they play in the calculation of the published data series.

Definition of the Work Schedule

The NCS work schedule is defined as, “The number of daily hours, weekly hours, and annual weeks that employees in an occupation are scheduled and do work.” The work schedule is the standard schedule for the occupation; short-term fluctuations and one-time events are not considered unless the change becomes permanent. For example, paid or unpaid time off due to a snowstorm would not result in the adjustment of the work schedule because this would not represent a permanent change. Paid lunch periods are included in the work schedule, as is incidental time off, such as coffee breaks, or wash-up time. Vacation, holidays, sick leave, and other kinds of leave hours are included in the work schedule, but they are subtracted when calculating the number of hours worked in a year.

Page 2:

Benefit Costs. The ECI and ECEC publish data for a wide variety of benefits. The costs for these benefits may take different forms, such as monthly premiums, percent of gross earnings, or days of paid leave. These costs must be converted to a common cost form to allow for the calculation of individual benefit and total benefit costs across occupations, industries, and other publication categories in the survey. The NCS uses a cost-per-hour-worked concept as the common cost form. To convert all costs to a per-hour-worked basis, the cost of each benefit is converted to an annual cost and then divided by the number of annual hours worked.

Page 4:

Additional requirements of the job. Professional and managerial employees often work beyond the established work schedule of the employer due to the requirements of their jobs. Because such workers are exempt from the overtime provisions of the Fair Labor Standards Act, employers are not required to compensate them for the additional hours. If the hours worked are not compensated for, then they usually are not recorded. Collection of the actual hours normally worked would be the preferred way of determining the work schedule, but records of hours worked by exempt employees are usually not available. In most cases, the NCS collects the employer’s best estimate of the hours normally worked by exempt employees. If the respondent is unwilling or unable to estimate the hours, then the normal work hours of other employees in the establishment are used.

The actual hours worked by elementary and secondary school teachers (who are exempt) are often not available. Time spent in lesson preparation, test construction and grading, providing additional help to students, and other nonclassroom activities are not available and therefore not recorded. The NCS uses contract hours for teachers in determining the work schedule.12 Contracts usually specify the length of the school day, the number of teaching and required nonteaching days, and the amount of time, if any, teachers are required to be in the school before and after school hours. These hours are used to construct the work schedule. For example, it is common for teacher contracts to specify that teachers will work 185 days per year. In these cases, the daily work schedule would be the length of the school day plus any time teachers are required to be in school before or after the school day, and the weekly work schedule would be the daily schedule multiplied by 5 days (Monday through Friday). The number of weeks would be 37 (185 days ÷ 5 days per week). The time not worked during summer, Christmas break, and spring break would be excluded from the work schedule and would not be considered vacation or holiday. Jobs in schools are not considered to be seasonal.

[176] Report: “National Compensation Survey: Occupational Earnings in the United States, 2010.” U.S. Bureau of Labor Statistics, May 2011. <www.bls.gov>

Page 8:

Survey data were collected over a 13-month period for the 87 larger areas; for the 140 smaller areas, data were collected over a 4-month period. For each establishment in the survey, the data reflect the establishment’s most recent information at the time of collection. The data for the National bulletin were compiled from locality data collected between December 2009 and January 2011. The average reference period is July 2010.

Page 9:

For hourly workers, scheduled hours worked per day and per week, exclusive of overtime, are recorded. For salaried workers, field economists record the typical number of hours actually worked because those exempt from overtime provisions often work beyond the assigned work schedule.

The number of weeks worked annually is determined as well. Because salaried workers who are exempt from overtime provisions often work beyond the assigned work schedule, the typical number of hours they actually worked is collected.

Page 58 (of PDF): “Table 4. Full-time1 private industry workers: Mean and median hourly, weekly, and annual earnings and mean weekly and annual hours … Primary, secondary, and special education school teachers … Annual5 … Mean hours [=] 1,560”

Page 93 (of PDF): “Table 5. Full-time1 State and local government workers: Mean and median hourly, weekly, and annual earnings and mean weekly and annual hours … Primary, secondary, and special education school teachers … Annual … Mean hours [=] 1,405”

CALCULATION: (1,560 – 1,405) / 1,405 = 11%

NOTE: For more details on teacher work hours, click here.

[177] Book: Encyclopedia of Human Services and Diversity. Edited by Linwood H. Cousins. Sage Publications, 2014. Article: “Educational Support Services.” By Stephen T. Schroth (Know College).

Page 447: “All 50 states and the District of Columbia provide public education for children from kindergarten through grade 12. Additionally, many states also fund preschool programs that permit some children as young as 3 years of age to attend classes.”

[178] Dataset: “Table 203.90. Average Daily Attendance (ADA) as a Percentage of Total Enrollment, School Day Length, and School Year Length in Public Schools, by School Level and State: 2007–08 and 2011–12.” U.S. Department of Education, National Center for Education Statistics, May 2013. <nces.ed.gov>

“2011–12 … United States … Average hours in school day [=] 6.7 … Average days in school year [=] 179 … Average hours in school year [=] 1,203”

[179] Calculated with data from:

a) Dataset: “Table 105.20. Enrollment in Elementary, Secondary, and Degree-Granting Postsecondary Institutions, by Level and Control of Institution, Enrollment Level, and Attendance Status and Sex of Student: Selected Years, Fall 1990 Through Fall 2029.” U.S. Department of Education, National Center for Education Statistics, March 2021. <nces.ed.gov>

“Elementary and secondary schools1 … [In thousands] … Projected … 2019 … Public [=] 50,634 … Private2 [=] 5,716 … 1 Includes enrollments in local public school systems and in most private schools (religiously affiliated and nonsectarian). Excludes homeschooled children who were not also enrolled in public and private schools. 2 Private elementary enrollment includes preprimary students in schools offering kindergarten or higher grades.”

b) Dataset: “Table 206.10. Number and Percentage of Homeschooled Students Ages 5 Through 17 with a Grade Equivalent of Kindergarten Through 12th Grade, by Selected Child, Parent, and Household Characteristics: Selected Years, 1999 Through 2019.” U.S. Department of Education, National Center for Education Statistics, December 2021. <nces.ed.gov>

“2019 … Number home-schooled (in thousands) … Total [=] 1,457 … Note: Data in 2019 excludes students who were enrolled in school for more than 24 hours a week…. Data for all years also exclude students who were homeschooled only due to a temporary illness.”

CALCULATIONS:

  • 50,634 public + 5,716 private + 1,457 homeschooled = 57,807 total
  • 50,634 public / 57,807 total = 88%
  • 5,716 private / 57,807 total = 10%
  • 1,457 homeschooled / 57,807 total = 2%

NOTE: The word “approximately” is used because the counts from public and private schools include some preprimary students.

[180] Report: “Documentation to the NCES Common Core of Data Public Elementary/Secondary School Universe Survey: School Year 2010–11, Version Provisional 2a.” U.S. Department of Education, National Center for Education Statistics, September 2012. <nces.ed.gov>

Page C-6: “Elementary A general level of instruction classified by state and local practice as elementary, composed of any span of grades not above grade 8; preschool or kindergarten included only if it is an integral part of an elementary school or a regularly established school system.”

Page C-14: “Secondary The general level of instruction classified by state and local practice as secondary and composed of any span of grades beginning with the next grade following the elementary grades and ending with or below grade 12.”

[181] Dataset: “Table 219.46. Public High School 4-Year Adjusted Cohort Graduation Rate (ACGR), by Selected Student Characteristics and State: 2010–11 Through 2019–20.” U.S. Department Of Education, National Center for Education Statistics, December 2020. <nces.ed.gov>

United States … Total, ACGR for all students … 2019–20 [=] 878 … ACGR for students with selected characteristics,1 2019–20 … White [=] 90 … Black [=] 81 … Hispanic [=] 83 … Asian/Pacific Islander5 … Total [=] 93 … American Indian/Alaska Native [=] 759… The adjusted cohort graduation rate (ACGR) is the percentage of public high school freshmen who graduate with a regular diploma or a state-defined alternate high school diploma for students with the most significant cognitive disabilities within 4 years of starting 9th grade. Students who are entering 9th grade for the first time form a cohort for the graduating class. This cohort is “adjusted” by adding any students who subsequently transfer into the cohort and subtracting any students who subsequently transfer out, emigrate to another country, or die. Before 2017–18, the definition of ACGR included regular high school diplomas only. Values preceded by the “>=” or “<” symbol have been “blurred” (rounded) to protect student privacy. Race categories exclude persons of Hispanic ethnicity. In 2019–20, some states may have changed their requirements for a regular high school diploma to account for the impact of the coronavirus pandemic. These changes are at the discretion of each state but may have resulted in less comparability in the ACGRs between 2019–20 and prior school years.

[182] Calculated with the dataset: “Person Income in 2021, Both Sexes, 25 to 64 Years, Total Work Experience.” U.S. Census Bureau. Accessed January 24, 2023 at <www2.census.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • Like all Census Bureau measures of “money” income, this dataset doesn’t include noncash benefits like subsidized housing, food stamps, charitable services, and government or employer-provided health benefits. Also, the data are collected via government surveys, and low-income households substantially underreport their income on such surveys.

[183] Report: “Income in the United States: 2021.” By Jessica Semega and Melissa Kollar. U.S. Census Bureau, September 2022. <www.census.gov>

Page 13:

Data on income collected in the CPS ASEC [Current Population Survey Annual Social and Economic Supplements] by the U.S. Census Bureau cover money income received (exclusive of certain money receipts such as capital gains) before payments for personal income taxes, Social Security, union dues, Medicare deductions, etc. Money income also excludes tax credits such as the Earned Income Tax Credit, the Child Tax Credit, and special COVID-19- related stimulus payments. Money income does not reflect that some families receive noncash benefits such as Supplemental Nutrition Assistance/food stamps, health benefits, and subsidized housing. In addition, money income does not reflect the fact that noncash benefits often take the form of the use of business transportation and facilities, full or partial payments by business for retirement programs, or medical and educational expenses. …

Data users should consider these elements when comparing income levels. Moreover, readers should be aware that for many different reasons there is a tendency in household surveys for respondents to underreport their income. Based on an analysis of independently derived income estimates, the Census Bureau determined that respondents report income earned from wages or salaries more accurately than other sources of income, and that the reported wage and salary income is nearly equal to independent estimates of aggregate income.

[184] Calculated with data from:

a) “ACT Profile Report – National: Graduating Class 2022.” ACT, September 15, 2022. <www.act.org>

Page 7: “Table 1.1. Five Year Trends—Percent of Students Who Met College Readiness Benchmarks … Number of Students Tested: National … 2022 [=] 1,349,644”

b) Dataset: “Table 219.10 High School Graduates, by Sex and Control of School: Public High School Averaged Freshman Graduation Rate (AFGR); and Total Graduates as a Ratio of 17-Year-Old Population: Selected Years, 1869–70 Through 2030–31.” U.S. Department of Education, National Center for Education Statistics, November 2021. <nces.ed.gov>

“School Year … 2021–2212 … Total1 [=] 3,669,720 … 12 Number of high school graduates is projected by the National Center for Education Statistics (NCES) unless otherwise noted.”

CALCULATION: 1,349,644 / 3,669,720 = 37%

[185] “ACT Profile Report – National: Graduating Class 2022.” ACT, September 15, 2022. <www.act.org>

Page 7: “Table 1.1. Five Year Trends—Percent of Students Who Met College Readiness Benchmarks … Percent Who Met Benchmarks … 2022 … English [=] 53% … Mathematics [=] 31% … Reading [=] 41% … Science [=] 32% … Met All Four [=] 22%”

[186] “ACT Profile Report – National: Graduating Class 2022.” ACT, September 15, 2022. <www.act.org>

Page 20: “Table 3.3. Percent of Students Who Met ACT College Readiness Benchmark Scores by Race/Ethnicity … All Four % … All Students [=] 22 … Black/African American [=] 5 … American Indian/Alaska Native [=] 6 … White [=] 29 … Hispanic/Latino [=] 11 … Asian [=] 51 … Native Hawaiian/Other Pacific Islander [=] 10”

[187] Report: “Grade Inflation Continues to Grow in the Past Decade.” By Edgar Sanchez and Raeal Moore. ACT, May 2022. <www.act.org>

Overall grade inflation. This section presents results of the unadjusted HSGPA [high school grade point average] per year from 2010 to 2021. These analyses do not adjust for other explanatory factors such as race/ethnicity, education, or family income. These averages were then plotted to demonstrate the change in average HSGPA across years. As Figure 2 illustrates, the average HSGPA increased from 3.22 in 2010 to 3.39 in 2021, an increase of 0.17 grade points. For comparison purposes, the average ACT Composite score is also presented for each graduating class. From 2010 to 2021, the average ACT Composite score decreased by almost a point from 21.0 in 2010 to 20.3 in 2021. This suggests the presence of grade inflation because scores on a standardized measure of achievement did not increase while HSGPA did.

CALCULATIONS:

  • (3.39 – 3.22) / 3.22 = 5%
  • (20.3 – 21.0) / 21.0 = –3%

[188] Calculated with the dataset: “Table 605.10. Gross Domestic Product Per Capita and Public and Private Education Expenditures Per Full-Time-Equivalent (FTE) Student, by Level of Education and Country: Selected Years, 2005 Through 2019.” U.S. Department of Education, National Center for Education Statistics, January 2023. <nces.ed.gov>

“Data adjusted to U.S. dollars by the OECD [Organization for Economic Cooperation and Development] using the purchasing power parity (PPP) index. Constant dollars are based on national gross domestic product deflators and PPP indexes….”

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • As of August 5, 2023, this is the latest available data.

[189] Book: Beyond Economic Growth: An Introduction to Sustainable Development (2nd edition). By Tatyana P. Soubbotina. World Bank, 2004. <documents.worldbank.org>

Pages 132–133:

Developed countries (industrial countries, industrially advanced countries). High-income countries, in which most people have a high standard of living. Sometimes also defined as countries with a large stock of physical capital, in which most people undertake highly specialized activities. … Depending on who defines them, developed countries may also include middle-income countries with transition economies, because these countries are highly industrialized. Developed countries contain about 15 percent of the world’s population. They are also sometimes referred to as “the North.”

Page 141:

Organisation for Economic Cooperation and Development (OECD). An organization that coordinates policy mostly among developed countries. OECD member countries exchange economic data and create unified policies to maximize their countries’ economic growth and help nonmember countries develop more rapidly. The OECD arose from the Organisation for European Economic Cooperation (OEEC), which was created in 1948 to administer the Marshall Plan in Europe. In 1960, when the Marshall Plan was completed, Canada, Spain, and the United States joined OEEC members to form the OECD.

[190] Calculated with the dataset: “Table 602.20. Average Fourth-Grade Scores and Annual Instructional Time in Mathematics and Science, by Country or Other Education System: 2015.” U.S. Department of Education, National Center for Education Statistics, December 2016. <nces.ed.gov>

NOTES:

  • As of August 5, 2023, this is the latest available data.
  • An Excel file containing the data and calculations is available upon request.

[191] Calculated with the dataset: “Table 602.40. Average Reading Literacy, Mathematics Literacy, and Science Literacy Scores of 15-Year-Old Students, by Sex and Country or Other Education System: Selected Years, 2009 Through 2018.” U.S. Department of Education, National Center for Education Statistics, December 2019. <nces.ed.gov>

NOTES:

  • As of August 5, 2023, this is the latest available data.
  • An Excel file containing the data and calculations is available upon request.

[192] Webpage: “Our Global Reach.” Organization for Economic Cooperation and Development. Accessed June 9, 2023 at <www.oecd.org>

“Australia, Austria, Belgium, Canada, Chile, Colombia, Costa Rica, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Israel, Italy, Japan, Korea [South], Latvia, Lithuania, Luxembourg, Mexico, Netherlands, New Zealand, Norway, Poland, Portugal, Slovak Republic, Slovenia, Spain, Sweden, Switzerland, Turkey, United Kingdom, United States”

[193] Book: Beyond Economic Growth: An Introduction to Sustainable Development (2nd edition). By Tatyana P. Soubbotina. World Bank, 2004. <documents.worldbank.org>

Pages 132–133:

Developed countries (industrial countries, industrially advanced countries). High-income countries, in which most people have a high standard of living. Sometimes also defined as countries with a large stock of physical capital, in which most people undertake highly specialized activities. According to the World Bank classification, these include all high-income economies except Hong Kong (China), Israel, Kuwait, Singapore, and the United Arab Emirates. Depending on who defines them, developed countries may also include middle-income countries with transition economies, because these countries are highly industrialized. Developed countries contain about 15 percent of the world’s population. They are also sometimes referred to as “the North.”

Page 141:

Organisation for Economic Cooperation and Development (OECD). An organization that coordinates policy mostly among developed countries. OECD member countries exchange economic data and create unified policies to maximize their countries’ economic growth and help nonmember countries develop more rapidly. The OECD arose from the Organisation for European Economic Cooperation (OEEC), which was created in 1948 to administer the Marshall Plan in Europe. In 1960, when the Marshall Plan was completed, Canada, Spain, and the United States joined OEEC members to form the OECD.

[194] Dataset: “Table 602.20. Average Fourth-Grade Scores and Annual Instructional Time in Mathematics and Science, by Country or Other Education System: 2015.” U.S. Department of Education, National Center for Education Statistics, December 2016. <nces.ed.gov>

NOTE: As of August 5, 2023, this is the latest available data.

[195] Dataset: “Table 602.40. Average Reading Literacy, Mathematics Literacy, and Science Literacy Scores of 15-Year-Old Students, by Sex and Country or Other Education System: Selected Years, 2009 Through 2018.” U.S. Department of Education, National Center for Education Statistics, December 2019. <nces.ed.gov>

NOTE: As of August 5, 2023, this is the latest available data.

[196] Calculated with the dataset: “Table 602.10. Average Reading Literacy Scale Scores of Fourth-Graders and Percentage Distribution, by International Benchmark Level and Country or Other Education System: Selected Years, 2001 Through 2016.” U.S. Department of Education, National Center for Education Statistics, November 2017. <nces.ed.gov>

NOTES:

  • As of August 5, 2023, this is the latest available data.
  • An Excel file containing the data and calculations is available upon request.

[197] Calculated with the dataset: “Table 602.40. Average Reading Literacy, Mathematics Literacy, and Science Literacy Scores of 15-Year-Old Students, by Sex and Country or Other Education System: Selected Years, 2009 Through 2018.” U.S. Department of Education, National Center for Education Statistics, December 2019. <nces.ed.gov>

NOTES:

  • As of August 5, 2023, this is the latest available data.
  • Data for Spain was unavailable for 2018.
  • An Excel file containing the data and calculations is available upon request.

[198] Webpage: “Our Global Reach.” Organization for Economic Cooperation and Development. Accessed June 9, 2023 at <www.oecd.org>

“Australia, Austria, Belgium, Canada, Chile, Columbia, Costa Rica, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Israel, Italy, Japan, Korea [South], Latvia, Lithuania, Luxembourg, Mexico, Netherlands, New Zealand, Norway, Poland, Portugal, Slovak Republic, Slovenia, Spain, Sweden, Switzerland, Turkey, United Kingdom, United States”

[199] Book: Beyond Economic Growth: An Introduction to Sustainable Development (2nd edition). By Tatyana P. Soubbotina. World Bank, 2004. <documents.worldbank.org>

Pages 132–133:

Developed Countries (Industrial Countries, Industrially Advanced Countries). High-income countries, in which most people have a high standard of living. Sometimes also defined as countries with a large stock of physical capital, in which most people undertake highly specialized activities. According to the World Bank classification, these include all high-income economies except Hong Kong (China), Israel, Kuwait, Singapore, and the United Arab Emirates. Depending on who defines them, developed countries may also include middle-income countries with transition economies, because these countries are highly industrialized. Developed countries contain about 15 percent of the world’s population. They are also sometimes referred to as “the North.”

Page 141:

Organisation for Economic Cooperation and Development (OECD). An organization that coordinates policy mostly among developed countries. OECD member countries exchange economic data and create unified policies to maximize their countries’ economic growth and help nonmember countries develop more rapidly. The OECD arose from the Organisation for European Economic Cooperation (OEEC), which was created in 1948 to administer the Marshall Plan in Europe. In 1960, when the Marshall Plan was completed, Canada, Spain, and the United States joined OEEC members to form the OECD.

[200] Dataset: “Table 602.10. Average Reading Literacy Scale Scores of Fourth-Graders and Percentage Distribution, by International Benchmark Level and Country or Other Education System: Selected Years, 2001 Through 2016.” U.S. Department of Education, National Center for Education Statistics, November 2017. <nces.ed.gov>

NOTE: As of August 5, 2023, this is the latest available data.

[201] Dataset: “Table 602.40. Average Reading Literacy, Mathematics Literacy, and Science Literacy Scores of 15-Year-Old Students, by Sex and Country or Other Education System: Selected Years, 2009 Through 2018.” U.S. Department of Education, National Center for Education Statistics, December 2019. <nces.ed.gov>

NOTES:

  • As of August 5, 2023, this is the latest available data.
  • Data for Spain was unavailable for 2018.

[202] Article: “U.S. Education Spending Tops Global List, Study Shows.” Associated Press, June 25, 2013. <www.cbsnews.com>

‘When people talk about other countries out-educating the United States, it needs to be remembered that those other nations are out-investing us in education as well,’ said Randi Weingarten, president of the American Federation of Teachers, a labor union.”

[203] Calculated with data from:

a) Dataset: “Table 605.10. Gross Domestic Product Per Capita and Public and Private Education Expenditures Per Full-Time-Equivalent (FTE) Student, by Level of Education and Country: Selected Years, 2005 Through 2013.” U.S. Department of Education, National Center for Education Statistics, January 2017. <nces.ed.gov>

“Data adjusted to U.S. dollars using the purchasing power parity (PPP) index.”

b) Dataset: “Table 602.40. Average Reading Literacy, Mathematics Literacy, and Science Literacy Scores of 15-Year-Old Students, by Sex and Country or Other Education System: 2009, 2012, and 2015.” U.S. Department of Education, National Center for Education Statistics, January 2017. <nces.ed.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • Spending data for Canada and Greece were unavailable.

[204] Webpage: “List of OECD Member Countries—Ratification of the Convention on the OECD.” Organization for Economic Cooperation and Development. Accessed December 16, 2016 at <bit.ly>

“Australia, Austria, Belgium, Canada, Chile, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Israel, Italy, Japan, Korea [South], Latvia, Luxembourg, Mexico, Netherlands, New Zealand, Norway, Poland, Portugal, Slovak Republic, Slovenia, Spain, Sweden, Switzerland, Turkey, United Kingdom, United States”

[205] Book: Beyond Economic Growth: An Introduction to Sustainable Development (2nd edition). By Tatyana P. Soubbotina. World Bank, 2004. <documents.worldbank.org>

Pages 132–133:

Developed Countries (Industrial Countries, Industrially Advanced Countries). High-income countries, in which most people have a high standard of living. Sometimes also defined as countries with a large stock of physical capital, in which most people undertake highly specialized activities. According to the World Bank classification, these include all high-income economies except Hong Kong (China), Israel, Kuwait, Singapore, and the United Arab Emirates. Depending on who defines them, developed countries may also include middle-income countries with transition economies, because these countries are highly industrialized. Developed countries contain about 15 percent of the world’s population. They are also sometimes referred to as “the North.”

Page 141:

Organisation for Economic Cooperation and Development (OECD). An organization that coordinates policy mostly among developed countries. OECD member countries exchange economic data and create unified policies to maximize their countries’ economic growth and help nonmember countries develop more rapidly. The OECD arose from the Organisation for European Economic Cooperation (OEEC), which was created in 1948 to administer the Marshall Plan in Europe. In 1960, when the Marshall Plan was completed, Canada, Spain, and the United States joined OEEC members to form the OECD.

[206] Calculated with data from:

a) Dataset: “Table 605.10. Gross Domestic Product Per Capita and Public and Private Education Expenditures Per Full-Time-Equivalent (FTE) Student, by Level of Education and Country: Selected Years, 2005 Through 2013.” U.S. Department of Education, National Center for Education Statistics, January 2017. <nces.ed.gov>

“Data adjusted to U.S. dollars using the purchasing power parity (PPP) index.”

b) Dataset: “Table 602.40. Average Reading Literacy, Mathematics Literacy, and Science Literacy Scores of 15-Year-Old Students, by Sex and Country or Other Education System: 2009, 2012, and 2015.” U.S. Department of Education, National Center for Education Statistics, January 2017. <nces.ed.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • Spending data for Canada and Greece were unavailable.

[207] Webpage: “List of OECD Member Countries—Ratification of the Convention on the OECD.” Organization for Economic Cooperation and Development. Accessed December 16, 2016 at <bit.ly>

“Australia, Austria, Belgium, Canada, Chile, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Israel, Italy, Japan, Korea [South], Latvia, Luxembourg, Mexico, Netherlands, New Zealand, Norway, Poland, Portugal, Slovak Republic, Slovenia, Spain, Sweden, Switzerland, Turkey, United Kingdom, United States”

[208] Book: Beyond Economic Growth: An Introduction to Sustainable Development (2nd edition). By Tatyana P. Soubbotina. World Bank, 2004. <documents.worldbank.org>

Pages 132–133:

Developed Countries (Industrial Countries, Industrially Advanced Countries). High-income countries, in which most people have a high standard of living. Sometimes also defined as countries with a large stock of physical capital, in which most people undertake highly specialized activities. According to the World Bank classification, these include all high-income economies except Hong Kong (China), Israel, Kuwait, Singapore, and the United Arab Emirates. Depending on who defines them, developed countries may also include middle-income countries with transition economies, because these countries are highly industrialized. Developed countries contain about 15 percent of the world’s population. They are also sometimes referred to as “the North.”

Page 141:

Organisation for Economic Cooperation and Development (OECD). An organization that coordinates policy mostly among developed countries. OECD member countries exchange economic data and create unified policies to maximize their countries’ economic growth and help nonmember countries develop more rapidly. The OECD arose from the Organisation for European Economic Cooperation (OEEC), which was created in 1948 to administer the Marshall Plan in Europe. In 1960, when the Marshall Plan was completed, Canada, Spain, and the United States joined OEEC members to form the OECD.

[209] “Nineteenth Annual Report of the Board of Education of Jersey City, N.J. for the Year Ending November 30, 1885.” Jersey City Board of Education, Department of Public Instruction. Sunday Tattler Print, 1886. <www.google.com>

Page 8:

Cost Per Pupil, Based on Average Attendance, Average Register, Total Enrollment, For the Year …

For all the schools

Average Attendance [=] 14,926

Average Register [=] 16,186

Total Enrollment [=] 24,446

Cost per Pupil, Based on Average Attendance [=] $13.24

Cost per Pupil, Based on Average Register [=] $12.21

Cost per Pupil, Based on Average Total Enrollment [=] $8.09

[210] Webpage: “Consumer Price Index (Estimate) 1800–.” Federal Reserve Bank of Minneapolis. Accessed June 9, 2023 at <www.minneapolisfed.org>

The data below use 1967 as the index (1967=100). With the caveat that data before 1913 should be considered estimates, you can use the Annual Average Index numbers below (center column) to make manual calculations of inflation over time. To find out how much a price in Year 1 would be in Year 2 dollars:

Year 2 Price = Year 1 Price x (Year 2 CPI/Year 1 CPI)

Annual Average Index … 1885 [=] 27 … 2021 [=] 814.3

CALCULATION: $13.24 × 814.3 / 27 = $399

[211] Calculated with data from the footnote above and the dataset: “Table 236.55. Total and Current Expenditures Per Pupil in Public Elementary and Secondary Schools: Selected Years, 1919–20 Through 2019–20.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

“Expenditure per pupil in fall enrollment … Total expenditure3 … 2019–20 … Unadjusted dollars [=] 15,518 … Constant 2020–21 dollars [=] 17,013”

CALCULATION: $17,013 / $399 = 43

[212] “Nineteenth Annual Report of the Board of Education of Jersey City, N.J. for the Year Ending November 30, 1885.” By the Jersey City Board of Education, Department of Public Instruction. Sunday Tattler Print, 1886. <www.google.com>

Pages 19–23:

The following is the list of questions used at the last examination for entrance to the High School, and the names and ranks of the successful candidates….

1885 High School Entrance Exam, Page 1
1885 High School Entrance Exam, Page 2
1885 High School Entrance Exam, Page 3
1885 High School Entrance Exam, Page 4
1885 High School Entrance Exam, Page 5
1885 High School Entrance Exam, Page 6
1885 High School Entrance Exam, Page 7
1885 High School Entrance Exam, Page 8
1885 High School Entrance Exam, Page 9

Page 28:

The rules for the Government of the High School provide that all examinations for admission shall be in writing, and shall be conducted by the Principal and Assistant Teachers of the High School under the supervision of the Committee on High School and the School Superintendent. The Committee shall fix a standard for all examinations, which shall not be less than 75 percent of the maximum credits attainable.

[213] “Glossary: Integrated Postsecondary Education Data System.” U.S. Department of Education, National Center for Education Statistics. Accessed June 9, 2023 at <surveys.nces.ed.gov>

Private for-profit institution A private institution in which the individual(s) or agency in control receives compensation other than wages, rent, or other expenses for the assumption of risk.

Private not-for-profit institution A private institution in which the individual(s) or agency in control receives no compensation, other than wages, rent, or other expenses for the assumption of risk. These include both independent not-for-profit schools and those affiliated with a religious organization.

Public institution An educational institution whose programs and activities are operated by publicly elected or appointed school officials and which is supported primarily by public funds.

[214] Table constructed with data from:

a) Dataset: “Table 334.10. Total Expenditures of Public Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: 2009–10 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

b) Dataset: “Table 334.30. Total Expenditures of Private Nonprofit Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: Selected Years, 1999–2000 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

c) Dataset: “Table 334.50. Total Expenditures of Private For-Profit Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: Selected Years, 1999–2000 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

[215] Calculated with data from:

a) Dataset: “Table 334.10. Total Expenditures of Public Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: 2009–10 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

b) Dataset: “Table 334.30. Total Expenditures of Private Nonprofit Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: Selected Years, 1999–2000 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

c) Dataset: “Table 334.50. Total Expenditures of Private For-Profit Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: Selected Years, 1999–2000 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

NOTE: An Excel file containing the data and calculations is available upon request.

[216] “Glossary: Integrated Postsecondary Education Data System.” U.S. Department of Education, National Center for Education Statistics. Accessed June 14, 2023 at <surveys.nces.ed.gov>

Instruction A functional expense category that includes expenses of the colleges, schools, departments, and other instructional divisions of the institution and expenses for departmental research and public service that are not separately budgeted. Includes general academic instruction, occupational and vocational instruction, community education, preparatory and adult basic education, and regular, special, and extension sessions. Also includes expenses for both credit and noncredit activities. Excludes expenses for academic administration where the primary function is administration (for example, academic deans). Information technology expenses related to instructional activities if the institution separately budgets and expenses information technology resources are included (otherwise these expenses are included in academic support). Institutions include actual or allocated costs for operation and maintenance of plant, interest, and depreciation.

[217] “Glossary: Integrated Postsecondary Education Data System.” U.S. Department of Education, National Center for Education Statistics. Accessed June 14, 2023 at <surveys.nces.ed.gov>

Research A functional expense category that includes expenses for activities specifically organized to produce research outcomes and commissioned by an agency either external to the institution or separately budgeted by an organizational unit within the institution. The category includes institutes and research centers, and individual and project research. This function does not include nonresearch sponsored programs (for example, training programs). Also included are information technology expenses related to research activities if the institution separately budgets and expenses information technology resources (otherwise these expenses are included in academic support.) Institutions include actual or allocated costs for operation and maintenance of plant, interest, and depreciation.

[218] “Glossary: Integrated Postsecondary Education Data System.” U.S. Department of Education, National Center for Education Statistics. Accessed June 14, 2023 at <surveys.nces.ed.gov>

Public service A functional expense category that includes expenses for activities established primarily to provide noninstructional services beneficial to individuals and groups external to the institution. Examples are conferences, institutes, general advisory service, reference bureaus, and similar services provided to particular sectors of the community. This function includes expenses for community services, cooperative extension services, and public broadcasting services. Also includes information technology expenses related to the public service activities if the institution separately budgets and expenses information technology resources (otherwise these expenses are included in academic support). Institutions include actual or allocated costs for operation and maintenance of plant, interest, and depreciation.

[219] “Glossary: Integrated Postsecondary Education Data System.” U.S. Department of Education, National Center for Education Statistics. Accessed June 14, 2023 at <surveys.nces.ed.gov>

Academic support A functional expense category that includes expenses of activities and services that support the institution’s primary missions of instruction, research, and public service. It includes the retention, preservation, and display of educational materials (for example, libraries, museums, and galleries); organized activities that provide support services to the academic functions of the institution (such as a demonstration school associated with a college of education or veterinary and dental clinics if their primary purpose is to support the instructional program); media such as audiovisual services; academic administration (including academic deans but not department chairpersons); and formally organized and separately budgeted academic personnel development and course and curriculum development expenses. Also included are information technology expenses related to academic support activities; if an institution does not separately budget and expense information technology resources, the costs associated with the three primary programs will be applied to this function and the remainder to institutional support. Institutions include actual or allocated costs for operation and maintenance of plant, interest, and depreciation.

[220] “Glossary: Integrated Postsecondary Education Data System.” U.S. Department of Education, National Center for Education Statistics. Accessed June 14, 2023 at <surveys.nces.ed.gov>

Student services A functional expense category that includes expenses for admissions, registrar activities, and activities whose primary purpose is to contribute to students emotional and physical well-being and to their intellectual, cultural, and social development outside the context of the formal instructional program. Examples include student activities, cultural events, student newspapers, intramural athletics, student organizations, supplemental instruction outside the normal administration, and student records. Intercollegiate athletics and student health services may also be included except when operated as self-supporting auxiliary enterprises. Also may include information technology expenses related to student service activities if the institution separately budgets and expenses information technology resources (otherwise these expenses are included in institutional support.) Institutions include actual or allocated costs for operation and maintenance of plant, interest, and depreciation.

[221] “Glossary: Integrated Postsecondary Education Data System.” U.S. Department of Education, National Center for Education Statistics. Accessed June 14, 2023 at <surveys.nces.ed.gov>

Institutional support A functional expense category that includes expenses for the day-to-day operational support of the institution. Includes expenses for general administrative services, central executive-level activities concerned with management and long range planning, legal and fiscal operations, space management, employee personnel and records, logistical services such as purchasing and printing, and public relations and development. Also includes information technology expenses related to institutional support activities. If an institution does not separately budget and expense information technology resources, the IT costs associated with student services and operation and maintenance of plant will also be applied to this function.

[222] “Glossary: Integrated Postsecondary Education Data System.” U.S. Department of Education, National Center for Education Statistics. Accessed June 14, 2023 at <surveys.nces.ed.gov>

Hospital services Expenses associated with a hospital operated by the postsecondary institution (but not as a component unit) and reported as a part of the institution. This classification includes nursing expenses, other professional services, general services, administrative services, and fiscal services. Also included are information technology expenses, actual or allocated costs for operation and maintenance of plant, interest and depreciation related to hospital capital assets.

Hospitals (revenues) Revenues generated by a hospital operated by the postsecondary institution. Includes gifts, grants, appropriations, research revenues, endowment income, and revenues of health clinics that are part of the hospital unless such clinics are part of the student health services program. Sales and service revenues are included net of patient contractual allowances. Revenues associated with the medical school are included elsewhere. Also includes all amounts appropriated by governments (federal, state, local) for the operation of hospitals.

[223] “Glossary: Integrated Postsecondary Education Data System.” U.S. Department of Education, National Center for Education Statistics. Accessed June 14, 2023 at <surveys.nces.ed.gov>

Auxiliary enterprises expenses Expenses for essentially self-supporting operations of the institution that exist to furnish a service to students, faculty, or staff, and that charge a fee that is directly related to, although not necessarily equal to, the cost of the service. Examples are residence halls, food services, student health services, intercollegiate athletics (only if essentially self-supporting), college unions, college stores, faculty and staff parking, and faculty housing. Institutions include actual or allocated costs for operation and maintenance of plant, interest and depreciation.

Auxiliary enterprises revenues Revenues generated by or collected from the auxiliary enterprise operations of the institution that exist to furnish a service to students, faculty, or staff, and that charge a fee that is directly related to, although not necessarily equal to, the cost of the service. Auxiliary enterprises are managed as essentially self-supporting activities. Examples are residence halls, food services, student health services, intercollegiate athletics, college unions, college stores, and movie theaters.

[224] “Glossary: Integrated Postsecondary Education Data System.” U.S. Department of Education, National Center for Education Statistics. Accessed June 14, 2023 at <surveys.nces.ed.gov>

Net grant aid to students (expenses) The portion of scholarships and fellowships granted by an institution that exceeds the amount applied to institutional charges such as tuition and fees or room and board. The amount reported as expense excludes allowances.”

[225] “Glossary: Integrated Postsecondary Education Data System.” U.S. Department of Education, National Center for Education Statistics. Accessed June 14, 2023 at <surveys.nces.ed.gov>

Independent operations Expenses associated with operations that are independent of or unrelated to the primary missions of the institution (i.e., instruction, research, public service) although they may contribute indirectly to the enhancement of these programs. This category is generally limited to expenses of a major federally funded research and development center. Also includes information technology expenses, actual or allocated costs for operation and maintenance of plant, interest and depreciation related to the independent operations. Expenses of operations owned and managed as investments of the institution’s endowment funds are excluded.

Independent operations (revenues) Revenues associated with operations independent of or unrelated to the primary missions of the institution (i.e., instruction, research, public service) although they may contribute indirectly to the enhancement of these programs. Generally includes only those revenues associated with major federally funded research and development centers. Net profit (or loss) from operations owned and managed as investments of the institution’s endowment funds is excluded.

[226] Table constructed with data from:

a) Dataset: “Table 334.10. Total Expenditures of Public Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: 2009–10 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

b) Dataset: “Table 334.30. Total Expenditures of Private Nonprofit Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: Selected Years, 1999–2000 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

c) Dataset: “Table 334.50. Total Expenditures of Private For-Profit Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: Selected Years, 1999–2000 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

[227] Form: “2020–21 Survey Materials: Finance for Degree-Granting Public Institutions Using GASB [Governmental Accounting Standards Board] Reporting Standards.” National Center for Education Statistics, December 9, 2021. <nces.ed.gov>

Page 33 (of PDF):

Part C-1—Expenses and Other Deductions: Functional Classification

This part is intended to collect expenses by function. …

Include all operating expenses and nonoperating expenses and deductions. … Included are the costs incurred for salaries and wages, goods, and other services used in the conduct of the institution’s operations. Not included is the acquisition cost of capital assets, such as equipment and library books, to the extent the assets are capitalized under the institution’s capitalization policy. …

Operation and maintenance of plant is no longer reported as a separate functional expense category. Instead these expenses are to be distributed among the other functional expense categories.

Page 39 (of PDF):

Capital Assets

Tangible or intangible assets that are capitalized under an institution’s capitalization policy; some of these assets are subject to depreciation and some are not. These assets consist of land and land improvements, buildings, building improvements, machinery, equipment, infrastructure, and all other assets that are used in operations and that have initial useful lives extending beyond one year. Capital assets also include collections of works of art and historical treasure and library collections; however under certain conditions such collections may not be capitalized. They also include property acquired under capital leases and intangible assets such as patents, copyrights, trademarks, goodwill, and software. Excluded are assets that are part of endowment funds or other capital fund investments in real estate.

[228] Dataset: “Table 3.16. Government Current Expenditures by Function.” U.S. Department of Commerce, Bureau of Economic Analysis. Last revised October 18, 2022. <apps.bea.gov>

Line 32: “Education … Higher … 2021 [=] $232.1 [billions dollars]”

[229] Email from the U.S. Bureau of Economic Analysis to Just Facts, June 19, 2015.

As of July 2013, research expenditures are included in the NIPAs [National Income and Product Accounts] as investment. Research conducted by public universities and funded by the federal government or private organizations (or non-profits) are represented as sales of a service for the state and local sector and investment by the federal or private sector. Research funded by a state or local government is recognized as state and local investment, most of it classified as own-account investment.

[230] Email from the U.S. Bureau of Economic Analysis to Just Facts, July 5, 2019

“Line 32 for Higher education does not include spending for university hospitals. That is captured in the health function on line 28.”

[231] Email from the U.S. Bureau of Economic Analysis to Just Facts, June 15, 2015.

“Federal government outlays on loans, including student loans, are typically excluded from BEA’s [Bureau of Economic Analysis] estimates of federal expenditures. Unlike similar estimates—notably, estimates of spending on student loans in the federal budget—BEA excludes both the loan amounts and the subsidy costs of those loans from our estimates.”

[232] Calculated with data from:

a) Dataset: “Table 3.16. Government Current Expenditures by Function [Billions of Dollars].” U.S. Department of Commerce, Bureau of Economic Analysis. Last revised October 18, 2022. <apps.bea.gov>

“Government1 … Education … Higher … 2021 [=] 232.1”

b) Dataset: “HH-1. Households by Type: 1940 to Present.” U.S. Census Bureau, Current Population Survey, November 2021. <www.census.gov>

“Year …. 2021r …. Total households [thousands] [=] 129,224”

CALCULATION: $232,100,000,000 / 129,224,000 = $1,796

[233] Calculated with data from:

a) Dataset: “Table 334.10. Total Expenditures of Public Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: 2009–10 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

b) Dataset: “Table 334.30. Total Expenditures of Private Nonprofit Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: Selected Years, 1999–2000 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

c) Dataset: “Table 334.50. Total Expenditures of Private For-Profit Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: Selected Years, 1999–2000 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

d) Dataset: “Table 3.16. Government Current Expenditures by Function.” U.S. Department of Commerce, Bureau of Economic Analysis. Last revised October 18, 2022. <apps.bea.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • Functions that contribute directly to the education of students and the general public include instruction, public service, and academic support. See the next footnote for definitions of these functions.
  • For private for-profit colleges, the National Center for Education Statistics does not segregate spending on: (a) public service from research and (b) academic support from institutional support and student services. To estimate spending on these individual functions for private for-profit colleges, Just Facts averaged the respective ratios on these functions from public and private non-profit colleges and applied these ratios to private for-profit colleges.

[234] “Glossary: Integrated Postsecondary Education Data System.” U.S. Department of Education, National Center for Education Statistics. Accessed June 14, 2023

at <surveys.nces.ed.gov>

Instruction A functional expense category that includes expenses of the colleges, schools, departments, and other instructional divisions of the institution and expenses for departmental research and public service that are not separately budgeted. Includes general academic instruction, occupational and vocational instruction, community education, preparatory and adult basic education, and regular, special, and extension sessions. Also includes expenses for both credit and non-credit activities. Excludes expenses for academic administration where the primary function is administration (for example, academic deans). Information technology expenses related to instructional activities if the institution separately budgets and expenses information technology resources are included (otherwise these expenses are included in academic support). Institutions include actual or allocated costs for operation and maintenance of plant, interest, and depreciation.

Public service A functional expense category that includes expenses for activities established primarily to provide noninstructional services beneficial to individuals and groups external to the institution. Examples are conferences, institutes, general advisory service, reference bureaus, and similar services provided to particular sectors of the community. This function includes expenses for community services, cooperative extension services, and public broadcasting services. Also includes information technology expenses related to the public service activities if the institution separately budgets and expenses information technology resources (otherwise these expenses are included in academic support). Institutions include actual or allocated costs for operation and maintenance of plant, interest, and depreciation.

Academic support A functional expense category that includes expenses of activities and services that support the institution’s primary missions of instruction, research, and public service. It includes the retention, preservation, and display of educational materials (for example, libraries, museums, and galleries); organized activities that provide support services to the academic functions of the institution (such as a demonstration school associated with a college of education or veterinary and dental clinics if their primary purpose is to support the instructional program); media such as audiovisual services; academic administration (including academic deans but not department chairpersons); and formally organized and separately budgeted academic personnel development and course and curriculum development expenses. Also included are information technology expenses related to academic support activities; if an institution does not separately budget and expense information technology resources, the costs associated with the three primary programs will be applied to this function and the remainder to institutional support. Institutions include actual or allocated costs for operation and maintenance of plant, interest, and depreciation.

[235] Form: “2020–21 Survey Materials: Finance for Degree-Granting Public Institutions Using GASB [Governmental Accounting Standards Board] Reporting Standards.” National Center for Education Statistics, December 9, 2021. <nces.ed.gov>

Page 33 (of PDF):

Part C-1—Expenses and Other Deductions: Functional Classification

This part is intended to collect expenses by function. …

Include all operating expenses and nonoperating expenses and deductions. … Included are the costs incurred for salaries and wages, goods, and other services used in the conduct of the institution’s operations. Not included is the acquisition cost of capital assets, such as equipment and library books, to the extent the assets are capitalized under the institution’s capitalization policy. …

Operation and maintenance of plant is no longer reported as a separate functional expense category. Instead these expenses are to be distributed among the other functional expense categories. …

01 Instruction – Expenses of the colleges, schools, departments, and other instructional divisions of the institution and expenses for departmental research and public service that are not separately budgeted should be included in this classification. Include expenses for both credit and noncredit activities. Exclude expenses for academic administration where the primary function is administration (for example, academic deans); such expenses should be reported on line 05. The instruction category includes academic instruction, occupational and vocational instruction, community education, preparatory and adult basic education, and remedial and tutorial instruction conducted by the teaching faculty for the institution’s students. …

03 Public service – Report expenses for all activities budgeted specifically for public service and for activities established primarily to provide noninstructional services beneficial to groups external to the institution. Examples are seminars and projects provided to particular sectors of the community. Include expenditures for community services and cooperative extension services.

05 Academic support – This category includes expenses for the support services that are an integral part of the institution’s primary missions of instruction, research, and public service. Include expenses for museums, libraries, galleries, audio/visual services, ancillary support, academic administration, personnel development, and course and curriculum development. Include expenses for veterinary and dental clinics if their primary purpose is to support the institutional program.

Page 39 (of PDF):

Capital Assets

Tangible or intangible assets that are capitalized under an institution’s capitalization policy; some of these assets are subject to depreciation and some are not. These assets consist of land and land improvements, buildings, building improvements, machinery, equipment, infrastructure, and all other assets that are used in operations and that have initial useful lives extending beyond one year. Capital assets also include collections of works of art and historical treasure and library collections; however under certain conditions such collections may not be capitalized. They also include property acquired under capital leases and intangible assets such as patents, copyrights, trademarks, goodwill, and software. Excluded are assets that are part of endowment funds or other capital fund investments in real estate.

[236] “Glossary: Integrated Postsecondary Education Data System.” U.S. Department of Education, National Center for Education Statistics. Accessed June 14, 2023 at <surveys.nces.ed.gov>

Student services A functional expense category that includes expenses for admissions, registrar activities, and activities whose primary purpose is to contribute to students emotional and physical well-being and to their intellectual, cultural, and social development outside the context of the formal instructional program. Examples include student activities, cultural events, student newspapers, intramural athletics, student organizations, supplemental instruction outside the normal administration, and student records. Intercollegiate athletics and student health services may also be included except when operated as self-supporting auxiliary enterprises. Also may include information technology expenses related to student service activities if the institution separately budgets and expenses information technology resources (otherwise these expenses are included in institutional support.) Institutions include actual or allocated costs for operation and maintenance of plant, interest, and depreciation.

Institutional support A functional expense category that includes expenses for the day-to-day operational support of the institution. Includes expenses for general administrative services, central executive-level activities concerned with management and long range planning, legal and fiscal operations, space management, employee personnel and records, logistical services such as purchasing and printing, and public relations and development. Also includes information technology expenses related to institutional support activities. If an institution does not separately budget and expense information technology resources, the IT costs associated with student services and operation and maintenance of plant will also be applied to this function.

Auxiliary enterprises expenses Expenses for essentially self-supporting operations of the institution that exist to furnish a service to students, faculty, or staff, and that charge a fee that is directly related to, although not necessarily equal to, the cost of the service. Examples are residence halls, food services, student health services, intercollegiate athletics (only if essentially self-supporting), college unions, college stores, faculty and staff parking, and faculty housing. Institutions include actual or allocated costs for operation and maintenance of plant, interest and depreciation.

Auxiliary enterprises revenues Revenues generated by or collected from the auxiliary enterprise operations of the institution that exist to furnish a service to students, faculty, or staff, and that charge a fee that is directly related to, although not necessarily equal to, the cost of the service. Auxiliary enterprises are managed as essentially self-supporting activities. Examples are residence halls, food services, student health services, intercollegiate athletics, college unions, college stores, and movie theaters.

[237] Form: “2020–21 Survey Materials: Finance for Degree-Granting Public Institutions Using GASB [Governmental Accounting Standards Board] Reporting Standards.” National Center for Education Statistics, December 9, 2021. <nces.ed.gov>

Page 33 (of PDF):

Part C-1—Expenses and Other Deductions: Functional Classification

This part is intended to collect expenses by function. …

Include all operating expenses and nonoperating expenses and deductions. … Included are the costs incurred for salaries and wages, goods, and other services used in the conduct of the institution’s operations. Not included is the acquisition cost of capital assets, such as equipment and library books, to the extent the assets are capitalized under the institution’s capitalization policy. …

Operation and maintenance of plant is no longer reported as a separate functional expense category. Instead these expenses are to be distributed among the other functional expense categories. …

01 Instruction – Expenses of the colleges, schools, departments, and other instructional divisions of the institution and expenses for departmental research and public service that are not separately budgeted should be included in this classification. Include expenses for both credit and noncredit activities. Exclude expenses for academic administration where the primary function is administration (for example, academic deans); such expenses should be reported on line 05. The instruction category includes academic instruction, occupational and vocational instruction, community education, preparatory and adult basic education, and remedial and tutorial instruction conducted by the teaching faculty for the institution’s students. …

03 Public service – Report expenses for all activities budgeted specifically for public service and for activities established primarily to provide noninstructional services beneficial to groups external to the institution. Examples are seminars and projects provided to particular sectors of the community. Include expenditures for community services and cooperative extension services.

05 Academic support – This category includes expenses for the support services that are an integral part of the institution’s primary missions of instruction, research, and public service. Include expenses for museums, libraries, galleries, audio/visual services, ancillary support, academic administration, personnel development, and course and curriculum development. Include expenses for veterinary and dental clinics if their primary purpose is to support the institutional program.

Page 39 (of PDF):

Capital Assets

Tangible or intangible assets that are capitalized under an institution’s capitalization policy; some of these assets are subject to depreciation and some are not. These assets consist of land and land improvements, buildings, building improvements, machinery, equipment, infrastructure, and all other assets that are used in operations and that have initial useful lives extending beyond one year. Capital assets also include collections of works of art and historical treasure and library collections; however under certain conditions such collections may not be capitalized. They also include property acquired under capital leases and intangible assets such as patents, copyrights, trademarks, goodwill, and software. Excluded are assets that are part of endowment funds or other capital fund investments in real estate.

[238] Calculated with data from:

a) Dataset: “Table 3.16. Government Current Expenditures by Function [Billions of Dollars].” U.S. Department of Commerce, Bureau of Economic Analysis. Last revised October 18, 2022. <apps.bea.gov>

b) Dataset: “Table 303.10. Total Fall Enrollment in Degree-Granting Postsecondary Institutions, by Attendance Status, Sex of Student, and Control of Institution: Selected Years, 1947 Through 2031.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

c) Dataset: “CPI—All Urban Consumers (Current Series).” U.S. Department of Labor, Bureau of Labor Statistics. Accessed January 27, 2023 at <www.bls.gov>

“Series Id: CUUR0000SA0; Series Title: All Items in U.S. City Average, All Urban Consumers, Not Seasonally Adjusted; Area: U.S. City Average; Item: All Items; Base Period: 1982–84=100”

NOTE: An Excel file containing the data and calculations is available upon request.

[239] Email from the U.S. Bureau of Economic Analysis to Just Facts, June 19, 2015.

As of July 2013, research expenditures are included in the NIPAs [National Income and Product Accounts] as investment. Research conducted by public universities and funded by the federal government or private organizations (or non-profits) are represented as sales of a service for the state and local sector and investment by the federal or private sector. Research funded by a state or local government is recognized as state and local investment, most of it classified as own-account investment.

[240] Email from the U.S. Bureau of Economic Analysis to Just Facts, July 5, 2019

“Line 32 for Higher education does not include spending for university hospitals. That is captured in the health function on line 28.”

[241] Email from the U.S. Bureau of Economic Analysis to Just Facts, June 15, 2015.

“Federal government outlays on loans, including student loans, are typically excluded from BEA’s [Bureau of Economic Analysis] estimates of federal expenditures. Unlike similar estimates—notably, estimates of spending on student loans in the federal budget—BEA excludes both the loan amounts and the subsidy costs of those loans from our estimates.”

[242] Email from the U.S. Department of Education, National Center for Education Statistics to Just Facts, June 10, 2015.

“In reference to table 330.10,† ‘tuition and fees and room and board rates charged’ refer to the published rates that do not include discounts or student financial aid in its calculation.”

NOTE: † Dataset: “Table 330.10. Average Undergraduate Tuition and Fees and Room and Board Rates Charged for Full-Time Students in Degree-Granting Postsecondary Institutions, by Level and Control of Institution: Selected Years, 1963–64 Through 2012–13.” U.S. Department of Education, National Center for Education Statistics, March 2014. <nces.ed.gov>

[243] Calculated with the dataset: “Table 330.20. Average Undergraduate Tuition and Fees and Room and Board Rates Charged for Full-Time Students in Degree-Granting Postsecondary Institutions, by Control and Level of Institution and State or Jurisdiction: 2019–20 and 2020–21 [in Current Dollars].” U.S. Department Of Education, National Center for Education Statistics, February 2022. <nces.ed.gov>

NOTE: An Excel file containing the data and calculations is available upon request.

[244] Calculated with data from:

a) Dataset: “Table 330.10. Average Undergraduate Tuition and Fees and Room and Board Rates Charged for Full-Time Students in Degree-Granting Postsecondary Institutions, by Level and Control of Institution: Selected Years, 1963–64 Through 2021–22.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

b) Dataset: “Table 236.55. Total and Current Expenditures Per Pupil in Public Elementary and Secondary Schools: Selected Years, 1919–20 Through 2019–20.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

c) Dataset: “Table 106.75. Education Price Indexes: Selected School Years, 1919–20 Through 2020–21.” U.S. Department Of Education, National Center for Education Statistics, April 2022. <nces.ed.gov>

NOTE: An Excel file containing the data and calculations is available upon request.

[245] Calculated with data from:

a) Dataset: “Table 334.10. Total Expenditures of Public Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: 2009–10 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

“Total … Expenditure per full-time-equivalent student in constant 2021–22 dollars13 … 2-year … 2020–21 [=] $21,729”

b) Dataset: “Table 330.20. Average Undergraduate Tuition and Fees and Room and Board Rates Charged for Full-Time Students in Degree-Granting Postsecondary Institutions, by Control and Level of Institution and State or Jurisdiction: 2019–20 and 2020–21 [in Current Dollars].” U.S. Department Of Education, National Center for Education Statistics, February 2022. <nces.ed.gov>

“United States … Public 2-year, tuition and required fees … In-state, 2020–211 [=] $3,501”

CALCULATION: $21,729 / $3,501 = 6.2

NOTE: One of the two datasets above is adjusted for inflation, while the other is not. Since this inflation adjustment is merely for a single year, it does not materially change the ultimate result.

[246] Calculated with data from:

a) Dataset: “Table 334.10. Total Expenditures of Public Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: 2009–10 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

“Total … Expenditure per full-time-equivalent student in constant 2020–21 dollars13 … 2-year … 2020–21 [=] $21,729”

b) Dataset: “Table 330.20. Average Undergraduate Tuition and Fees and Room and Board Rates Charged for Full-Time Students in Degree-Granting Postsecondary Institutions, by Control and Level of Institution and State or Jurisdiction: 2019–20 and 2020–21 [in Current Dollars].” U.S. Department Of Education, National Center for Education Statistics, February 2022. <nces.ed.gov>

“United States … Public 2-year, tuition and required fees … Out-of-state, 2020–21 [=] $8,256”

CALCULATION: $21,729 / $8,256 = 2.6

NOTE: One of the two datasets above is adjusted for inflation, while the other is not. Since this inflation adjustment is merely for a single year, it does not materially change the ultimate result.

[247] Calculated with data from:

a) Dataset: “Table 334.10. Total Expenditures of Public Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: 2009–10 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

“Total … Expenditure per full-time-equivalent student in constant 2020–21 dollars13 … 4-year … 2020–21 [=] $52,896”

b) Dataset: “Table 330.20. Average Undergraduate Tuition and Fees and Room and Board Rates Charged for Full-Time Students in Degree-Granting Postsecondary Institutions, by Control and Level of Institution and State or Jurisdiction: 2019–20 and 2020–21 [in Current Dollars].” U.S. Department Of Education, National Center for Education Statistics, February 2022. <nces.ed.gov>

“United States … Public 4-year … In-state, 2020–21 … Total [=] $21,337”

CALCULATION: $52,896 / $21,337 = 2.5

NOTE: One of the two datasets above is adjusted for inflation, while the other is not. Since this inflation adjustment is merely for a single year, it does not materially change the ultimate result.

[248] Calculated with data from:

a) Dataset: “Table 334.10. Total Expenditures of Public Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: 2009–10 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

“Total … Expenditure per full-time-equivalent student in constant 2020–21 dollars13 … 4-year … 2020–21 [=] $52,896”

b) Dataset: “Table 330.20. Average Undergraduate Tuition and Fees and Room and Board Rates Charged for Full-Time Students in Degree-Granting Postsecondary Institutions, by Control and Level of Institution and State or Jurisdiction: 2019–20 and 2020–21 [in Current Dollars].” U.S. Department Of Education, National Center for Education Statistics, February 2022. <nces.ed.gov>

“United States … Public 4-year … In-state, 2020–21 … Room [=] $6,774 … Board [=] $5,189 … Out-of-state tuition and required fees, 2020–21 [=] $27,091”

CALCULATION: $52,896 / ($6,774 + $5,189 + $27,091) = 1.4

NOTE: One of the two datasets above is adjusted for inflation, while the other is not. Since this inflation adjustment is merely for a single year, it does not materially change the ultimate result.

[249] Calculated with data from:

a) Dataset: “Table 334.30. Total Expenditures of Private Nonprofit Degree-Granting Postsecondary Institutions, by Purpose and Level of Institution: Selected Years, 1999–2000 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

“Total … Expenditure per full-time-equivalent student in constant 2020–21 dollars12 … 4-year … 2020–21 [=] $69,145”

b) ) Dataset: “Table 330.20. Average Undergraduate Tuition and Fees and Room and Board Rates Charged for Full-Time Students in Degree-Granting Postsecondary Institutions, by Control and Level of Institution and State or Jurisdiction: 2019–20 and 2020–21 [in Current Dollars].” U.S. Department Of Education, National Center for Education Statistics, February 2022. <nces.ed.gov>

“United States … Private 4-year … 2020–21 … Total [=] $46,313”

CALCULATION: $69,145 / $46,313 = 1.5

NOTE: One of the two datasets above is adjusted for inflation, while the other is not. Since this inflation adjustment is merely for a single year, it does not materially change the ultimate result.

[250] Report: “Billions of Dollars in Potentially Erroneous Education Credits Continue to Be Claimed for Ineligible Students and Institutions.” Treasury Inspector General for Tax Administration, March 27, 2015. <www.tigta.gov>

Page 1:

Education tax credits help taxpayers offset the costs of higher education and have become an increasingly important component of Federal higher education policy. The amount of education credits individuals claim each year has increased from more than $3 billion for Tax Year 1998 to almost $19 billion for Tax Year 2012. …

The Taxpayer Relief Act of 19973 created two permanent education tax credits, the Hope Credit and the Lifetime Learning Credit. The American Recovery and Reinvestment Act of 20094 temporarily replaced the Hope Credit with a refundable tax credit5 known as the American Opportunity Tax Credit (AOTC). The AOTC was initially set to expire at the end of Calendar Year 2010 but has since been extended through Calendar Year 2017.

Page 20:

To accomplish our objective, we … identified 12,214,137 taxpayers2 on the IRS Individual Return Transaction File3 who claimed education credits for 13,351,478 students on Tax Year4 2012 returns. We verified the accuracy and reliability of the data obtained by comparing 30 tax returns to return information found on the Integrated Data Retrieval System.5 The data were determined to be sufficiently reliable for the purposes of the audit.

[251] Report: “The Alternative Minimum Tax for Individuals: A Growing Burden.” By Kurt Schuler. U.S. Congress, Joint Economic Committee. May 2001.

<www.jec.senate.gov>

Page 3: “A tax credit is a provision that allows a reduction in tax liability by a specific dollar amount, regardless of income. For example, a tax credit of $500 allows both taxpayers with income of $40,000 and those with income of $80,000 to reduce their taxes by $500, if they qualify for the credit.”

[252] Report: “Overview of the Federal Tax System.” By Molly F. Sherlock and Donald J. Marples. Congressional Research Service, November 21, 2014.

<www.fas.org>

Page 7: “If a tax credit is refundable, and the credit amount exceeds tax liability, a taxpayer receives a payment from the government.”

[253] Report: “Options for Reducing the Deficit: 2015 to 2024.” Congressional Budget Office, November 20, 2014. <www.cbo.gov>

Page 38: “Low- and moderate-income people are eligible for certain refundable tax credits under the individual income tax if they meet specified criteria. If the amount of a refundable tax credit exceeds a taxpayer’s tax liability before that credit is applied, the government pays the excess to that person.”

[254] Report: “Existing Compliance Processes Will Not Reduce the Billions of Dollars in Improper Earned Income Tax Credit and Additional Child Tax Credit Payments.” Treasury Inspector General for Tax Administration, September 29, 2014. <www.oversight.gov>

Page 15:

The Internal Revenue Code requires the IRS to process tax returns and pay any related tax refunds within 45 calendar days of receipt of the tax return or the tax return due date, whichever is later. Because of this requirement, the IRS cannot conduct extensive eligibility checks similar to those that occur with other Federal programs that typically certify eligibility prior to the issuance of payments or benefits.

[255] Report: “Individuals Who Are Not Authorized to Work in the United States Were Paid $4.2 Billion in Refundable Credits.” Treasury Inspector General for Tax Administration, July 7, 2011. <www.justfacts.com>

Page 2:

Two of the largest refundable tax credits are the EITC [Earned Income Tax Credit] and the ACTC [Additional Child Tax Credit]. …

The ACTC is the refundable portion of the Child Tax Credit (CTC). The CTC can reduce an individual’s taxes owed by as much as $1,000 for each qualifying child. The ACTC is provided in addition to the CTC to individuals whose taxes owed were less than the amount of CTC they were entitled to claim. The ACTC is always the refundable portion of the CTC, which means an individual claiming the ACTC receives a refund even if no income tax was withheld or paid. As with all refundable credits, the risk of fraud for these types of claims is significant.

[256] Report: “Billions of Dollars in Potentially Erroneous Education Credits Continue to Be Claimed for Ineligible Students and Institutions.” Treasury Inspector General for Tax Administration, March 27, 2015. <www.tigta.gov>

Page 3:

Prior TIGTA [Treasury Inspector General for Tax Administration] Report Raised Concerns with IRS Efforts to Identify and Prevent Erroneous Education Credit Claims

In September 2011,6 we reported that the IRS does not have effective processes to identify taxpayers who claim erroneous education credits; 2.1 million taxpayers received a total of $3.2 billion in education credits ($1.6 billion in refundable credits and $1.6 billion in nonrefundable credits) that appeared to be erroneous. …

This review was performed with information obtained from the Wage and Investment Division office in Atlanta, Georgia, and the Small Business/Self-Employed Division office in Lanham, Maryland, during the period November 2013 through November 2014. We conducted this performance audit in accordance with generally accepted government auditing standards. …

Page 5:

The IRS does not have effective processes to identify erroneous claims for education credits. Although the IRS has taken steps to address some of our recommendations to improve the identification and prevention of erroneous education credit claims, many of the deficiencies we previously identified still exist. Based on our analysis of education credits claimed and received on Tax Year 2012 tax returns, we estimate more than 3.6 million taxpayers (claiming more than 3.8 million students) received more than $5.6 billion ($2.5 billion in refundable credits and $3.1 billion in nonrefundable credits) in potentially erroneous education credits.

Pages 7–8:

Our analysis of Tax Year 2012 tax returns identified the following education credit claims that appear to be erroneous based on IRS records:

• More than 2 million taxpayers (claiming more than 2.2 million students) who received education credits totaling more than $3.2 billion with no Form 1098-T filed with the IRS by a postsecondary educational institution for the student claimed.16

• More than 1.6 million taxpayers (claiming nearly 1.7 million students) who received education credits totaling approximately $2.5 billion for which Department of Education data show the educational institution listed on the Form 8863 was not certified to receive Federal student aid funding, i.e. not an eligible educational institution. To qualify for an education credit, students must attend a postsecondary educational institution that is certified by the Department of Education to receive Federal student aid funding.

• 427,345 taxpayers (claiming 431,622 students) who received AOTCs [American Opportunity Tax Credit] totaling approximately $662 million for students who, based on Forms 1098-T, did not attend school at least half-time as required.17 Students must attend an eligible institution at least half-time to qualify for the AOTC.

Further analysis of the more than 3.6 million taxpayers we identified showed that 765,943 (21 percent) claimed both a student for which the IRS received no Form 1098-T and listed an ineligible institution on their Form 8863. Figure 6 shows the results of our analysis of taxpayers who received education credits for students with no Form 1098-T or who attended an ineligible institution.

Pages 12–13:

Erroneous American Opportunity Tax Credits Are Being Allowed for Students Claimed for More Than Four Years

Analysis of tax return filings between Tax Years 2007 and 2013 identified 1.1 million students for which the AOTC was claimed for more than four years. Our review of a statistically valid sample of 139 of the 1.1 million students identified that 130 (94 percent) students were erroneously claimed for the AOTC.23 Based on the results of our sample, we estimate that more than 1 million (94 percent) of the more than 1.1 million students we identified were used to receive potentially erroneous AOTCs totaling nearly $1.7 billion.24 Specifically, each of these students were claimed in excess of the four-year limit between Tax Years 2007 and 2013. For Tax Year 2012 alone, we estimate that 419,827 students who had already been claimed in four previous tax years were used to receive potentially erroneous AOTCs totaling more than $650 million.

Page 15:

Potentially Erroneous Education Credits Are Being Received for Students Who Are Incarcerated or of Unlikely Ages to Be Eligible

The IRS has yet to establish effective processes to identify taxpayers who claim potentially erroneous education credits for students who are of an unlikely age to pursue postsecondary education or who are incarcerated. Our review identified:

• 39,763 taxpayers who received more than $61.5 million in potentially erroneous education credits as of December 31, 2013, for 43,800 students who were under age 14 or over age 65. For each of these students, the IRS did not receive a Form 1098-T for the student being claimed.

• 2,148 taxpayers who received potentially erroneous education credits totaling approximately $3.9 million for students who were incarcerated for all of Calendar Year 2012. For each of these students, the IRS did not receive a Form 1098-T for the student being claimed.

Education credit requirements do not require a student to be a specific age to qualify for an education credit, nor do they specify that a student must not be incarcerated. However, both conditions call into question the validity of a taxpayer’s education credit claim. For example, students under the age of 14 or over the age of 65 are not likely to be attending a postsecondary educational institution. In addition, individuals who are incarcerated for a full calendar year are unlikely to meet the requirement that they be a taxpayer’s dependent or incur qualifying educational expenses at an eligible educational institution.

Page 20:

Appendix I Detailed Objective, Scope, and Methodology

Our overall objective was to assess the IRS’s efforts to improve the detection and prevention of questionable education credit claims. We conducted follow-up testing to evaluate the effectiveness of the IRS’s actions to address recommendations made in five prior audit reports.1 To accomplish our objective, we: …

B. Identified 12,214,137 taxpayers2 on the IRS Individual Return Transaction File3 who claimed education credits for 13,351,478 students on Tax Year4 2012 returns. We verified the accuracy and reliability of the data obtained by comparing 30 tax returns to return information found on the Integrated Data Retrieval System.5 The data were determined to be sufficiently reliable for the purposes of the audit.

C. Identified 1,167,119 students for which the AOTC was claimed for more than four tax years between Tax Year 2007 and Tax Year 2013.

Page 25: “Figure 1: Computation of the Average Refundable and Nonrefundable Education Credit Received Per Tax Return … Average Credit Per Tax Return [=] $1,548.90”

[257] Report: “Trends in the Student Loan Market.” U.S. Treasury Department, Treasury Borrowing Advisory Committee, November 4, 2014. <home.treasury.gov>

Page 6:

History of the Student Lending Program

• Student loans are used to finance post-secondary education, which is typically targeted for undergraduate and postgraduate education but also can include eligible vocational or trade schools.

• The U.S. government began offering Federal financing for Institutions of Higher Education (IHE) in 1965 with Title IV of the Higher Education Act (HEA).

[258] Report: “Trends in the Student Loan Market.” U.S. Treasury Department, Treasury Borrowing Advisory Committee, November 4, 2014. <home.treasury.gov>

Page 36:

Types of U.S. Federal Student Loans

Direct, Subsidized Loans. Loan is directly administered by the Federal government and offered only to undergraduate students based on financial need. Interest does not accumulate while the borrower remains in school. The interest rate (2014–15) is 4.66% and the maximum loan balance is $23,000.

Direct, Unsubsidized Loans. Loan is directly administered by the Federal government and offered to both undergraduate and graduate students regardless of need. Interest accumulates while the borrower remains in school. The 2014–15 interest rate for undergraduates is 4.66% and for graduate students is 6.21%. For undergraduate students, the maximum loan balance is $31,000 for dependent students (i.e. supported by parents) and the maximum combined balance of subsidized and unsubsidized Federal loans is $57,500 for independent students. Graduate and professional students have a hard cap of $138,500 balance.

Direct PLUS [Parent Loans for Undergraduate Students] Loans. Loan is directly administered by the Federal government and offered to graduate students and the parents of undergraduate students up to the cost of tuition and living expenses, at an interest rate (2014–15) of 7.21%.

Perkins Loans. Loan is administered by the IHE [Institution of Higher Education] /university. Interest does not accumulate while the borrower remains in school. The 2014–15 rate is 5.0%. The aggregate limit is $27,500 for undergraduate and $60,000 for graduate students (inclusive of the $27,500 as an undergraduate).

Page 37:

Terms of U.S. Federal Student Loans

• Pay rate – Interest rates are fixed over the life of the loan, but are based upon the UST10y rate and a fixed spread – 205bp for undergraduate loans, 360bp for graduate loans and 460bp for PLUS loans. The rates are capped at 8.25% (undergraduate), 9.50% (graduate) and 10.50% (PLUS).

• Maturity – The maturity of student loans is typically 10y but can extend to 25y.

• Repayment – For the most part, Federal student loans are similar to auto loans, with a fixed monthly payment of principal and interest over a ten year term.

[259] Article: “The New Math of Student Loans.” By AnnaMaria Andriotis. Wall Street Journal, June 12, 2015. <www.wsj.com>

[For] undergraduate students with creditworthy parents and graduate students with high credit scores—student loans from private lenders … could help them save thousands of dollars over the life of a loan. …

[Federal student loans don’t] reward parents who have high credit scores and are financially comfortable, because every applicant who gets approved for a federal loan—no matter what their credit score is—ends up with the same interest rate.

By contrast, private loan rates are determined based largely on the applicant or parent’s credit as well as documentation of their income. …

… Beginning July 1, interest rates will be 4.29% for [federal] Stafford loans for undergraduates, down from 4.66%. The rate on the federal PLUS [Parent Loans for Undergraduate Students] loan will be 6.84%, down from 7.21%. …

SunTrust Banks, for example, currently is charging fixed interest rates between 4% and 10.5% on its loans, which range from seven to 15 years. …

Citizens Bank, a unit of Citizens Financial Group, charges interest rates of as little as 5.75% for undergraduate fixed-rate loans.

[260] “Quarterly Report on Household Debt and Credit, 2023:Q1.” Federal Reserve Bank of New York, Research and Statistics Group, Microeconomic Studies, May 2023. <www.newyorkfed.org>

Page 3: “Total Debt Balance and Its Composition” <www.newyorkfed.org>

23:Q1 Debt Balance, Trillions of $

Mortgage

12.04

Student Loan

1.60

Auto Loan

1.56

Credit Card

0.99

Other

0.51

HE [Home Equity] Revolving

0.34

Total

17.05

Page 42:

Loan types. In our analysis we distinguish between the following types of accounts: mortgage accounts, home equity revolving accounts, auto loans and leases, bank card accounts, student loans and other loan accounts. Mortgage accounts include all mortgage installment loans, including first mortgages and home equity installment loans (HEL), both of which are closed-end loans. Home Equity Revolving accounts (aka Home Equity Line of Credit or HELOC), unlike home equity installment loans, are home equity loans with a revolving line of credit where the borrower can choose when and how often to borrow up to an updated credit limit. Auto Loans are loans taken out to purchase a car, including leases, provided by automobile dealers and automobile financing companies. Bankcard accounts (or credit card accounts) are revolving accounts for banks, bankcard companies, national credit card companies, credit unions and savings & loan associations. Student Loans include loans to finance educational expenses provided by banks, credit unions and other financial institutions as well as federal and state governments. The Other category includes Consumer Finance (sales financing, personal loans) and Retail (clothing, grocery, department stores, home furnishings, gas etc) loans.

Our analysis excludes authorized user trades, disputed trades, lost/stolen trades, medical trades, child/family support trades, commercial trades and, as discussed above, inactive trades (accounts not reported on within the last 3 months).

[261] Calculated with data from:

a) Dataset: “Federal Student Loan Portfolio by Loan Status.” National Student Loan Data System. Accessed June 21, 2023 at <studentaid.gov>

Tab: “Federally Managed”

b) “Quarterly Report on Household Debt and Credit, 2023:Q1.” Federal Reserve Bank of New York, Research and Statistics Group, Microeconomic Studies, May 2023. <www.newyorkfed.org>

Page 2 (of PDF): “Outstanding student loan debt stood at $1.604 trillion in 2023Q1.”

NOTE: An Excel file containing the data and calculations is available upon request.

[262] Calculated with data from:

a) Dataset: “Federal Student Loan Portfolio by Loan Status.” National Student Loan Data System. Accessed June 21, 2023 at <studentaid.gov>

Tab: “Federally Managed”

b) “Quarterly Report on Household Debt and Credit, 2023:Q1.” Federal Reserve Bank of New York, Research and Statistics Group, Microeconomic Studies, May 2023. <www.newyorkfed.org>

Page 2 (of PDF): “Outstanding student loan debt stood at $1.604 trillion in 2023Q1.”

NOTE: An Excel file containing the data and calculations is available upon request.

[263] Public Law 116-136: “Coronavirus Aid, Relief, and Economic Security Act.” 116th U.S. Congress. Signed into Law by Donald J. Trump on March 27, 2020. <www.congress.gov>

Title III, Part IV, Subtitle B, Section 3513:

Temporary Relief for Federal Student Loan Borrowers.

(a) In General.—The Secretary shall suspend all payments due for loans made under part D and part B (that are held by the Department of Education) of title IV of the Higher Education Act of 1965 … through September 30, 2020.

(b) No Accrual of Interest.—Notwithstanding any other provision of the Higher Education Act of 1965 … interest shall not accrue on a loan described under subsection (a) for which payment was suspended for the period of the suspension.

(c) Consideration of Payments.—Notwithstanding any other provision of the Higher Education Act of 1965 … the Secretary shall deem each month for which a loan payment was suspended under this section as if the borrower of the loan had made a payment for the purpose of any loan forgiveness program or loan rehabilitation program authorized under part D or B of title IV of the Higher Education Act of 1965 … for which the borrower would have otherwise qualified.

(d) Reporting to Consumer Reporting Agencies.—During the period in which the Secretary suspends payments on a loan under subsection (a), the Secretary shall ensure that, for the purpose of reporting information about the loan to a consumer reporting agency, any payment that has been suspended is treated as if it were a regularly scheduled payment made by a borrower.

(e) Suspending Involuntary Collection.—During the period in which the Secretary suspends payments on a loan under subsection (a), the Secretary shall suspend all involuntary collection related to the loan, including—

(1) a wage garnishment authorized under section 488A of the Higher Education Act of 1965 … or section 3720D of title 31, United States Code;

(2) a reduction of tax refund by amount of debt authorized under section 3720A of title 31, United States Code, or section 6402(d) of the Internal Revenue Code of 1986;

(3) a reduction of any other Federal benefit payment by administrative offset authorized under section 3716 of title 31, United States Code (including a benefit payment due to an individual under the Social Security Act or any other provision described in subsection (c)(3)(A)(i) of such section); and

(4) any other involuntary collection activity by the Secretary.

[264] “Quarterly Report on Household Debt and Credit, 2021:Q4.” Federal Reserve Bank of New York, Research And Statistics Group, Microeconomic Studies, February 2022. <www.newyorkfed.org>

Page 2 (of PDF):

Aggregate delinquency rates were flat in the fourth quarter of 2021 but remain very low, after declining sharply through the beginning of the pandemic. The fourth quarter saw a continued decline in late delinquency offset by a small increase in the share of earlier delinquency. The low delinquency rates have reflected forbearances (provided by both the CARES [Coronavirus Aid, Relief, and Economic Security] Act and voluntarily offered by lenders), which protect borrowers’ credit records from the reporting of skipped or deferred payments but are now winding down. …

Delinquency rates by product continued to decline, and new transitions into delinquency mostly declined across the board. The share of student loans that are reported as delinquent remains very low, as the majority of outstanding federal student loans are covered by CARES Act administrative forbearances, which have been extended through May 2022.

[265] Press release: “Biden-Harris Administration Announces Final Student Loan Pause Extension Through December 31 and Targeted Debt Cancellation to Smooth Transition to Repayment.” U.S. Department of Education, August 24, 2022. <www.ed.gov>

“Today, the U.S. Department of Education (Department) announced a final extension of the pause on student loan repayment, interest, and collections through December 31, 2022. Borrowers should plan to resume payments in January 2023.”

[266] Press release: “Biden-Harris Administration Continues Fight for Student Debt Relief for Millions of Borrowers, Extends Student Loan Repayment Pause.” U.S. Department of Education, November 22, 2022. <www.ed.gov>

Today, the U.S. Department of Education announced an extension of the pause on student loan repayment, interest, and collections. The extension will alleviate uncertainty for borrowers as the Biden-Harris Administration asks the Supreme Court to review the lower-court orders that are preventing the Department from providing debt relief for tens of millions of Americans. Payments will resume 60 days after the Department is permitted to implement the program or the litigation is resolved, which will give the Supreme Court an opportunity to resolve the case during its current Term. If the program has not been implemented and the litigation has not been resolved by June 30, 2023—payments will resume 60 days after that.

[267] “Quarterly Report on Household Debt and Credit, 2023:Q1.” Federal Reserve Bank of New York, Research and Statistics Group, Microeconomic Studies, May 2023. <www.newyorkfed.org>

Page 2 (of PDF): “Less than 1% of aggregate student debt was 90+ days delinquent or in default in 2023Q13 , a small decline from the previous quarter. Delinquency rates fell substantially in the previous quarter due to the implementation of the Fresh Start program, which made previously defaulted loan balances current.”

[268] Dataset: “Federally Managed Portfolio by Loan Status.” U.S. Department of Education, National Student Loan Data System. Accessed June 21, 2023 at <studentaid.gov>

Tab: “Loan Status Definitions”

Please Note: Recipient counts are based at the loan level. As a result, recipients may be counted multiple times across varying loan statuses. …

In-School – Includes loans that have never entered repayment as a result of the borrower’s enrollment in school.

Grace – Includes loans that have entered a six-month grace period after the borrower is no longer enrolled in school at least half-time. Borrowers are not expected to make payments during grace.

Repayment – Includes loans that are in an active repayment status.

Deferment – Includes loans in which payments have been postponed as a result of certain circumstances such as returning to school, military service, or economic hardship.

Forbearance – Includes loans in which payments have been temporary suspended or reduced as a result of certain types of financial hardships.

Cumulative in Default – Includes loans that are more than 360 days delinquent.

Other – Includes loans that are in non-defaulted bankruptcy and in a disability status.

[269] “Quarterly Report on Household Debt and Credit, 2021:Q3.” Federal Reserve Bank of New York, Research And Statistics Group, Microeconomic Studies, November 2021. <www.newyorkfed.org>

Page 2 (of PDF):

Aggregate delinquency rates have remained low and declining since the beginning of the pandemic, reflecting an uptake in forbearances (provided by both the CARES [Coronavirus Aid, Relief, and Economic Security] Act and voluntarily offered by lenders), which protect borrowers’ credit records from the reporting of skipped or deferred payments. …

Delinquency rates by product continued to decline, and new transitions into delinquency mostly declined across the board, continuing to reflect ongoing participation in various borrower assistance programs. The share of student loans that are reported as delinquent remains very low, as the majority of outstanding federal student loans are covered by CARES Act administrative forbearances. Auto loans and credit cards also showed continued declines in their delinquency transition rates, reflecting the persistent impact of government stimulus programs and bank-offered forbearance options for troubled borrowers.

[270] “Quarterly Report on Household Debt and Credit, 2023:Q1.” Federal Reserve Bank of New York, Research and Statistics Group, Microeconomic Studies, May 2023. <www.newyorkfed.org>

Page 2 (of PDF): “Less than 1% of aggregate student debt was 90+ days delinquent or in default in 2023Q13 , a small decline from the previous quarter. Delinquency rates fell substantially in the previous quarter due to the implementation of the Fresh Start program, which made previously defaulted loan balances current.”

[271] Calculated with data from:

a) Dataset: “Federal Student Loan Portfolio by Loan Status.” National Student Loan Data System. Accessed June 21, 2023 at <studentaid.gov>

Tab: “Federally Managed”

b) “Quarterly Report on Household Debt and Credit, 2023:Q1.” Federal Reserve Bank of New York, Research and Statistics Group, Microeconomic Studies, May 2023. <www.newyorkfed.org>

Page 2 (of PDF): “Outstanding student loan debt stood at $1.604 trillion in 2023Q1.”

NOTE: An Excel file containing the data and calculations is available upon request.

[272] “Fiscal Year 2014 Financial Report of the United States Government.” U.S. Department of the Treasury, February 26, 2015. <fiscal.treasury.gov>

Page 71:

For those unable to afford credit at the market rate, federal credit programs provide subsidies in the form of direct loans offered at an interest rate lower than the market rate. For those to whom non-federal financial institutions are reluctant to grant credit because of the high risk involved, federal credit programs guarantee the payment of these non-federal loans and absorb the cost of defaults.

Page 72:

The majority of the loan programs are provided by Education, HUD [Department of Housing and Urban Development], USDA [Department of Agriculture], Treasury, Small Business Administration (SBA), VA [Veterans Affairs], Export-Import Bank and United States Agency for International Development (USAID). For significant detailed information regarding the direct and guaranteed loan programs listed in the tables above, please refer to the financial statements of the agencies.

[273] Report: “Fair-Value Accounting for Federal Credit Programs.” U.S. Congressional Budget Office, March 2012. <www.cbo.gov>

Page 2:

Market risk is the component of financial risk that remains even after investors have diversified their portfolios as much as possible; it arises from shifts in macroeconomic conditions, such as productivity and employment, and from changes in expectations about future macroeconomic conditions. Loans and loan guarantees expose the government to market risk because future repayments of loans tend to be lower when the economy as a whole is performing poorly and resources are more highly valued.

Page 7: “When the government extends credit, the associated market risk of those obligations is effectively passed along to citizens who, as investors, would view that risk as costly.”

[274] Report: “Trends in the Student Loan Market.” U.S. Treasury Department, Treasury Borrowing Advisory Committee, November 4, 2014. <home.treasury.gov>

Page 4:

A key concern is that students are taking on student loans because historically an education has been correlated with economic mobility; however, today an average of 40% of students at four-year institutions (and 68% of students in for-profit institutions) do not graduate within six years,1 which means they most likely do not benefit from the income upside from a higher degree yet have the burden of student debt. …

1 National Center for Education Statistics. Based on graduation rates of Bachelor’s Degree-Seeking Students at 4-Year Postsecondary Institutions (cohort entry year: 2006).

Page 21:

Failure-to-graduate remains the most deadly of traps for higher education. As shown in the following Graphs 8, 9 and 10, the marginal benefit of higher education is clear in terms of lifetime earnings and better employment stability. Failure-to-graduate combined with leverage is a poor mix: the debt burden remains but very little of the economic benefits accrue. Whatever the reasons that the student failed-to-graduate, he or she is left with all of the downside and limited upside.

[275] Curriculum Vitae: “Deborah J. Lucas.” MIT Sloan School of Management, February 2014. <dlucas.scripts.mit.edu>

Sloan Distinguished Professor of Finance, Sloan School of Management, 2011–present

Assistant Director, Financial Analysis Division, Congressional Budget Office 2010–2011

Associate Director of Financial Studies, Congressional Budget Office 2009–2010

Professor of Finance, Sloan School of Management, 2009–2011 (on leave)

Donald C. Clark HSBC Professor of Consumer Finance, Department of Finance, Kellogg School of Management, Northwestern University, 1996–2009.

Chief Economist, Congressional Budget Office, 2000–2001.

Member, Social Security Technical Advisory Panel, 1999–2000, and 2006–2007.

Chairman, Department of Finance, Kellogg School of Management, 1996–1998.

John L. and Helen Kellogg Distinguished Associate Professor, Department of Finance, Kellogg School of Management, Northwestern University. 1992–1996.

NBER Research Associate, 1998–present.

NBER Faculty Research Fellow, 1992–1998.

Senior Staff Economist, Council of Economic Advisers, Washington, D.C., 1992–1993.

Assistant Professor, Department of Finance, J.L. Kellogg School of Management, Northwestern University, 1985–1992.

Visiting Assistant Professor, Department of Finance, Sloan School of Management, Massachusetts Institute of Technology, 1990–1991.

[276] Book: Public Economics in the United States: How the Federal Government Analyzes and Influences the Economy. Edited by Steven Payson. ABC-CLIO, 2014.

Chapter 15: “Federal Credit Programs.” By Deborah Lucas (Director, MIT Center for Finance and Policy). Pages 375–398.

Page 394:

The government delivers subsidies in a variety of forms, for example, using cash grants, or in-kind assistance such as free vaccinations. Government credit that is offered at a below-market price to beneficiaries similarly provides a subsidy. …

Government credit programs may have adverse consequences that must be weighed against their expected benefits. One concern is that credit subsidies will distort the allocation of capital in the economy and crowd out productive investments by households and firms. For example, subsidizing mortgages guarantees increases the demand for housing, causing more savings to be invested in residential construction. That leaves fewer resources available for other investment activities, and puts upward pressure on the interest rates facing all borrowers.

Easier access to credit markets is not always advantageous to program participants. Unsophisticated borrowers, such as some college students and first-time homebuyers, may not be fully aware of the costs and risks associated with accumulating high debt loans. Consumer protection and disclosure laws usually do not extend to the government, and there is the possibility that it will inadvertently offer poorly designed products that can harm consumers. …

A well-understood consequence of government credit provision is that it tends to create incentives for greater risk taking, particularly when a borrower becomes financially distressed. The reason is that a debtor with a guaranteed debt benefits from the upside if a gamble pays off, whereas the government shares in the losses if the gamble fails. (The effect is less pronounced for loans obtained privately because financial institutions charge interest rates that increase with risk, which discourages excessive risk taking.)

[277] “Remarks at Southwest Texas State College Upon Signing the Higher Education Act of 1965.” By Lyndon B. Johnson, November 8, 1965. <www.presidency.ucsb.edu>

In a very few moments, I will put my signature on the Higher Education Act of 1965. The President’s signature upon this legislation passed by this Congress will swing open a new door for the young people of America. For them, and for this entire land of ours, it is the most important door that will ever open—the door to education. …

This bill is only one of more than two dozen education measures enacted by the first session of the 89th Congress. And history will forever record that this session—the first session of the 89th Congress—did more for the wonderful cause of education in America than all the previous 176 regular sessions of Congress did, put together.

[278] Report: “Trends in the Student Loan Market.” U.S. Treasury Department, Treasury Borrowing Advisory Committee, November 4, 2014. <home.treasury.gov>

Page 6:

History of the Student Lending Program

• Student loans are used to finance post-secondary education, which is typically targeted for undergraduate and postgraduate education but also can include eligible vocational or trade schools.

• The U.S. government began offering Federal financing for Institutions of Higher Education (IHE) in 1965 with Title IV of the Higher Education Act (HEA).

[279] “Fiscal Year 2014 Financial Report of the United States Government.” U.S. Department of the Treasury, February 26, 2015. <fiscal.treasury.gov>

Page 71: “For those to whom non-federal financial institutions are reluctant to grant credit because of the high risk involved, federal credit programs guarantee the payment of these non-federal loans and absorb the cost of defaults.”

Page 72: “[T]he Federal Family Education Loan (FFEL) Program … was established in fiscal year 1965, and is a guaranteed loan program.”

[280] Report: “Federal Family Education Loan Program’s Financial Statements for Fiscal Years 1993 and 1992.” U.S. General Accounting Office, June 1994. <www.gao.gov>

Page 65:

On August 10,1993, President Clinton signed the Omnibus Budget Reconciliation Act of 1993 (P.L. 103-66). A portion of that Act entitled “The Student Loan Reform Act of 1993” requires the phase-in of federal direct student lending. Direct student lending, as a percentage of new student loan volume will be phased in over five years as follows:

Academic Year

Percent

1994–95

5%

1995–96

40%

1996–97

at least 50%

1997–98

at least 50%

1998–99

at least 60%

The Student Loan Reform Act of 1993 ensures adequate financing for the current guaranty agencies during the transition and provides for alternative mechanisms to assure loan guarantees in the event that any of the guaranty agencies do not continue to operate. The implementation plans for the new direct loan program provide for Education’s cost of transitioning outstanding guaranteed loans, therefore no provision for such cost has been included in the principal statements.

[281] Calculated with data from:

a) Vote 406: “Omnibus Budget Reconciliation Act of 1993.” U.S. House of Representatives, August 5, 1993. <clerk.house.gov>

b) Vote 247: “Omnibus Budget Reconciliation Act of 1993.” U.S. Senate, August 6, 1993. <www.senate.gov>

Party

Voted “Yes”

Voted “No”

Voted “Present” or Did Not Vote †

Number

Portion

Number

Portion

Number

Portion

Republican

0

0%

219

100%

0

0%

Democrat

267

85%

47

15%

0

0%

Independent

1

100%

0

0%

0

0%

NOTE: † Voting “Present” is effectively the same as not voting.

[282] Report: “Trends in the Student Loan Market.” U.S. Treasury Department, Treasury Borrowing Advisory Committee, November 4, 2014. <home.treasury.gov>

Page 3: “[T]he Student Aid and Fiscal Responsibility Act (SAFRA) of 2010 ceased the origination of federal student loans by private lenders and as of July 1, 2010, all federal student loans are made directly by the Department of Education and funded by the U.S. Treasury Department.”

[283] “Fiscal Year 2012 Financial Report of the United States Government.” U.S. Department of the Treasury, January, 17, 2013. <fiscal.treasury.gov>

Page 69: “The Student Aid and Fiscal Responsibility Act (SAFRA), which was enacted as part of the Health Care Education and Reconciliation Act of 2010 (Public Law 111-152), eliminated the authority to guarantee new FFEL [Federal Family Education Loans] after June 30, 2010.”

[284] Public Law 111-152: “Health Care and Education Reconciliation Act of 2010.” 111th U.S. Congress. Signed into law by Barack Obama on March 30, 2010. <www.govinfo.gov>

PART II—Student Loan Reform

Sec. 2201. Termination of Federal Family Education Loan appropriations.

Sec. 2202. Termination of Federal loan insurance program.

Sec. 2203. Termination of applicable interest rates.

Sec. 2204. Termination of Federal payments to reduce student interest costs.

Sec. 2205. Termination of FFEL [Federal Family Education Loans] PLUS [Parent Loans for Undergraduate Students] Loans.

Sec. 2206. Federal Consolidation Loans.

Sec. 2207. Termination of Unsubsidized Stafford Loans for middle-income borrowers.

Sec. 2208. Termination of special allowances.

Sec. 2209. Origination of Direct Loans at institutions outside the United States.

Sec. 2210. Conforming amendments.

Sec. 2211. Terms and conditions of loans.

Sec. 2212. Contracts; mandatory funds.

Sec. 2213. Income-based repayment.

[285] Calculated with data from:

a) Vote 194: “Health Care and Education Reconciliation Act of 2010.” U.S. House of Representatives, March 25, 2010. <clerk.house.gov>

b) Vote 105: “Health Care and Education Reconciliation Act of 2010.” U.S. Senate, March 25, 2010. <www.senate.gov>

Party

Voted “Yes”

Voted “No”

Voted “Present” or Did Not Vote †

Number

Portion

Number

Portion

Number

Portion

Republican

0

0%

215

99%

3

1%

Democrat

275

88%

35

11%

1

0%

Independent

1

100%

0

0%

0

0%

NOTE: † Voting “Present” is effectively the same as not voting.

[286] “Private Student Loan Report.” Enterval Analytics, January 11, 2023. <www.enterval.com>

Page 3:

The nineteenth edition of this semi-annual report includes continuous contributions from the Enterval Private Student Loan Consortium, a data cooperative of the five largest student loan lenders and holders: Citizens Bank, N.A., Discover Bank, Navient, PNC Bank, N.A., and Sallie Mae Bank. In addition to the original five Consortium members, the Q3 2022 report includes seven other contributors: College Ave Student Loans, Navy Federal Credit Union and five members from the Education Finance Council, recognized on page 28.

In total, these 12 data contributors represented 59.12% of the private student loans outstanding (including consolidation, refinance and parent loans) in the U.S. Overall at the end of September 30th, 2022 (the latest date federal loan portfolio data was available at report creation), private student loans are estimated to be 7.22% ($127.24 B) of the total student loans outstanding. The remaining 92.78% ($1,634.50 B) of the $1.76 T in total student loans are federal loans made through or guaranteed by the U.S. Department of Education.

[287] Calculated with data from:

a) “Quarterly Report on Household Debt and Credit, 2023:Q1.” Federal Reserve Bank of New York, Research and Statistics Group, Microeconomic Studies, May 2023. <www.newyorkfed.org>

Page 3: “Total Debt Balance and Its Composition” <www.newyorkfed.org>

b) Dataset: “CPI—All Urban Consumers (Current Series).” U.S. Department of Labor, Bureau of Labor Statistics. Accessed January 27, 2023 at <www.bls.gov>

“Series Id: CUUR0000SA0; Series Title: All Items in U.S. City Average, All Urban Consumers, Not Seasonally Adjusted; Area: U.S. City Average; Item: All Items; Base Period: 1982–84=100”

NOTE: An Excel file containing the data and calculations is available upon request.

[288] Report: “Trends in the Student Loan Market.” U.S. Treasury Department, Treasury Borrowing Advisory Committee, November 4, 2014. <home.treasury.gov>

Page 3: “[T]he Student Aid and Fiscal Responsibility Act (SAFRA) of 2010 ceased the origination of federal student loans by private lenders and as of July 1, 2010, all federal student loans are made directly by the Department of Education and funded by the U.S. Treasury Department.”

[289] “WHO Director-General’s Opening Remarks at the Media Briefing on Covid-19.” World Health Organization, March 11, 2020. <bit.ly>

[Dr. Tedros Adhanom Ghebreyesus:] …

WHO [World Health Organization] has been assessing this outbreak around the clock and we are deeply concerned both by the alarming levels of spread and severity, and by the alarming levels of inaction.

We have therefore made the assessment that COVID-19 can be characterized as a pandemic.

[290] Press release: “COVID-19 and Other Global Health Issues.” World Health Organization, May 5, 2023. <www.justfacts.com>

[Dr. Tedros Adhanom Ghebreyesus:] …

Yesterday, the Emergency Committee met for the 15th time and recommended to me that I declare an end to the public health emergency of international concern. I have accepted that advice. It’s therefore with great hope that I declare COVID-19 over as a global health emergency.

[291] Article: “Student Loan Delinquencies Surge.” By Emily Dai. Federal Reserve Bank of St. Louis, Inside the Vault, Spring 2013. Pages 1–3. <fraser.stlouisfed.org>

Page 1:

In the third quarter of 2012, the share of delinquent student loan balances exceeded the share of delinquent credit card balances, according to the Federal Reserve Bank of New York’s Consumer Credit Panel and to Equifax.2 This is the first such occurrence since 2003, when reliable data became available.3 In the fourth quarter of 2012, the share of delinquent student loan balances continued to rise.

2 “Delinquent” here refers to balances past due for 90 days or more.

3 The data were first captured by Equifax in 2003 and first reported in 2010 in the Federal Reserve Bank of New York’s Household Debt and Credit Report.

[292] “Quarterly Report on Household Debt and Credit, 2021:Q3.” Federal Reserve Bank of New York, Research And Statistics Group, Microeconomic Studies, November 2021. <www.newyorkfed.org>

Page 2 (of PDF):

Aggregate delinquency rates have remained low and declining since the beginning of the pandemic, reflecting an uptake in forbearances (provided by both the CARES [Coronavirus Aid, Relief, and Economic Security] Act and voluntarily offered by lenders), which protect borrowers’ credit records from the reporting of skipped or deferred payments. …

Delinquency rates by product continued to decline, and new transitions into delinquency mostly declined across the board, continuing to reflect ongoing participation in various borrower assistance programs. The share of student loans that are reported as delinquent remains very low, as the majority of outstanding federal student loans are covered by CARES Act administrative forbearances. Auto loans and credit cards also showed continued declines in their delinquency transition rates, reflecting the persistent impact of government stimulus programs and bank-offered forbearance options for troubled borrowers.

[293] Public Law 116-136: “Coronavirus Aid, Relief, and Economic Security Act.” 116th U.S. Congress. Signed into Law by Donald J. Trump on March 27, 2020. <www.congress.gov>

Title III, Part IV, Subtitle B, Section 3513:

Temporary Relief for Federal Student Loan Borrowers.

(a) In General.—The Secretary shall suspend all payments due for loans made under part D and part B (that are held by the Department of Education) of title IV of the Higher Education Act of 1965 … through September 30, 2020.

(b) No Accrual of Interest.—Notwithstanding any other provision of the Higher Education Act of 1965 … interest shall not accrue on a loan described under subsection (a) for which payment was suspended for the period of the suspension.

(c) Consideration of Payments.—Notwithstanding any other provision of the Higher Education Act of 1965 … the Secretary shall deem each month for which a loan payment was suspended under this section as if the borrower of the loan had made a payment for the purpose of any loan forgiveness program or loan rehabilitation program authorized under part D or B of title IV of the Higher Education Act of 1965 … for which the borrower would have otherwise qualified.

(d) Reporting to Consumer Reporting Agencies.—During the period in which the Secretary suspends payments on a loan under subsection (a), the Secretary shall ensure that, for the purpose of reporting information about the loan to a consumer reporting agency, any payment that has been suspended is treated as if it were a regularly scheduled payment made by a borrower.

(e) Suspending Involuntary Collection.—During the period in which the Secretary suspends payments on a loan under subsection (a), the Secretary shall suspend all involuntary collection related to the loan, including—

(1) a wage garnishment authorized under section 488A of the Higher Education Act of 1965 … or section 3720D of title 31, United States Code;

(2) a reduction of tax refund by amount of debt authorized under section 3720A of title 31, United States Code, or section 6402(d) of the Internal Revenue Code of 1986;

(3) a reduction of any other Federal benefit payment by administrative offset authorized under section 3716 of title 31, United States Code (including a benefit payment due to an individual under the Social Security Act or any other provision described in subsection (c)(3)(A)(i) of such section); and

(4) any other involuntary collection activity by the Secretary.

[294] Press release: “Biden-Harris Administration Announces Final Student Loan Pause Extension Through December 31 and Targeted Debt Cancellation to Smooth Transition to Repayment.” U.S. Department of Education, August 24, 2022. <www.ed.gov>

“Today, the U.S. Department of Education (Department) announced a final extension of the pause on student loan repayment, interest, and collections through December 31, 2022. Borrowers should plan to resume payments in January 2023.”

[295] Press release: “Biden-Harris Administration Continues Fight for Student Debt Relief for Millions of Borrowers, Extends Student Loan Repayment Pause.” U.S. Department of Education, November 22, 2022. <www.ed.gov>

Today, the U.S. Department of Education announced an extension of the pause on student loan repayment, interest, and collections. The extension will alleviate uncertainty for borrowers as the Biden-Harris Administration asks the Supreme Court to review the lower-court orders that are preventing the Department from providing debt relief for tens of millions of Americans. Payments will resume 60 days after the Department is permitted to implement the program or the litigation is resolved, which will give the Supreme Court an opportunity to resolve the case during its current Term. If the program has not been implemented and the litigation has not been resolved by June 30, 2023—payments will resume 60 days after that.

[296] “Quarterly Report on Household Debt and Credit, 2023:Q1.” Federal Reserve Bank of New York, Research and Statistics Group, Microeconomic Studies, May 2023. <www.newyorkfed.org>

Page 2 (of PDF): “Less than 1% of aggregate student debt was 90+ days delinquent or in default in 2023Q13 , a small decline from the previous quarter. Delinquency rates fell substantially in the previous quarter due to the implementation of the Fresh Start program, which made previously defaulted loan balances current.”

[297] “Quarterly Report on Household Debt and Credit, 2023:Q1.” Federal Reserve Bank of New York, Research and Statistics Group, Microeconomic Studies, May 2023. <www.newyorkfed.org>

Page 12: “Percent of Balance 90+ Days Delinquent by Loan Type” <www.newyorkfed.org>

Page 42:

Loan types. In our analysis we distinguish between the following types of accounts: mortgage accounts, home equity revolving accounts, auto loans and leases, bank card accounts, student loans and other loan accounts. Mortgage accounts include all mortgage installment loans, including first mortgages and home equity installment loans (HEL), both of which are closed-end loans. Home Equity Revolving accounts (aka Home Equity Line of Credit or HELOC), unlike home equity installment loans, are home equity loans with a revolving line of credit where the borrower can choose when and how often to borrow up to an updated credit limit. Auto Loans are loans taken out to purchase a car, including leases, provided by automobile dealers and automobile financing companies. Bankcard accounts (or credit card accounts) are revolving accounts for banks, bankcard companies, national credit card companies, credit unions and savings & loan associations. Student Loans include loans to finance educational expenses provided by banks, credit unions and other financial institutions as well as federal and state governments. The Other category includes Consumer Finance (sales financing, personal loans) and Retail (clothing, grocery, department stores, home furnishings, gas etc) loans.

Our analysis excludes authorized user trades, disputed trades, lost/stolen trades, medical trades, child/family support trades, commercial trades and, as discussed above, inactive trades (accounts not reported on within the last 3 months).

[298] Report: “Trends in the Student Loan Market.” U.S. Treasury Department, Treasury Borrowing Advisory Committee, November 4, 2014. <home.treasury.gov>

Page 3: “[T]he Student Aid and Fiscal Responsibility Act (SAFRA) of 2010 ceased the origination of federal student loans by private lenders and as of July 1, 2010, all federal student loans are made directly by the Department of Education and funded by the U.S. Treasury Department.”

[299] “WHO Director-General’s Opening Remarks at the Media Briefing on Covid-19.” World Health Organization, March 11, 2020. <bit.ly>

[Dr. Tedros Adhanom Ghebreyesus:] …

WHO [World Health Organization] has been assessing this outbreak around the clock and we are deeply concerned both by the alarming levels of spread and severity, and by the alarming levels of inaction.

We have therefore made the assessment that COVID-19 can be characterized as a pandemic.

[300] Press release: “COVID-19 and Other Global Health Issues.” World Health Organization, May 5, 2023. <www.justfacts.com>

[Dr. Tedros Adhanom Ghebreyesus:] …

Yesterday, the Emergency Committee met for the 15th time and recommended to me that I declare an end to the public health emergency of international concern. I have accepted that advice. It’s therefore with great hope that I declare COVID-19 over as a global health emergency.

[301] Public Law 116-136: “Coronavirus Aid, Relief, and Economic Security Act.” 116th U.S. Congress. Signed into Law by Donald J. Trump on March 27, 2020. <www.congress.gov>

Title III, Part IV, Subtitle B, Section 3513:

Temporary Relief for Federal Student Loan Borrowers.

(a) In General.—The Secretary shall suspend all payments due for loans made under part D and part B (that are held by the Department of Education) of title IV of the Higher Education Act of 1965 … through September 30, 2020.

(b) No Accrual of Interest.—Notwithstanding any other provision of the Higher Education Act of 1965 … interest shall not accrue on a loan described under subsection (a) for which payment was suspended for the period of the suspension.

(c) Consideration of Payments.—Notwithstanding any other provision of the Higher Education Act of 1965 … the Secretary shall deem each month for which a loan payment was suspended under this section as if the borrower of the loan had made a payment for the purpose of any loan forgiveness program or loan rehabilitation program authorized under part D or B of title IV of the Higher Education Act of 1965 … for which the borrower would have otherwise qualified.

(d) Reporting to Consumer Reporting Agencies.—During the period in which the Secretary suspends payments on a loan under subsection (a), the Secretary shall ensure that, for the purpose of reporting information about the loan to a consumer reporting agency, any payment that has been suspended is treated as if it were a regularly scheduled payment made by a borrower.

(e) Suspending Involuntary Collection.—During the period in which the Secretary suspends payments on a loan under subsection (a), the Secretary shall suspend all involuntary collection related to the loan, including—

(1) a wage garnishment authorized under section 488A of the Higher Education Act of 1965 … or section 3720D of title 31, United States Code;

(2) a reduction of tax refund by amount of debt authorized under section 3720A of title 31, United States Code, or section 6402(d) of the Internal Revenue Code of 1986;

(3) a reduction of any other Federal benefit payment by administrative offset authorized under section 3716 of title 31, United States Code (including a benefit payment due to an individual under the Social Security Act or any other provision described in subsection (c)(3)(A)(i) of such section); and

(4) any other involuntary collection activity by the Secretary.

[302] Report: “The Biden Administration Extends the Pause on Federal Student Loan Payments: Legal Considerations for Congress.” By Kevin M. Lewis and Edward C. Liu. Congressional Research Service, January 27, 2021. <crsreports.congress.gov>

Page 1:

The Higher Education Relief Opportunities for Students (HEROES) Act of 2003 authorizes the Secretary of Education (Secretary) to “waive or modify any statutory or regulatory provision applicable to” the Title IV loan programs “as the Secretary deems necessary” to ensure that individuals adversely affected by a Presidentially declared national emergency “are not placed in a worse position financially.” (The HEROES Act discussed in this Sidebar is not the same as the identically named COVID-19 relief bill that the House of Representatives passed in the 116th Congress.)

Pages 2–3:

Student Loan Relief During the Trump Administration

During the Trump Administration, both Congress and the Executive afforded temporary relief to certain Title IV borrowers to mitigate the COVID-19 emergency’s economic impact. Following President Trump’s declaration of a national emergency with respect to the pandemic under the National Emergencies Act (NEA), the Secretary announced in March 2020 that “[a]ll borrowers with federally held student loans” would “automatically have their interest rates set to 0% for a period of at least 60 days.” The Secretary also announced that “each of these borrowers” would “have the option to suspend their payments for at least two months.” The Secretary’s March 2020 announcement did not specify the statutory authority for this relief.

The following week, the Secretary announced that ED would also “halt collection actions and wage garnishments to provide additional assistance to borrowers.” Again, the Secretary’s announcement did not specify which statute she invoked to provide this assistance.

A few days later, Congress enacted the Coronavirus Aid, Relief, and Economic Security (CARES) Act, which codified aspects of the relief the Secretary previously granted. Section 3513 of the CARES Act required the Secretary to “suspend all payments due for” certain Title IV loans held by ED “through September 30, 2020.” Among other things, Section 3513 also (1) required the Secretary to “suspend all involuntary collection” activities on such loans during the payment suspension period, and (2) provided that interest would not accrue on such loans during that period.

The 116th Congress did not pass legislation extending Section 3513’s sunset date. Instead, in August 2020, the Secretary purported to extend the student loan relief through December 31, 2020. Although the Secretary’s August 2020 announcement did not specify the statutory authority for that extension, ED later published a Federal Register notice asserting that the Secretary based the March and August relief measures on the HEROES Act. …

The Secretary invoked the HEROES Act again in December 2020 to extend this student loan relief through January 31, 2021.

Student Loan Relief During the Biden Administration

On Inauguration Day, President Biden announced that “the Acting Secretary of Education will extend the pause on federal student loan payments and collections and keep the interest rate at 0%.” Although the President’s announcement did not specify how long this extension would last, ED’s website suggests the extension will remain in effect “at least through Sept. 30, 2021.” Additional details about the extension are currently unavailable.

Legal Issues and Considerations for Congress

The relief measures discussed above raise unresolved questions regarding the Secretary’s authority to waive or modify statutes and regulations in response to a national emergency. As far as CRS’s research reveals, no court has interpreted or applied the HEROES Act or reviewed ED’s actions taken pursuant to the Act. Thus, it appears no court has considered where the outer boundaries of the Secretary’s HEROES Act authorities lie. Moreover, before the COVID-19 pandemic, Secretaries generally invoked the HEROES Act relatively narrowly to grant relief to limited subsets of borrowers, such as deployed military service members or victims of certain natural disasters. As a result, judicial and administrative precedent cast little light on whether the HEROES Act authorizes the COVID-19 relief measures discussed here.

[303] Report: “Student Loans: Education Has Increased Federal Cost Estimates of Direct Loans by Billions Due to Programmatic and Other Changes.” U.S. Government Accountability Office, July 28, 2022. <www.gao.gov>

Overview:

The largest estimated cost increases—$102 billion in total—stemmed from emergency relief provided to most federal student loan borrowers under the CARES Act and related administrative actions in response to the COVID-19 pandemic. This relief included suspending (1) all payments due, (2) interest accrual, and (3) involuntary collections for loans in default. The suspensions, which are programmatic changes dating back to March 13, 2020, are currently set to expire on August 31, 2022. Reestimates based on updated data and assumptions about borrowers in Income-Driven Repayment plans also substantially increased estimated costs.

[304] Calculated with data from:

a) Dataset: “Federal Student Loan Portfolio by Loan Status.” National Student Loan Data System. Accessed June 21, 2023 at <studentaid.gov>

Tab: “Federally Managed”

b) “Quarterly Report on Household Debt and Credit, 2023:Q1.” Federal Reserve Bank of New York, Research and Statistics Group, Microeconomic Studies, May 2023. <www.newyorkfed.org>

Page 2 (of PDF): “Outstanding student loan debt stood at $1.604 trillion in 2023Q1.”

NOTE: An Excel file containing the data and calculations is available upon request.

[305] “WHO Director-General’s Opening Remarks at the Media Briefing on Covid-19.” World Health Organization, March 11, 2020. <bit.ly>

[Dr. Tedros Adhanom Ghebreyesus:] …

WHO [World Health Organization] has been assessing this outbreak around the clock and we are deeply concerned both by the alarming levels of spread and severity, and by the alarming levels of inaction.

We have therefore made the assessment that COVID-19 can be characterized as a pandemic.

[306] Press release: “COVID-19 and Other Global Health Issues.” World Health Organization, May 5, 2023. <www.justfacts.com>

[Dr. Tedros Adhanom Ghebreyesus:] …

Yesterday, the Emergency Committee met for the 15th time and recommended to me that I declare an end to the public health emergency of international concern. I have accepted that advice. It’s therefore with great hope that I declare COVID-19 over as a global health emergency.

[307] Public Law 116-136: “Coronavirus Aid, Relief, and Economic Security Act.” 116th U.S. Congress. Signed into Law by Donald J. Trump on March 27, 2020. <www.congress.gov>

Title III, Part IV, Subtitle B, Section 3513:

Temporary Relief for Federal Student Loan Borrowers.

(a) In General.—The Secretary shall suspend all payments due for loans made under part D and part B (that are held by the Department of Education) of title IV of the Higher Education Act of 1965 … through September 30, 2020.

(b) No Accrual of Interest.—Notwithstanding any other provision of the Higher Education Act of 1965 … interest shall not accrue on a loan described under subsection (a) for which payment was suspended for the period of the suspension.

(c) Consideration of Payments.—Notwithstanding any other provision of the Higher Education Act of 1965 … the Secretary shall deem each month for which a loan payment was suspended under this section as if the borrower of the loan had made a payment for the purpose of any loan forgiveness program or loan rehabilitation program authorized under part D or B of title IV of the Higher Education Act of 1965 … for which the borrower would have otherwise qualified.

(d) Reporting to Consumer Reporting Agencies.—During the period in which the Secretary suspends payments on a loan under subsection (a), the Secretary shall ensure that, for the purpose of reporting information about the loan to a consumer reporting agency, any payment that has been suspended is treated as if it were a regularly scheduled payment made by a borrower.

(e) Suspending Involuntary Collection.—During the period in which the Secretary suspends payments on a loan under subsection (a), the Secretary shall suspend all involuntary collection related to the loan, including—

(1) a wage garnishment authorized under section 488A of the Higher Education Act of 1965 … or section 3720D of title 31, United States Code;

(2) a reduction of tax refund by amount of debt authorized under section 3720A of title 31, United States Code, or section 6402(d) of the Internal Revenue Code of 1986;

(3) a reduction of any other Federal benefit payment by administrative offset authorized under section 3716 of title 31, United States Code (including a benefit payment due to an individual under the Social Security Act or any other provision described in subsection (c)(3)(A)(i) of such section); and

(4) any other involuntary collection activity by the Secretary.

[308] Calculated with data from:

a) Dataset: “Federal Student Loan Portfolio by Loan Status.” National Student Loan Data System. Accessed April 28, 2022 at <studentaid.gov>

Tab: “Federally Managed”

b) “Quarterly Report on Household Debt and Credit, 2021:Q4.” Federal Reserve Bank of New York, Research and Statistics Group, Microeconomic Studies, February 2022. <www.newyorkfed.org>

Page 3 (of PDF): “Outstanding student loan debt stood at $1.58 trillion in the fourth quarter, an $8 billion decline from 2021Q3.”

NOTE: An Excel file containing the data and calculations is available upon request.

[309] Dataset: “Federally Managed Portfolio by Loan Status.” U.S. Department of Education, National Student Loan Data System. Accessed April 28, 2022 at <studentaid.gov>

Tab: “Loan Status Definitions”

Please Note: Recipient counts are based at the loan level. As a result, recipients may be counted multiple times across varying loan statuses. …

In-School – Includes loans that have never entered repayment as a result of the borrower’s enrollment in school.

Grace – Includes loans that have entered a six-month grace period after the borrower is no longer enrolled in school at least half-time. Borrowers are not expected to make payments during grace.

Repayment – Includes loans that are in an active repayment status.

Deferment – Includes loans in which payments have been postponed as a result of certain circumstances such as returning to school, military service, or economic hardship.

Forbearance – Includes loans in which payments have been temporary suspended or reduced as a result of certain types of financial hardships.

Cumulative in Default – Includes loans that are more than 360 days delinquent.

Other – Includes loans that are in non-defaulted bankruptcy and in a disability status.

NOTE: Recipient counts are based at the loan level. As a result, recipients may be counted multiple times across varying loan statuses.

[310] Report: “Federal Student Loans: Actions Needed to Improve Oversight of Schools’ Default Rates.” U.S. Government Accountability Office, April 2018. <www.gao.gov>

Page 13: “When borrowers do not make payments on their federal student loans, and the loans are in default, the federal government and taxpayers are left with the costs.”

[311] Report: “Fair-Value Accounting for Federal Credit Programs.” U.S. Congressional Budget Office, March 2012. <www.cbo.gov>

Page 7: “When the government extends credit, the associated market risk of those obligations is effectively passed along to citizens who, as investors, would view that risk as costly.”

[312] Webpage: “Student Loan Bankruptcy Exception.” FinAid. Accessed September 27, 2018 at <bit.ly>

The US Bankruptcy Code at 11 USC 523(a)(8) provides an exception to bankruptcy discharge for education loans. This page provides a history of the legislative language in this section of the US Bankruptcy Code.

Student loans were dischargeable in bankruptcy prior to 1976. With the introduction of the US Bankruptcy Code (11 USC 101 et seq) in 1978, the ability to discharge education loans was limited. Subsequent changes in the law have further narrowed the dischargeability of education debt. …

The following timeline illustrates the date of major changes in the treatment of student loans under the US Bankruptcy Code and related changes to other legislation.

[313] Report: “Trends in the Student Loan Market.” U.S. Treasury Department, Treasury Borrowing Advisory Committee, November 4, 2014. <home.treasury.gov>

Page 9:

Unlike Other Credit, Can’t Extinguish Student Loans in Bankruptcy Default Consequences:

• Tax Refund Offsets: IRS can offset the borrower’s income tax refund until the defaulted loan is paid in full. A number of states also have laws that authorize state guaranty agencies to take state income tax refunds.

• Federal Benefits Offsets: The government can offset certain Social Security benefits to collect government student loans. Just as with other types of student loan collection, there is no time limit on Social Security offsets, according to a 2005 Supreme Court Case.

• Wage Garnishments: The government can also garnish wages as a way to recover money owed on a defaulted student loan. The United States Department of Education or a Student Loan Guarantor can garnish 15% of disposable pay1 per pay period without a court order.

• Effect on Credit History: Adversely affects credit for many years. If borrower defaults, loan will be listed as a current debt that is in default. The default will also be listed in the historical section of borrower’s credit report, specifying the length of the default.

• License Revocations: A number of states allow professional and vocational boards to refuse to certify, certify with restrictions, suspend or revoke a member’s professional or vocational license and, in some cases, impose a fine, when a member defaults on student loans.

[314] Article: “Benefits Can Be Used to Pay Student Loans.” By Melissa McNamara. Associated Press, December 7, 2005. <www.cbsnews.com>

“The Supreme Court ruled unanimously Wednesday that the government can seize a person’s Social Security benefits to pay old student loans.”

[315] Article: “White House Floats Bankruptcy Process for Some Student Debt.” By Josh Mitchell. Wall Street Journal, March 10, 2015. <www.wsj.com>

“Fewer than 1,000 people try to get rid of their student loans every year using bankruptcy in a process that is both expensive and uncertain: It involves filing a lawsuit in federal court, and lawyers typically charge several thousand dollars upfront for that work. A Wall Street Journal analysis found 713 such lawsuits were filed last year.”

[316] Report: “Federal Student Loan Forgiveness and Loan Repayment Programs.” By Alexandra Hegji, David P. Smole, and Elayne J. Heisler. Congressional Research Service. Updated November 20, 2018. <crsreports.congress.gov>

Page 2 (of PDF):

The number and availability of loan forgiveness and loan repayment programs have expanded considerably since the establishment of the first major federal loan forgiveness program by the National Defense Education Act of 1958. Currently, over 50 such programs are authorized at the federal level, approximately 30 of which were operational as of October 1, 2017.

Pages 3–4:

Distinction Among Loan Forgiveness and Loan Repayment Programs

In employment-focused loan forgiveness and loan repayment programs, a borrower typically must work or serve in a certain function, profession, or geographic location for a specified period of time to qualify for benefits. Under repayment plan-based loan forgiveness, a borrower typically must repay according to an income-driven repayment plan for a specified period of time to qualify for benefits. At the end of the specified term, some or all of the individual’s qualifying student loan debt is forgiven or repaid on his or her behalf. The individual is thus relieved of responsibility for repaying that portion of his or her student loan debt. One of the most important distinctions among these types of programs is whether the availability of benefits is incorporated into the loan terms and conditions and is thus considered an entitlement to qualified borrowers or whether benefits are made available to qualified borrowers at the discretion of the entity administering the program and whether the benefits are subject to the availability of funds. For the purposes of this report, the former types of programs are referred to as loan forgiveness while the latter are referred to as loan repayment.

In general, loan forgiveness benefits are broadly available to borrowers of qualified loans. The availability of these benefits is expressed to borrowers in their loan documents, such as the master promissory note and the borrower’s rights and responsibilities statement.9 A borrower who satisfies the loan forgiveness program’s eligibility criteria, as set forth in the loan terms and conditions, is entitled to the loan forgiveness benefits. Benefits that are entitlements to qualified borrowers are generally funded through mandatory appropriations and accounted for as part of federal student loan subsidy costs, which are discussed in detail later in the section titled “Cost of Loan Forgiveness and Loan Repayment Programs.” There are two broad categories of loan forgiveness benefits: loan forgiveness for public service employment and loan forgiveness following income-driven repayment.

Loan repayment programs also provide debt relief to borrowers for service in a specific function, profession, or location. However, in contrast to employment-focused loan forgiveness programs, the entity that administers a loan repayment program typically either directly repays some or all of the qualified borrower’s student loan debt on his or her behalf or provides funding to a separate entity for purposes of implementing a loan repayment program and making such payments. Loan repayment benefits are generally offered through programs that are separate or distinct from the program through which a federal student loan is made. In many instances, these programs are designed to address broad employment needs or shortages (for example, within a specific occupation or geographic location), while other such programs are intended to help individual federal agencies recruit and retain qualified employees, often serving as an additional form of compensation to targeted employees, who may be harder to recruit or retain. Both types of loan repayment benefits are generally available to a limited number of qualified borrowers. Typically, loan repayment benefits are discretionary and their availability is subject to the appropriation of funds.

Pages 11–14:

Availability of Loan Forgiveness for Public Service Employment

As described above, loan forgiveness for public service employment provides debt relief to qualified borrowers employed in certain occupations, for specific employers, or in public service. These benefits are considered entitlements and are written into the terms and conditions of widely available federal student loans (for example, Direct Subsidized Loans and Direct Unsubsidized Loans and Perkins Loans). They are potentially available to an open-ended number of qualified borrowers.

Table 1 provides a summary of the various loan forgiveness for public service employment programs offered. …

Table 1 illustrates that although loan forgiveness benefits are entitlements that are potentially available to a wide array of borrowers, to qualify for benefits borrowers must still meet specific eligibility criteria, including completing a specific type of service or entering into a particular occupation or profession.

All three programs are widely available to individuals serving as teachers, while Federal Perkins Loan Cancellation is available to individuals who also serve in other specific public service occupations, such as law enforcement personnel and public defenders, and Direct Loan Public Service Loan Forgiveness is available to an even broader array of individuals who are employed full-time in public service, which includes employment in federal, state, local, or tribal government agencies, and certain nonprofit organizations. Additionally, borrowers of loans made under these programs must serve for a minimum period of time. For these loan forgiveness programs, service commitments generally last between 1 year (for partial benefits) and 10 years.

Availability of Loan Forgiveness Following Income Driven Repayment

Loan forgiveness following income-driven repayment provides debt relief to borrowers who repay their federal student loans as a proportion of their income for an extended period of time but who have not repaid their entire student loan debt. These benefits are considered entitlements and are written into the terms and conditions of widely available federal student loans (for example, Direct Subsidized Loans and Direct Unsubsidized Loans). These repayment plans are potentially available to an open-ended number of qualified borrowers; however, they are distinct from those programs that target public service employment.

Table 2 provides a summary of the various loan forgiveness programs that provide debt relief to individuals following income-driven repayment and provides details on the operational status of each program. The table is organized according to the date on which borrowers first became eligible to repay under each plan. Unlike the loan forgiveness programs presented in Table 1, these programs are not grouped by the potential scope of availability to borrowers and financial resources used to provide benefits, because numerous factors and borrower characteristics may affect program participation, which makes it difficult to estimate the potential scope of each program. …

Table 2 illustrates that the various programs that provide loan forgiveness following income-driven repayment are widely available to a potentially open-ended number of borrowers who meet income-driven qualifications. Unlike loan forgiveness or repayment programs that seek to encourage borrowers to enter into certain service or occupational commitments, no such employer-specific or occupational or service requirements exist for these programs. Rather, under each of the above programs, borrowers generally must make monthly payments towards their qualifying federal student loans for a specified period of time (between 20 and 25 years). The amount of monthly payments is determined based on factors including the amount of the student loan debt, family size, and adjusted gross income; monthly payments are capped at a percentage of a borrower’s discretionary income (between 10% and 20%) or other income-driven criteria. At the end of each program’s repayment period, the outstanding balance of a borrower’s loans is then forgiven and they are no longer responsible for payments on their loans.

[317] Press release: “Fact Sheet: A Student Aid Bill of Rights: Taking Action to Ensure Strong Consumer Protections for Student Loan Borrowers.” White House, Office of the Press Secretary, March 10, 2015. <www.whitehouse.gov>

In addition, new requirements may be appropriate for private and federally guaranteed student loans so that all of the more than 40 million Americans with student loans have additional basic rights and protections. The President is directing his Cabinet and White House advisers, working with the Consumer Financial Protection Bureau, to study whether consumer protections recently applied to mortgages and credit cards, such as notice and grace periods after loans are transferred among lenders and a requirement that lenders confirm balances to allow borrowers to pay off the loan, should also be afforded to student loan borrowers and improve the quality of servicing for all types of student loans. The agencies will develop recommendations for regulatory and legislative changes for all student loan borrowers, including possible changes to the treatment of loans in bankruptcy proceedings and when they were borrowed under fraudulent circumstances.

[318] Final rule: “Student Assistance General Provisions, Federal Family Education Loan Program, and William D. Ford Federal Direct Loan Program.” Federal Register, October 30, 2015. <www.govinfo.gov>

Page 67204:

Agency: Office of Postsecondary Education, Department of Education. …

The Secretary amends the regulations governing the William D. Ford Federal Direct Loan (Direct Loan) Program to create a new income-contingent repayment plan in accordance with the President’s initiative to allow more Direct Loan borrowers to cap their loan payments at 10 percent of their monthly incomes. …

In addition, the final regulations will add a new income-contingent repayment plan, called the Revised Pay As You Earn repayment plan (REPAYE plan), to § 685.209. The REPAYE plan is modeled on the existing Pay As You Earn repayment plan, and will be available to all Direct Loan student borrowers regardless of when the borrower took out the loans. Finally, the regulations will allow lump sum payments made through student loan repayment programs administered by the DOD [Department of Defense] to count as qualifying payments for purposes of the Public Service Loan Forgiveness Program.

Summary of the Major Provisions of This Regulatory Action: …

• For a borrower who only has loans received to pay for undergraduate study, provide that the remaining balance of the borrower’s loans that have been repaid under the REPAYE plan is forgiven after 20 years of qualifying payments.

• For a borrower who has at least one loan received to pay for graduate study, provide that the remaining balance of the borrower’s loans that have been repaid under the REPAYE plan is forgiven after 25 years of qualifying payments.

[319] Report: “Public Service Loan Forgiveness: Education Needs to Provide Better Information for the Loan Servicer and Borrowers.” U.S. Government Accountability Office, September 2018. <www.gao.gov>

Page 2 (of PDF): “The PSLF [Public Loan Service Forgiveness] program, established by statute in 2007, forgives borrowers’ federal student loans after they make at least 10 years of qualifying payments while working for certain public service employers and meeting other requirements.”

[320] Press release: “Over 323,000 Federal Student Loan Borrowers to Receive $5.8 Billion in Automatic Total and Permanent Disability Discharges.” U.S. Department of Education, August 19, 2021. <www.ed.gov>

Over 323,000 borrowers who have a total and permanent disability (TPD) will receive more than $5.8 billion in automatic student loan discharges due to a new regulation announced today by the U.S. Department of Education. The change will apply to borrowers who are identified through an existing data match with the Social Security Administration (SSA). … First, the Department will indefinitely extend the policy announced in March to stop asking these borrowers to provide information on their earnings—a process that results in the reinstatement of loans if and when borrowers do not respond—beyond the end of the national emergency. Second, the Department will then pursue the elimination of the three-year monitoring period required under current regulations during the negotiated rulemaking that will begin in October. …

This new regulation allows the Department to provide automatic TPD discharges for borrowers who are identified through administrative data matching by removing the requirement for these borrowers to fill out an application before receiving relief.

[321] Press release: “Public Service Loan Forgiveness (PSLF) Program Overhaul.” U.S. Department of Education, October 6, 2021. <www.ed.gov>

We will offer a time-limited waiver so that student borrowers can count payments from all federal loan programs or repayment plans toward forgiveness. This includes loan types and payment plans that were not previously eligible. We will pursue opportunities to automate PSLF eligibility, give borrowers a way to get errors corrected, and make it easier for members of the military to get credit toward forgiveness while they serve. We will pair these changes with an expanded communications campaign to make sure affected borrowers learn about these opportunities and encourage them to apply. …

The Department estimates that the limited waiver alone will help over 550,000 borrowers who had previously consolidated their loans see their progress toward PSLF grow automatically, with the average borrower receiving 23 additional payments. This includes approximately 22,000 borrowers who will be immediately eligible to have their federal student loans discharged without further action on their part, totaling $1.74 billion in forgiveness. Another 27,000 borrowers could potentially qualify for $2.82 billion in forgiveness if they certify additional periods of employment. For reference, just over 16,000 borrowers have ever received forgiveness under PSLF prior to this action.

[322] Press release: “Department of Education Announces Actions to Fix Longstanding Failures in the Student Loan Programs.” U.S. Department of Education, April 19, 2022. <www.ed.gov>

Today’s actions complement steps the Administration has already taken within its first year to cancel more than $17 billion in debt for 725,000 borrowers in addition to extending the student loan payment pause, saving 41 million borrowers billions of dollars in payments each month. The Department has now approved approximately:

• $6.8 billion for more than 113,000 public servants through improvements to PSLF [Public Service Loan Forgiveness];

• $7.8 billion for more than 400,000 borrowers who have a total and permanent disability;

• $1.2 billion for borrowers who attended ITT Technical Institutes before it closed; and

• Nearly $2 billion to 105,000 borrowers who were defrauded by their school.

[323] Webpage: “Who Qualifies for Borrower Defense?” U.S. Department of Education, Federal Student Aid Office. Accessed March 25, 2022 at <studentaid.gov>

Under the law, you may be eligible for borrower defense to repayment discharge of the federal student loans that you took out to attend a school if that school misled you, or engaged in other misconduct in violation of certain state laws. Specifically, you may assert borrower defense by demonstrating that the school, through an act or omission, violated state law directly related to your federal student loan or to the educational services for which the loan was provided. You may be eligible for borrower defense regardless of whether your school closed or you are otherwise eligible for loan discharge under other laws. You will only be eligible for this type of federal student loan discharge if your school’s misleading activities or other misconduct directly relate to the loan or to the educational services for which the loan was provided. You will not be eligible for this type of loan discharge based on claims that are not directly related to your loan or the educational services provided by the school. For example, personal injury claims or claims based on allegations of harassment are not bases for a borrower defense application.

[324] Press release: “Fact Sheet: Protecting Students from Abusive Career Colleges.” U.S. Department of Education, June 8, 2015. <bit.ly>

Today, the Education Department is announcing new steps in this work, particularly to address the concerns of students who attended schools owned by Corinthian Colleges Inc.

How Debt Relief Will Work for Corinthian Students

The Department has worked to rapidly develop a streamlined process for getting debt relief to Corinthian students. The Department’s aim is to make the process of forgiving loans fair, clear and efficient—and to ensure that students who are eligible to participate know about this opportunity.

Some Corinthian schools closed down, while others were sold but remain open under different ownership. The announcements today are for:

• Corinthian students whose schools have closed down.

• Corinthian students who believe they were victims of fraud, regardless of whether their school closed. …

Helping Corinthian Students Whose Schools Have Closed

In general, when a college closes, students are eligible to discharge their federal student loans if they were attending when the school closed or who withdrew from the school within 120 days of the closing date. Given the unique circumstances for former Corinthian students, the Department is expanding eligibility for students to apply for a closed school loan discharge, extending the window of time back to June 20, 2014, to capture students who attended the now-closed campuses after Corinthian entered into an agreement with the Department to terminate Corinthian’s ownership of its schools. …

Helping Students Who Believe They Were Victims of Fraud, Regardless of Whether Their School Closed

Provisions in the law called “defense to repayment” or “borrower’s defense” allow borrowers to seek loan forgiveness if they believe they were defrauded by their college under state law. This provision has rarely been used in the past. Now, the Department is taking unprecedented action to create a streamlined process that is fair to students who may have been victims of fraud and that holds colleges accountable to taxpayers. …

For example, after analyzing the Department’s findings in its investigation of Heald College and relevant California law, the Department has determined that evidence of misrepresentation exists for students enrolled in a large majority of programs offered at Heald College campuses between 2010 and 2015. Specifically, the Department has determined that students who relied on misrepresentations found in published job placement rates for many Heald programs qualify to have their federal direct student loans discharged. Students can have their loans forgiven and receive refunds for amounts paid based on a simple attestation. More information about this process—including the attestation form—is available on studentaid.gov/Corinthian. Additional details will be posted on the website in the coming weeks. …

• Building a better system for debt relief for the future: The Department will develop new regulations to clarify and streamline loan forgiveness under the defense to repayment provision, while maintaining or enhancing current consumer protection standards and strengthening those provisions that hold colleges accountable for actions that result in loan discharges. That process will begin later this year and will not slow down the loan discharge process for current applicants.

[325] Press release: “Department of Education Announces Action to Streamline Borrower Defense Relief Process.” U.S. Department of Education, March 18, 2021. <www.ed.gov>

Today, the U.S. Department of Education (Department) announced it will streamline debt relief determinations for borrowers with claims approved to date that their institution engaged in certain misconduct. The Department will be rescinding the formula for calculating partial relief and adopting a streamlined approach for granting full relief under the regulations to borrower defense claims approved to date. The Department anticipates this change will ultimately help approximately 72,000 borrowers receive $1 billion in loan cancellation. …

Current provisions in federal law called “borrower defense to repayment” or “borrower defense” allow federal borrowers to seek cancellation of their William D. Ford Direct Loan (Direct Loan) Program loans if their institution engaged in certain misconduct. Beginning today, the Department will ensure that borrowers with approved borrower defense claims to date will have a streamlined path to receiving full loan discharges. This includes borrowers with previously approved claims that received less than a full loan discharge.

Full relief under the regulations will include:

• 100 percent discharge of borrowers’ related federal student loans.

• Reimbursement of any amounts paid on the loans, where appropriate under the regulations.

• Requests to credit bureaus to remove any related negative credit reporting. And,

• Reinstatement of federal student aid eligibility, if applicable. …

The Department will begin applying this new approach today and affected borrowers will receive notices from the Department over the next several weeks with discharges following after that.

[326] Press release: “ED Announces Changes to Relief for Approved Borrower Defense Applications.” U.S. Department of Education, Federal Student Aid Office, March 18, 2021. <studentaid.gov>

The U.S. Department of Education (ED) is changing how we determine relief for borrowers who were misled or whose schools engaged in other misconduct in violation of certain laws. …

Current borrower defense provisions in federal law allow federal borrowers to seek loan forgiveness of their William D. Ford Direct Loan (Direct Loan) Program loans if their institution engaged in certain misconduct. ED’s new approach will grant full loan relief for borrowers when evidence shows that their institution engaged in certain types of misconduct that impacted a borrower’s decision to apply to or remain enrolled in that institution.

ED will be rescinding the formula for calculating partial loan relief. Full loan relief available under the regulations will be applied to borrower defense claims approved to date; the change applies to claims for which borrowers only received partial loan relief and for applications approved to date that have yet to receive a relief determination. ED anticipates this change will ultimately help approximately 72,000 borrowers receive $1 billion in loan cancellation.

[327] Press release: “ED Approves Group Borrower Defense Discharge for ITT Technical Institute Borrowers and for Certain Kaplan Career Institute Borrowers.” U.S. Department of Education, Federal Student Aid Office, August 16, 2022. <studentaid.gov>

The U.S. Department of Education (ED) announced that it will discharge all remaining federal student loans borrowers received to attend ITT Technical Institute (ITT) from Jan. 1, 2005, through its closure in September 2016. This will provide relief to all federal student loan borrowers who attended ITT during that period, including borrowers who have not yet applied for borrower defense to repayment (borrower defense) discharge. Approximately 208,000 borrowers will receive discharges of their federal student loans, resulting in nearly $3.9 billion in relief. Eligible borrowers will have their ITT loans discharged without needing to take any additional action.

[328] Press release: “ED Approves Group Borrower Defense Discharge for All Corinthian Colleges Students.” U.S. Department of Education, Federal Student Aid Office, June 2, 2022. <studentaid.gov>

The U.S. Department of Education (ED) approved borrower defense to repayment (borrower defense) claims for all students with federal student loans who attended a school owned or operated by Corinthian Colleges Inc. (Corinthian) from its founding in 1995 to its closure in April 2015. This group borrower defense discharge will provide relief to all federal student loan borrowers who attended Corinthian during that period, including borrowers who have not yet applied for borrower defense discharge. Approximately 560,000 borrowers will receive 100% discharges of their federal student loans, resulting in approximately $5.8 billion in discharges. Eligible borrowers will have their Corinthian loans discharged without needing to take any additional action.

[329] Article: “Government to Forgive Student Loans at Corinthian Colleges.” By Tamar Lewin. New York Times, June 8, 2015. <www.nytimes.com>

“Secretary of Education Arne Duncan announced Monday that the Education Department would forgive the federal loans of tens of thousands of students who attended Corinthian Colleges, a for-profit college company that closed and filed for bankruptcy last month, amid widespread charges of fraud.”

[330] Article: “For-Profit Colleges File for Bankruptcy.” By Tamar Lewin. New York Times, May 4, 2015. <www.nytimes.com>

Corinthian was once one of the nation’s largest for-profit college companies, enrolling more than 100,000 students at its 100 Everest, Heald and WyoTech campuses. But for the last few years, the company has faced charges of predatory recruiting and false placement and graduation rates. It went into its death spiral last year when the Department of Education suspended its access to the federal student aid it depended on, and then brokered the sale of most of its campuses.

[331] Press release: “Improved Borrower Defense Discharge Process Will Aid Defrauded Borrowers, Protect Taxpayers.” U.S. Department of Education, December 20, 2017. <bit.ly>

After careful review to ensure a fair and efficient process, the U.S. Department of Education (the Department) today unveiled an improved discharge process for borrower defense to repayment (BDR) claims.

“We have been working to get this right for students since day one,” said Secretary Betsy DeVos. “No fraud is acceptable, and students deserve relief if the school they attended acted dishonestly. This improved process will allow claims to be adjudicated quickly and harmed students to be treated fairly. It also protects taxpayers from being forced to shoulder massive costs that may be unjustified.”

For pending claims, no changes were made to the existing approval criteria. Claims that previously would have been approved will still be approved today. However, rather than taking an “all or nothing” approach to discharge, the improved process will provide tiers of relief to compensate former Corinthian students based on damages incurred.

[332] Webpage: “Guidance Concerning Some Provisions of the 2016 Borrower Defense to Repayment Regulations.” Federal Student Aid, U.S. Department of Education, March 15, 2019. <bit.ly>

On November 1, 2016, the Department of Education published final regulations concerning borrower defense to repayment and other related matters in the Federal Register (81 Fed. Reg. 75,926). The original effective date (July 1, 2017) of these regulations was delayed by the Department, but by order of the U.S. District Court for the District of Columbia in the case Bauer et al. v. DeVos … the 2016 Final Regulations took effect.

[333] Ruling: Meaghan Bauer v. Elisabeth DeVos. U.S. District Court for the District of Columbia, September 12, 2018. U.S. District Judge Randolph D. Moss. <www.mass.gov>

Memorandum:

Meaghan Bauer, Stephano Del Rose, and a coalition of nineteen states and the District of Columbia bring suit against the Department of Education under the Administrative Procedure Act…. Plaintiffs challenge three agency actions delaying the implementation of the “Borrower Defense Regulations,” a package of regulatory changes to federal student loan programs designed to “protect student loan borrowers from misleading, deceitful, and predatory practices.”

Conclusion:

The Court will GRANT the state plaintiffs’ and student borrower plaintiffs’ motions for summary judgment … and will DENY the Department’s cross-motion for summary judgment…. It is further ORDERED that the parties shall appear for a status conference on September 14, 2018 at 10:30 a.m. in Courtroom 21 to address remedies.

[334] Ruling: California Association of Private Postsecondary Schools v. Elisabeth DeVos. U.S. District Court for the District of Columbia, October 16, 2018. U.S. District Judge Randolph D. Moss. <www.mass.gov>

Memorandum:

In sum, the Court concludes that CAPPS [California Association of Private Postsecondary Schools ] has failed to carry its burden of demonstrating that it is entitled to the “extraordinary remedy” of a preliminary injunction. … With respect to many of CAPPS’s challenges, the Court is not convinced that the association has shown a “substantial likelihood” that it has standing to sue. … With respect to some challenges, that conclusion is clear, while it is less certain as to others. But, as to each of the challenges that CAPPS raises in the pending motion, it falls well short of the “high standard for irreparable injury” that the Court of Appeals “has set.” …

Conclusion:

For the foregoing reasons, Plaintiff’s renewed motion to for preliminary injunction … is hereby DENIED.

[335] Final rule: “Student Assistance General Provisions, Federal Perkins Loan Program, Federal Family Education Loan Program, William D. Ford Federal Direct Loan Program, and Teacher Education Assistance for College and Higher Education Grant Program.” Federal Register, March 19, 2019. <www.govinfo.gov>

Page 9964:

Agency: Office of Postsecondary Education, Department of Education. …

Consistent with the decisions of the U.S. District Court for the District of Columbia, this document memorializes that selected provisions of these final regulations took effect. Due to more recently-effective amendments, the Department must also correct affected amendatory instructions to ensure their incorporation into the CFR [Code of Federal Regulations]. …

The original “effective date” for these provisions was July 1, 2017 … To the extent the provisions explicitly use this date as a benchmark (for example, § 685.206(c)(“For loans first disbursed prior to July 1, 2017, the borrower may assert a borrower defense under this paragraph”)), the Department will use July 1, 2017 as the relevant date.

[336] Webpage: “Guidance Concerning Some Provisions of the 2016 Borrower Defense to Repayment Regulations.” Federal Student Aid, U.S. Department of Education, March 15, 2019. <bit.ly>

On November 1, 2016, the Department of Education published final regulations concerning borrower defense to repayment and other related matters in the Federal Register (81 Fed. Reg. 75,926). The original effective date (July 1, 2017) of these regulations was delayed by the Department, but by order of the U.S. District Court for the District of Columbia in the case Bauer and others v. DeVos … the 2016 Final Regulations took effect.

[337] Press release: “Education Department Approves $3.9 Billion Group Discharge for 208,000 Borrowers Who Attended ITT Technical Institute.” U.S. Department of Education, August 16, 2022. <www.ed.gov>

The nearly $32 billion in student loan relief approved to date includes:

• $13 billion for 1 million borrowers whose institutions took advantage of them through discharges related to borrower defense and school closures.

• $9.5 billion for 175,000 borrowers through the Public Service Loan Forgiveness Program.

• $9 billion in total and permanent disability discharges for more than 425,000 borrowers.

The Department is also working on new regulations that will permanently improve a variety of the existing student loan forgiveness programs, significantly reduce monthly payments, and provide greater protections for students and taxpayers against unaffordable debts.

[338] Press release: “Education Department Approves $415 Million in Borrower Defense Claims Including for Former DeVry University Student.” U.S. Department of Education, February 16, 2022. <www.ed.gov>

“Nearly 16,000 borrowers will receive $415 million in borrower defense to repayment discharges following the approval of four new findings and the continued review of claims. … Today’s actions bring the total amount of approved relief under borrower defense to repayment to approximately $2 billion for more than 107,000 borrowers.”

[339] Press release: “Department of Education Announces Approval of New Categories of Borrower Defense Claims Totaling $500 Million in Loan Relief to 18,000 Borrowers.” U.S. Department of Education, June 16, 2021. <www.ed.gov>

The U.S. Department of Education (Department) announced today the approval of 18,000 borrower defense to repayment (borrower defense) claims for individuals who attended ITT Technical Institute (ITT). These borrowers will receive 100 percent loan discharges, resulting in approximately $500 million in relief. This brings total loan cancellation under borrower defense by the Biden–Harris Administration to $1.5 billion for approximately 90,000 borrowers.

[340] Press release: “Department of Education Announces Actions to Fix Longstanding Failures in the Student Loan Programs.” U.S. Department of Education, April 19, 2022. <www.ed.gov>

The Department has now approved approximately: …

• $1.2 billion for borrowers who attended ITT Technical Institutes before it closed; and

• Nearly $2 billion to 105,000 borrowers who were defrauded by their school.

[341] Calculated with data from the press release: “Education Department Approves $415 Million in Borrower Defense Claims Including for Former DeVry University Student.” U.S. Department of Education, February 16, 2022. <www.ed.gov>

Nearly 16,000 borrowers will receive $415 million in borrower defense to repayment discharges following the approval of four new findings and the continued review of claims. This includes approximately 1,800 former DeVry University (DeVry) students who will receive approximately $71.7 million in full borrower defense discharges after the U.S. Department of Education (Department) determined that the institution made widespread substantial misrepresentations about its job placement rates. These are the first approved borrower defense claims associated with a currently operating institution, and the Department will seek to recoup the cost of the discharges from DeVry. The Department anticipates that the number of approved claims related to DeVry will increase as it continues reviewing pending applications. …

After a review of voluminous amounts of evidence, the Department found that from 2008 to 2015 DeVry repeatedly misled prospective students across the country with claims that 90 percent of DeVry graduates who actively seek employment obtained jobs in their field of study within six months of graduation. This claim was the foundation of a national advertising campaign called, “We Major in Careers” to brand DeVry as a “Career Placement University” where it used the 90 percent placement statistic as the way to convince prospective students to enroll.

In fact, the institution’s actual job placement rate was around 58 percent. The Department found that more than half of the jobs included in the claimed 90 percent placement rate were held by students who obtained them well before graduating from DeVry and often before they even enrolled. These jobs were not attributable to a DeVry education and their inclusion was contrary to the plain language of the 90 percent claim. Moreover, DeVry excluded from its calculation large numbers of graduates who were in fact actively looking for work simply because they did not conduct a search in the manner that the University’s Career Services department preferred. …

To date, the Department has identified approximately 1,800 borrowers who will be eligible for approximately $71.7 million in discharges because they relied upon DeVry’s misrepresentation in deciding to enroll. The number of approvals is anticipated to grow as the Department reviews outstanding claims from former DeVry students. All borrowers with approved claims will receive full relief.

CALCULATION: $71.7 million / 1,800 borrowers = $39,833

[342] Webpage: “Get to Know the Student Loan Process.” Accessed August 24, 2022 at <www.devry.edu>

Classes Start August 29, 2022 …

Students apply for Federal Student Loans by completing the Free Application for Federal Student Aid (FAFSA®). After completing the FAFSA, there will be two additional steps that must be completed to secure your loans. Your Student Support Advisor will help guide you through these steps. …

Federal Direct Loans are low-interest loans that offer in-school deferment for students enrolled at least half-time. Loan amounts are based on dependency status and the number of credit hours enrolled toward your DeVry degree. Eligible students borrow directly from the U.S. Department of Education. …

Federal Direct Unsubsidized Loans are non-need based, low-interest loans available to eligible students enrolled at least half-time. Loan amounts are based on a number of factors such as the student’s cost of attendance and federal guidelines. …

Federal PLUS Loans are credit-based loans for eligible students who are enrolled at least half-time. For undergraduate students, the PLUS borrower must be one of the student’s parents. For graduate students, the PLUS borrower is the student. PLUS loans are non-need based and loan amounts are based on the student’s unmet cost of attendance.

[343] Twitter post: “The Biden Administration’s Student Loan Debt Plan.” By President Biden, August 24, 2022. <twitter.com>

“Forgiving Debt … $20,000 if you went to college on Pell Grants … $10,000 if you didn’t receive Pell grants”

[344] Webpage: “The Biden-Harris Administration’s Student Debt Relief Plan Explained.” U.S. Department of Education, Federal Student Aid. Accessed August 28, 2022 at <studentaid.gov>

To smooth the transition back to repayment and help borrowers at highest risk of delinquencies or default once payments resume, the U.S. Department of Education will provide up to $20,000 in debt cancellation to Pell Grant recipients with loans held by the Department of Education and up to $10,000 in debt cancellation to non-Pell Grant recipients. Borrowers are eligible for this relief if their individual income is less than $125,000 or $250,000 for households.

[345] Webpage: “The Biden-Harris Administration’s Student Debt Relief Plan Explained.” U.S. Department of Education, Federal Student Aid. Accessed August 28, 2022 at <studentaid.gov>

Income-based repayment plans have long existed within the U.S. Department of Education. However, the Biden-Harris Administration is proposing a rule to create a new income-driven repayment plan that will substantially reduce future monthly payments for lower- and middle-income borrowers.

The rule would:

• Require borrowers to pay no more than 5% of their discretionary income monthly on undergraduate loans. This is down from the 10% available under the most recent income-driven repayment plan.

• Raise the amount of income that is considered non-discretionary income and therefore is protected from repayment, guaranteeing that no borrower earning under 225% of the federal poverty level—about the annual equivalent of a $15 minimum wage for a single borrower—will have to make a monthly payment.

• Forgive loan balances after 10 years of payments, instead of 20 years, for borrowers with loan balances of $12,000 or less.

• Cover the borrower’s unpaid monthly interest, so that unlike other existing income-driven repayment plans, no borrower’s loan balance will grow as long as they make their monthly payments—even when that monthly payment is $0 because their income is low.

[346] Report: “The Biden Administration Extends the Pause on Federal Student Loan Payments: Legal Considerations for Congress.” By Kevin M. Lewis and Edward C. Liu. Congressional Research Service, January 27, 2021. <crsreports.congress.gov>

Page 1:

The Higher Education Relief Opportunities for Students (HEROES) Act of 2003 authorizes the Secretary of Education (Secretary) to “waive or modify any statutory or regulatory provision applicable to” the Title IV loan programs “as the Secretary deems necessary” to ensure that individuals adversely affected by a Presidentially declared national emergency “are not placed in a worse position financially.” (The HEROES Act discussed in this Sidebar is not the same as the identically named COVID-19 relief bill that the House of Representatives passed in the 116th Congress.)

Pages 2–3:

Student Loan Relief During the Trump Administration

During the Trump Administration, both Congress and the Executive afforded temporary relief to certain Title IV borrowers to mitigate the COVID-19 emergency’s economic impact. Following President Trump’s declaration of a national emergency with respect to the pandemic under the National Emergencies Act (NEA), the Secretary announced in March 2020 that “[a]ll borrowers with federally held student loans” would “automatically have their interest rates set to 0% for a period of at least 60 days.” The Secretary also announced that “each of these borrowers” would “have the option to suspend their payments for at least two months.” The Secretary’s March 2020 announcement did not specify the statutory authority for this relief.

The following week, the Secretary announced that ED would also “halt collection actions and wage garnishments to provide additional assistance to borrowers.” Again, the Secretary’s announcement did not specify which statute she invoked to provide this assistance.

A few days later, Congress enacted the Coronavirus Aid, Relief, and Economic Security (CARES) Act, which codified aspects of the relief the Secretary previously granted. Section 3513 of the CARES Act required the Secretary to “suspend all payments due for” certain Title IV loans held by ED “through September 30, 2020.” Among other things, Section 3513 also (1) required the Secretary to “suspend all involuntary collection” activities on such loans during the payment suspension period, and (2) provided that interest would not accrue on such loans during that period.

The 116th Congress did not pass legislation extending Section 3513’s sunset date. Instead, in August 2020, the Secretary purported to extend the student loan relief through December 31, 2020. Although the Secretary’s August 2020 announcement did not specify the statutory authority for that extension, ED later published a Federal Register notice asserting that the Secretary based the March and August relief measures on the HEROES Act. …

The Secretary invoked the HEROES Act again in December 2020 to extend this student loan relief through January 31, 2021.

Student Loan Relief During the Biden Administration

On Inauguration Day, President Biden announced that “the Acting Secretary of Education will extend the pause on federal student loan payments and collections and keep the interest rate at 0%.” Although the President’s announcement did not specify how long this extension would last, ED’s website suggests the extension will remain in effect “at least through Sept. 30, 2021.” Additional details about the extension are currently unavailable.

Legal Issues and Considerations for Congress

The relief measures discussed above raise unresolved questions regarding the Secretary’s authority to waive or modify statutes and regulations in response to a national emergency. As far as CRS’s research reveals, no court has interpreted or applied the HEROES Act or reviewed ED’s actions taken pursuant to the Act. Thus, it appears no court has considered where the outer boundaries of the Secretary’s HEROES Act authorities lie. Moreover, before the COVID-19 pandemic, Secretaries generally invoked the HEROES Act relatively narrowly to grant relief to limited subsets of borrowers, such as deployed military service members or victims of certain natural disasters. As a result, judicial and administrative precedent cast little light on whether the HEROES Act authorizes the COVID-19 relief measures discussed here.

[347] Report: “The Biden Administration Extends the Pause on Federal Student Loan Payments: Legal Considerations for Congress.” By Kevin M. Lewis and Edward C. Liu. Congressional Research Service, January 27, 2021. <crsreports.congress.gov>

Page 1:

The Higher Education Relief Opportunities for Students (HEROES) Act of 2003 authorizes the Secretary of Education (Secretary) to “waive or modify any statutory or regulatory provision applicable to” the Title IV loan programs “as the Secretary deems necessary” to ensure that individuals adversely affected by a Presidentially declared national emergency “are not placed in a worse position financially.” (The HEROES Act discussed in this Sidebar is not the same as the identically named COVID-19 relief bill that the House of Representatives passed in the 116th Congress.)

Pages 2–3:

Student Loan Relief During the Trump Administration

During the Trump Administration, both Congress and the Executive afforded temporary relief to certain Title IV borrowers to mitigate the COVID-19 emergency’s economic impact. Following President Trump’s declaration of a national emergency with respect to the pandemic under the National Emergencies Act (NEA), the Secretary announced in March 2020 that “[a]ll borrowers with federally held student loans” would “automatically have their interest rates set to 0% for a period of at least 60 days.” The Secretary also announced that “each of these borrowers” would “have the option to suspend their payments for at least two months.” The Secretary’s March 2020 announcement did not specify the statutory authority for this relief.

The following week, the Secretary announced that ED would also “halt collection actions and wage garnishments to provide additional assistance to borrowers.” Again, the Secretary’s announcement did not specify which statute she invoked to provide this assistance.

A few days later, Congress enacted the Coronavirus Aid, Relief, and Economic Security (CARES) Act, which codified aspects of the relief the Secretary previously granted. Section 3513 of the CARES Act required the Secretary to “suspend all payments due for” certain Title IV loans held by ED “through September 30, 2020.” Among other things, Section 3513 also (1) required the Secretary to “suspend all involuntary collection” activities on such loans during the payment suspension period, and (2) provided that interest would not accrue on such loans during that period.

The 116th Congress did not pass legislation extending Section 3513’s sunset date. Instead, in August 2020, the Secretary purported to extend the student loan relief through December 31, 2020. Although the Secretary’s August 2020 announcement did not specify the statutory authority for that extension, ED later published a Federal Register notice asserting that the Secretary based the March and August relief measures on the HEROES Act. …

The Secretary invoked the HEROES Act again in December 2020 to extend this student loan relief through January 31, 2021.

Student Loan Relief During the Biden Administration

On Inauguration Day, President Biden announced that “the Acting Secretary of Education will extend the pause on federal student loan payments and collections and keep the interest rate at 0%.” Although the President’s announcement did not specify how long this extension would last, ED’s website suggests the extension will remain in effect “at least through Sept. 30, 2021.” Additional details about the extension are currently unavailable.

Legal Issues and Considerations for Congress

The relief measures discussed above raise unresolved questions regarding the Secretary’s authority to waive or modify statutes and regulations in response to a national emergency. As far as CRS’s research reveals, no court has interpreted or applied the HEROES Act or reviewed ED’s actions taken pursuant to the Act. Thus, it appears no court has considered where the outer boundaries of the Secretary’s HEROES Act authorities lie. Moreover, before the COVID-19 pandemic, Secretaries generally invoked the HEROES Act relatively narrowly to grant relief to limited subsets of borrowers, such as deployed military service members or victims of certain natural disasters. As a result, judicial and administrative precedent cast little light on whether the HEROES Act authorizes the COVID-19 relief measures discussed here.

[348] Report: “The Biden Student Loan Forgiveness Plan: Budgetary Costs and Distributional Impact.” Penn Wharton Budget Model, August 26, 2022. <budgetmodel.wharton.upenn.edu>

Summary: President Biden’s new student loan forgiveness plan includes three major components. We estimate that debt cancellation alone will cost up to $519 billion, with about 75% of the benefit accruing to households making $88,000 or less. Loan forbearance will cost another $16 billion. The new income-driven repayment (IDR) program would cost another $70 billion, increasing the total plan cost to $605 billion under strict “static” assumptions. However, depending on future IDR program details to be released and potential behavioral (i.e., “non-static”) changes, total plan costs could exceed $1 trillion.

[349] Calculated with the dataset: “Average Number of People per Household, by Race and Hispanic Origin, Marital Status, Age, and Education of Householder: 2021.” U.S. Census Bureau, November 2021. <www.census.gov>

“Total households [=] 129,931,000”

CALCULATIONS:

  • $605,000,000,000 cost / 129,931,000 households = $4,656 cost/household
  • $1,000,000,000,000 cost / 129,931,000 households = $7,696 cost/household

[350] Webpage: “Actions on House Resolution 1412: Higher Education Relief Opportunities for Students Act of 2003.” U.S. House of Representatives, 108th Congress (2003–2004). Accessed August 27, 2022 at <www.congress.gov>

“Sponsor: Rep. Kline, John [R-MN-2] (Introduced 03/25/2003)”

[351] “Introduction of the Higher Education Relief Opportunities for Students Act of 2003—H.R. 1412.” Congressional Record, March 25, 2003. <www.congress.gov>

John Kline of Minnesota in the House of Representatives

Tuesday, March 25, 2003

Mr. Kline. Mr. Speaker, I am pleased to introduce, along with several of my colleagues, the Higher Education Relief Opportunities for Students Act, HEROES, of 2003. This is a bill that expresses the support and commitment of the U.S. House of Representatives for the troops who protect and defend the United States. Specifically this bill provides authority to the Secretary of Education to assist students whose lives are being disrupted by being called to serve in the Armed Forces.

This bill is simple in its purpose. It extends the specific waiver authority within title IV of the Higher Education Act for the Secretary of Education, and allows him to maintain his commitment to our men and women in uniform by providing assistance and flexibility as they transfer in and out of postsecondary education during a time of national emergency. This waiver authority addresses the need to assist students who are being called up to active duty or active service.

This bill is specific in its intent—to ensure that as a result of a war, military contingency operation or a national emergency: Affected borrowers of Federal student assistance are not in a worse financial position; administrative requirements on affected individuals are minimized without affecting the integrity of the programs; current year income of affected individuals may be used to determine need for purposes of financial assistance; and the Secretary is provided the authority to address issues not yet foreseen.

The bill also urges all postsecondary institutions to continue their support and commitment to their students by providing a full refund of tuition, fees, and other charges to students who are members of the Armed Forces or are serving on active duty, including the Reserves and National Guard. Many times, America’s military are also students. They are called away from their families, class work and studies to serve our nation’s national defense. These heroes deserve the flexibility and accommodations that institutions of higher education can provide as they deploy and return to the classroom.

As families send loved ones abroad to defend our Nation, the Higher Education Relief Opportunities for Students Act will allow the Secretary of Education to reduce some of the effects of that upheaval here at home.

I am proud and delighted that a number of my colleagues have signed on as original cosponsors of the Higher Education Relief Opportunities for Students Act. It is an indication of the Congress’s commitment to our military and to our students and families, as well as to those who make higher education available. I look forward to swift passage of this legislation.

[352] Webpage: “Actions on House Resolution 1412: Higher Education Relief Opportunities for Students Act of 2003.” U.S. House of Representatives, 108th Congress (2003–2004). Accessed August 27, 2022 at <www.congress.gov>

“04/01/2003 – 12:17pm…The House proceeded with forty minutes of debate on H.R. 1412.”

[353] “Debate on the Higher Education Relief Opportunities for Students Act of 2003—H.R. 1412.” Congressional Record, April 1, 2003. <www.congress.gov>

Just Facts’ summary:

Kline opened the debate by having a clerk read the text of bill. Kline then stated that the HEROES Act would relieve financial burdens on soldiers “while they defend our nation” and again emphasized that it will do this “without affecting the integrity” of student loan programs.

In response, Democrat Tim Ryan of Ohio noted that under the bill, the “interest” on these loans “will still be accruing; so this is a great first step, but I think we can do much better.” He then recommended that Congress pass an additional bill that will pay for this interest. “I have a bill that is the Active Reservists and National Guard Student Loan Relief Act which would do this,” said Ryan, “and I think we should look into it.”

After a few others spoke, Ryan explained that “under the current legislation that we are dealing with,” a reservist in his district named Krista Rosado who was called to active duty for up to two years “will accrue over $1,400 in additional interest on her loan. So when she does get back from service, she will owe this money.” He then stated:

I think the natural next step for us to take is to say to Krista, thank you for your service, thank you for your sacrifice, and we will take care of the interest on your loan while you were over serving your country.

Republican John Boehner of Ohio then replied to Ryan that “we have worked on his important addition to this bill, but under the 1973 Budget Act we are required to find offsets.” This means that other spending must be reduced so that the bill doesn’t increase the federal deficit.

Boehner then added that Ryan’s bill would cost about $10 million and “we will continue to work with you to try to find these offsets under the Budget Act so that we can, in fact, bring this bill to the floor.” Ultimately, Ryan’s bill never passed.

That exchange, along with many other statements during the debate, prove that the HEROES Act only allowed for the deferral or forbearance of student loans—not the cancellation of interest, much less cancelling the actual loans. These actions cost money, thus impairing the integrity of the program.

The debate also reveals that the bill applies to active-duty soldiers and people who are specifically harmed by a national disaster, not virtually everyone who owes student loans.

A few excerpts from many statements in the debate that confirm the above include the following:

  • “their loan payments are deferred until they return”
  • “forbear a loan as our servicemen and servicewomen are activated”
  • “specific needs of each student whose education is interrupted when they are called to service”
  • “disrupted by a national disaster”
  • “in the case of a national emergency,” the Secretary of Education “can, in fact, defer these payments”

[354] Webpage: “Actions on House Resolution 1412: Higher Education Relief Opportunities for Students Act of 2003.” U.S. House of Representatives, 108th Congress (2003–2004). Accessed August 27, 2022 at <www.congress.gov>

04/01/2003 – 3:57pm … House … On motion to suspend the rules and pass the bill Agreed to by the Yeas and Nays: (2/3 required): 421–1 (Roll no. 96) ….

07/31/2003 … Senate … Passed Senate without amendment by Unanimous Consent.

[355] Webpage: “House Resolution 1412: Higher Education Relief Opportunities for Students Act of 2003.” U.S. House of Representatives, 108th Congress (2003–2004). Accessed August 27, 2022 at <www.congress.gov>

SEC. 2. Waiver Authority for Response to Military Contingencies and National Emergencies.

(a) Waivers and Modifications.—

(1) In general.—Notwithstanding any other provision of law, unless enacted with specific reference to this section, the Secretary of Education (referred to in this Act as the “Secretary” ) may waive or modify any statutory or regulatory provision applicable to the student financial assistance programs under title IV of the Act as the Secretary deems necessary in connection with a war or other military operation or national emergency to provide the waivers or modifications authorized by paragraph (2).

(2) Actions authorized.—The Secretary is authorized to waive or modify any provision described in paragraph (1) as may be necessary to ensure that—

(A) recipients of student financial assistance under title IV of the Act who are affected individuals are not placed in a worse position financially in relation to that financial assistance because of their status as affected individuals;

(B) administrative requirements placed on affected individuals who are recipients of student financial assistance are minimized, to the extent possible without impairing the integrity of the student financial assistance programs, to ease the burden on such students and avoid inadvertent, technical violations or defaults;

[356] Ruling: Myra Brown v. U.S. Department of Education. United States District Court for the Northern District of Texas, Fort Worth Division, November 10, 2022. By Mark T. Pittman. <storage.courtlistener.com>

Page 1:

The Constitution vests “all legislative powers” in Congress. This power, however, can be delegated to the executive branch. But if the executive branch seeks to use that delegated power to create a law of vast economic and political significance, it must have clear congressional authorization. If not, the executive branch unconstitutionally exercises “legislative powers” vested in Congress. In this case, the HEROES Act—a law to provide loan assistance to military personnel defending our nation—does not provide the executive branch clear congressional authorization to create a $400 billion student loan forgiveness program. The Program is thus an unconstitutional exercise of Congress’s legislative power and must be vacated. ….

Pages 21–22:

First, the HEROES Act does not mention loan forgiveness. If Congress provided clear congressional authorization for $400 billion in student loan forgiveness via the HEROES Act, it would have mentioned loan forgiveness. The Act allows the Secretary [of Education] only to “waive or modify” provisions of title IV. The Secretary then uses that provision to rewrite title IV portions to provide for loan forgiveness. …

Second, the portions of the HEROES Act Defendants rely on fail to provide clear congressional authorization for the Program. Defendants rely on the COVID-19 pandemic as their justification for the Program. They contend that the HEROES Act allows the Secretary the authority to address the financial hardship of the COVID-19 pandemic. Indeed, the COVID-19 pandemic falls within the HEROES Act’s definition of an emergency. … But it is unclear whether the Program is “necessary in connection with [that] national emergency.” … The COVID-19 pandemic was declared a national emergency almost three years ago and declared weeks before the Program by the President as “over.”20 Thus, it is unclear if COVID-19 is still a “national emergency” under the Act.

Page 25:

Conclusion

This case involves the question of whether Congress—through the HEROES Act—gave the Secretary authority to implement a Program that provides debt forgiveness to millions of student-loan borrowers, totaling over $400 billion. Whether the Program constitutes good public policy is not the role of this Court to determine.21 Still, no one can plausibly deny that it is either one of the largest delegations of legislative power to the executive branch, or one of the largest exercises of legislative power without congressional authority in the history of the United States.

In this country, we are not ruled by an all-powerful executive with a pen and a phone. Instead, we are ruled by a Constitution that provides for three distinct and independent branches of government. As President James Madison warned, “[t]he accumulation of all powers, legislative, executive, and judiciary, in the same hands, whether of one, a few, or many, and whether hereditary, self-appointed, or elective, may justly be pronounced the very definition of tyranny.” The Federalist No. 47.

The Court is not blind to the current political division in our country. But it is fundamental to the survival of our Republic that the separation of powers as outlined in our Constitution be preserved. And having interpreted the HEROES Act, the Court holds that it does not provide “clear congressional authorization” for the Program proposed by the Secretary.

[357] Ruling: Biden v. Nebraska. U.S. Supreme Court, June 30, 2023. Decided 6–3. Majority: Roberts, Thomas, Alito, Gorsuch, Kavanaugh, Barrett. Dissenting: Kagan, Sotomayor, Jackson. <caselaw.findlaw.com>

Title IV of the Higher Education Act of 1965 (Education Act) governs federal financial aid mechanisms, including student loans. … The Act authorizes the Secretary of Education to cancel or reduce loans in certain limited circumstances. The Secretary may cancel a set amount of loans held by some public servants…. He may also forgive the loans of borrowers who have died or become “permanently and totally disabled, … borrowers who are bankrupt, … and borrowers whose schools falsely certify them, close down, or fail to pay lenders. …

The issue presented in this case is whether the Secretary has authority under the Higher Education Relief Opportunities for Students Act of 2003 (HEROES Act) to depart from the existing provisions of the Education Act and establish a student loan forgiveness program that will cancel about $430 billion in debt principal and affect nearly all borrowers. Under the HEROES Act, the Secretary “may waive or modify any statutory or regulatory provision applicable to the student financial assistance programs under title IV of the [Education Act] as the Secretary deems necessary in connection with a war or other military operation or national emergency.” … As relevant here, the Secretary may issue such waivers or modifications only “as may be necessary to ensure” that “recipients of student financial assistance under title IV of the [Education Act affected by a national emergency] are not placed in a worse position financially in relation to that financial assistance because of [the national emergency].” …

In 2022, as the COVID-19 pandemic came to its end, the Secretary invoked the HEROES Act to issue “waivers and modifications” reducing or eliminating the federal student debt of most borrowers. Borrowers with eligible federal student loans who had an income below $125,000 in either 2020 or 2021 qualified for a loan balance discharge of up to $10,000. Those who previously received Pell Grants—a specific type of federal student loan based on financial need—qualified for a discharge of up to $20,000. …

This case implicates many of the factors present in past cases raising similar separation of powers concerns. The Secretary has never previously claimed powers of this magnitude under the HEROES Act; “[n]o regulation premised on” the HEROES Act “has even begun to approach the size or scope” of the Secretary’s program. … The “ ‘economic and political significance’ ” of the Secretary’s action is staggering. … And the Secretary’s assertion of administrative authority has “conveniently enabled [him] to enact a program” that Congress has chosen not to enact itself. … The Secretary argues that the principles explained in West Virginia and its predecessors should not apply to cases involving government benefits. But major questions cases “have arisen from all corners of the administrative state,” … and this is not the first such case to arise in the context of government benefits. …

All this leads the Court to conclude that “[t]he basic and consequential tradeoffs” inherent in a mass debt cancellation program “are ones that Congress would likely have intended for itself.” … In such circumstances, the Court has required the Secretary to “point to ‘clear congressional authorization’ ” to justify the challenged program. … And as explained, the HEROES Act provides no authorization for the Secretary’s plan when examined using the ordinary tools of statutory interpretation—let alone “clear congressional authorization” for such a program. …

Reversed and remanded.

[358] Press release: “Fact Sheet: President Biden Announces New Actions to Provide Debt Relief and Support for Student Loan Borrowers.” U.S. Department of Education, June 30, 2023. <www.ed.gov>

No President has fought harder for student debt relief than President Biden, and he’s not done yet. President Biden will not let Republican elected officials succeed in denying hardworking Americans the relief they need. In light of the Supreme Court’s ruling this morning, President Biden and his Administration have already taken two steps this afternoon aimed at providing debt relief for as many borrowers as possible, as fast as possible, and supporting student loan borrowers:

• The Secretary of Education initiated a rulemaking process aimed at opening an alternative path to debt relief for as many working and middle-class borrowers as possible, using the Secretary’s authority under the Higher Education Act.

• The Department of Education (Department) finalized the most affordable repayment plan ever created, ensuring that borrowers will be able to take advantage of this plan this summer—before loan payments are due. …

The Biden-Harris Administration today also finalized the most affordable repayment plan ever created, called the Saving on a Valuable Education (SAVE) plan. This income-driven repayment plan will cut borrowers’ monthly payments in half, help the typical borrower save more than $1,000 per year on payments, allow many borrowers to make $0 monthly payments, and ensure borrowers don’t see their balances grow from unpaid interest. Specifically, the plan will:

• For undergraduate loans, cut in half the amount that borrowers have to pay each month from 10% to 5% of discretionary income.

• Raise the amount of income that is considered non-discretionary income and therefore is protected from repayment, guaranteeing that no borrower earning under 225% of the federal poverty level—about the annual equivalent of a $15 minimum wage for a single borrower—will have to make a monthly payment under this plan.

• Forgive loan balances after 10 years of payments, instead of 20 years, for borrowers with original loan balances of $12,000 or less. The Department estimates that this reform will allow nearly all community college borrowers to be debt-free within 10 years.

• Not charge borrowers with unpaid monthly interest, so that unlike other existing income-driven repayment plans, no borrower’s loan balance will grow as long as they make their monthly payments—even when that monthly payment is $0 because their income is low.

• All student borrowers in repayment will be eligible to enroll in the SAVE plan. They will be able to enroll later this summer, before any monthly payments are due. Borrowers who sign up or are already signed up for the current Revised Pay as You Earn (REPAYE) plan will be automatically enrolled in SAVE once the new plan is implemented.

[359] Press release: “Secretary Cardona Statement on Supreme Court Ruling on Biden Administration’s One Time Student Debt Relief Plan.” U.S. Department of Education, June 30, 2023. <www.ed.gov>

Today, the Supreme Court ruled against students and families across the country. It’s an outrage that lawsuits brought on by Republican elected officials have blocked critical student debt relief….

President Biden, Vice President Harris, and I will never stop fighting for borrowers, which is why we are using every tool available to provide them with needed relief. Earlier today, the Department of Education initiated a regulatory process to provide debt relief….

[360] Proposed rule: “Intent to Establish a Negotiated Rulemaking Committee.” Federal Register, July 6, 2023. <www.govinfo.gov>

Page 43069: “We announce our intention to establish a negotiated rulemaking committee to prepare proposed regulations for the Federal Student Aid programs authorized under title IV of the Higher Education Act of 1965, as amended (HEA).”

[361] Press release: “Biden-Harris Administration to Provide 804,000 Borrowers with $39 Billion in Automatic Loan Forgiveness as a Result of Fixes to Income Driven Repayment Plans.” U.S. Department of Education, July 14, 2023. <www.ed.gov>

“The Department of Education (Department) today will begin notifying more than 804,000 borrowers that they have a total of $39 billion in Federal student loans that will be automatically discharged in the coming weeks.”

[362] Press release: “Biden-Harris Administration to Provide 804,000 Borrowers with $39 Billion in Automatic Loan Forgiveness as a Result of Fixes to Income Driven Repayment Plans.” U.S. Department of Education, July 14, 2023. <www.ed.gov>

The Department of Education (Department) today will begin notifying more than 804,000 borrowers that they have a total of $39 billion in Federal student loans that will be automatically discharged in the coming weeks. In total, the Biden-Harris Administration has approved more than $116.6 billion in student loan forgiveness for more than 3.4 million borrowers.

The forthcoming discharges are a result of fixes implemented by the Biden-Harris Administration to ensure all borrowers have an accurate count of the number of monthly payments that qualify toward forgiveness under income-driven repayment (IDR) plans. These fixes are part of the Department’s commitment to address historical failures in the administration of the Federal student loan program in which qualifying payments made under IDR plans that should have moved borrowers closer to forgiveness were not accounted for. Borrowers are eligible for forgiveness if they have accumulated the equivalent of either 20 or 25 years of qualifying months. …

Today’s action builds on the Biden-Harris Administration’s unparalleled record of student debt relief to date, including:

• $45 billion for 653,800 public servants through improvements to PSLF;

• $10.5 billion for 491,000 borrowers who have a total and permanent disability; and

• $22 billion for nearly 1.3 million borrowers who were cheated by their schools, saw their schools precipitously close, or are covered by related court settlements.

[363] Webpage: “Who Qualifies for Borrower Defense?” U.S. Department of Education, Federal Student Aid Office. Accessed March 25, 2022 at <studentaid.gov>

Under the law, you may be eligible for borrower defense to repayment discharge of the federal student loans that you took out to attend a school if that school misled you, or engaged in other misconduct in violation of certain state laws. Specifically, you may assert borrower defense by demonstrating that the school, through an act or omission, violated state law directly related to your federal student loan or to the educational services for which the loan was provided. You may be eligible for borrower defense regardless of whether your school closed or you are otherwise eligible for loan discharge under other laws. You will only be eligible for this type of federal student loan discharge if your school’s misleading activities or other misconduct directly relate to the loan or to the educational services for which the loan was provided. You will not be eligible for this type of loan discharge based on claims that are not directly related to your loan or the educational services provided by the school. For example, personal injury claims or claims based on allegations of harassment are not bases for a borrower defense application.

[364] Press release: “U.S. Department of Education Announces $42 Billion in Approved Public Service Loan Forgiveness for More Than 615,000 Borrowers Since October 2021.” U.S. Department of Education, May 8, 2023. <www.ed.gov>

To mark Public Service Recognition Week, the U.S. Department of Education (Department) today announced that, as of the beginning of May 2023, it has approved a total of $42 billion in Public Service Loan Forgiveness (PSLF) for more than 615,000 borrowers since October 2021. This is a result of the temporary PSLF changes made by the Biden-Harris Administration that made it easier for borrowers to reach forgiveness. At the end of the previous Administration, only about 7,000 borrowers had been approved for the PSLF program.

[365] Report: “Higher Education: Education Should Strengthen Oversight of Schools and Accreditors.” U.S. Government Accountability Office, January 22, 2015. <www.gao.gov>

Page 2 (of PDF): “To access federal student aid—which totaled more than $136 billion in fiscal year 2013—schools must be accredited to ensure they offer a quality education.”

Page 4: “Accreditors … play a critical role in protecting the federal investment in higher education as part of the ‘triad’ that oversees schools participating in federal student aid programs authorized under Title IV of the Higher Education Act.8,9

Page 5:

Accrediting Agencies: Apply and enforce standards that help ensure that the education offered by a postsecondary school is of sufficient quality to achieve the objectives for which it is offered. …

The purpose of accreditation … is to help ensure that member schools meet quality standards established by accrediting agencies. While accreditation first arose in the U.S. as a means of ensuring academic quality by nongovernmental peer evaluation, today the process also serves as one of the bases for determining a school’s eligibility to participate in federal student aid programs. …

Accreditation … is a peer review process that serves several purposes in addition to being a gatekeeper for federal funds….

Pages 6–7:

In general, two different types of accreditors—regional and national—offer accreditation to schools that allows the schools to access federal student aid funds.11,12 Regional accreditors accredit mostly nonprofit and public schools, while national accreditors generally accredit for-profit schools. At the time of our review, regional accreditors had 3,134 member schools in total, while national accreditors had 3,719.13 Seven regional accreditors accredit schools within a particular region and have historically accredited public and private nonprofit schools that award degrees. In addition, eight national accreditors operate nationwide and have historically accredited vocational or technical schools that do not award degrees. Differences between regional and national accreditors still exist, as seen in figure 2, but some for-profit schools have obtained regional accreditation in recent years and many for-profit schools currently award two-and four-year degrees.

[366] Report: “Higher Education: Education Should Strengthen Oversight of Schools and Accreditors.” U.S. Government Accountability Office, January 22, 2015. <www.gao.gov>

Page 4: “Accreditors—generally nongovernmental, nonprofit organizations—play a critical role in protecting the federal investment in higher education….”

Page 5:

[U.S. Department of] Education: Recognize accreditors determined to be reliable authorities as to the quality of education offered by schools; certify schools as eligible to participate in federal student aid programs; and ensure that participating schools comply with the laws, regulations, and policies governing federal student aid. …

Accreditation agencies and processes predate the Higher Education Act [of 1965], and accreditation is a peer review process that serves several purposes in addition to being a gatekeeper for federal funds, including facilitating the transferability of courses and credits across member schools. According to representatives of schools and accrediting agencies, accreditation also encourages schools to maintain a focus on self-improvement.

While Education is required to determine whether accrediting agencies have standards in certain areas before recognizing them, the accrediting agencies are responsible for evaluating member schools to determine if they meet the accreditors’ standards. This accreditation process generally occurs at least every 10 years, depending on the accreditor and the school. The process is typically conducted by volunteer peer evaluators, generally from other member schools, selected by the accreditor, with final accreditation decisions made by a board that includes representatives from member schools and the public. While specific steps vary by accrediting agency, schools generally go through a similar accreditation process….

[367] Webpage: “The Executive Branch.” White House. Accessed March 2, 2022 at <www.whitehouse.gov>

Under Article II of the Constitution, the President is responsible for the execution and enforcement of the laws created by Congress. Fifteen executive departments—each led by an appointed member of the President’s Cabinet—carry out the day-to-day administration of the federal government. They are joined in this by other executive agencies such as the CIA and Environmental Protection Agency, the heads of which are not part of the Cabinet, but who are under the full authority of the President. The President also appoints the heads of more than 50 independent federal commissions, such as the Federal Reserve Board or the Securities and Exchange Commission, as well as federal judges, ambassadors, and other federal offices. The Executive Office of the President (EOP) consists of the immediate staff to the President, along with entities such as the Office of Management and Budget and the Office of the United States Trade Representative. …

The Cabinet is an advisory body made up of the heads of the 15 executive departments. Appointed by the President and confirmed by the Senate, the members of the Cabinet are often the President’s closest confidants. In addition to running major federal agencies, they play an important role in the Presidential line of succession—after the Vice President, Speaker of the House, and Senate President pro tempore, the line of succession continues with the Cabinet offices in the order in which the departments were created. All the members of the Cabinet take the title Secretary, excepting the head of the Justice Department, who is styled Attorney General. …

Department of Education

The mission of the Department of Education is to promote student learning and preparation for college, careers, and citizenship in a global economy by fostering educational excellence and ensuring equal access to educational opportunity.

The Department administers federal financial aid for higher education, oversees educational programs and civil rights laws that promote equity in student learning opportunities, collects data and sponsors research on America’s schools to guide improvements in education quality, and works to complement the efforts of state and local governments, parents, and students.

The U.S. Secretary of Education oversees the Department’s 4,200 employees and $68.6 billion budget.

[368] Report: “Higher Education: Education Should Strengthen Oversight of Schools and Accreditors.” U.S. Government Accountability Office, January 22, 2015. <www.gao.gov>

Page 9:

The Higher Education Act requires accreditors to report certain sanctions, including terminations and probations, to Education within 30 days, and to provide Education a summary of the reasons leading them to terminate a school’s accreditation.19 Regional accreditors recently agreed on common sanction definitions, while national accrediting agencies do not have agreed-upon sanction definitions….

19 Accreditors must also report such sanctions, and provide summaries to the appropriate state licensing or authorizing agency. … Specifically, accreditors must notify Education and the appropriate state licensing or authorizing agency of any final decision to place a school on probation; deny, withdraw, suspend, revoke, or terminate a school’s accreditation; or take other adverse action, as defined by the accrediting agency. … Accreditors must provide written notice to the public of such sanctions within 24 hours of its notice to the school.

[369] Report: “Higher Education: Education Should Strengthen Oversight of Schools and Accreditors.” U.S. Government Accountability Office, January 22, 2015. <www.gao.gov>

Page 6: “In general, two different types of accreditors—regional and national—offer accreditation to schools that allows the schools to access federal student aid funds.11,12 Regional accreditors accredit mostly nonprofit and public schools, while national accreditors generally accredit for-profit schools.”

Page 8:

Areas in Which Accreditors Are Required to Have Standards

1. Success with respect to student achievement (Standards may be established by the school and differ according to its mission.)

2. Curricula

3. Faculty

4. Facilities, equipment, and supplies

5. Fiscal and administrative capacity

6. Student support services

7. Recruiting and admissions practices

8. Measures of program length and objectives

9. Student complaints

10. Compliance with federal student aid program responsibilities

Pages 14–15:

In addition, the proportion of member schools that accreditors sanctioned varied. For example, two accreditors each sanctioned fewer than 2 percent of their member schools during our timeframe, compared to 41 percent for another accreditor. A representative from one accrediting agency explained that a key challenge for accreditors is grappling with competing expectations of accreditation. The representative noted that there is a general view by policy makers and those who influence policy that accreditors do not terminate accreditation enough. However, if an accreditor does terminate a particular school’s accreditation, she said there may be significant negative reaction from the public in the affected region, and a view that the accreditor is being too punitive.29

Page 18:

Reasons for Accreditor Sanctions:

Academic quality: issues with student achievement in relation to the mission and curricula, or other student outcomes.

Administrative capability: issues such as those related to facilities, supplies, and administrative capability.

Financial capability: issues with financial capability and compliance with federal student aid responsibilities.

Integrity: fraud or misrepresentation.

Governance: issues with division of responsibility, such as between the Board and a college president.

Institutional Effectiveness: issues related to long-term plans for assessing learning and academic achievement.

Page 22:

We found that, on average, accreditors were no more likely to issue terminations or probations to schools with weaker student outcomes compared to schools with stronger student outcomes from October 2009 through March 2014…. This held true for one combined indicator incorporating all of the student outcome characteristics we reviewed, as well as for most of the individual characteristics we examined. … Regional accreditors, however, were more likely to issue terminations or probations to schools with weaker outcomes on the combined indicator. (See appendix I for additional details on this analysis and appendix III for additional information on accreditor sanctions associated with student outcomes.)

Pages 23–25:

Selected Student Outcome Characteristics:

Three-Year Cohort Default Rate: the percent of borrowers in default 3 years after entering repayment status. Education views this characteristic as an indicator of academic quality at schools, since students who received a lower quality education may be less likely to have adequate income to repay their loans.

Forbearance Rate: the percent of borrowers in forbearance (and therefore not repaying their loans on a temporary basis) during the official cohort default period. Education views this characteristic as an indicator of academic quality at schools, since students who received a lower quality education may be less likely to have adequate income to repay their loans.

Graduation Rate: the percent of first-time full-time degree/certificate-seeking undergraduate students who complete a program within 150 percent of the program length. A low graduation rate may indicate a lack of academic quality.

Dropout Rate: the percent of students who left school during a particular year, but did not graduate. A high dropout rate may indicate a lack of academic quality.

Retention Rate: the percent of first-time degree/certificate-seeking students who enrolled in one fall and either successfully completed their program or re-enrolled in the next fall. A low retention rate may indicate a lack of academic quality.

Increases in Federal Student Aid: annual growth in federal student aid volume, which may indicate in extreme cases that growth may be too rapid to maintain academic and administrative services needed to adequately support students.

Number of Program Review Findings: the number of findings at schools selected by Education for in-depth review due to the presence of certain risk factors, and the number of issues found in those reviews. …

Although accreditors are required by law to have standards in academic and financial areas, among others, they are not required to use the student outcome characteristics that we selected to assess school academic quality, or to sanction members with weaker outcomes. Some accreditors do examine school student-level outcomes as benchmarks to determine whether their member schools are providing quality education, but would not necessarily sanction or revoke the accreditation of a school for not meeting these benchmarks.39

Table 3: Likelihood of Termination or Probation for Schools with Weaker vs. Stronger Individual Student Outcome Characteristics, by Type of Accreditor, October 2009 through March 2014

Was there a significant difference in accreditors’ responses to weaker and stronger student outcomesa at schools?

Overall … Default Rate [=] Yes … Graduation Rate [=] No … Dropout Rate [=] No … Retention Rate [=] No … Forbearance Rate [=] No

Regional accreditors … Default Rate [=] Yes … Graduation Rate [=] Yes … Dropout Rate [=] Yes … Retention Rate [=] Yes … Forbearance Rate [=] No

National accreditors … Default Rate [=] No … Graduation Rate [=] No … Dropout Rate [=] No … Retention Rate [=] No … Forbearance Rate [=] No

Source: GAO [U.S. Government Accountability Office] analysis of school-level student outcome characteristics collected by Education and data from the accreditation database. …

Notes: We used statistical techniques that allowed us to examine accreditors’ likelihood of sanctioning schools with weaker student outcome characteristics, compared to schools with stronger outcomes, for each individual outcome. Schools with weaker student outcomes were considered to be those in the bottom vs. the top for each characteristic (those in the 1st vs. 99th percentile and 5th vs. 95th percentile). “Yes” indicates that the difference between the 1st and 99th percentiles and/or 5th and 95th percentiles was statistically significant at the 95 percent confidence level. All comparisons were significant for the 1st and 99th percentiles as well as for the 5th and 95th percentiles, with the exception of default rate for regional accreditors, which was only significant when comparing the 5th and 95th percentiles.

a Default rate indicates the percent of borrowers who entered repayment in fiscal 2009 or 2010 and were in default as of the end of the second following fiscal year; graduation rates reported to IPEDS in 2011 and 2012 are for first-time full-time degree/certificate-seeking undergraduate students that completed their degree within 150 percent of the expected time; dropout rate indicates the total number of withdrawals reported by each school during a particular year divided by the total number of graduates plus withdrawals reported to the National Student Loan Data System for that year for award years 2008–2009 through 2012–2013; retention rate indicates the percent of first-time degree/certificate-seeking students who enrolled in the previous fall and either successfully completed their program or re-enrolled in the next fall as reported to IPEDS in the fall of 2010 and 2011; and forbearance rate indicates the percent of borrowers who entered repayment status in fiscal year 2009 and 2010 and were in forbearance as of the end of the following fiscal year.

Because the graduation rate collected by Education is limited to first-time full-time degree/certificate-seeking undergraduate students, we also estimated accreditors’ likelihood of sanctioning schools with higher dropout rates.40 Similar to the results of our graduation rate analysis, we found that national accreditors were not more likely to issue terminations or probations to schools with higher dropout rates than those with lower dropout rates. In contrast, regional accreditors were more likely to issue terminations or probations to schools with higher dropout rates (see table 3 above).

Pages 33–34:

For 36 of the 93 schools receiving federal student aid funds that were placed on probation by their accreditors in fiscal year 2012, we found no indication of follow-up activities by Education between the beginning of fiscal year 2012 and December 2013.56,57 Not all accreditor sanctions require follow-up by Education, such as a sanction issued for failure to obtain student feedback. However, oversight actions by Education may be warranted if accreditor sanctions indicate potential federal student aid violations or other weaknesses affecting a school’s ability to appropriately administer federal student aid programs. As discussed above, our review of 10 schools with fiscal year 2012 accreditor sanctions found three cases in which analysts had no record of accreditor sanctions that could indicate a need for heightened federal student aid oversight. Because Education did not capture its decisions or the rationale for them in these cases, it is not possible to know if analysts did not review the cases at all, or if they reviewed them and determined that no action should be taken.

Pages 35–36:

Unclear guidance from Education may also make it difficult for Education staff who oversee schools to respond consistently to these sanction notifications and contribute to lapses in oversight of schools, since the guidance does not lay out the recommended approach to specific types of accreditor sanctions.60 Moreover, although several officials who oversee schools told us they believed official guidance required them to restrict access to federal student aid funds for schools with show cause orders, the guidance does not specifically refer to show cause orders. In addition, the fact that Education may not have reviewed accreditor information about up to one-third of the 93 schools that were receiving federal student aid funds and that were placed on probation in fiscal year 2012, as discussed above, may also reflect the lack of clear guidance by sanction type. …

Moreover, in part because Education’s guidance does not lay out the recommended approach to specific types of accreditor sanctions, officials who oversee schools do not consistently view accreditor sanction notifications as a valuable oversight tool. For example, one official noted that her team would never respond to accreditor probations because they occur too frequently to track and would disrupt other work.62 However, our review found that just under 100 schools of the more than 6,000 participating in federal student aid programs were placed on probation by their accreditor in fiscal year 2012. Another official said reviewing accreditor sanctions was not very useful in overseeing schools, as accreditors would take additional action to prompt a response by Education if a school’s situation became more serious. However, other officials who oversee schools stated that they found show cause order notifications helpful.63 Consequently, Education’s response to sanctions is inconsistent. Since accreditors may take other, informal steps prior to issuing a sanction, as discussed earlier in the report, accreditor sanctions can in fact be a serious indication of problems at a school. More specifically, all accreditor sanctions—including probations—can be an important source of information on schools. Consistent with federal internal control standards that call for ongoing, continual monitoring, reviewing accreditor sanctions in a timely manner can help analysts who oversee schools detect school compliance issues as they occur and prevent more serious problems from developing in the future.64

[370] Report: “Higher Education: Education Should Strengthen Oversight of Schools and Accreditors.” U.S. Government Accountability Office, January 22, 2015. <www.gao.gov>

Page 40:

However, our analysis found that accreditors were no more likely to sanction schools with weaker student outcomes than schools with stronger student outcomes. These findings raise questions about whether existing accreditor standards are sufficient to ensure the quality of schools, whether Education is effectively determining if these standards ensure educational quality, and whether federal student aid funds are appropriately safeguarded.

[371] Press release: “Fact Sheet: Protecting Students from Abusive Career Colleges.” U.S. Department of Education, June 8, 2015. <bit.ly>

Over the past six years, the Education Department has taken unprecedented steps to hold career colleges accountable for giving students what they deserve: a high-quality, affordable education that prepares them for their careers. The Department established tougher regulations targeting misleading claims by colleges and incentives that drove sales people to enroll students through dubious promises. The Department has cracked down on bad actors through investigations and enforcement actions. The Department also issued “gainful employment” regulations, which will help ensure that students at career colleges don’t end up with debt they cannot repay. The Department will continue to hold institutions accountable in order to improve the value of their programs, protect students from abusive colleges, and safeguard the interests of taxpayers.

[372] Report: “The Budget and Economic Outlook: Fiscal Years 2013 to 2023.” U.S. Congressional Budget Office, February 2013. <www.cbo.gov>

Page 25:

However, several factors—collectively labeled other means of financing and not directly included in budget totals—also affect the government’s need to borrow from the public. Among them are reductions (or increases) in the government’s cash balance and in the cash flows associated with federal credit programs (such as those related to student loans and mortgage guarantees) because only the subsidy costs of those programs (calculated on a present-value basis) are reflected in the budget deficit.

CBO [U.S. Congressional Budget Office] projects that Treasury borrowing will be $104 billion more than the projected budget deficit in fiscal year 2013, mainly to finance student loans. Each year from 2014 to 2023, borrowing by the Treasury is expected to exceed the amount of the deficit, mostly because of the need to provide financing for student loans and other credit programs. CBO projects that the government will need to borrow $76 billion more per year, on average, during that period than the budget deficits would suggest.

[373] Report: “Analytical Perspectives: Budget of the U.S. Government, Fiscal Year 2012.” White House Office of Management and Budget, June 18, 2010. <www.gpo.gov>

Page 139:

To illustrate the budgetary and non-budgetary components of a credit program, consider a portfolio of new direct loans made to a cohort of college students. To encourage higher education, the Government offers loans at a lower cost than private lenders. Students agree to repay the loans according to the terms of their promissory notes. The loan terms may include lower interest rates or longer repayment periods than would be available from private lenders. Some of the students are likely to become delinquent or default on their loans, leading to Government losses to the extent the Government is unable to recover the full amount owed by the students. … In other words, the subsidy cost is the difference in present value between the amount disbursed by the Government and the estimated value of the loan assets the Government receives in return. Because the loan assets have value, the remainder of the transaction (beyond the amount recorded as a subsidy) is simply an exchange of financial assets of equal value and does not result in a cost to the Government.

[374] Report: “Fair-Value Accounting for Federal Credit Programs.” U.S. Congressional Budget Office, March 2012. <www.cbo.gov>

Page 1:

According to the rules for budgetary accounting prescribed in the Federal Credit Reform Act of 1990 (FCRA, incorporated as title V of the Congressional Budget Act of 1974), the estimated lifetime cost of a new loan or loan guarantee is recorded in the budget in the year in which the loan is disbursed.2 That lifetime cost is generally described as the subsidy provided by the loan or loan guarantee. It is measured by discounting all of the expected future cash flows associated with the loan or loan guarantee—including the amounts disbursed, principal repaid, interest received, fees charged, and net losses that accrue from defaults—to a present value at the date the loan is disbursed. A present value is a single number that expresses a flow of current and future income, or payments, in terms of a lump sum received, or paid, today; the present value depends on the rate of interest, known as the discount rate, that is used to translate future cash flows into current dollars.3

Page 3: “CBO [U.S. Congressional Budget Office] has estimated that the average subsidy for direct student loans made between 2010 and 2020 would be a negative 9 percent under FCRA accounting…. (A negative subsidy indicates that, for budgetary purposes, the transactions are recorded as generating net income for the government.)”

[375] Report: “Fair-Value Accounting for Federal Credit Programs.” U.S. Congressional Budget Office, March 2012. <www.cbo.gov>

Pages 1–2:

FCRA [Federal Credit Reform Act]-based cost estimates, however, do not provide a comprehensive measure of what federal credit programs actually cost the government and, by extension, taxpayers. Under FCRA’s rules, the present value of expected future cash flows is calculated by discounting them using the rates on U.S. Treasury securities with similar terms to maturity. Because that procedure does not fully account for the cost of the risk the government takes on when issuing loans or loan guarantees, it makes the reported cost of federal direct loans and loan guarantees in the federal budget lower than the cost that private institutions would assign to similar credit assistance based on market prices. Specifically, private institutions would generally calculate the present value of expected future cash flows by discounting those flows using the rates of return on private loans (or securities) with similar risks and maturities. Because the rates of return on private loans exceed Treasury rates, the discounted value of expected loan repayments is smaller under this alternative approach, which implies a larger cost of issuing a loan. (Similar reasoning implies that the private cost of a loan guarantee would be higher than its cost as estimated under FCRA.)4

FCRA and market-based cost estimates alike take into account expected losses from defaults by borrowers. However, because FCRA estimates use Treasury interest rates instead of market-based rates for discounting, FCRA estimates do not incorporate the cost of the market risk associated with the loans. Market risk is the component of financial risk that remains even after investors have diversified their portfolios as much as possible; it arises from shifts in macroeconomic conditions, such as productivity and employment, and from changes in expectations about future macroeconomic conditions. Loans and loan guarantees expose the government to market risk because future repayments of loans tend to be lower when the economy as a whole is performing poorly and resources are more highly valued.

Some observers argue that using market-based rates for discounting loan repayments to the federal government would be inappropriate because the government can fund its loans by issuing Treasury debt and thus does not seem to pay a price for market risk. However, Treasury rates are lower than those market-based rates primarily because Treasury debt holders are protected against default risk. If payments from borrowers fall short of what is owed to the federal government, the shortfall must be made up eventually either by raising taxes or by cutting other spending. (Issuing additional Treasury debt can postpone but not avert the need to raise taxes or cut spending.) Therefore, a more comprehensive approach to measuring the cost of federal credit programs would recognize market risk as a cost to the government and would calculate present values using market-based discount rates. Under such an approach, the federal budget would reflect the market values of loans and loan guarantees.

Page 11:

Federal student loans expose the government to losses from defaults, and they involve significant administrative expenses for origination, servicing, and collection on defaults; at the same time, the government collects fees and interest from borrowers. As with other types of credit, student loans are exposed to market risk, meaning that default rates tend to be higher, and recoveries smaller, when the economy is weak and the losses are most costly.

[376] Report: “Measuring the Cost of Government Activities That Involve Financial Risk.” U.S. Congressional Budget Office, March 2021. <www.cbo.gov>

Page 3:

FCRA [Federal Credit Reform Act] estimates are more likely to produce the appearance of budgetary savings (in other words, show a negative cost) for activities that could be costly to government stakeholders. Fair-value estimates, by contrast, avoid the implication that the government can reduce the deficit just by making loans and guarantees at market rates (an implication that is often called a free lunch). In a competitive market, private investors charge interest and fees that fully offset the average cost of defaults and market risk. A FCRA estimate for a loan made at market prices would incorporate the interest and fees that a private investor would charge for market risk but not the cost of the market risk itself. As a result, under FCRA, a given loan or loan guarantee would appear less expensive when made by the federal government than if it was made by a private lender. The cost of market risk would be included in fair-value estimates, making estimates of the costs of loans and loan guarantees more comprehensive (and higher) using that measure.

Page 4:

Student Loans

Higher limits on the amount of money that students could borrow for postsecondary education would result in more loans to students. That larger volume of loans would have a greater cost under the fair-value approach than under the FCRA approach because the cash flows of the student loan programs are subject to market risk: Former students tend to have lower income when the economy is performing poorly (because well-paying jobs are scarce), and the rate of defaults on student loans tends to be higher as well. A fair-value measure incorporating that market risk represents what the government would need to pay private entities to take on the cash flows from the loans. By contrast, a FCRA measure represents an amount of current cash spending that would have the same long-run effect on the debt as the student loans if defaults occurred at average rates.

[377] Report: “Fair-Value Accounting for Federal Credit Programs.” U.S. Congressional Budget Office, March 2012. <www.cbo.gov>

Page 2:

What is termed the fair-value approach to budgeting for federal credit programs would measure those programs’ costs at market prices or at some approximation of market prices when directly comparable market prices are unavailable. A fair-value approach generally entails applying the discount rates on expected future cash flows that private financial institutions would apply.5 In the view of the Congressional Budget Office (CBO), adopting a fair-value approach would provide a more comprehensive way to measure the costs of federal credit programs and would permit more level comparisons between those costs and the costs of other forms of federal assistance.

Page 3:

In some cases, fair-value estimates of budgetary costs as a percentage of loan amounts are considerably higher than FCRA [Federal Credit Reform Act] estimates: CBO has estimated that the average subsidy for direct student loans made between 2010 and 2020 would be a negative 9 percent under FCRA accounting but a positive 12 percent on a fair-value basis. (A negative subsidy indicates that, for budgetary purposes, the transactions are recorded as generating net income for the government.) Subsequent changes in CBO’s interest rate projections would affect both estimates of the amounts of those subsidies, but the large gap between them would remain.

Page 6:

Because FCRA accounting requires the use of Treasury rates for discounting, it implicitly treats the market risk associated with federal credit programs as having no cost to the government. As a result, the subsidy provided by the government is understated under FCRA accounting. Moreover, the higher the market risk that is associated with a credit obligation, the greater is that understatement.

Page 7:

When the government extends credit, the associated market risk of those obligations is effectively passed along to citizens who, as investors, would view that risk as costly.

If the federal government is able to spread certain risks more widely than the private sector can, the government may be a relatively efficient provider of certain types of insurance. That is, a private provider of such insurance might charge higher fees if it is unable to transfer the risk to a wide group of investors. However, even if the federal government can spread risks widely, it cannot eliminate the component of risk that is associated with fluctuations in the aggregate economy—market risk—and which investors require compensation to bear.

The federal government’s ability to borrow at Treasury rates also does not reduce the cost to taxpayers of the market risk associated with federal credit programs. Treasury rates are relatively low because the securities are backed by the government’s ability to raise taxes. When the government finances a risky loan or loan guarantee by selling a Treasury security, it is effectively shifting risk to members of the public. If such a loan is repaid as expected, the interest and principal payments cover the government’s obligation to the holder of the Treasury security, but if the borrower defaults, the obligation to the security holder must be paid for either by raising taxes or by cutting other spending to be able to repay the Treasury debt. (Issuing additional Treasury debt can postpone but not avert the need to raise taxes or cut spending.) Thus, the risk is effectively borne by taxpayers (or by beneficiaries of government programs); like investors, taxpayers and government beneficiaries generally value resources more highly when the economy is performing poorly.

[378] Report: “Estimates of the Cost of Federal Credit Programs in 2023.” U.S. Congressional Budget Office, June 2022. <www.cbo.gov>

Page 1 (of PDF):

The federal government supports some private activities by offering credit assistance to individuals and businesses. That assistance is provided through direct loans and guarantees of loans made by private financial institutions. In this report, the Congressional Budget Office estimates the lifetime costs of new loans and loan guarantees that are projected to be issued in 2023. The report shows two kinds of estimates: those currently used in the federal budget, which are made by following the procedures prescribed by the Federal Credit Reform Act of 1990 (FCRA), and those referred to as fair-value estimates, which measure the market value of the government’s obligations. Most of the FCRA estimates were produced by other federal agencies; the FCRA estimates for the largest federal credit programs and all of the fair-value estimates were produced by CBO.

Using FCRA procedures, CBO estimates that new loans and loan guarantees issued in 2023 would result in savings of $41.1 billion. But using the fair-value approach, CBO estimates that those loans and guarantees would have a lifetime cost of $51.1 billion. About three-quarters of the difference between those amounts is attributable to three sources: …

• The Department of Education’s student loan programs, which are projected to save $1.4 billion on a FCRA basis but to cost $7.7 billion on a fair-value basis.

Page 4:

Table 1. Projected Costs of Federal Credit Programs in 2023 …

Student Loans … Subsidy Rate (Percent)a … FCRA Estimate [=] –1.7% … Fair-Value Estimate [=] 9.1% … Subsidy (Billions of dollars) … FCRA Estimate [=] –$1.4 … Fair-Value Estimate [=] $7.7 …

Fair-value estimates differ from FCRA estimates in that they account for market risk—the component of financial risk that remains even with a well-diversified portfolio. Market risk arises from shifts in macroeconomic conditions, such as productivity and employment, and from changes in expectations about future macroeconomic conditions. …

The table excludes consolidation loans administered by the Department of Education. …

a. The subsidy rate is the cost of a program, calculated on either a FCRA or fair-value basis, divided by the amount disbursed. A positive subsidy rate indicates a cost to the government, and a negative rate indicates budgetary savings. …

The broad category of lending with the largest difference between the FCRA subsidy rate and the fair-value subsidy rate is student loans. Under FCRA procedures, those loans generate greater budgetary savings per dollar lent than most other federal credit assistance does; under the fair-value approach, most of those savings become costs.

Pages 8–9:

The Department of Education’s student loan programs provide several types of loans—subsidized Stafford loans (which are available to undergraduate students), unsubsidized Stafford loans (which are available to undergraduate and graduate students), and PLUS loans (which are available to parents and to graduate students). Those programs are projected to account for $85 billion of federal credit in 2023.

CBO uses a hybrid approach to separately estimate the fair-value subsidies for the portion of each student loan program with borrowers in fixed-payment repayment plans and income-driven repayment (IDR) plans. For borrowers in fixed-payment repayment plans, CBO uses the loss-multiple approach to estimate the subsidy rate on a fair-value basis. For borrowers in IDR plans, CBO’s fair-value estimates incorporate an adjustment to the projection of wages.

IDR plans tie required payments to borrowers’ incomes and provide loan forgiveness after a certain period. Those plans involve more market risk than fixed-payment repayment plans because the required payments depend on borrowers’ income and because borrowers may be eligible to have their unpaid balances forgiven. When the economy performs poorly, borrowers’ earnings are more likely to decrease, lowering the required payments. Those reduced payments will eventually lead to more loan forgiveness. (That additional risk is partly offset because borrowers in IDR plans are less likely than borrowers in fixed-payment repayment plans to default on their loans.) To develop an adjustment for IDR plans, CBO applied methods from academic studies that estimate the financial value of required payments that are a function of future wages.21 Those studies developed methods to adjust projections of future wages on the basis of their relationship with stock prices.

Projected Subsidies. Calculated on a FCRA basis, the average subsidy rate for the Department of Education’s student loan programs in 2023 is estimated to be −1.7 percent, and the lifetime budgetary savings are projected to be $1.4 billion. FCRA subsidy rates vary considerably among the individual programs, from −29.3 percent for the PLUS [Parent Loans for Undergraduate Students] loan program for parents to 13.7 percent for the subsidized Stafford loan program. …

Calculated on a fair-value basis, the average subsidy rate for the student loan programs in 2023 is estimated to be 9.1 percent, and the lifetime cost is projected to be $7.7 billion. The difference in budgetary impact between the FCRA and fair-value estimates is thus $9.2 billion. The fair-value subsidy rates differ substantially among the individual programs, from −15.0 percent for the PLUS loan program for parents to 24.5 percent for the subsidized Stafford loan program.

[379] Report: “Student Loans: Education Has Increased Federal Cost Estimates of Direct Loans by Billions Due to Programmatic and Other Changes.” U.S. Government Accountability Office, July 28, 2022. <www.gao.gov>

Overview:

Although the Department of Education originally estimated federal Direct Loans made in the last 25 years would generate billions in income for the federal government, its current estimates show these loans will cost the government billions. Education originally estimated these loans to generate $114 billion in income for the government. Although actual costs cannot be known until the end of the loan terms, as of fiscal year 2021 these loans are estimated to cost the federal government $197 billion. This swing of $311 billion was driven both by programmatic changes and by reestimates using revised assumptions (e.g., economic factors and loan performance) as additional data became available (see figure).

The largest estimated cost increases—$102 billion in total—stemmed from emergency relief provided to most federal student loan borrowers under the CARES Act and related administrative actions in response to the COVID-19 pandemic. This relief included suspending (1) all payments due, (2) interest accrual, and (3) involuntary collections for loans in default. The suspensions, which are programmatic changes dating back to March 13, 2020, are currently set to expire on August 31, 2022. Reestimates based on updated data and assumptions about borrowers in Income-Driven Repayment plans also substantially increased estimated costs.

[380] U.S. Code Title 38, Part III, Chapter 34, Subchapter I, Section 3452: “Veterans’ Benefits, Definitions.” Accessed August 10, 2015 at <www.law.cornell.edu>

(f) The term “institution of higher learning” means a college, university, or similar institution, including a technical or business school, offering postsecondary level academic instruction that leads to an associate or higher degree if the school is empowered by the appropriate State education authority under State law to grant an associate or higher degree. When there is no State law to authorize the granting of a degree, the school may be recognized as an institution of higher learning if it is accredited for degree programs by a recognized accrediting agency. Such term shall also include a hospital offering educational programs at the postsecondary level without regard to whether the hospital grants a postsecondary degree. Such term shall also include an educational institution which is not located in a State, which offers a course leading to a standard college degree, or the equivalent, and which is recognized as such by the secretary of education (or comparable official) of the country or other jurisdiction in which the institution is located.

[381] Webpage: “Academic Degree and Certificate Definitions.” Arkansas Department of Higher Education, Research and Planning Division. Accessed July 17, 2015 at <bit.ly>

Associate degree (two years or more): a degree granted upon completion of a program that requires at least two, but fewer than four, academic years of postsecondary education. It includes a level of general education necessary for growth as a lifelong learner and is comprised of 60–72 semester credit hours. There are four types of associate degrees …

Baccalaureate (bachelor’s) degree: a degree granted upon completion of a program that requires four to five years of full-time college work and carries the title of bachelor. …

Master’s degree: a degree which requires at least one, but no more than two, full-time equivalent years of study beyond the bachelor’s degree.

Doctoral degree: a degree awarded upon completion of an educational program at the graduate level which terminates in a doctor’s degree. …

First professional degree: a degree awarded upon completion of a program which meets all of these criteria: a) completion of academic requirements to begin practice in the profession; b) at least two years of college work before entering the program; and c) at least six academic years of college work to complete the degree program, including the prior required college work. First professional degrees are awarded in these fields:

• Chiropractic (DC)

• Dentistry (DDS or DMD)

• Law (LLB or JD)

• Medicine (MD)

• Optometry (OD)

• Osteopathic Medicine (DO)

• Pharmacy (Pharm.D.)

• Podiatry (Pod D or DP)

• Theology (M Div or MHL)

• Veterinary Medicine (DVM)

[382] Report: “Time to Degree of U.S. Research Doctorate Recipients.” By Thomas B. Hoffer and Vincent Welch. National Science Foundation, March 2006. <www.umces.edu>

Page 1:

This InfoBrief draws on data from the Survey of Earned Doctorates (SED) to document average time-to-degree differences among research doctorate recipients from U.S. universities. … [T] three measures of time to degree are examined here:

• total elapsed time from completion of the baccalaureate to the doctorate (total time to degree)

• time in graduate school less reported periods of nonenrollment (registered time to degree)

• age at doctorate …

For the 2003 doctorate recipients, the median total time from baccalaureate to doctorate was 10.1 years, while the median registered time was 7.5 years and the median age at doctorate was 33.3 years.

Pages 2–3:

Table 3 shows time-to-degree differences for 2003 by more detailed science fields of study. Chemistry has the lowest [median] times to degree on all three measures. For the registered time-to-degree variable, mathematics (6.8 years), engineering (6.9 years), and biological sciences (6.9 years), and physics and astronomy (7.0 years) were the next closest fields to chemistry (6.0 years). The longest registered time-to-degree total was found for anthropology (9.6 years).

[383] Webpage: “Path to Graduate and Professional Education.” Grand Valley State University. Accessed August 10, 2015 at <bit.ly>

A doctoral degree typically involves both coursework and a major research project. Usually 5 to 7 years of full-time study is needed to complete a Ph.D. or other research doctorate. The first 2 to 3 years usually involve classes, seminars, and directed reading to give you comprehensive knowledge of an academic field. This period of study is followed by a written or oral examination that tests your knowledge.

[384] Webpage: “The Difference Between a PhD and Professional Doctorate.” Capella University, March 27, 2018. <www.capella.edu>

Some people say that a PhD prepares you to teach, while a professional doctorate is more geared toward a professional career. But the answer to the question is more complex. …

The primary difference between PhD and professional doctorate programs is the type of research conducted in the independent research phase.

PhD students are expected to create, expand, and contribute to knowledge, research, and theory in their field of study. This kind of discovery is often called original research.

Professional doctorate students are expected to expand and apply existing knowledge and research to existing problems in their professional field. This is often referred to as applied research.

[385] Book: Academically Adrift: Limited Learning on College Campuses. By Richard Arum and Josipa Roksa. University of Chicago Press, 2011.

Pages 146–147:

During graduate training, future faculty members receive little if any formal instruction on teaching. Doctoral training focuses primarily, and at times exclusively, on research. Although recent decades have seen a proliferation of interest in improving the preparation of graduate students, a recent survey of doctoral students indicated that only 50 percent either had an opportunity to take a teaching assistant’s training course lasting at least one term, or reported that they had an opportunity to learn about teaching in their respective disciplines through workshops and seminars.38 Not surprisingly, one of the main concerns of students in doctoral programs is a lack of systematic opportunities to help them learn how to teach.39

Graduate students are not only entering classrooms without much preparation, but more problematically, they are learning in their graduate programs to deprioritize and perhaps even devalue teaching. This aspect of graduate training, which neither prepares students to teach nor always instills in them a respect for the importance of teaching, is problematic not only on principled grounds but also from a functional standpoint: “Many, if not most [PhDs], will not be tenure-track faculty members,” and only a few will have jobs at research universities.41

[386] Webpage: “Difference Between Academic and Professional Doctorate Degrees.” University of California Berkeley, Office of Planning and Analysis. Accessed December 16, 2016 at <bit.ly>

“Although the work for the professional doctor’s degree may extend the boundaries of knowledge in the field, it is directed primarily towards distinguished practical performance.”

[387] Webpage: “The Difference Between a PhD and Professional Doctorate.” Capella University, March 27, 2018. <www.capella.edu>

Some people say that a PhD prepares you to teach, while a professional doctorate is more geared toward a professional career. But the answer to the question is more complex. …

The primary difference between PhD and professional doctorate programs is the type of research conducted in the independent research phase.

PhD students are expected to create, expand, and contribute to knowledge, research, and theory in their field of study. This kind of discovery is often called original research.

Professional doctorate students are expected to expand and apply existing knowledge and research to existing problems in their professional field. This is often referred to as applied research.

[388] Webpage: “Academic Degree and Certificate Definitions.” Arkansas Department of Higher Education, Research and Planning Division. Accessed July 17, 2015 at <bit.ly>

Associate degree (two years or more): a degree granted upon completion of a program that requires at least two, but fewer than four, academic years of postsecondary education. It includes a level of general education necessary for growth as a lifelong learner and is comprised of 60–72 semester credit hours. There are four types of associate degrees: …

Baccalaureate (bachelor’s) degree: a degree granted upon completion of a program that requires four to five years of full-time college work and carries the title of bachelor. …

Master’s degree: a degree which requires at least one, but no more than two, full-time equivalent years of study beyond the bachelor’s degree.

Doctoral degree: a degree awarded upon completion of an educational program at the graduate level which terminates in a doctor’s degree. …

First professional degree: a degree awarded upon completion of a program which meets all of these criteria: a) completion of academic requirements to begin practice in the profession; b) at least two years of college work before entering the program; and c) at least six academic years of college work to complete the degree program, including the prior required college work. First professional degrees are awarded in these fields:

• Chiropractic (DC)

• Dentistry (DDS or DMD)

• Law (LLB or JD)

• Medicine (MD)

• Optometry (OD)

• Osteopathic Medicine (DO)

• Pharmacy (Pharm.D.)

• Podiatry (Pod D or DP)

• Theology (M Div or MHL)

• Veterinary Medicine (DVM)

[389] Dataset: “Table 1. Number and Percentage Distribution of Students Enrolled at Title IV Institutions, by Control of Institution, Student Level, Level of Institution, Enrollment Status, and Other Selected Characteristics: United States, Fall 2021.” U.S. Department of Education, National Center for Education Statistics, Spring 2022. <nces.ed.gov>

“All institutions … Number … All Students [=] 19,036,612 … Percent … 4-year [=] 73.4 … 2-year [=] 25.1 … Full-time [=] 60.9 … Part-time [=] 39.1 … Men [=] 41.5 … Women [=] 58.5”

[390] Dataset: “Table 302.10. Recent High School Completers and Their Enrollment in College, by Sex and Level of Institution: 1960 Through 2021.” U.S. Department Of Education, National Center for Education Statistics, August 2022. <nces.ed.gov>

[391] Dataset: “Table 302.20. Percentage of Recent High School Completers Enrolled in College, by Race/Ethnicity: 1960 Through 2021.” U.S. Department of Education, National Center for Education Statistics, August 2022. <nces.ed.gov>

[392] Dataset: “Table 326.20. Graduation Rate From First Institution Attended Within 150 Percent of Normal Time for First-Time, Full-Time Degree/Certificate-Seeking Students at 2-Year Postsecondary Institutions, by Race/Ethnicity, Sex, and Control of Institution: Selected Cohort Entry Years, 2000 Through 2017.” U.S. Department of Education, National Center for Education Statistics, October 2021. <nces.ed.gov>

[393] Dataset: “Table 326.10. Graduation Rate From First Institution Attended for First-Time, Full-Time Bachelor’s Degree- Seeking Students at 4-Year Postsecondary Institutions, by Race/Ethnicity, Time to Completion, Sex, Control of Institution, and Acceptance Rate: Selected Cohort Entry Years, 1996 Through 2014.” U.S. Department of Education, National Center for Education Statistics, October 2021. <nces.ed.gov>

[394] Dataset: “Table 326.10. Graduation Rate From First Institution Attended for First-Time, Full-Time Bachelor’s Degree- Seeking Students at 4-Year Postsecondary Institutions, by Race/Ethnicity, Time to Completion, Sex, Control of Institution, and Acceptance Rate: Selected Cohort Entry Years, 1996 Through 2014.” U.S. Department of Education, National Center for Education Statistics, October 2021. <nces.ed.gov>

[395] Calculated with the dataset: “Person Income in 2021, Both Sexes, 25 to 64 Years, Total Work Experience.” U.S. Census Bureau. Accessed January 24, 2023 at <www2.census.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • Like all Census Bureau measures of “money” income, this dataset doesn’t include noncash benefits like subsidized housing, food stamps, charitable services, and government or employer-provided health benefits. Also, the data are collected via government surveys, and low-income households substantially underreport their income on such surveys.

[396] Report: “Income in the United States: 2021.” By Jessica Semega and Melissa Kollar. U.S. Census Bureau, September 2022. <www.census.gov>

Page 13:

Data on income collected in the CPS ASEC [Current Population Survey Annual Social and Economic Supplements] by the U.S. Census Bureau cover money income received (exclusive of certain money receipts such as capital gains) before payments for personal income taxes, Social Security, union dues, Medicare deductions, etc. Money income also excludes tax credits such as the Earned Income Tax Credit, the Child Tax Credit, and special COVID-19- related stimulus payments. Money income does not reflect that some families receive noncash benefits such as Supplemental Nutrition Assistance/food stamps, health benefits, and subsidized housing. In addition, money income does not reflect the fact that noncash benefits often take the form of the use of business transportation and facilities, full or partial payments by business for retirement programs, or medical and educational expenses. …

Data users should consider these elements when comparing income levels. Moreover, readers should be aware that for many different reasons there is a tendency in household surveys for respondents to underreport their income. Based on an analysis of independently derived income estimates, the Census Bureau determined that respondents report income earned from wages or salaries more accurately than other sources of income, and that the reported wage and salary income is nearly equal to independent estimates of aggregate income.

[397] Paper: “The Falling Time Cost of College: Evidence from Half a Century of Time Use Data.” By Philip Babcock and Mindy Marks. The Review of Economics and Statistics, May 2011. Pages 468–478. <www.mitpressjournals.org>

Page 468:

Using multiple data sets from different time periods, we document declines in academic time investment by full-time college students in the United States between 1961 and 2003. Full-time students allocated 40 hours per week toward class and studying in 1961, whereas by 2003, they were investing about 27 hours per week. Declines were extremely broad based and are not easily accounted for by framing effects, work or major choices, or compositional changes in students or schools. We conclude that there have been substantial changes over time in the quantity or manner of human capital production on college campuses.

[398] Calculated with data from the book: Academically Adrift: Limited Learning on College Campuses. By Richard Arum and Josipa Roksa. University of Chicago Press, 2011.

Pages 32–33:

Our research was made possible by a collaborative partnership with the Council for Aid to Education … and twenty-four four-year colleges and universities that granted us access to students who were scheduled to take the Collegiate Learning Assessment (CLA) in their first semester (Fall 2005) and at the end of their sophomore year (Spring 2007). … The research in this book is based on longitudinal data of 2,322 students enrolled across a diverse range of campuses. … The schools are dispersed nationally across all four regions of the country. We refer to this multifaceted data as the Determinants of College Learning (DCL) dataset. … On most measures, students in the DCL dataset appeared reasonably representative of traditional-age undergraduates in four-year institutions, and the colleges and universities they attended resembled four-year institutions nationwide. The DCL students’ racial, ethnic, and family backgrounds as well as their English-language backgrounds and high school grades also tracked well with national statistics.

Pages 110–111:

Students in our sample reported spending on average fifteen hours per week attending classes and labs. The rest of the time was divided between studying and myriad other activities. Studying is far from the focus of students’ “free time” (i.e., time outside of class): only twelve hours a week are spent studying. Combining the hours spent studying with the hours spent in classes and labs, students in our sample spent less than one-fifth (16 percent) of their reported time each week on academic pursuits. …

In addition to attending classes and studying, students are spending time working, volunteering, and participating in college clubs, fraternities, and sororities. If we presume that students are sleeping eight hours a night … that leaves 85 hours a week for other activities…. What is this additional time spent on? It seems to be spent mostly on socializing and recreation. A recent study of University of California undergraduates reported that while students spent thirteen hours a week studying, they also spent twelve hours socializing with friends, eleven hours using computers for fun, six hours watching television, six hours exercising, five hours on hobbies, and three hours on other forms of entertainment. Students were thus spending on average 43 hours per week outside the classroom on these activities—that is, over three times more hours than the time they spent studying.

CALCULATIONS:

  • 15 hours attending classes and labs + 12 to 13 hours studying = 27–28 hours
  • 27–28 hours on educational activities / 168 hours per week = 16–17%
  • 43 hours on leisure activities and sports / 168 hours per week = 26%

[399] Paper: “Where A Is Ordinary: The Evolution of American College and University Grading, 1940–2009.” By Stuart Rojstaczer and Christopher Healy. Teachers College Record, July 2012. <www.tcrecord.org>

Page 1: “A’s represent 43% of all letter grades, an increase of 28 percentage points since 1960 and 12 percentage points since 1988.”

Page 3:

The characteristics of the 135 institutions for which we have contemporary data are summarized in Table 1. In addition, we have historical data on grading practices from the 1930s onward for 173 institutions (93 of which also have contemporary data). Time series were constructed beginning in 1960 by averaging data from all institutions on an annual basis. For the 1930s, 1940s, and 1950s, data are sparse, so we averaged over 1936 to 1945 (data from 37 schools) and 1946 to 1955 (data from 13 schools) to estimate average grades in 1940 and 1950, respectively. For the early part of the 1960s, there are 11–13 schools represented by our annual averages. By the early part of the 1970s, the data become more plentiful, and 29–30 schools are averaged. Data quantity increases dramatically by the early 2000s with 82–83 schools included in our data set. Because our time series do not include the same schools every year, we smooth our annual estimates with a moving centered three-year average.

Page 4: “Table 1. Characteristics of Schools With Contemporary Data Including Grading Averages … Totals … %A [=] 43.0 … %B [=] 33.8 … %C [=] 14.9 … %D [=] 4.1 … %F [=] 4.2”

Page 10:

Our sample has a student population of 1.5 million, far greater than any other previous detailed study on national grading patterns for four-year colleges and universities. It should be noted, however, that although we randomly found and sought data, in comparison with national student populations, our sample underrepresents private schools (which grade higher than national averages) and overrepresents Southern schools (which grade lower than national averages) … The average SAT score of our sampled student body weighted by student population (math plus verbal) is about 40 points higher than that seen nationally for 2008 in a survey of 2,343 four-year institutions….

The combined effect of undersampling private schools, oversampling Southern schools, and (probably) the slightly higher average SAT scores of our sampled students relative to national averages suggests that our weighted average of 42% A’s is a slightly conservative one.

[400] Report: “Grade Inflation at American Colleges and Universities.” By Stuart Rojstaczer. Last updated March 29, 2016. <www.gradeinflation.com>

We now have data on average grades from over 400 schools (with a combined enrollment of over four million undergraduates). …

By the mid-to-late 1990s, A was the most common grade at an average four-year college campus (and at a typical community college as well). By 2013, the average college student had about a 3.15 GPA … and forty-five percent of all A–F letter grades were A’s.…

The observed grade change nationwide in the consumer era is the equivalent of every class of 100 making two B students into B+ students every year and alternating between making one A– student into an A student and one B+ student into an A– student every year. It’s so incrementally slow a process that it’s easy to see why an individual instructor (or university administrator or leader) can delude himself into believing that it’s all due to better teaching or better students. But after 30 years of professors making these kinds of incremental changes, the amount of rise becomes so large that what’s happening becomes clear: mediocre students are getting higher and higher grades. It’s perhaps worth noting that if you strictly applied the above grading changes in a typical class of 100 at a four-year college today, you’d run out of B students to elevate to B+ students in about seven years.

Statements have been made by some that grade inflation is confined largely to selective and highly selective colleges and universities. The three charts above indicate that these statements are not correct. Significant grade inflation is present everywhere and contemporary rates of change in GPA are on average the same for public and private schools.

[401] Article: “Going Naked.” By Richard H. Hersh. American Association of Colleges and Universities, Peer Review, Spring 2007. <go.gale.com>

[T]he Collegiate Learning Assessment project (CLA) began as an approach to assessing core outcomes espoused by all of higher education—critical thinking, analytical reasoning, problem solving, and writing. (Fig. 1 provides a small sample of questions used in developing our scoring rubrics.) These outcomes cannot be taught sufficiently in any one course or major but rather are the collective and cumulative result of what takes place or does not place over the four to six years of undergraduate education in and out of the classroom.

The CLA is an institutional measure of value-added rather than an assessment of an individual student or course. It has now been used by more than two-hundred institutions and over 80,000 students in cross-sectional and longitudinal studies to signal where an institution stands with regard to its own standards and to other similar institutions….

[402] Book: Beyond the Bubble Test: How Performance Assessments Support 21st Century Learning. Edited by Linda Darling-Hammond and Frank Adamson. Jossey-Bass (an imprint of John Wiley & Sons), 2014.

Pages 71–72:

The CLA [Collegiate Learning Assessment] was developed to measure undergraduates’ learning—in particular, their ability to think critically, reason analytically, solve problems, and communicate clearly. …

Both the CLA and its high school counterpart, the CWRA [College and Work Readiness Assessment], differs substantially from most other standardized tests, which are based on an empiricist philosophy and a psychometric/behavioral tradition. …

The conceptual underpinnings of the CLA and CWRA are embodied in what has been called a criterion sampling approach to measurement. This approach assumes that the whole is greater than the sum of its parts and that complex tasks require an integration of abilities that cannot be captured if divided into and measured as individual components. The criterion sampling notion is straightforward: if you want to know what a person knows and can do, sample tasks from the domain in which she is to act, observe performance, and infer competence and learning. For example, if you want to know whether a person not only knows the laws that govern driving a car but can also actually drive a car, do not just give her a multiple-choice test. Also administer a driving test with a sample of tasks from the general driving domain (starting the car, pulling into traffic, turning right and left in traffic, backing up, parking). On the basis of this sample of performance, it is possible to draw more general, valid inferences about driving performance.

The CLA/CWRA follows the criterion-sampling approach by defining a domain of real-world tasks that are holistic and drawn from life situations.

[403] Book: Academically Adrift: Limited Learning on College Campuses. By Richard Arum and Josipa Roksa. University of Chicago Press, 2011.

Pages 32–33: “[T]he Council for Aid to Education … brought together leading national psychometricians at the end of the twentieth century to develop a state-of-the-art assessment instrument to measure undergraduate learning … the Collegiate Learning Assessment….”

Pages 35–36:

The Council for Aid to Education has also published a detailed scoring rubric on the criteria that it defines as critical thinking, analytical reasoning, and problem solving—including how well the student assesses the quality and relevance of evidence, analyzes and synthesizes data and information, draws conclusions from his or her analysis, and considers alternative perspectives. In addition, the scoring rubric with respect to written communication requires that the presentation is clear and concise, the structure of the argument is well-developed and effective, the work is persuasive, the written mechanics are proper and correct, and reader interest is maintained.71

71 Council for Aid to Education, Collegiate Learning Assessment Common Scoring Rubric (New York: Council for Aid to Education, 2008).

[404] Book: Academically Adrift: Limited Learning on College Campuses. By Richard Arum and Josipa Roksa. University of Chicago Press, 2011.

Pages 32–33:

Our research was made possible by a collaborative partnership with the Council for Aid to Education … and twenty-four four-year colleges and universities that granted us access to students who were scheduled to take the Collegiate Learning Assessment (CLA) in their first semester (Fall 2005) and at the end of their sophomore year (Spring 2007). … The schools are dispersed nationally across all four regions of the country. We refer to this multifaceted data as the Determinants of College Learning (DCL) dataset. … On most measures, students in the DCL dataset appeared reasonably representative of traditional-age undergraduates in four-year institutions, and the colleges and universities they attended resembled four-year institutions nationwide. The DCL students’ racial, ethnic, and family backgrounds as well as their English-language backgrounds and high school grades also tracked well with national statistics.

Page 159:

The overall retention rate from freshman to sophomore year across the twenty-four institutions included in the DCL dataset was slightly under 50 percent, although this varied notably across institutions and groups of students. If bias is introduced into our analyses by processes of selective attrition, however, it is likely in a direction that leads us to overestimate the overall rate of academic growth that is occurring in these institutions—that is, the dearth of learning we have identified would likely be even more pronounced if we had been able to track down and continue assessing the students who dropped out of the study and / or the institutions they originally attended.

[405] Book: Aspiring Adults Adrift: Tentative Transitions of College Graduates. By Richard Arum and Josipa Roksa. University of Chicago Press, 2014.

Page 37:

Over the full four years of college, students gained an average of 0.47 standard deviations on the CLA [Collegiate Learning Assessment].41 Thus, after four years of college, an average-scoring student in the fall of his or her freshman year would score at a level only eighteen percentile points higher in the spring of his or her senior year. Stated differently, freshmen who entered higher education at the 50th percentile would reach a level equivalent to the 68th percentile of the incoming freshman class by the end of their senior year. Since standard deviations are not the most intuitive way of understanding learning gains, it is useful to consider that if the CLA were rescaled to a one-hundred-point scale, approximately one-third of students would not improve more than one point over four years of college. …

41 A recent replication using data from the Wabash National Study of Liberal Arts Education, relying on a different sample and a multiple-choice measure of critical thinking (the Collegiate Assessment of Academic Proficiency, or CAAP), produced a virtually identical estimate; students in the Wabash Study gained 0.44 standard deviations on the CAAP measure of critical thinking over four years of college. See Ernest T. Pascarella and others, “How Robust Are the Findings of Academically Adrift?” Change: The Magazine of Higher Learning, May/ June 2011: 20–24.

Page 42:

The results indicate that students attending high-selectivity institutions improve on the CLA substantially more than those attending low-selectivity institutions, even when models are adjusted for students’ background and academic characteristics. This association between institutional selectivity and CLA performance is consistent with findings for persistence and graduation in other research. A range of factors, from greater expenditures to unique peer environments at high-selectivity schools, may help to account for these patterns.

Page 44:

These patterns indicate that the issues we have identified, namely weak academic engagement and limited learning, are widespread. They are not concentrated at a few institutions, or even at a specific type of institution. While students in more selective institutions gain more on the CLA, their gains are still modest, and while they spend more time studying alone, their average is still only slightly over ten hours per week.

Pages 137–139

Analyses presented in this book build on the Determinants of College Learning (DCL) dataset…. The CAE initiated the Collegiate Learning Assessment (CLA) Longitudinal Project in the fall of 2005, administering a short survey and the CLA instrument to a sample of freshmen at four-year institutions. The same students were contacted for the sophomore-year follow-up in the spring of 2007 and the senior-year follow-up in the spring of 2009. …

The senior-year sample included 1,666 respondents with valid CLA scores. … Characteristics of the senior-year sample thus correspond reasonably well with the characteristics of students from a nationally representative sample. …

While the CLA as a whole is considered by some as state of the art, the performance task component of the test is the best developed and most sophisticated part of the assessment instrument; it is the component that the Organisation for Economic Cooperation and Development adopted for its cross-national assessment of higher education students’ generic skill strand in the Assessment of Higher Education Learning Outcomes (AHELO) project.

We use students’ scores on the performance task of the CLA as an indicator of their critical thinking, complex reasoning, and writing skills. In addition to being the most developed, this performance task was the most uniformly administered component across time and institutions.

[406] Report: “The Literacy of America’s College Students.” By Justin D. Baer, Andrea L. Cook, and Stéphane Baldi. American Institutes for Research, January 2006. <www.air.org>

Page 4:

The NSACS [National Survey of America’s College Students], sponsored by The Pew Charitable Trusts, collected data from a sample of 1,827 graduating students at 80 randomly selected 2-year and 4-year colleges and universities (68 public and 12 private) from across the United States. The NSACS specifically targeted college and university students nearing the end of their degree program, thus providing a broader and more comprehensive picture of students’ fundamental literacy abilities than ever before.

The NSACS used the same assessment instrument as the 2003 National Assessment of Adult Literacy (NAAL), a nationally representative survey of the English-language literacy abilities of U.S. adults 16 and older residing in households or prisons. The NAAL was developed and administered by the U.S. Department of Education’s National Center for Education Statistics (NCES). Literacy levels were categorized as Below Basic, Basic, Intermediate, or Proficient on the basis of the abilities of participants.

Because literacy is not a single skill used in the same manner for all types of printed and written information, the NSACS measured literacy along three dimensions: prose literacy, document literacy, and quantitative literacy. These three literacy domains were designed to capture an ordered set of information-processing skills and strategies that adults use to accomplish a wide range of literacy tasks and make it possible to profile the various types and levels of literacy among different subgroups in society.

[407] Report: “The Literacy of America’s College Students.” By Justin D. Baer, Andrea L. Cook, and Stéphane Baldi. American Institutes for Research, January 2006. <www.air.org>

Page 4: “Prose Literacy: The knowledge and skills needed to perform prose tasks, that is, to search, comprehend, and use information from continuous texts. Prose examples include editorials, news stories, brochures, and instructional materials.”

Page 21: “Table 2.2. Percentage of U.S. Adults in College and the Nation in Each Prose Literacy Level, by Selected Characteristics”

[408] Report: “The Literacy of America’s College Students.” By Justin D. Baer, Andrea L. Cook, and Stéphane Baldi. American Institutes for Research, January 2006. <www.air.org>

Page 4: “Document Literacy: The knowledge and skills needed to perform document tasks, that is, to search, comprehend, and use information from noncontinuous texts in various formats. Document examples include job applications, payroll forms, transportation schedules, maps, tables, and drug or food labels.”

Page 22: “Table 2.3. Percentage of U.S. Adults in College and the Nation in Each Document Literacy Level, by Selected Characteristics”

[409] Report: “The Literacy of America’s College Students.” By Justin D. Baer, Andrea L. Cook, and Stéphane Baldi. American Institutes for Research, January 2006. <www.air.org>

Page 4:

Quantitative Literacy: The knowledge and skills required to perform quantitative literacy tasks, that is, to identify and perform computations, either alone or sequentially, using numbers embedded in printed materials. Quantitative examples include balancing a checkbook, figuring out a tip, completing an order form, or determining the amount of interest on a loan from an advertisement.

Page 23: “Table 2.4. Percentage of U.S. Adults in College and the Nation in Each Quantitative Literacy Level, by Selected Characteristics”

[410] Report: “The Literacy of America’s College Students.” By Justin D. Baer, Andrea L. Cook, and Stéphane Baldi. American Institutes for Research, January 2006. <www.air.org>

Page 5: “The literacy of students in 4-year public institutions was comparable to the literacy of students in 4-year private institutions.”

Page 30: “Prose literacy was higher for students in selective 4-year colleges, though differences between selective and nonselective 4-year colleges for document and quantitative literacy could not be determined because of the sample size.”

Page 34:

College students come from a variety of economic backgrounds, with some students supporting themselves and others relying on their families to pay for tuition and other necessities.1 Despite variations in income, most differences in the literacy of students across income groups were not significant (Table 4.1).

1 Students were asked whether they were financially independent or whether they were financially dependent on their parents. Depending on their answer, they were asked to report either their parents’ household income or their personal income. The financial information was combined to create a single measure of personal or parents’ household income.

[411] Report: “The Literacy of America’s College Students.” By Justin D. Baer, Andrea L. Cook, and Stéphane Baldi. American Institutes for Research, January 2006. <www.air.org>

Page 35: “Table 4.1. Average Prose, Document, and Quantitative Literacy Scores for U.S. Adults in 2- and 4-Year Colleges, by Income.”

[412] Calculated with data from the article: “Many Business Leaders Doubt U.S. Colleges Prepare Students.” By Preety Sidhu and Valerie J. Calderon. Gallup, February 26, 2014. <news.gallup.com>

On a five-point scale, where 5 means strongly agree and 1 means strongly disagree, please indicate your level of agreement with each of the following statements.

Higher education institutions in this country are graduating students with the skills and competencies that MY business needs.

1 Strongly disagree [=] 17% … 2 [=] 17% … 3 [=] 34% … 4 [=] 22% … 5 Strongly agree [=] 11% …

The study reported includes findings from the Gallup/Lumina Business Leaders Poll on Higher Education, a quantitative survey conducted to understand the perceptions of business leaders about the quality and effectiveness of American higher education institutions in preparing graduates for the workforce. Gallup conducted 623 interviews with business leaders in executive and senior roles at their company. The sample was from Dun & Bradstreet. A simple stratified random sample design was used for sampling businesses. Businesses were grouped into five strata based on sales revenue ($50,000–$499,999/$500,000–$4.9 million/$5 million–$14.9 million/$15 million–$49.9 million/$50 million–$100 million+). Businesses with larger sales revenue were oversampled to ensure enough completes for analysis. Weights were calculated to take into account sampling rate and the non-response rate by sales revenue and census region.

Gallup conducted surveys in English only from Nov. 25–Dec. 16, 2013. Up to five calls were made to each business to reach an eligible respondent.

The questionnaire was developed in consultation with representatives from Lumina Foundation and Gallup. All interviewing was supervised and conducted by Gallup’s full-time interviewing staff. For results based on the total sample size of 623 business leaders, one can say with 95% confidence that the margin of error attributable to sampling and other random effects is ±6 percentage points.

CALCULATION: 17% strongly disagree + 17% disagree + 34% neutral = 68%

[413] Report: “How College Contributes to Workforce Success: Employer Views on What Matters Most.” By Ashley Finley. Association of American Colleges and Universities and Hanover Research, April 2, 2021. <dgmg81phhvh63.cloudfront.net>

Page 1:

This report presents findings from an online survey that was conducted in October 2020 by the Association of American Colleges and Universities in partnership with Hanover Research.

The total respondent sample of 496 included equal numbers of executives and hiring managers who are responsible for making hiring and promotion decisions in US companies of various types and sizes across a wide range of industries…. Only respondents representing organizations at which a minimum of 25 percent of entry-level positions are filled by employees who hold an associate’s or bachelor’s degree were eligible for participation.

Page 15:

While nearly nine in ten employers (87 percent) report that they are at least “somewhat satisfied” with the ability of recent college graduates to apply the skills and knowledge learned in college to complex problems in the workplace, just under half (49 percent) are “very satisfied.” Moreover … just six in ten employers believe that college graduates possess the knowledge and skills needed to succeed in entry-level positions, and just over half (55 percent) believe they possess the knowledge and skills required for advancement and promotion.

Page 16: “Figure 12: Employers do not believe most graduates possess the level of preparedness needed for workforce success.”

Page 22: “Figure 16: Percentages of employers by age and education who are “very satisfied” with college graduates’ ability to apply the skills and knowledge learned in college to complex problems in the workplace. … All Employers [=] 49% … Under Age 40 [=] 59% … Aged 50 and Above [=] 28%”

NOTES:

  • The margin of error was provided to Just Facts by the Association of American Colleges and Universities via e-mail on March 4, 2022.
  • NOTE: For facts about what constitutes a scientific survey and the factors that impact their accuracy, visit Just Facts’ research on Deconstructing Polls & Surveys.

[414] Report: “How Should Colleges Assess And Improve Student Learning? Employers’ Views On The Accountability Challenge.” Peter D. Hart Research Associates for the Association of American Colleges and Universities, January 9, 2008. <files.eric.ed.gov>

Page 1: “From November 8 to December 12, 2007, Peter D. Hart Research Associates, Inc., interviewed 301 employers whose companies have at least 25 employees and report that 25% or more of their new hires hold at least a bachelor’s degree from a four-year college. … The margin of error for this survey is ±5.7 percentage points.”

Page 3:

Employers believe that college graduates are reasonably well prepared in a variety of areas, but in no area do employers give them exceptionally strong marks. When asked to evaluate recent college graduates’ preparedness in 12 areas, employers give them the highest marks for teamwork, ethical judgment, and intercultural skills, and the lowest scores for global knowledge, self-direction, and writing. …

In none of the 12 areas tested does a majority of employers give college graduates a high rating (or “8,” “9,” or “10”) for their level of preparedness. …

Employers Evaluate College Graduates’ Preparedness In Key Areas

[415] Calculated with the dataset: “Person Income in 2021, Both Sexes, 25 to 64 Years, Total Work Experience.” U.S. Census Bureau. Accessed January 24, 2023 at <www2.census.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • Like all Census Bureau measures of “money” income, this dataset doesn’t include noncash benefits like subsidized housing, food stamps, charitable services, and government or employer-provided health benefits. Also, the data are collected via government surveys, and low-income households substantially underreport their income on such surveys.

[416] Report: “Income in the United States: 2021.” By Jessica Semega and Melissa Kollar. U.S. Census Bureau, September 2022. <www.census.gov>

Page 13:

Data on income collected in the CPS ASEC [Current Population Survey Annual Social and Economic Supplements] by the U.S. Census Bureau cover money income received (exclusive of certain money receipts such as capital gains) before payments for personal income taxes, Social Security, union dues, Medicare deductions, etc. Money income also excludes tax credits such as the Earned Income Tax Credit, the Child Tax Credit, and special COVID-19- related stimulus payments. Money income does not reflect that some families receive noncash benefits such as Supplemental Nutrition Assistance/food stamps, health benefits, and subsidized housing. In addition, money income does not reflect the fact that noncash benefits often take the form of the use of business transportation and facilities, full or partial payments by business for retirement programs, or medical and educational expenses. …

Data users should consider these elements when comparing income levels. Moreover, readers should be aware that for many different reasons there is a tendency in household surveys for respondents to underreport their income. Based on an analysis of independently derived income estimates, the Census Bureau determined that respondents report income earned from wages or salaries more accurately than other sources of income, and that the reported wage and salary income is nearly equal to independent estimates of aggregate income.

[417] Calculated with the dataset: “Person Income in 2021, Both Sexes, 25 to 64 Years, Total Work Experience.” U.S. Census Bureau. Accessed January 24, 2023 at <www2.census.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • Like all Census Bureau measures of “money” income, this dataset doesn’t include noncash benefits like subsidized housing, food stamps, charitable services, and government or employer-provided health benefits. Also, the data are collected via government surveys, and low-income households substantially underreport their income on such surveys.

[418] Report: “Income in the United States: 2021.” By Jessica Semega and Melissa Kollar. U.S. Census Bureau, September 2022. <www.census.gov>

Page 13:

Data on income collected in the CPS ASEC [Current Population Survey Annual Social and Economic Supplements] by the U.S. Census Bureau cover money income received (exclusive of certain money receipts such as capital gains) before payments for personal income taxes, Social Security, union dues, Medicare deductions, etc. Money income also excludes tax credits such as the Earned Income Tax Credit, the Child Tax Credit, and special COVID-19- related stimulus payments. Money income does not reflect that some families receive noncash benefits such as Supplemental Nutrition Assistance/food stamps, health benefits, and subsidized housing. In addition, money income does not reflect the fact that noncash benefits often take the form of the use of business transportation and facilities, full or partial payments by business for retirement programs, or medical and educational expenses. …

Data users should consider these elements when comparing income levels. Moreover, readers should be aware that for many different reasons there is a tendency in household surveys for respondents to underreport their income. Based on an analysis of independently derived income estimates, the Census Bureau determined that respondents report income earned from wages or salaries more accurately than other sources of income, and that the reported wage and salary income is nearly equal to independent estimates of aggregate income.

[419] Calculated with the dataset: “Person Income in 2021, Both Sexes, 25 to 64 Years, Total Work Experience.” U.S. Census Bureau. Accessed January 24, 2023 at <www2.census.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • Like all Census Bureau measures of “money” income, this dataset doesn’t include noncash benefits like subsidized housing, food stamps, charitable services, and government or employer-provided health benefits. Also, the data are collected via government surveys, and low-income households substantially underreport their income on such surveys.

[420] Report: “Income in the United States: 2021.” By Jessica Semega and Melissa Kollar. U.S. Census Bureau, September 2022. <www.census.gov>

Page 13:

Data on income collected in the CPS ASEC [Current Population Survey Annual Social and Economic Supplements] by the U.S. Census Bureau cover money income received (exclusive of certain money receipts such as capital gains) before payments for personal income taxes, Social Security, union dues, Medicare deductions, etc. Money income also excludes tax credits such as the Earned Income Tax Credit, the Child Tax Credit, and special COVID-19- related stimulus payments. Money income does not reflect that some families receive noncash benefits such as Supplemental Nutrition Assistance/food stamps, health benefits, and subsidized housing. In addition, money income does not reflect the fact that noncash benefits often take the form of the use of business transportation and facilities, full or partial payments by business for retirement programs, or medical and educational expenses. …

Data users should consider these elements when comparing income levels. Moreover, readers should be aware that for many different reasons there is a tendency in household surveys for respondents to underreport their income. Based on an analysis of independently derived income estimates, the Census Bureau determined that respondents report income earned from wages or salaries more accurately than other sources of income, and that the reported wage and salary income is nearly equal to independent estimates of aggregate income.

[421] Calculated with the dataset: “Person Income in 2021, Both Sexes, 25 to 64 Years, Total Work Experience.” U.S. Census Bureau. Accessed January 24, 2023 at <www2.census.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • Like all Census Bureau measures of “money” income, this dataset doesn’t include noncash benefits like subsidized housing, food stamps, charitable services, and government or employer-provided health benefits. Also, the data are collected via government surveys, and low-income households substantially underreport their income on such surveys.

[422] Calculated with the dataset: “Person Income in 2021, Both Sexes, 25 to 64 Years, Total Work Experience.” U.S. Census Bureau. Accessed January 24, 2023 at <www2.census.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • Like all Census Bureau measures of “money” income, this dataset doesn’t include noncash benefits like subsidized housing, food stamps, charitable services, and government or employer-provided health benefits. Also, the data are collected via government surveys, and low-income households substantially underreport their income on such surveys.

[423] Report: “Income in the United States: 2021.” By Jessica Semega and Melissa Kollar. U.S. Census Bureau, September 2022. <www.census.gov>

Page 13:

Data on income collected in the CPS ASEC [Current Population Survey Annual Social and Economic Supplements] by the U.S. Census Bureau cover money income received (exclusive of certain money receipts such as capital gains) before payments for personal income taxes, Social Security, union dues, Medicare deductions, etc. Money income also excludes tax credits such as the Earned Income Tax Credit, the Child Tax Credit, and special COVID-19- related stimulus payments. Money income does not reflect that some families receive noncash benefits such as Supplemental Nutrition Assistance/food stamps, health benefits, and subsidized housing. In addition, money income does not reflect the fact that noncash benefits often take the form of the use of business transportation and facilities, full or partial payments by business for retirement programs, or medical and educational expenses. …

Data users should consider these elements when comparing income levels. Moreover, readers should be aware that for many different reasons there is a tendency in household surveys for respondents to underreport their income. Based on an analysis of independently derived income estimates, the Census Bureau determined that respondents report income earned from wages or salaries more accurately than other sources of income, and that the reported wage and salary income is nearly equal to independent estimates of aggregate income.

[424] The next three footnotes document that:

  • Private-sector economic output is equal to personal consumption expenditures (PCE) + gross private domestic investment (GPDI) + net exports of goods and services.
  • PCE is the “primary measure of consumer spending on goods and services” by private individuals and nonprofit organizations.
  • GPDI is a measure of private spending on “structures, equipment, and intellectual property products.”

Since education is not a service that is typically imported or exported, a valid approximation of private spending on education can be arrived at by summing PCE and GPDI. The fourth footnote below details the data used in this calculation.

[425] Report: “Fiscal Year 2013 Analytical Perspectives, Budget of the U.S. Government.” White House Office of Management and Budget, February 12, 2012. <www.gpo.gov>

Page 471:

The main purpose of the NIPAs [national income and product accounts published by the U.S. Bureau of Economic Analysis] is to measure the Nation’s total production of goods and services, known as gross domestic product (GDP), and the incomes generated in its production. GDP excludes intermediate production to avoid double counting. Government consumption expenditures along with government gross investment—State and local as well as Federal—are included in GDP as part of final output, together with personal consumption expenditures, gross private domestic investment, and net exports of goods and services (exports minus imports).

[426] Report: “Concepts and Methods of the U.S. National Income and Product Accounts, Chapter 5: Personal Consumption Expenditures.” U.S. Bureau of Economic Analysis. Updated December 2022. <www.bea.gov>

Page 5-1:

Personal consumption expenditures (PCE) is the primary measure of consumer spending on goods and services in the U.S. economy.1 It accounts for about two-thirds of domestic final spending, and thus it is the primary engine that drives future economic growth. PCE shows how much of the income earned by households is being spent on current consumption as opposed to how much is being saved for future consumption.

PCE also provides a comprehensive measure of types of goods and services that are purchased by households. Thus, for example, it shows the portion of spending that is accounted for by discretionary items, such as motor vehicles, or the adjustments that consumers make to changes in prices, such as a sharp run-up in gasoline prices.2

Page 5-2:

PCE measures the goods and services purchased by “persons”—that is, by households and by nonprofit institutions serving households (NPISHs)—who are resident in the United States. Persons resident in the United States are those who are physically located in the United States and who have resided, or expect to reside, in this country for 1 year or more. PCE also includes purchases by U.S. government civilian and military personnel stationed abroad, regardless of the duration of their assignments, and by U.S. residents who are traveling or working abroad for 1 year or less.3

Page 5-69:

Nonprofit Institutions Serving Households

In the NIPAs [National Income and Product Accounts], nonprofit institutions serving households (NPISHs), which have tax-exempt status, are treated as part of the personal sector of the economy. Because NPISHs produce services that are not generally sold at market prices, the value of these services is measured as the costs incurred in producing them.

In PCE, the value of a household purchase of a service that is provided by a NPISH consists of the price paid by the household or on behalf of the household for that service plus the value added by the NPISH that is not included in the price. For example, the value of the educational services provided to a student by a university consists of the tuition fee paid by the household to the university and of the additional services that are funded by sources other than tuition fees (such as by the returns to an endowment fund).

[427] Report: “Measuring the Economy: A Primer on GDP and the National Income and Product Accounts.” U.S. Bureau of Economic Analysis, December 2015. <www.bea.gov>

Page 8: “Gross private domestic investment consists of purchases of fixed assets (structures, equipment, and intellectual property products) by private businesses that contribute to production and have a useful life of more than one year, of purchases of homes by households, and of private business investment in inventories.”

[428] Calculated with the dataset: “Table 2.4.5U. Personal Consumption Expenditures by Type of Product.” U.S. Bureau of Economic Analysis. Last revised April 27, 2023. <apps.bea.gov>

NOTE: An Excel file containing the data and calculations is available upon request.

[429] Calculated with data from the report: “Early Learning and Child Care: Agencies Have Helped Address Fragmentation and Overlap Through Improved Coordination.” U.S. Government Accountability Office, July 2017. <www.gao.gov>

Page 2 (of PDF):

Multiple federal programs may provide or support early learning or child care for children age 5 and under. Of these programs, nine† describe early learning or child care as an explicit purpose and are administered by the Departments of Health and Human Services (HHS), Education (Education), and the Interior (Interior). Fiscal year 2015 obligations for these nine† programs totaled approximately $15 billion, with the vast majority of these funds concentrated in Head Start and the Child Care and Development Fund. An additional 35‡ programs did not have an explicit early learning or child care purpose, but permitted funds to be used for these services. Additionally, three§ tax expenditures subsidized individuals’ private purchase of child or dependent care.

Pages 2–3:

To address our objectives, we used three criteria to identify relevant programs: they (1) funded or supported early learning or child care services, (2) were provided to children age 5 and under, and (3) delivered services in an educational or child care setting. We limited our review to programs for which federal funds were obligated in fiscal year 2015, the most recent available obligations data at the time we conducted our work. We did not conduct a separate legal review to identify and analyze relevant programs or verify the accuracy of the information agencies provided to us.

To address our first objective, we started with the list of 45 programs and 5 tax expenditures in our 2012 review.4 We sent questionnaires to nine agencies and one regional commission included in the 2012 review and received responses from them all. We conducted follow-up interviews with agency officials to confirm that these programs and tax expenditures continued to meet all three of our criteria in fiscal year 2015. We also reviewed supplementary information, such as information from annual reports and program notices in the Federal Register, from the Departments of Education (Education), Health and Human Services (HHS), the Interior (Interior), and all the other agencies included in our prior review. After we created a preliminary list of programs, we counted the number of federal early learning and child care programs by examining the key benefits and services they provide.5 Using a similar definition as in our prior review, we considered a program to have an explicit early learning or child care purpose if, according to our analysis, early learning or child care is specifically described as a program purpose in the Catalog of Federal Domestic Assistance (CFDA) or in agency documents. We categorized all other programs included in this review as not having an explicit early learning or child care purpose. In this review, we also included tax expenditures that could be used to subsidize families or employers for early learning or child care related expenses.6 After we identified programs and tax expenditures that met our criteria, we obtained information about fiscal year 2015 program obligations from the President’s budget for fiscal year 2017. We used the Department of the Treasury’s (Treasury) Tax Expenditure Estimates for fiscal year 2017 to obtain information on estimated losses in revenue in fiscal year 2015 for tax expenditures.

CALCULATION: †9 programs + ‡35 programs + §3 tax expenditures = 47 programs

[430] Report: “Child Care: Information on Integrating Early Care and Education Funding.” U.S. Government Accountability Office, September 14, 2016. <www.gao.gov>

Page 1:

Every year millions of children under the age of 5 participate in federal and state early care and education programs. For fiscal years 2010 to 2015, Congress appropriated almost $48 billion to Head Start and over $31 billion to the Child Care and Development Fund (CCDF), the two largest sources of federal funding for early care and education. The Head Start program is administered by the Department of Health and Human Services (HHS) through about 1,800 grants to groups who deliver education, nutrition, health, and other social services to approximately 1 million children in poverty from birth to age 5 each year. Through Head Start, HHS funds two programs—Head Start, which provides early care and education to 3- and 4- year-olds, and Early Head Start, which serves pregnant women and children from birth up to age 3. CCDF funding is provided through a block grant to states and tribes to, among other things, help low-income, working families pay for child care (for children from birth to 12) so that parents can work, pursue an education, or attend job training. Additionally, states spend about $5.6 billion annually on state-funded prekindergarten (Pre-K) programs.1

[431] Report: “Budget of the U.S. Government Fiscal Year 2024, Appendix.” White House, Office of Management and Budget, March 2023. <www.govinfo.gov>

Pages 460–461: “Payments to States for the Child Care and Development Block Grant … Program and Financing (in millions of dollars) … Obligations by program activity … Line 0001 … Child Care Block Grant Payments to States … 2022 actual [=] 6,135”

Pages 461–463: “Children and Families Services Programs … Program and Financing (in millions of dollars) … Obligations by program activity … Line 0101 Head Start … 2022 actual [=] 11,033”

[432] Report: “Office of Head Start – Services Snapshot, National All Programs (2021–2022).” U.S. Department of Health & Human Services, Office of Head Start, November 2022. <files.eric.ed.gov>

Page 1:

This National Services Snapshot summarizes key data on demographics and services for children from birth to age five and pregnant women served by Head Start, Early Head Start, and Migrant and Seasonal Head Start programs. The data in this Snapshot is a subset of the annual Program Information Report (PIR) submission to the Office of Head Start. …

Total Cumulative Enrollment

Actual number of children and pregnant women served by the program throughout the entire program year, inclusive of enrollees who left during the program year and the enrollees who filled those empty places. Due to turnover, more children and families may receive Head Start services cumulatively throughout the program year (all of whom are reported in the PIR) than indicated by the funded enrollment numbers.

Total cumulative enrollment [=] 801,077

Children total cumulative enrollment [=] 788,341

Pregnant women total cumulative enrollment [=] 12,736

[433] Calculated with data from the report: “Head Start Program Facts, Fiscal Year 2021.” U.S. Department of Health & Human Services, Office of Head Start, September 20, 2022. <eclkc.ohs.acf.hhs.gov>

Page 1 (of PDF):

Established in 1965, Head Start promotes school readiness for children in low-income families by offering educational, nutritional, health, social, and other services. Since its inception, Head Start has served more than 38 million children birth to age 5 and their families. In 2021, Head Start received funding to serve about 839,116 children and pregnant people in centers, family homes, and in family child care homes. …

Throughout this fact sheet, unless otherwise specified, “Head Start” refers to the Head Start program as a whole. This includes Head Start preschool services to children primarily ages 3 to 5; Early Head Start services to infants, toddlers, and pregnant people; and services to families provided by American Indian and Alaska Native (AIAN) and Migrant and Seasonal Head Start (MSHS) programs.

“Funded enrollment” (also called “enrollment slots”) refers to the number of children and pregnant people supported by federal Head Start funds in a program at any one time during the program years. This number includes slots funded by state or other funds when used by grant recipients as required nonfederal match. States may provide additional funding to local Head Start programs, which is not included in federal Head Start reporting.

“Cumulative enrollment” refers to the actual number of children and pregnant people Head Start programs serve throughout the entire program year, inclusive of enrollees who left during the program year and the enrollees who filled those empty places. Due to turnover, more children and families may receive Head Start services throughout the program year than is reflected in funded enrollment. All of these enrollees are reported in the Program Information Report (PIR). …

Total [=] $10,747,915,429

CALCULATION: $10,747,915,429 / 839,116 = $12,809 federal funding per enrollee

[434] Report: “Head Start: Undercover Testing Finds Fraud and Abuse at Selected Head Start Centers.” U.S. Government Accountability Office, May 18, 2010. <www.gao.gov>

Page 2 (of PDF):

The Head Start program, overseen by the Department of Health and Human Services and administered by the Office of Head Start, provides child development services primarily to low-income families and their children. Federal law allows up to 10 percent of enrolled families to have incomes above 130 percent of the poverty line—GAO [U.S. Government Accountability Office] refers to them as “over-income.” Families with incomes below 130 percent of the poverty line, or who meet certain other criteria, are referred to as “under-income”. Nearly 1 million children a year participate in Head Start, and the American Recovery and Reinvestment Act provided an additional $2.1 billion in funding.

GAO received hotline tips alleging fraud and abuse by grantees. In response, GAO investigated the validity of the allegations, conducted undercover tests to determine if other centers were committing fraud, and documented instances where potentially eligible children were put on Head Start wait lists. The investigation of allegations is ongoing.

To perform this work, GAO interviewed grantees and a number of informants and reviewed documentation. GAO used fictitious identities and bogus documents for proactive testing of Head Start centers. GAO also interviewed families on wait lists. Results of undercover tests and family interviews cannot be projected to the entire Head Start program. In a corrective action briefing, agency officials agreed to address identified weaknesses. …

GAO received allegations of fraud and abuse involving two Head Start nonprofit grantees in the Midwest and Texas. Allegations include manipulating recorded income to make over-income applicants appear under-income, encouraging families to report that they were homeless when they were not, enrolling more than 10 percent of over-income children, and counting children as enrolled in more than one center at a time. GAO confirmed that one grantee operated several centers with more than 10 percent over-income students, and the other grantee manipulated enrollment data to over-report the number of children enrolled. GAO is still investigating the other allegations reported.

Realizing that these fraud schemes could be perpetrated at other Head Start programs, GAO attempted to register fictitious children as part of 15 undercover test scenarios at centers in six states and the District of Columbia. In 8 instances staff at these centers fraudulently misrepresented information, including disregarding part of the families’ income to register over-income children into under-income slots. The undercover tests revealed that 7 Head Start employees lied about applicants’ employment status or misrepresented their earnings. This leaves Head Start at risk that over-income children may be enrolled while legitimate under-income children are put on wait lists. At no point during our registrations was information submitted by GAO’s fictitious parents verified, leaving the program at risk that dishonest persons could falsify earnings statements and other documents in order to qualify. In 7 instances centers did not manipulate information.

Page 10:

Case: 12 … State: California … Undercover scenario: Income exceeded poverty guidelines … Case details:

• The income for the family of three (mother, father, and child) was $12,000 more than allowed for the family to be considered income-eligible.

• A Head Start associate denied this application because the family was over-income.

• The Head Start associate explained that families often lie about being separated or divorced in order to reduce their income and that Head Start is not strict about checking whether that is true. …

We also identified a key vulnerability during our investigation that could allow over-income children to be enrolled in other Head Start centers: income documentation for enrollees is not required to be maintained by grantees. According to HHS [Department of Health and Human Services] guidance, Head Start center employees must sign a statement attesting that the applicant child is eligible and identifying which income documents they examined, such as W-2s or pay stubs; however, they do not have to maintain copies of them. We discovered that the lack of documentation made it virtually impossible to determine whether only under-income children were enrolled in spots reserved for under-income children.

[435] Report: “Head Start: Action Needed to Enhance Program Oversight and Mitigate Significant Fraud and Improper Payment Risks.” U.S. Government Accountability Office, September 2019. <www.gao.gov>

Page 2 (of PDF):

Why GAO [U.S. Government Accountability Office] Did This Study

Congress appropriated over $10 billion for programs under the Head Start Act, to serve approximately 1 million children through about 1,600 Head Start grantees and their centers nationwide. This report discusses (1) what vulnerabilities GAO’s covert tests identified in selected Head Start grantees’ controls for program eligibility screening; (2) the extent to which OHS [Office of Head Start] provides timely monitoring of grantees; and (3) what control vulnerabilities exist in OHS’s methods for ensuring grantees provide services for all children and pregnant women they are funded to serve. GAO conducted 15 nongeneralizable covert tests at Head Start centers in metropolitan areas. GAO selected only centers that were underenrolled to be sure we did not displace any actual, eligible children.

Page 2: “You asked us to review the Head Start program to see whether the

internal control vulnerabilities we identified in 2010 persist.”

Page 5:

We conducted this performance audit from October 2017 to July 2019 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. We conducted our related investigative work in accordance with investigative standards prescribed by the Council of the Inspectors General on Integrity and Efficiency.

Page 16:

Our covert tests and eligibility file reviews for selected Head Start grantees found control vulnerabilities and potential fraud and improper payment risks that OHS has not fully assessed. While our covert tests and eligibility file reviews are nongeneralizable, they nonetheless illustrate that Head Start center staff do not always properly verify eligibility and exemplify control vulnerabilities that present fraud and improper payment risks to the Head Start program. Leading practices for managing fraud risks state that agencies should assess fraud risks as part of a strategy to mitigate the likelihood and effect of fraud. During this review, OHS officials told us they did not believe the program was at risk of fraud or improper payments. However, OHS has not performed a comprehensive fraud risk assessment to support this determination.

Pages 17–18:

In seven of 15 covert tests, the Head Start centers correctly determined we were not eligible. In these seven tests, staff at the Head Start centers categorized our applications as over-income. In some cases, the staff recommended other child-care services or placed us on a waitlist as an over-income applicant, as permitted by program rules.

In three of 15 covert tests, we identified control vulnerabilities, as Head Start Center staff encouraged us to attend without following all eligibility-verification requirements.

• In one of these three cases, we did not provide any documentation to support claims of receiving public assistance and earned wages, as required by program regulations, but we were still accepted into the program.

• In the second of these three cases, we did not provide any documentation to support claims of receiving cash income from a third-party source, as required by program regulations, but Head Start staff encouraged us to attend nonetheless.

• In the third of these three cases, center staff emphasized we would need to indicate income below a specific amount (the federal poverty level) so that we would qualify. We later retrieved our eligibility documents from this center’s files and found that some documents in the file noted the grantee had reviewed our income information—though we had provided none—and other documents in the file noted the grantee was still waiting on our income documentation. We were eventually contacted by Head Start center personnel and told we were accepted into the program and asked to provide income documentation, though our income had not yet been verified.

While these three cases showed several vulnerabilities, such as instructions regarding income limit and approval without the documentation, we did not categorize these three cases as potential fraud because we did not have evidence of staff knowingly and willfully making false statements or encouraging our applicant to make a statement they knew to be false. Also, in each of these three cases, we were told we could bring the missing documentation when the child began attendance or at orientation.

In the remaining five of 15 covert tests, we found indicators of potential fraud, as described in greater detail below. We plan to refer these five cases of potential fraud to the HHS [U.S. Department of Health and Human Services] Office of Inspector General (OIG) for further action as appropriate.

• In three of these five potential fraud cases, documents we later retrieved from the Head Start centers’ files showed that our applications were fabricated to exclude income information we provided, which would have shown the family to be over-income. For example, in one case the Form 1040 Internal Revenue Service (IRS) tax form we submitted as proof of income was replaced with another fabricated 1040 tax form. The fabricated 1040 tax form showed a lowered qualifying income amount, and the applicant signature was forged.

• In two of the five potential fraud cases, Head Start center staff dismissed eligibility documentation we offered during the enrollment interview. For example, in one case we explained we had two different jobs and offered an IRS W-2 Wage and Tax Statement (W-2) for one job and an employment letter from a separate employer. The combined income for these jobs would have shown the family to be over-income. However, the Head Start center only accepted income documentation from one job and told us we did not need to provide documentation of income from the second job—actions which made our applicant erroneously appear to be below the federal poverty level.

[436] Calculated with data from:

a) Dataset: “Enrollment of 3-, 4-, and 5-Year-Old Children in Preprimary Programs, by Level of Program, Control of Program, and Attendance Status: Selected Years, 1965 Through 2012.” U.S. Department Of Education, National Center for Education Statistics, May 2013. <nces.ed.gov>

b) Dataset: “Nursery and Primary School Enrollment of People 3 to 6 Years Old, by Control of School, Attendance Status, Age, Race, Hispanic Origin, Mother’s Labor Force Status and Education, and Family Income.” U.S. Census Bureau. <www2.census.gov>

NOTE: An Excel file containing the data and calculations is available upon request.

[437] Calculated with data from:

a) Dataset: “Enrollment of 3-, 4-, and 5-Year-Old Children in Preprimary Programs, by Level of Program, Control of Program, and Attendance Status: Selected Years, 1965 Through 2012.” U.S. Department Of Education, National Center for Education Statistics, May 2013. <nces.ed.gov>

b) Dataset: “Nursery and Primary School Enrollment of People 3 to 6 Years Old, by Control of School, Attendance Status, Age, Race, Hispanic Origin, Mother’s Labor Force Status and Education, and Family Income.” U.S. Census Bureau. <www2.census.gov>

NOTE: An Excel file containing the data and calculations is available upon request.

[438] Webpage: “Fact Sheet: President Obama’s Plan for Early Education for All Americans.” White House, Office of the Press Secretary, February 13, 2013. <obamawhitehouse.archives.gov>

In his State of the Union address, President Obama called on Congress to expand access to high-quality preschool to every child in America. As part of that effort, the President will propose a series of new investments that will establish a continuum of high-quality early learning for a child—beginning at birth and continuing to age 5. …

High-quality early childhood education provides the foundation for all children’s success in school and helps to reduce achievement gaps. Despite the individual and economic benefits of early education, our nation has lagged in its commitment to ensuring the provision of high quality public preschool in our children’s earliest years. …

Preschool for All

The President’s proposal will improve quality and expand access to preschool, through a cost sharing partnership with all 50 states, to extend federal funds to expand high-quality public preschool to reach all low- and moderate-income four-year olds from families at or below 200% of poverty. … The proposal would include an incentive for states to broaden participation in their public preschool program for additional middle-class families, which states may choose to reach and serve in a variety of ways, such as a sliding-scale arrangement. …

The proposal also encourages states to expand the availability of full-day kindergarten. …

The President will also launch a new Early Head Start–Child Care Partnership program, to support states and communities that expand the availability of Early Head Start and child care providers that can meet the highest standards of quality for infants and toddlers, serving children from birth through age 3. Funds will be awarded through Early Head Start on a competitive basis to enhance and support early learning settings; provide new, full-day, comprehensive services that meet the needs of working families; and prepare children for the transition into preschool. …

The President is proposing to expand the Administration’s evidence-based home visiting initiative, through which states are implementing voluntary programs that provide nurses, social workers, and other professionals to meet with at-risk families in their homes and connect them to assistance that impacts a child’s health, development, and ability to learn.

[439] Webpage: “Summary of Senate Bill S.1380: Strong Start for America’s Children Act of 2015.” U.S. Congress. Accessed August 8, 2015 at <www.congress.gov>

This bill directs the Department of Education (ED) to allot matching grants to states and, through them, subgrants to local educational agencies, childhood education program providers, or consortia of those entities to implement high-quality prekindergarten programs for children from low-income families.

Grants are allotted to states based on each state’s proportion of children who are age four and who are from families with incomes at or below 200% of the poverty level.

“High-quality prekindergarten programs” are those that serve children three or four years of age and meet criteria concerning: class size; learning environments; teacher qualifications, salaries, and professional development; program monitoring; and accessibility to comprehensive health and support services.

States may apply to use up to 15% of their grant for subgrants to high-quality early childhood education and care programs for infants and toddlers whose family income is at or below 200% of the poverty level.

ED and the Department of Health and Human Services (HHS) shall develop a process to: (1) provide Head Start program services to children younger than age four in states or regions that already provide four-year-olds whose family income is at or below 200% of the poverty level with sustained access to high-quality prekindergarten programs, or (2) convert programs to serve infants and toddlers.

ED shall award competitive matching grants to states to increase their capacity to offer high-quality prekindergarten programs. States must provide assurances that they will use their grant to become eligible, within three years of receiving the grant, for this Act’s grants for high-quality prekindergarten programs. …

[440] Webpage: “Cosponsors of Senate Bill S.1380: Strong Start for America’s Children Act of 2015.” U.S. Congress. Accessed December 30, 2016 at <www.congress.gov>

Sponsor: Murray, Patty [D-WA] (Introduced 05/19/2015)

Cosponsors (24):

Casey, Robert P., Jr. [D-PA]

Hirono, Mazie K. [D-HI]

Franken, Al [D-MN]

Markey, Edward J. [D-MA]

Schatz, Brian [D-HI]

Udall, Tom [D-NM]

Kaine, Tim [D-VA]

Mikulski, Barbara A. [D-MD]

Murphy, Christopher S. [D-CT]

Durbin, Richard [D-IL]

Coons, Christopher A. [D-DE]

Heinrich, Martin [D-NM]

Whitehouse, Sheldon [D-RI]

Baldwin, Tammy [D-WI]

Cantwell, Maria [D-WA]

Gillibrand, Kirsten E. [D-NY]

Wyden, Ron [D-OR]

Booker, Cory A. [D-NJ]

Warren, Elizabeth [D-MA]

Sanders, Bernard [I-VT]

Klobuchar, Amy [D-MN]

Cardin, Benjamin L. [D-MD]

Tester, Jon [D-MT]

Reed, Jack [D-RI]

[441] Article: “Exceedingly Social, But Doesn’t Like Parties.” By Michael Powell.

Washington Post, November 5, 2006. <www.washingtonpost.com>

Quoting Sanders: “I’m a democratic socialist.”

[442] Article: “Bernie Sanders: Obamacare Is a ‘Good Republican Program.’ ” By Bryan Koenig. CNN, September 24, 2013. <bit.ly>

“Sanders, an Independent who caucuses with Senate Democrats, reiterated his support of a universal single-payer Medicare for all, inspired by health care programs in Europe.”

[443] Webpage: “Party Division in the Senate, 1789–Present.” U.S. Senate Historical Office. Accessed August 8, 2015 at <www.senate.gov>

Note: Statistics listed below reflect party division immediately following the election, unless otherwise noted. The actual number of senators representing a particular party often changes during a congress, due to the death or resignation of a senator, or as a consequence of a member changing parties.

114th Congress (2015–2017)

Majority Party: Republican (54 seats)

Minority Party: Democrat (44 seats)

Other Parties: 2 Independents (both caucus with the Democrats)

Total Seats: 100

[444] Webpage: “Major Actions of Senate Bill S.1380: Strong Start for America’s Children Act of 2015.” U.S. Congress. Accessed December 30, 2016 at <www.congress.gov>

“05/19/2015: Introduced in Senate”

[445] Webpage: “Fact Sheet: The American Families Plan.” White House, Office of the Press Secretary, April 28, 2021. <www.whitehouse.gov>

Today, President Biden announced the American Families Plan, an investment in our kids, our families, and our economic future. …

Universal Pre-School for All Three- and Four-Year-Olds

President Biden is calling for a national partnership with states to offer free, high-quality, accessible, and inclusive preschool to all three-and four-year-olds, benefitting five million children and saving the average family $13,000, when fully implemented. This historic $200 billion investment in America’s future will first prioritize high-need areas and enable communities and families to choose the settings that work best for them. The President’s plan will also ensure that all publicly-funded preschool is high-quality, with low student-to-teacher ratios, high-quality and developmentally appropriate curriculum, and supportive classroom environments that are inclusive for all students. The President’s plan will leverage investments in tuition-free community college and teacher scholarships to support those who wish to earn a bachelor’s degree or another credential that supports their work as an educator, or to become an early childhood educator. And, educators will receive job-embedded coaching, professional development, and wages that reflect the importance of their work. All employees in participating pre-K programs and Head Start will earn at least $15 per hour, and those with comparable qualifications will receive compensation commensurate with that of kindergarten teachers. These investments will give American children a head start and pave the way for the best-educated generation in U.S. history. …

Building on the American Jobs Plan’s investments in school and child care infrastructure and workforce training, President Biden’s American Families Plan will ensure low and middle-income families pay no more than 7 percent of their income on high-quality child care for children under 5 years-old, saving the average family $14,800 per year on child care expenses, while also generating lifetime benefits for three million children, supporting hundreds of thousands of child care providers and workers, allowing roughly one million parents, primarily mothers, to enter the labor force, and significantly bolstering inclusive and equitable economic growth. Specifically, President Biden’s plan will invest $225 billion to:

Make child care affordable. Families will pay only a portion of their income based on a sliding scale. For the most hard-pressed working families, child care costs for their young children would be fully covered and families earning 1.5 times their state median income will pay no more than 7 percent of their income for all children under age five. The plan will also provide families with a range of inclusive and accessible options to choose from for their child, from child care centers to family child care providers to Early Head Start.

[446] Webpage: “The Build Back Better Framework.” White House. Accessed February 28, 2022 at <www.whitehouse.gov>

The Build Back Better Framework …

Offers universal and free preschool for all 3- and 4-year-olds, the largest expansion of universal and free education since states and communities across the country established public high school 100 years ago. … The Build Back Better framework will enable states to expand access to free preschool for more than 6 million children per year and increase the quality of preschool for many more children already enrolled. Importantly, parents will be able to send children to high-quality preschool in the setting of their choice—from public schools to child care providers to Head Start. …

The Build Back Better framework will ensure that middle-class families pay no more than 7 percent of their income on child care and will help states expand access to high-quality, affordable child care to about 20 million children per year—covering 9 out of 10 families across the country with young children. For two parents with one toddler earning $100,000 per year, the framework will produce more than $5,000 in child care savings per year. Nearly all families of four making up to $300,000 per year will be eligible.

[447] House Resolution 5376: “Build Back Better Act.” U.S. House of Representatives, 117th Congress (2021–2022). Accessed February 28, 2022 at <www.congress.gov>

09/27/2021 — Introduced in House …

Subtitle D—Child Care and Universal Pre-Kindergarten

Section 23001. Birth Through Five Child Care and Early Learning Entitlement. …

(D) Eligible child.—The term “eligible child” means an individual (without regard to the immigration status of the individual or of any parent of the individual)—

(i) who is less than 6 years of age;

(ii) who is not yet in kindergarten;

(iii) whose family income—

(I) does not exceed 100 percent of the State median income for a family of the same size for fiscal year 2022;

(II) does not exceed 115 percent of such State median income for fiscal year 2023;

(III) does not exceed 130 percent of such State median income for fiscal year 2024; and

(IV) for each of the fiscal years 2025 through 2027, is of any level;

(iv) whose family assets do not exceed $1,000,000 (as certified by a member of such family); and

(v) who—

(I) resides with a parent participating in an eligible activity;

(II) is included in a population of vulnerable children identified by the lead agency involved, which at a minimum shall include children experiencing homelessness, children in foster care, children in kinship care, and children who are receiving, or need to receive, child protective services; or

(III) resides with a parent who is more than 65 years of age. …

(c) Appropriations.—

(1) In general.—In addition to amounts otherwise available, there is appropriated to the Department of Health and Human Services, out of any money in the Treasury not otherwise appropriated, for carrying out this section—

(A) $20,000,000,000 for fiscal year 2022, to remain available until September 20, 2025,

(B) $30,000,000,000 for fiscal year 2023, to remain available until September 30, 2026

(C) $40,000,000,000 for fiscal year 2024, to remain available until September 30, 2027;

(D) such sums as may be necessary for each of fiscal years 2025 through 2027, to remain available for one fiscal year. …

(d) Establishment of Birth Through Five Child Care and Early Learning Entitlement Program.—

(1) In general.—The Secretary is authorized to administer a child care and early learning entitlement program under which families, in States, territories, and Indian Tribes with an approved application under subsection (f) or (g), shall be provided an opportunity to obtain high-quality child care services for eligible children, subject to the requirements this section.

(2) Assistance for every eligible child.—Beginning on October 1, 2024, every family who applies for assistance under this section with respect to a child in a State with an approved application under subsection (g), or in a territory or Indian tribe with an approved application under subsection (f), and who is determined, by a lead agency (or other entity designated by a lead agency) following standards and procedures established by the Secretary by rule, to be an eligible child, shall be offered child care assistance in accordance with and subject to the requirements and limitations of this section. …

(E) Sliding fee scale for copayments.—

(i) In general.—Except as provided in clauses (ii)(I) and (iii), the State plan shall provide an assurance that the State will for the period covered by the plan use a sliding fee scale … to determine a copayment for a family receiving assistance under this section….

(ii) Sliding fee scale.—A full copayment … shall use a sliding fee scale that provides that, for a family with a family income—

(I) of not more than 75 percent of State median income for a family of the same size, the family shall not pay a copayment, toward the cost of the child care involved for all eligible children in the family;

(II) of more than 75 percent but not more than 100 percent of State medican income for a family of the same size, the copayment shall be more than 0 but not more than 2 percent of that family income, toward such cost for all such children;

(III) of more than 100 percent but not more than 125 percent of State median income for a family of the same size, the copayment shall be more than 2 but not more than 4 percent of that family income, toward such cost for all such children;

(IV) of more than 125 percent but not more than 150 percent of State median income for a family of the same size, the copayment shall be more than 4 but not more than 7 percent of that family income, toward such cost for all such children; and

(V) of more than 150 percent of the State median income for a family of the same size, the copayment shall be 7 percent of that family income, toward such cost for all such children. …

Section 23002. Universal Preschool. …

(5) Eligible child.—The term “eligible child” means a child who is age 3 or 4, on the date established by the applicable local educational agency for kindergarten entry. …

(c) Payments for State Universal Preschool Services. …

(2) Payments to states.—

(A) Preschool services.—The Secretary shall pay to each State with an approved State plan … an amount for each year equal to—

(i) 100 percent of the State’s expenditures in the year for preschool services described in subsection (d), for each of fiscal years 2022, 2023, and 2024;

(ii) 90 percent of the State’s expenditures in the year for such preschool services, for fiscal year 2025;

(iii) 80 percent of the State’s expenditures in the year for such preschool services, for fiscal year 2026;

(iv) 70 percent of the State’s expenditures in the year for such preschool services, for fiscal year 2027; and

(v) 60 percent of the State’s expenditures in the year for such preschool services, for fiscal year 2028. …

(6) State plan.—In order to be eligible for payments under this section, the Governor of a State shall submit a State plan for universal, high-quality, free, inclusive, and mixed delivery preschool services to the Secretary for approval….

[448] Calculated with data from Vote 385: “Build Back Better Act.” U.S. House of Representatives, November 19, 2021. <clerk.house.gov>

Party

Voted “Yes”

Voted “No”

Voted “Present” or Did Not Vote †

Number

Portion

Number

Portion

Number

Portion

Republican

0

0%

212

100%

1

0%

Democrat

220

100%

1

0%

0

0%

Independent

0

0%

0

0%

0

0%

NOTE: † Voting “Present” is effectively the same as not voting.

[449] House Resolution 5376: “Inflation Reduction Act of 2022.” U.S. House of Representatives, 117th Congress (2021–2022). Accessed June 30, 2023 at <www.congress.gov>

“08/07/2022 — Passed/agreed to in Senate: Passed Senate with an amendment by Yea-Nay Vote. 51–50.”

[450] Article: “Manchin, Key Dem, Says Build Back Better Bill Is ‘Dead.’ ” By Alan Fram. Associated Press, February 1, 2022. <www.pbs.org>

“What Build Back Better bill?” [West Virginia Senator Joe] Manchin said Tuesday, using the legislation’s name, when reporters asked about it. “There is no, I mean, I don’t know what you’re all talking about.” Asked if he’d had any talks about it, he added, “No, no, no no. It’s dead.”

Manchin has repeatedly said he remains open to talks aimed at crafting a smaller bill that could include its provisions aimed at reducing carbon emissions, creating free pre-Kindergarten programs and increasing federal health care subsidies. But he has said negotiations have yet to begin. …

In December, Manchin’s abrupt announcement of his opposition to the 10-year, roughly $2 trillion measure, which had already passed the House, snuffed out its prospects in the Senate. His party needs his vote to prevail in that chamber, where every Republican opposes the legislation but Vice President Kamala Harris can vote to break ties.

[451] Article: “Not the Year for Women and Parents: Child Care Provisions Were Cut From the Inflation Reduction Act. It’s Not the First Time.” By Christopher Hickey. CNN, August 12, 2022. <www.cnn.com>

The passage of the Inflation Reduction Act may have been a win for Democrats and President Biden on climate, the US economy and prescription drugs, but for women it falls short of its potential on key policies.

The Democrats’ ambitious plans at points included universal pre-kindergarten, lower child care costs, paid family and sick leave and the enhanced child tax credit, among other provisions, but those were ultimately eliminated during negotiations.

[452] Report: “Child Care: Information on Integrating Early Care and Education Funding.” U.S. Government Accountability Office, September 2016. <www.gao.gov>

Page 1:

Every year millions of children under the age of 5 participate in federal and state early care and education programs. For fiscal years 2010 to 2015, Congress appropriated almost $48 billion to Head Start and over $31 billion to the Child Care and Development Fund (CCDF), the two largest sources of federal funding for early care and education. The Head Start program is administered by the Department of Health and Human Services (HHS) through about 1,800 grants to groups who deliver education, nutrition, health, and other social services to approximately 1 million children in poverty from birth to age 5 each year. Through Head Start, HHS funds two programs—Head Start, which provides early care and education to 3- and 4- year-olds, and Early Head Start, which serves pregnant women and children from birth up to age 3. CCDF funding is provided through a block grant to states and tribes to, among other things, help low-income, working families pay for child care (for children from birth to 12) so that parents can work, pursue an education, or attend job training. Additionally, states spend about $5.6 billion annually on state-funded prekindergarten (Pre-K) programs.1

[453] Report: “Budget of the U.S. Government Fiscal Year 2024, Appendix.” White House, Office of Management and Budget, March 2023. <www.govinfo.gov>

Pages 460–461: “Payments to States for the Child Care and Development Block Grant … Program and Financing (in millions of dollars) … Obligations by program activity … Line 0001 … Child Care Block Grant Payments to States … 2022 actual [=] 6,135”

Pages 461–463: “Children and Families Services Programs … Program and Financing (in millions of dollars) … Obligations by program activity … Line 0101 Head Start … 2022 actual [=] 11,033”

[454] Report: “Documentation to the NCES [National Center for Education Statistics] Common Core of Data Public Elementary/Secondary School Universe Survey: School Year 2010–11, Version Provisional 2a.” U.S. Department of Education, National Center for Education Statistics, September 2012. <nces.ed.gov>

Page C-7: “Head Start Program A federally funded program that provides comprehensive educational, social, health, and nutritional services to low-income preschool children and their families, and children from ages 3 to school entry age (i.e., the age of compulsory school attendance).”

[455] Webpage: “About Us.” U.S. Department of Health & Human Services, Office of Head Start. Accessed June 27, 2015 at <bit.ly>

Head Start promotes the school readiness of young children from low-income families through agencies in their local community. …

Head Start and Early Head Start programs support the mental, social, and emotional development of children from birth to age 5. In addition to education services, programs provide children and their families with health, nutrition, social, and other services. Head Start services are responsive to each child and family’s ethnic, cultural, and linguistic heritage.

Head Start encourages the role of parents as their child’s first and most important teachers. Programs build relationships with families that support positive parent-child relationships, family well-being, and connections to peers and community. Head Start began as a program for preschoolers. Three- and 4-year-olds made up over 80 percent of the children served by Head Start last year.

Early Head Start serves pregnant women, infants, and toddlers. Early Head Start programs are available to the family until the child turns 3 years old and is ready to transition into Head Start or another pre-K program. Early Head Start helps families care for their infants and toddlers through early, continuous, intensive, and comprehensive services.

[456] “Head Start Impact Study, Final Report.” By Michael Puma and others. U.S. Department of Health and Human Services, Administration for Children and Families, January 2010. <www.acf.hhs.gov>

Page xiii:

Since its beginning in 1965 as a part of the War on Poverty, Head Start’s goal has been to boost the school readiness of low-income children. Based on a “whole child” model, the program provides comprehensive services that include preschool education; medical, dental, and mental health care; nutrition services; and efforts to help parents foster their child’s development. Head Start services are designed to be responsive to each child’s and family’s ethnic, cultural, and linguistic heritage.

Page 1-2:

The Head Start program, created in 1965 as part of the War on Poverty, is intended to boost the school readiness of low-income children. Head Start has grown from its early days of originally offering six-week summer sessions for 4-year-olds, to providing typically nine-month and sometimes year-long programs serving children from three to five years of age. The program is dedicated to promoting school readiness and providing comprehensive child development services to low-income children, their families, and communities, with an underlying premise that low-income children and families need extra support to prepare them for the transition to school.

[457] Report: “Head Start: Undercover Testing Finds Fraud and Abuse at Selected Head Start Centers.” U.S. Government Accountability Office, May 18, 2010. <www.gao.gov>

Page 3: “Head Start operates both full- and part-day programs—most only during the school year.”

[458] “Head Start Impact Study, Final Report.” By Michael Puma and others. U.S. Department of Health and Human Services, Administration for Children and Families, January 2010. <www.acf.hhs.gov>

Page xiii:

The Head Start Impact Study was conducted with a nationally representative sample of 84 grantee/delegate agencies and included nearly 5,000 newly entering, eligible 3- and 4-year-old children who were randomly assigned to either: (1) a Head Start group that had access to Head Start program services or (2) a control group that did not have access to Head Start, but could enroll in other early childhood programs or non-Head Start services selected by their parents. Data collection began in fall 2002 and continued through 2006, following children from program application through the spring of their 1st grade year.2

Page xx: “For those attending Head Start, the average number of hours spent per week was between 24 and 28 hours, with some variation by cohort and year.”

[459] “Third Grade Follow-Up to the Head Start Impact Study, Final Report.” By Michael Puma and others. U.S. Department of Health and Human Services, Administration for Children and Families, October 2012. <www.acf.hhs.gov>

Pages xiii–xix:

The Head Start Impact Study (HSIS) was conducted with a nationally representative sample of 84 grantee/delegate agencies and included nearly 5,000 newly entering, eligible 3- and 4-year-old children who were randomly assigned to either: (1) a Head Start group that had access to Head Start program services or (2) a control group that did not have access to Head Start, but could enroll in other early childhood programs or non-Head Start services selected by their parents. Data collection began in fall 2002 and continued through 2008, following children from program application through the spring of their 3rd grade year. …

This study is unique in its design and differs from prior evaluations of early childhood programs:

Randomized Control. The Congressional mandate for this study had a clearly stated goal of producing causal findings, i.e., the purpose was to determine if access to Head Start caused better developmental and parenting outcomes for participating children and families. To do this, the study randomly assigned Head Start applicants either to a Head Start group that was allowed to enroll, or to a “control” group that could not. This procedure ensured comparability between the two groups at program entry, so that later differences can be causally attributed to Head Start.

Representative Sample of Programs and Children. Most random assignment studies are conducted in small demonstration programs or in a small number of operating sites, usually those that volunteer to be included in the research. In contrast, the Head Start Impact Study is based on a nationally representative sample of Head Start programs and children, with a few exceptions for programs serving particular populations. This makes the study results generalizable to the vast majority of programs nationwide at the time the study was fielded in 2002, not just the selected study sample. Unlike most studies, it examines the average impact of programs that represent the full range of intensity and quality and adherence to the established Head Start program standards (i.e., the best, the worst, and those in the middle of a fully implemented program).

Examination of a Comprehensive Set of Outcomes Over Time. The study quantifies the overall impact of Head Start separately for 3- and 4-year-old children in four key program domains-cognitive development, social-emotional development, health status and services, and parenting practices–following them through early elementary school. These impacts are measured by examining the difference in outcomes between children assigned to the Head Start group and those assigned to the control group.

Other study features that must be considered in interpreting the study findings include:

Control Group Children Did Not All Stay at Home. Children who were placed in the control or comparison group were allowed to enroll in other non-parental care or non-Head Start child care or programs selected by their parents. They could remain at home in parent care, or enroll in a child care or preschool program. Consequently, the impact of Head Start was determined by a comparison to a mixture of alternative care settings rather than against a situation in which children were artificially prevented from obtaining child care or early education programs outside of their home. Approximately 60 percent of the control group children participated in child care or early education programs during the first year of the study, with 13.8 percent of the 4-year-olds in the control group and 17.8 percent of the 3-year-olds in the control group finding their way into Head Start during this year. Preventing families from seeking out alternative care or programs for their children is both infeasible and unethical. The design used here answers the policy question, how well does Head Start do when compared against the other types of services or care that low-income children could receive in fall 2002.

Impacts Represent the Effects of One Year of Head Start. For children in the 4-year-old cohort, the study provides the impact of Head Start for a single year, i.e., the year before they are eligible to enter kindergarten. The impacts for the 3-year-old cohort reflect the benefits of being provided an earlier year of Head Start (as compared to the control group, which received access to Head Start at age 4.) At the end of one year of Head Start participation, the 3-year-old cohort—but not the 4-year-old cohort—had another year to go before they started kindergarten. It was not feasible or desirable for this study to prevent 3-year-olds from participating in Head Start for two years. Thus, the study could not directly assess the receipt of one year versus two years of Head Start. Rather, it addresses the receipt of an earlier year—whether having Head Start available at age three is helpful to children brought to the program at that age, or whether those children would be just as well off, if the program did not enroll them until age four. This is not only important to individual families; it also answers an important policy question. To answer this question, the best approach is to preclude program entry at age three while allowing it at age four and contrast outcomes after that point with statistically equivalent children never excluded from the program. By design, the study did not attempt to control children’s experiences after their first Head Start year.

The Head Start Impact Study is a comprehensive, carefully designed study of a large-scale early childhood program that has existed for more than 40 years. It is designed to address the overall average impact of the Head Start program as it existed in 2002. The findings cannot be directly compared to more narrowly focused studies of other early childhood programs. The Advisory Committee on Head Start Research and Evaluation, which developed the blueprint for this study, recommended that “the research and findings should be used in combination with the rest of the Head Start research effort to improve the effectiveness of Head Start programs for children and families” (Advisory Committee on Head Start Research and Evaluation, 1999, p. 44). The Third Grade Follow-up to the Head Start Impact Study builds upon the existing randomized control design in the HSIS in order to determine the longer term impact of the Head Start program on the well-being of children and families through the end of 3rd grade.

Key Findings

Looking across the full study period, from the beginning of Head Start through 3rd grade, the evidence is clear that access to Head Start improved children’s preschool outcomes across developmental domains, but had few impacts on children in kindergarten through 3rd grade. Providing access to Head Start was found to have a positive impact on the types and quality of preschool programs that children attended, with the study finding statistically significant differences between the Head Start group and the control group on every measure of children’s preschool experiences in the first year of the study. In contrast, there was little evidence of systematic differences in children’s elementary school experiences through 3rd grade, between children provided access to Head Start and their counterparts in the control group.

In terms of children’s well-being, there is also clear evidence that access to Head Start had an impact on children’s language and literacy development while children were in Head Start. These effects, albeit modest in magnitude, were found for both age cohorts during their first year of admission to the Head Start program. However, these early effects rapidly dissipated in elementary school, with only a single impact remaining at the end of 3rd grade for children in each age cohort.

With regard to children’s social-emotional development, the results differed by age cohort and by the person describing the child’s behavior. For children in the 4-year-old cohort, there were no observed impacts through the end of kindergarten but favorable impacts reported by parents and unfavorable impacts reported by teachers emerged at the end of 1st and 3rd grades. One unfavorable impact on the children’s self-report emerged at the end of 3rd grade. In contrast to the 4-year-old cohort, for the 3-year-old cohort there were favorable impacts on parent-reported social emotional outcomes in the early years of the study that continued into early elementary school. However, there were no impacts on teacher-reported measures of social-emotional development for the 3-year-old cohort at any data collection point or on the children’s self-reports in 3rd grade.

In the health domain, early favorable impacts were noted for both age cohorts, but by the end of 3rd grade, there were no remaining impacts for either age cohort. Finally, with regard to parenting practices, the impacts were concentrated in the younger cohort. For the 4-year-old cohort, there was one favorable impact across the years while there were several favorable impacts on parenting approaches and parent-child activities and interactions (all reported by parents) across the years for the 3-year-old cohort.

In summary, there were initial positive impacts from having access to Head Start, but by the end of 3rd grade there were very few impacts found for either cohort in any of the four domains of cognitive, social-emotional, health and parenting practices. The few impacts that were found did not show a clear pattern of favorable or unfavorable impacts for children.

In addition to looking at Head Start’s average impact across the diverse set of children and families who participated in the program, the study also examined how impacts varied among different types of participants. There is evidence that for some outcomes, Head Start had a differential impact for some subgroups of children over others. At the end of 3rd grade for the 3-year-old cohort, the most striking sustained subgroup findings were found in the cognitive domain for children from high risk households as well as for children of parents who reported no depressive symptoms. Among the 4-year-olds, sustained benefits were experienced by children of parents who reported mild depressive symptoms, severe depressive symptoms, and Black children.

Overview of Study Methods

To reliably answer the research questions outlined by Congress, a nationally representative sample of Head Start programs and newly entering 3- and 4-year-old children was selected, and children were randomly assigned either to a Head Start group that had access to Head Start services in the initial year of the study or to a control group that could receive any other non-Head Start services available in the community, chosen by their parents. In fact, approximately 60 percent of control group parents enrolled their children in some other type of preschool program in the first year. In addition, all children in the 3-year-old cohort could receive Head Start services in the second year. Under this randomized design, a simple comparison of outcomes for the two groups yields an unbiased estimate of the impact of access to Head Start in the initial year on children’s school readiness. This research design ensured that the Head Start and control groups did not differ in any systematic or unmeasured way except through their access to Head Start services. It is important to note that, because the control group in the 3-year-old cohort was given access to Head Start in the second year, the findings for this age group reflect the added benefit of providing access to Head Start at age 3 vs. at age 4, not the total benefit of having access to Head Start for two years.

In addition to random assignment, this study is set apart from most program evaluations because it includes a nationally representative sample of programs, making results generalizable to the Head Start program as a whole, not just to the selected samples of programs and children. However, the study does not represent Head Start programs serving special populations, such as tribal Head Start programs, programs serving migrant and seasonal farm workers and their families, or Early Head Start. Further, the study does not represent the 15 percent of Head Start programs in which the pool of applicants for Head Start slots was too small to allow for an adequate control group.

Selected Head Start grantees and centers had to have a sufficient number of applicants for the 2002–2003 program year to allow for the creation of a control group without requiring Head Start slots to go unfilled. As a consequence, the study was conducted in communities that had more children eligible for Head Start than could be served with the existing number of funded slots.

At each of the selected Head Start centers, program staff provided information about the study to parents at the time enrollment applications were distributed. Parents were told that enrollment procedures would be different for the 2002–2003 Head Start year and that some decisions regarding enrollment would be made using a lottery-like process. Local agency staff implemented their typical process of reviewing enrollment applications and screening children for admission to Head Start based on criteria approved by their respective Policy Councils. No changes were made to these locally established ranking criteria.

Information was collected on all children determined to be eligible for enrollment in fall 2002, and an average sample of 27 children per center was selected from this pool: 16 who were assigned to the Head Start group and 11 who were assigned to the control group. Random assignment was done separately for two study samples—newly entering 3-year-olds (to be studied through two years of potential Head Start participation, kindergarten, 1st grade, and 3rd grade) and newly entering 4-year-olds (to be studied through one year of Head Start participation, kindergarten, 1st grade, and 3rd grade).

The total sample, spread over 23 different states, consisted of 84 randomly selected Head Start grantees/delegate agencies, 383 randomly selected Head Start centers, and a total of 4,667 newly entering children, including 2,559 in the 3-year-old group and 2,108 in the 4-year-old group.4

Data collection began in the fall of 2002 and continued through the spring of 2008, following children from entry into Head Start through the end of 3rd grade. Comparable data were collected for both Head Start and control group children, including interviews with parents, direct child assessments, surveys of Head Start, other early childhood, and elementary school teachers, interviews with center directors and other care providers at the preschool level, direct observations of the quality of various preschool care settings, and teacher or care provider assessments of children. For the Third Grade Follow-up, principal surveys and teacher ratings by the principal were added to the data collection. Response rates were consistently quite high, approximately 80 percent for parents and children throughout the study. Teacher response rates were higher at the preschool level (about 80 percent) and gradually decreased as the child data were collected only during 3rd grade and the response rate was about the same as for 3rd grade teachers.

Although every effort was made to ensure compliance with random assignment, some children accepted into Head Start did not participate in the program (about 15 percent for the 3-year-old cohort and 20 percent for the 4-year-old cohort), and some children assigned to the non-Head Start group nevertheless entered the program in the first year (about 17 percent for 3-year-olds and 14 percent for 4-year-olds), typically at centers that were not in the study sample. These families are referred to as “no shows” and “crossovers.” Statistical procedures for dealing with these events are discussed in the report. Thus, the findings in this report provide estimates of both the impact of access to Head Start using the sample of all randomly assigned children (referred to as Intention to Treat, or ITT) and the impact of actual Head Start participation (adjusting for the no shows and crossovers, referred to as Impacts on the Treated or IOT).

Page xx:

Not surprisingly, the study children attended schools with much higher levels of poverty than schools nationwide (as indicated by proportions of students eligible for free- and reduced-price lunch—66–67 percent) and were in schools with higher proportions of minority students (approximately 60 percent of students). With only a few exceptions, teacher and classroom characteristics did not differ significantly between children in the Head Start group and those in the control group.

Page xxi:

Impacts on Children’s Cognitive Development

The cognitive domain consisted of: (1) direct assessments of language and literacy skills, pre-writing skills (in Head Start years only), and math skills; (2) teacher reports of children’s school performance; and (3) parent reports of child literacy skills and grade promotion.

There is clear evidence that Head Start had a statistically significant impact on children’s language and literacy development while children were in Head Start. These effects, albeit modest in magnitude, were found for both age cohorts during their first year of admission to the Head Start program. However, these early effects dissipated in elementary school, with only a single impact remaining at the end of 3rd grade for children in each age cohort: a favorable impact for the 4-year-old cohort (ECLS-K Reading) and an unfavorable impact for the 3-year-old cohort (grade promotion).

Impacts aside, these children remain disadvantaged compared to their same-age peers; the scores of both the Head Start and the control group children remained lower than the norm for the population. At the end of 3rd grade, HSIS children (both Head Start and control group children) in the 4-year-old cohort, on average, scored about eight points (approximately one-half of a standard deviation) lower than a national sample of third graders on the ECLS-K Reading Assessment and the promotion rate6 for the 3-year old cohort was two to three percent lower than the predicted national promotion rate for children at the end of 3rd grade.

For mathematics, impacts were found only on a single outcome measure (Woodcock Johnson III Applied Problems) and only for the 3-year-old cohort at the end of their Head Start year.

The findings from the cognitive domain are summarized by age cohort below.7 Exhibits 2a and 2b present all statistically significant cognitive impacts and their effect sizes8 from the Intent to Treat (ITT) analysis.

Page xxv:

Impacts on Children’s Social-Emotional Development

The social-emotional domain consisted of parent-reported measures during the Head Start years, reports by both parents and teachers in all elementary school years, with child self-reports added at the end of 3rd grade. Measures of children’s behavior, social skills and approaches to learning, parent-child relationships, teacher child relationships, school adjustment, peer relationships and school experiences were assessed.

With regard to children’s social-emotional development, the results differed by age cohort and by the source of the information on the child’s behavior. For children in the 4-year-old cohort, there were no observed impacts through the end of kindergarten and then favorable impacts reported by parents and unfavorable impacts reported by teachers at the end of 1st and 3rd grades and children at the end of 3rd grade.

In contrast, the early favorable social emotional impacts reported by parents for the 3-year-old cohort continued into early elementary school. There were favorable impacts at all data collection points through the end of 3rd grade on parent-reported measures of children’s social-emotional development. However, there were no impacts on teacher-reported measures of social-emotional development for the 3-year-old cohort at any data collection point or on the children’s self-reports in 3rd grade.

The findings from the social-emotional domain are summarized by age cohort below. Exhibits 3a and 3b provide all statistically significant social-emotional impacts and their effect sizes from the ITT analysis.

Page xxix:

Impact on Health Status and Access to Health Services

The health domain consisted of two categories: (1) children’s receipt of health care services and (2) their current health status. Early favorable impacts in the health domain were noted for both age cohorts but by the end of 3rd grade, there were no remaining impacts for either age cohort.

The findings from the health domain are summarized by age cohort below, while Exhibits 4a and 4b present all statistically significant health impacts and their effect sizes from the ITT analysis.

Page xxxi:

Impact on Parenting Practices

This domain consisted of six categories of outcomes: (1) disciplinary practices, (2) educational supports, (3) safety practices, (4) parenting styles, (5) parent participation in and communication with school and (6) parent and child time together. With regard to parenting practices, the impacts were concentrated in the younger cohort, which showed favorable parent-reported impacts across all years of the study. For the 4-year-old cohort, in contrast, there were few impacts.

The findings from the parenting practices domain are summarized by age cohort below, and Exhibits 5a and 5b provide the statistically significant parenting practices impacts and their effect sizes from the ITT analysis.

Pages 1–2: “In general, during the period of this study, to be eligible for Head Start, a child had to be living in a family whose income was below the Federal poverty line. Programs were permitted, however, to fill ten percent of their enrollment with children from families that are over this income level.”

Page 11: “To be randomly assigned, the child’s eligibility for admission to the program had to have been determined by the local Head Start agency. Thus all children in the study were determined to be eligible for Head Start, regardless of whether they were assigned to the Head Start or control group.”

Pages 25–27:

Child and Family Outcome Measures

Outcome measures were developed in four domains—child cognitive development, child social-emotional development, health, and parenting practices. The selection of these domains was guided by several factors. First, it was important to measure the school readiness skills that are the focus of the Head Start program. The Head Start performance measures and conceptual framework (U.S. Department of Health and Human Services, 2001) indicate that children enrolled in Head Start should demonstrate improved emergent literacy, numeracy, and language skills. The framework also stresses that children should demonstrate positive attitudes toward learning and improved social and emotional well-being, as well as improved physical health and development.

Second, domains were selected to reflect the program’s whole child model, i.e., school readiness is considered to be multi-faceted and comprising five dimensions of early learning: (1) physical well-being and motor development, (2) social and emotional development, (3) approaches toward learning, (4) language usage, and (5) cognition and general knowledge (Kagan, Moore, & Bredekamp, 1995). The whole-child model also was recommended by the Goal One Technical Planning Group of the National Education Goals Panel (Goal One Technical Planning Group, 1991, 1993).

Third, in 2002, the National Institute of Child Health and Human Development (NICHD), the Administration for Children and Families (ACF), and the Office of the Assistant Secretary for Planning and Evaluation (ASPE) within the U.S. Department of Health and Human Services (HHS) convened a panel of experts to discuss the state of measurement and assessment on early childhood education and school readiness in the cognitive and social emotional domains. Language, early literacy, and mathematics were the primary cognitive domains identified by the experts as important to early childhood development. The experts identified social-emotional competency and regulation of attention, behavior, and emotion as critical measures in the social-emotional domain.

Based on these factors and advice from the experts consulting with the Head Start Impact Study team and the Advisory Committee on Head Start Research and Evaluation, measures were selected to assess the cognitive, social-emotional, and health outcomes of children. Considering the major emphasis Head Start places on parent education and involvement, and its importance for promoting children’s development, a fourth domain, parenting practices, was also included. Exhibits 2.6 and 2.7 provide the measures used in pre-K through 3rd grade and the year in which they were administered. The 3rd grade measures are summarized in more detail within this chapter, organized by the four domains. A summary of the measures used in pre-K through 1st grade is provided in the Head Start Impact Study Final Report.

NOTE: Pages 27–30 list the 41 cognitive, social-emotional, health, and parenting measures evaluated in the study.

[460] Report: “ACF Should Improve Oversight of Head Start To Better Protect Children’s Safety.” By Suzanne Murrin. U.S. Department of Health & Human Services, Office of Inspector General, September 2022. <oig.hhs.gov>

Page 2 (of PDF):

The Office of Head Start (OHS)—part of the Department of Health and Human Services’ Administration for Children and Families (ACF)—oversees Head Start grant recipients to ensure their compliance with program standards, including standards addressing children’s safety. The Office of Inspector General (OIG) initiated this review to determine the extent to which recipients received adverse findings from ACF for violating program standards that prohibit child abuse, leaving a child unsupervised (lack of supervision), or releasing a child to an unauthorized person (unauthorized release); assess ACF’s oversight of how recipients identify, address, and prevent such incidents; and identify opportunities to better protect children. …

Approximately one in four Head Start grant recipients received an adverse finding from ACF for child abuse, lack of supervision, or unauthorized release between October 2015 and May 2020. These adverse findings encompassed 1,029 individual incidents.

Additionally, Head Start grant recipients did not promptly self-report all incidents of child abuse, lack of supervision, and unauthorized release as required.

Page 5:

We reviewed OHS data on recipients’ monitoring review results from October 1, 2015, through May 11, 2020 (i.e., from the beginning of fiscal year 2016—when the current Head Start performance standards went into effect—to the start of this study’s data collection). These data included identifying information for each recipient; the dates of each review; regulatory citations for each adverse finding; the status of each finding (e.g. active, corrected, etc.); and narrative fields detailing the reasons for the review results and corrective actions taken.

Page 8: “During the period of review, 27 percent of Head Start grant recipients received an adverse finding for child abuse, lack of supervision, or unauthorized release.”

Pages 9–10:

Nearly one in five recipients received an adverse finding for lack of supervision between October 2015 and May 2020. Our review of monitoring report narratives associated with these findings identified 533 individual incidents in which a child was left unsupervised. …

Between October 2015 and May 2020, 12 percent of recipients received an adverse finding for child abuse. Our review of monitoring report narratives associated with these findings identified 454 separate incidents (i.e., unique child-date combinations) of abuse. A single incident of child abuse sometimes included multiple forms of abuse (for example, both physical and verbal abuse). …

Few recipients had adverse findings for unauthorized release. We identified 42 incidents between October 2015 and May 2020 in which a child was released to an unauthorized person. Of these, 25 incidents involved children released from a bus to an unauthorized person and 17 involved children released from a Head Start facility to an unauthorized person.

[461] Paper: “Effects of a Statewide Pre-Kindergarten Program on Children’s Achievement and Behavior Through Sixth Grade.” By Kelley Durkin and others. Developmental Psychology, March 2022. Pages 470–484. <psycnet.apa.org>

Page 470:

This article presents the results through sixth grade of a longitudinal randomized control study of the effects of a scaled-up, state-supported pre-K program. The analytic sample includes 2,990 children from low-income families who applied to oversubscribed pre-K program sites across the state and were randomly assigned to offers of admission or a wait list control. Data through sixth grade from state education records showed that the children randomly assigned to attend pre-K had lower state achievement test scores in third through sixth grades than control children, with the strongest negative effects in sixth grade. A negative effect was also found for disciplinary infractions, attendance, and receipt of special education services, with null effects on retention.

Pages 472–473:

This paper reports results from a randomized longitudinal study of TN-VPK [Tennessee Voluntary Pre-K] that began with the 2009 and 2010 pre-K cohorts. As in the Head Start Impact Study, the Tennessee research team implemented randomization at oversubscribed program sites and followed the resulting sample afterward to investigate how well the pre-K effects were sustained. The results through sixth grade are reported here and those for earlier periods are summarized. …

While state pre-K programs vary, the Tennessee program is relatively typical. Pilot programs began in 1996 with full statewide implementation in 2005. TN-VPK is organized and overseen by the state department of education and serves more than 18,000 4-year-old children from low-income families statewide with local program sites in all but a few of the school districts in the state. The state requires a minimum instructional time of 5.5 hr per day, 5 days a week during the school year, classes of no more than 20 students staffed by a state-licensed teacher endorsed for early childhood education and paid at public school teacher rates, an educational assistant in each room, and a curriculum selected from a state-approved list….

When the TN-VPK program began, it met nine of the 10 standards advocated until recently revised by the National Institute of Early Education Research…. The current study began with the 2009 and 2010 pre-K cohorts. In 2015 … we reported a separate related study of a representative sample of TN-VPK programs in the state that found that quality as measured by the Early Childhood Environment Rating Scale … matched or exceeded that reported in evaluations of other state pre-K programs. More recently, Pion and Lipsey (2021) used a regression-discontinuity design with that statewide sample to investigate end-of-pre-K effects on a battery of commonly used cognitive measures. The TN-VPK results compared favorably with those found in similar designs for more than a dozen other statewide pre-K programs.

This study involves 79 oversubscribed TN-VPK [Tennessee Voluntary Pre-K] program sites with two cohorts of pre-K applicants randomized to offers of admission or a waitlist, one cohort entering pre-K in 2009–10, the other in 2010–11. This resulted in randomization of 111 site-level applicant lists (R-Lists). To be included in the RCT [random control trial] analytic sample, students had to: (1) be eligible for free or reduced price lunch, (2) be 4 years old by September 30 of their pre-K year, (3) be applicants to an oversubscribed TN-VPK program site that successfully randomized admission decisions, (4) not have applied for out-of-classroom special education services prior to pre-K enrollment, and (5) have a record in the state education database for at least one year of attendance in a Tennessee public school between pre-K and third grade. Of the students who met Criterion 1 through 4, there were 141 students who did not have subsequent state data in any year from kindergarten through third grade (Criterion 5). Omitting those students left a total of 2,990 eligible children in the sample used for analysis. …

We report all results using two definitions of treatment and control conditions: intent-to-treat (ITT) and treatment-on-treated (TOT). ITT differentiates students according to whether they were randomly assigned to receive offers of admission. TOT differentiates students according to whether they actually attended TN-VPK or not. … The children enrolled in TN-VPK attended an average of 143.8 days (SD = 31.6) during the school year. All participants were treated ethically, and the Vanderbilt University Institutional Review Board … approved this study. …

While we do not have information on the alternative care arrangements for students in the RCT analytic sample who did not attend TN-VPK, we do have that information via parent interviews for the 306 nonattending children in the ISS sample described earlier. Overall, 63% received home-based care by a parent, relative, or other person; 13% attended Head Start; 16% were in private center-based childcare; 5% had some combination of Head Start and private childcare; and childcare for 3% was not reported. Characteristics of the programs and students contributing to the ISS were very similar to those in the RCT analytic sample

Page 476:

Table 2: Intent-to-Treat Effect Estimates for Third and Sixth Grade State Achievement Tests

Subject

Treatment Group Meana

Control Group Meana

Third Grade TCAP (observed values)

Reading

746.1

748.2

Math

755.9

760.2

Science

748.6

752.2

Sixth Grade TNReady (observed values)

ELA

321.2

325

Math

317.1

323.6

Science

750.4

755.6

TCAP [Tennessee Comprehensive Assessment Program]
ELA [English Language Arts]

Page 478:

Table 4: Intent-to-Treat Effect Estimates for Grade Level and Special Education Status at the End of Sixth Grade

Outcome

Treatment Group Meana

Control Group Meana

On Grade

.872

.881

IEP

.117

.084

IEP [Individualized Educational Program]

Page 480:

Table 5: Intent-to-Treat Effect Estimates for Cumulative Disciplinary Events Through Sixth Grade

Offense Type

Treatment Group Meana

Control Group Meana

School Rules

0.231

0.185

Major Offenses

0.137

0.109

All Offenses

0.273

0.234

CALCULATION: 13% Head Start + 16% private center-based childcare + 5% some combination of Head Start and private = 39% other preschool

[462] Supplement: “Effects of a Statewide Prekindergarten Program on Children’s Achievement and Behavior through Sixth Grade”

“Table S9: Intent-to-Treat (ITT) and Treatment-on-Treated (TOT) Effect Estimates for On Grade Level From Kindergarten Through Sixth Grade (RCT Analytic Sample) … ITT … On Grade Level (Observed Values) … Sixth Grade … Treatment Group Meana [=] .872 … Control Group Meana [=] .881”

“Table S11: Intent-to-Treat (ITT) and Treatment-on-Treated (TOT) Effect Estimates for Attendance from Kindergarten Through Sixth Grade (RCT Analytic Sample) … ITT … Attendance (Observed Values) … Sixth Grade … Treatment Group Meana [=] .971 … Control Group Meana [=] .975”

[463] Paper: “A Reanalysis of the High/Scope Perry Preschool Program.” By James Heckman and others. University of Chicago, January 22, 2010. <www.researchgate.net>

Page 2:

The High/Scope Perry Preschool program, conducted in the 1960s, was an early childhood intervention that provided preschool to low-IQ, disadvantaged African-American children living in Ypsilanti, Michigan, a town near Detroit. … The beneficial long-term effects reported for the Perry program constitute a cornerstone of the argument for early intervention efforts throughout the world.

Page 3: “The sample size is small: 123 children allocated over five entry cohorts.”

[464] Paper: “The Rate of Return to the High/Scope Perry Preschool Program.” By James J. Heckman and others. U.S. National Library of Medicine, National Institutes of Health, February 2010. <www.ncbi.nlm.nih.gov>

Page 1: “The economic case for expanding preschool education for disadvantaged children is largely based on evidence from the High/Scope Perry Preschool Program, an early intervention in the lives of disadvantaged children in the early 1960s.8

[465] Webpage: “About Us.” HighScope Educational Research Foundation. Accessed July 31, 2015 at <bit.ly>

“HighScope was established in 1970 by the late David P. Weikart, PhD (1931–2003), who started the organization to continue research and program activities—including the Perry Preschool Project—he originally initiated as an administrator with the Ypsilanti Public Schools.”

[466] “Web Appendix for The Rate of Return to the High/Scope Perry Preschool Program.” By James J. Heckman and others. Elsevier, November 23, 2009. <jenni.uchicago.edu>

Page 11: “Table C.1: Overall Costs … 1962–63 … 1963–64 … 1964–65 … 1965–66 … 1966–67”

[467] Paper: “A Reanalysis of the High/Scope Perry Preschool Program.” By James Heckman and others. University of Chicago, January 22, 2010. <www.researchgate.net>

Page 4:

The eligibility rules for participation were that the participants (1) be African-American; (2) have a low IQ (between 70 and 85) at study entry,6 and (3) be disadvantaged as measured by parental employment level, parental education, and housing density (people/room). The Perry study targeted families who were more disadvantaged than other African-American families in the U.S. but were representative of a large segment of the disadvantaged African-American population. …

Among children in the Perry Elementary School neighborhood, Perry program families were particularly disadvantaged. Table 1 shows that compared to other families with children in the Perry School catchment area, Perry program families were younger, had lower levels of parental education, and had fewer working mothers. Further, Perry program families had fewer educational resources, larger families, and greater participation in welfare, compared to the families with children in another neighborhood elementary school in Ypsilanti (the Erickson School).

6 Measured by the Stanford-Binet IQ test (1960s norming), which has approximate mean of 111 and standard deviation of 16 at study entry (ages 3–4).

[468] Paper: “The Rate of Return to the High/Scope Perry Preschool Program.” By James J. Heckman and others. U.S. National Library of Medicine, National Institutes of Health, February 2010. <www.ncbi.nlm.nih.gov>

Page 4: “Drawn from the community served by the Perry Elementary School, participants were located through a survey of families associated with that school, as well as through neighborhood group referrals, and door-to-door canvassing. Disadvantaged children living in adverse circumstances were identified using IQ scores and a family socioeconomic status (SES) index.”

[469] Paper: “A Reanalysis of the High/Scope Perry Preschool Program.” By James Heckman and others. University of Chicago, January 22, 2010. <www.researchgate.net>

Page 2: “The study was evaluated by the method of random assignment.”

[470] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1482: “The Perry data set contains 123 individuals, 58 in the treatment group and 65 in the control group.”

[471] Paper: “A Reanalysis of the High/Scope Perry Preschool Program.” By James Heckman and others. University of Chicago, January 22, 2010. <www.researchgate.net>

Page 2:

The Perry Preschool curriculum was based on the Piagetian concept of active learning, which is centered around play that is based on problem-solving and guided by open-ended questions. Children were encouraged to plan, carry out, and then reflect on their own activities. The topics in the curriculum were not based on specific facts or topics, but rather on key developmental factors related to planning, expression, and understanding. These factors were then organized into ten topical categories, such as “creative representation,” “classification” (recognizing similarities and differences), “number,” and “time.”1 These educational principles were reflected in the types of open-ended questions asked by teachers: for example, “What happened? How did you make that? Can you show me? Can you help another child?” (Schweinhart and others, 1993, p.33)

As the curriculum was developed over the course of the program, its details and application varied. While the first year involved “thoughtful experimentation” on the part of the teachers, experience with the program and series of seminars during subsequent years led to the development and systematic application of teaching principles with “an essentially Piagetian theory-base.” During the later years of the program, all activities took place within a structured daily routine intended to help children “to develop a sense of responsibility and to enjoy opportunities for independence,” (Schweinhart and others, 1993, pp. 32–33).

[472] Paper: “A Reanalysis of the High/Scope Perry Preschool Program.” By James Heckman and others. University of Chicago, January 22, 2010. <www.researchgate.net>

Page 4: “Beginning at age 3, and lasting two years, treatment consisted of a 2.5-hour educational preschool on weekdays during the school year, supplemented by weekly home visits by teachers.5

[473] “Web Appendix for The Rate of Return to the High/Scope Perry Preschool Program.” By James J. Heckman and others. Elsevier, November 23, 2009. <jenni.uchicago.edu>

Page 4: “During each wave of the experiment, the preschool class consisted of 20–25 children, whose ages ranged from 3 to 4. This is true even of the first and last waves, as the first wave admitted 4-year-olds, who only received one year of treatment, and the last wave was taught alongside a group of 3-year-olds, who are not included in our data.”

[474] Calculated with data from:

a) “Web Appendix for The Rate of Return to the High/Scope Perry Preschool Program.” By James J. Heckman and others. Elsevier, November 23, 2009. <jenni.uchicago.edu>

Page 4: “Classes were 2-1/2 hours every weekday during the regular school year (mid-October through May). … Home Visits. Home visits lasting 1-1/2 hours were conducted weekly by the preschool teachers.”

b) Paper: “A Reanalysis of the High/Scope Perry Preschool Program.” By James Heckman and others. University of Chicago, January 22, 2010. <www.researchgate.net>

Page 4: “Program intensity was low compared to many subsequent early childhood development programs.4 Beginning at age 3, and lasting two years, treatment consisted of a 2.5-hour educational preschool on weekdays during the school year, supplemented by weekly home visits by teachers.5

c) Webpage: “Calculate Duration Between Two Dates—Results.” Accessed August 6, 2015 at <www.timeanddate.com>

From and including: Friday, October 15, 1965

To, but not including Tuesday, May 31, 1966

Result: 228 days

CALCULATIONS:

  • 228 days per year / 7 days per week = 33 weeks per year
  • (2.5 hour preschool classes × 5 weekdays) + 1.5 hour weekly visit = 14 hours per week
  • 33 weeks × 14 hours per week = 462 hours per year
  • 462 hours per year × 2 years = 924 hours

[475] “Web Appendix for The Rate of Return to the High/Scope Perry Preschool Program.” By James J. Heckman and others. Elsevier, November 23, 2009. <jenni.uchicago.edu>

Page 4:

During each wave of the experiment, the preschool class consisted of 20–25 children, whose ages ranged from 3 to 4. This is true even of the first and last waves, as the first wave admitted 4-year-olds, who only received one year of treatment, and the last wave was taught alongside a group of 3-year-olds, who are not included in our data. …

The preschool teaching staff of four produced a child–teacher ratio ranging from 5 to 6.25 over the course of the program. Teaching positions were filled by public-school teachers who were “certified in elementary, early childhood, and special education,” (Schweinhart and others, 1993, p.32).

[476] Paper: “The Rate of Return to the High/Scope Perry Preschool Program.” By James J. Heckman and others. U.S. National Library of Medicine, National Institutes of Health, February 2010. <www.ncbi.nlm.nih.gov>

Page 6: “We use estimates of initial program costs reported in Barnett (1996). These include both operating costs (teacher salaries and administrative costs) and capital costs (classrooms and facilities). This information is summarized in Web Appendix C. In undiscounted year-2006 dollars, cost of the program per child is $17,759.”

[477] Webpage: “CPI Inflation Calculator.” United States Department of Labor, Bureau of Labor Statistics. Accessed July 1, 2023 at <www.bls.gov>

$17,759.00 in January 2006 has the same buying power as $26,792.54 in January 2023

$17,759.00 in January 2006 has the same buying power as $2,794.15 in January 1965

About the CPI Inflation Calculator

The CPI inflation calculator uses the Consumer Price Index for All Urban Consumers (CPI-U) U.S. city average series for all items, not seasonally adjusted. This data represents changes in the prices of all goods and services purchased for consumption by urban households.

[478] Calculated with data from: “Table 236.55. Total and Current Expenditures Per Pupil in Public Elementary and Secondary Schools: Selected Years, 1919–20 Through 2019–20.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

Expenditure per pupil in fall enrollment … Unadjusted dollars1 … Total expenditure3 … 1965–66 [=] $607 … 2019–20 [=] $15,51861 Unadjusted (or “current”) dollars have not been adjusted to compensate for inflation … 3 Excludes “Other current expenditures,” such as community services, private school programs, adult education, and other programs not allocable to expenditures per student at public schools.

CALCULATION: $2,794 × $15,518 / $607 = $71,429

[479] Paper: “A Reanalysis of the High/Scope Perry Preschool Program.” By James Heckman and others. University of Chicago, January 22, 2010. <www.researchgate.net>

Page 2: “Participants were followed through age 40. There are plans for an age-50 followup.”

Pages 3–4:

Data were collected at age 3, the entry age, and through annual surveys until age 15, with additional follow-ups conducted at ages 19, 27, and 40. Program attrition remains low through age 40, with over 90% of the original subjects interviewed. Numerous measures were collected on economic, criminal, and educational outcomes over this span as well as on cognition and personality.

[480] Paper: “The Rate of Return to the High/Scope Perry Preschool Program.” By James J. Heckman and others. U.S. National Library of Medicine, National Institutes of Health, February 2010. <www.ncbi.nlm.nih.gov>

Page 9:

For each subject, the Perry data provide a full record of arrests, convictions, charges and incarcerations for most of the adolescent and adult years. They are obtained from administrative data sources.37 The empirical challenges addressed in this section are twofold: obtaining a complete lifetime profile of criminal activities for each person, and assigning values to that criminal activity. Web Appendix H presents a comprehensive analysis of the crime data which we summarize in this section.

37 The earliest records cover ages 8–39 and the oldest cover ages 13–44. However, there are some limitations. At the county (Washtenaw) level, arrests, all convictions, incarceration, case numbers, and status are reported. At the state (Michigan) level, arrests are only reported if they lead to convictions. For the 38 Perry subjects spread across the 19 states other than Michigan at the time of the age-40 interview, only 11 states provided criminal records. No corresponding data are provided for subjects residing abroad.

[481] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1482:

Researchers gathered data from four primary sources: interviews with subjects and parents, program-administered tests, school records, and criminal records. IQ tests were administered on an annual basis from program entry until age 10, and then once more at age 14. Information on special education, grade retention, and graduation status was collected from school records. Arrest records were obtained from the relevant authorities, supplemented with interview data on criminal behavior. Economic outcome data come primarily from interviews conducted at age 19, 27, and 40. Follow-up attrition rates for most variables were generally low, ranging between 0 to 10%.

NOTE: The tables on pages 1489–1492 provide data from the Perry program for ages 5, 6, 10, 12, 17, 15, 18, 19, 27, and 40.

[482] Paper: “A Reanalysis of the High/Scope Perry Preschool Program.” By James Heckman and others. University of Chicago, January 22, 2010. <www.researchgate.net>

Page 9: “In the case of the Perry study, there are approximately 25 observations per gender per treatment assignment group, and the distribution of observed measures is often highly skewed.10

Page 39: “In summary, our analysis shows that accounting for corrupted randomization, multiple-hypothesis testing and small sample sizes, there are strong effects of the Perry Preschool program on the outcomes of boys and girls. However, there are important differences by age in the strengths of treatment effects by gender.”

[483] Handbook of Statistics: Epidemiology and Medical Statistics. Edited by C.R. Rao and others. Elsevier, 2008.

Chapter 21: “The Multiple Comparison Issue in Healthcare Research.” By Lemuel A. Moyé. Pages 616–650.

Page 644:

The analysis of subgroups is a popular, necessary, and controversial component of the complete evaluation of a research effort. …

However useful and provocative these results can be, it is well-established that subgroup analyses are often misleading…. Assmann and others (2000) has demonstrated how commonly subgroup analyses are misused, while others point out the dangers of accepting subgroup analyses as confirmatory….

[484] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1481: “This article focuses on the three prominent early intervention experiments: the Abecedarian Project, the Perry Preschool Program, and the Early Training Project.”

Page 1493: “As a final demonstration of the value of correcting for multiple inference, we conduct a stand-alone reanalysis of the Perry Preschool Project, arguably the most influential of the three experiments.”

Pages 1489–1492:

Table 4. Effects on Preteen IQ Scores

Table 5. Effects on Preteen Primary School Outcomes

Table 6. Effects on Teenage Academic Outcomes

Table 7. Effects on Teenage Economic and Social Outcomes

Table 8. Effects on Adult Academic Outcomes

Table 9. Effects on Adult Economic Outcomes

Table 10. Effects on Adult Social Outcomes

NOTE: The authors of this paper did not embolden statistically significant outcomes in their tables of results (cited above). However, based on the text of the paper, the authors treat results with q values (also called FDR [false discovery rate] q values) less than .10 as statistically significant. They also sometimes apply a stricter standard of q < .05 and refer to a gray zone in which q values up to .13 may be significant. Hence, Just Facts has listed all of the Perry preschool program results with q ≤.13. For an Excel file containing the data in the tables above, contact us.

[485] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1481:

This article focuses on the three prominent early intervention experiments: the Abecedarian Project, the Perry Preschool Program, and the Early Training Project. …

But serious statistical inference problems affect these studies. The experimental samples are very small, ranging from approximately 60 to 120. Statistical power is therefore limited, and the results of conventional tests based on asymptotic theory may be misleading. More importantly, the large number of measured outcomes raises concerns about multiple inference: Significant coefficients may emerge simply by chance, even if there are no treatment effects. This problem is well known in the theoretical literature … and the biostatistics field … but has received limited attention in the policy evaluation literature. These issues—combined with a puzzling pattern of results in which early test score gains disappear within a few years and are followed a decade later by significant effects on adult outcomes—have created serious doubts about the validity of the results….

Page 1484:

[M]ost randomized evaluations in the social sciences test many outcomes but fail to apply any type of multiple inference correction. To gauge the extent of the problem, we conducted a survey of randomized evaluation works published from 2004 to 2006 in the fields of economic or employment policy, education, criminology, political science or public opinion, and child or adolescent welfare. Using the CSA Illumina social sciences databases, we identified 44 such articles in peer-reviewed journals. …

Nevertheless, only 3 works (7%) implemented any type of multiple-inference correction. … Although multiple-inference corrections are standard (and often mandatory) in psychological research … they remain uncommon in other social sciences, perhaps because practitioners in these fields are unfamiliar with the techniques or because they have seen no evidence that they yield more robust conclusions.

Pages 1490–1491:

The disaggregated [by sex] results suggest that early intervention improves high school graduation, employment, and juvenile arrest rates for females but has no significant effect on male outcomes. …

Unlike females, males show little evidence of positive effects as adults.

Pages 1493–1494:

As a final demonstration of the value of correcting for multiple inference, we conduct a stand-alone reanalysis of the Perry Preschool Project, arguably the most influential of the three experiments. …

… Do these findings replicate in the other two studies? In general, yes. The early male IQ effect replicates strongly in Abecedarian. The female high school graduation effect replicates in both Abecedarian and Early Training, and the early female IQ effect replicates weakly in Abecedarian and strongly in Early Training. …

In contrast to females, males appear to not derive lasting benefits from the interventions. …

[A] conventional research design [i.e., one that does not account for multiple inference problems] … adds eight more significant or marginally significant outcomes: female adult arrests, female employment, male monthly income, female government transfers, female special education rates, male drug use (in the adverse direction), male employment, and female monthly income. Of these eight outcomes, two (male and female monthly income) are not included in the other two studies [Abecedarian and Early Training]. The remaining six fail to replicate in either of the other studies. …

[Previous] researchers have emphasized the subset of unadjusted significant outcomes rather than applying a statistical framework that is robust to problems of multiple inference. …

Many studies in this field test dozens of outcomes and focus on the subset of results that achieve significance.

[486] Paper: “The Rate of Return to the High/Scope Perry Preschool Program.” By James J. Heckman and others. U.S. National Library of Medicine, National Institutes of Health, February 2010. <www.ncbi.nlm.nih.gov>

Page 2:

In a highly cited paper, Rolnick and Grunewald (2003) report a rate of return of 16 percent to the Perry program. Belfield and others (2006) report a 17 percent rate of return. …

… All of the reported estimates of rates of return are presented without standard errors, leaving readers uncertain as to whether the estimates are statistically significantly different from zero. The paper by Rolnick and Grunewald (2003) reports few details and no sensitivity analyses exploring the consequences of alternative assumptions about costs and benefits of key public programs and the costs of crime. The study by Belfield and others (2006) also does not report standard errors. It provides more details on how its estimates are obtained, but conducts only a limited sensitivity analysis.10

10 Barnett (1985) conducts a comprehensive analysis of the benefits and costs of the Perry program through age 19. He also conducts a sensitivity analysis to many of the assumptions he invokes. Our analysis is for the group through age 40. Our analysis builds on and extends this important analysis.

[487] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1481:

[S]everal randomized early intervention experiments have reported striking increases in short-term IQ scores and long-term outcomes for treated children… This article focuses on the three prominent early intervention experiments: the Abecedarian Project, the Perry Preschool Program, and the Early Training Project. … But serious statistical inference problems affect these studies.

Page 1482: “Of the three early intervention projects, Abecedarian was by far the most intensive.”

Page 1483: “Nevertheless, there are some important differences in these studies’ findings. In particular, the Perry Preschool Program reported large, statistically significant reductions in juvenile and adult criminal behavior that were not replicated in the Abecedarian Program.”

Page 1492: “Abecedarian females … experience no significant reduction in conviction or incarceration rates by age 21.”

Page 1493: “Previous findings demonstrating significant long-term effects for boys, primarily from the Perry program, do not survive multiplicity [multiple inference] adjustment [for statistical significance] and do not replicate in the other experiments.”

[488] Paper: “Comparative Benefit–Cost Analysis of the Abecedarian Program and Its Policy Implications.” By W.S. Barnett and Leonard N. Masse. Economics of Education Review, February 2007. Pages 113–125. <nieer.org>

Page 122: “Yet, the [Abecedarian] program did not produce gains in social and emotional development that elsewhere [the Perry program] have been found to account for a very large portion of potential benefits.”

[489] Paper: “A Reanalysis of the High/Scope Perry Preschool Program.” By James Heckman and others. University of Chicago, January 22, 2010. <www.researchgate.net>

Page 4: “[Perry] Program intensity was low compared to many subsequent early childhood development programs.4

[490] Paper: “A Reanalysis of the High/Scope Perry Preschool Program.” By James Heckman and others. University of Chicago, January 22, 2010. <www.researchgate.net>

Page 28:

Tables 3–6 show many statistically significant treatment effects and gender differences that survive multiple hypothesis testing. In summary, females show strong effects for educational outcomes, early employment and other early economic outcomes, as well as reduced numbers of arrests. Males, on the other hand, show strong effects on a number of outcomes, demonstrating a substantially reduced number of arrests and lower probability of imprisonment, as well as strong effects on earnings at age 27, employment at age 40, and other economic outcomes recorded at age 40.

Page 24: “Table 3: Main Outcomes, Females: Part 1”

Page 25: “Table 4: Main Outcomes, Females: Part 2”

Page 26: “Table 5: Main Outcomes, Males: Part 1”

Page 27: “Table 6: Main Outcomes, Males: Part 2”

NOTE: Contact us for an Excel file containing the data in the tables above.

Page 38:

Proper analysis of the Perry experiment presents many statistical challenges. These challenges include small-sample inference, accounting for imperfections in randomization, and accounting for large numbers of outcomes. The last of these refers to the risk of selecting statistically significant outcomes that are “cherry picked” from a larger set of unreported results.

We propose and implement a combination of methods to account for these problems. We control for the violations of the initial randomization protocol and imbalanced background variables. We estimate family-wise error rates that account for the multiplicity of the outcomes. We consider the external validity of the program.

Page 39:

The pattern of treatment response by gender varies with age. Males exhibit statistically significant treatment effects for criminal activity, later life income, and employment (ages 27 and 40), whereas, female treatment effects are strongest for education and early employment (ages 19 and 27). …

In summary, our analysis shows that accounting for corrupted randomization, multiple-hypothesis testing and small sample sizes, there are strong effects of the Perry Preschool program on the outcomes of boys and girls. However, there are important differences by age in the strengths of treatment effects by gender.

[491] Paper: “The Rate of Return to the High/Scope Perry Preschool Program.” By James J. Heckman and others. U.S. National Library of Medicine, National Institutes of Health, February 2010. <www.ncbi.nlm.nih.gov>

Page 4:

The Compromised Randomization Protocol.

A potential problem with the Perry study is that after random assignment, treatment and controls were reassigned, compromising the original random assignment and making simple interpretation of the evidence problematic. In addition, there was some imbalance in the baseline variables between treatment and control groups.

[492] Paper: “A Reanalysis of the High/Scope Perry Preschool Program.” By James Heckman and others. University of Chicago, January 22, 2010. <www.researchgate.net>

Page 9: “In the case of the Perry study, there are approximately 25 observations per gender per treatment assignment group, and the distribution of observed measures is often highly skewed.10

Page 36: “We estimate that 17% of the male cohort and 15% of the female cohort would be eligible for the Perry program if it were applied nationwide. This translates into a population estimate of 712,000 persons out of this 4.5 million black cohort resemble the Perry population.53

[493] “Margin of Error Calculator.” ComRes. Accessed May 23, 2019 at <bit.ly>

The margin of error shows the level of accuracy that a random sample of a given population has.

Our calculator gives the percentage points of error either side of a result for a chosen sample size.

It is calculated at the standard 95% confidence level. Therefore we can be 95% confident that the sample result reflects the actual population result to within the margin of error. This calculator is based on a 50% result in a poll, which is where the margin of error is at its maximum.

This means that, according to the law of statistical probability, for 19 out of every 20 polls the “true” result will be within the margin of error shown.

Population Size: 356,000 [= 712,000 (people eligible for the Perry program) / 2 (roughly half male and half female)]

Sample Size: 25

Margin of Error: 19.6

[494] Book: Multiple Regression: A Primer. By Paul D. Allison. Pine Forge Press, 1998.

Chapter 3: “What Can Go Wrong With Multiple Regression?” <us.sagepub.com>

Pages 57–58:

Sample size has a profound effect on tests of statistical significance. With a sample of 60 people, a correlation has to be at least .25 (in magnitude) to be significantly different from zero (at the .05 level). With a sample of 10,000 people, any correlation larger than .02 will be statistically significant. The reason is simple: There’s very little information in a small sample, so estimates of correlations are very unreliable. If we get a correlation of .20, there may still be a good chance that the true correlation is zero. …

Statisticians often describe small samples as having low power to test hypotheses. There is another, entirely different problem with small samples that is frequently confused with the issue of power. Most of the test statistics that researchers use (such as t tests, F tests, and chi-square tests) are only approximations. These approximations are usually quite good when the sample is large but may deteriorate markedly when the sample is small. That means that p values calculated for small samples may be only rough approximations of the true p values. If the calculated p value is .02, the true value might be something like .08. …

That brings us to the inevitable question: What’s a big sample and what’s a small sample? As you may have guessed, there’s no clear-cut dividing line. Almost anyone would consider a sample less than 60 to be small, and virtually everyone would agree that a sample of 1,000 or more is large. In between, it depends on a lot of factors that are difficult to quantify, at least in practice.

[495] Paper: “The Rate of Return to the High/Scope Perry Preschool Program.” By James J. Heckman and others. U.S. National Library of Medicine, National Institutes of Health, February 2010. <www.ncbi.nlm.nih.gov>

Page 5: “As the oldest and most cited early childhood intervention evaluated by the method of random assignment, the Perry study serves as a flagship for policy makers advocating public support for early childhood programs.”

[496] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1481:

The education literature contains dozens of papers showing inconsistent or low returns to publicly funded human capital investments…. In contrast to these studies, several randomized early intervention experiments have reported striking increases in short-term IQ scores and long-term outcomes for treated children… These results have been highly influential and often are cited as proof of efficacy for many types of early interventions…. The experiments underlie the growing movement for universal prekindergarten education….

This article focuses on the three prominent early intervention experiments: the Abecedarian Project, the Perry Preschool Program, and the Early Training Project.

Page 1493: “[T]he Perry Preschool Project [is] arguably the most influential of the three experiments.”

Page 1494: “[T]he most famous (and dramatic) preschool experiment [is] the Perry program….”

[497] Paper: “A Reanalysis of the High/Scope Perry Preschool Program.” By James Heckman and others. University of Chicago, January 22, 2010. <www.researchgate.net>

Page 2: “The case for universal pre-K is often based on the Perry study, even though the project only targeted a disadvantaged segment of the population.1

NOTE: For examples of such claims, see the next three footnotes.

[498] Commentary: “Capitalists for Preschool.” By John E. Pepper, Jr. and James M. Zimmerman. New York Times, March 1, 2013. <www.nytimes.com>

“Research by the University of Chicago economist James J. Heckman, a Nobel laureate, points to a 7- to 10-percent annual return on investment in high-quality preschool.”

NOTE: The statement above refers to the following paper: “The Rate of Return to the High/Scope Perry Preschool Program.” By James J. Heckman and others. U.S. National Library of Medicine, National Institutes of Health, February 2010. <www.ncbi.nlm.nih.gov>

[499] Report: “The Case for Pre-K in Education Reform: A Summary of Program Evaluation Findings.” By Albert Wat. Pew Center on the States, April 2010. <www.pewtrusts.org>

Page 2:

The short- and long-term benefits of high-quality pre-kindergarten have been well documented by researchers for the last 50 years. By now, even many outside the education field have heard about the academic and lifetime gains and the significant returns on investment yielded from the High/Scope Perry Preschool Project and the Chicago Child-Parent Centers.1

1 See for example: Albert Wat, “Dollars and Sense: A Review of Economic Analyses of Pre-K,” (Washington, DC: Pre-K Now, 2007).

[500] Commentary: “The Vague Promise of Obama’s Ambitious Preschool Plan.” By Jonathan Cohn. New Republic, February 15, 2013. <www.newrepublic.com>

“President Barack Obama visited Georgia on Thursday to tout his ambitious new proposal for universal preschool. … Obama’s plan comes from two ‘amazing preschools’—the Perry Preschool Project, in Michigan, and the Abecedarian Project, in North Carolina.”

[501] Textbook: Applied Statistics: From Bivariate Through Multivariate Techniques. By Rebecca M. Warner. Sage Publications, 2008.

Page 5:

Researchers in the behavioral and social sciences almost always want to make inferences beyond their samples; they hope that the attitudes or behaviors that they find in small groups of college students who actually participate in their studies will provide evidence about attitudes or behaviors in broader populations in the world outside the laboratory. Thus, almost all the statistics reported in journal articles are inferential statistics

However, in many types of research (such as experiments and small-scale surveys in psychology, education, and medicine), it is not practical to obtain random samples from the entire population of the country. Instead, researchers in these disciplines often use convenience samples when they conduct small-scale studies. …

When researchers obtain information about behavior from convenience samples, they cannot confidently use their results to make inferences about the responses of an actual, well-defined population.

Page 6:

Generalization of results beyond the sample to make inferences about a broader population is always risky, so researchers should be cautious in making generalizations. … It could be misleading, however, to generalize the results of the study to children or to older adults. …

To summarize, when a study uses data from a convenience sample, the researcher should clearly state that the nature of the sample limits the potential generalizability of the results.

It would be questionable to generalize about response to caffeine for populations that have drastically different characteristics from the members of the sample….

[502] Book: Multiple Regression: A Primer. By Paul D. Allison. Pine Forge Press, 1998.

Chapter 1: “What Is Multiple Regression?” <us.sagepub.com>

Page 9: “The most desirable data come from a probability sample from some well-defined population…. In practice, people often use whatever cases happen to be available. … Although it is acceptable to use such ‘convenience samples,’ you must be very cautious in generalizing the results to other populations.”

[503] Calculated with data from the paper: “A Reanalysis of the High/Scope Perry Preschool Program.” By James Heckman and others. University of Chicago, January 22, 2010. <www.researchgate.net>

Page 4:

The eligibility rules for participation were that the participants (1) be African-American; (2) have a low IQ (between 70 and 85) at study entry,6 and (3) be disadvantaged as measured by parental employment level, parental education, and housing density (people/room). The Perry study targeted families who were more disadvantaged than other African-American families in the U.S. but were representative of a large segment of the disadvantaged African-American population. …

Among children in the Perry Elementary School neighborhood, Perry program families were particularly disadvantaged. Table 1 shows that compared to other families with children in the Perry School catchment area, Perry program families were younger, had lower levels of parental education, and had fewer working mothers. Further, Perry program families had fewer educational resources, larger families, and greater participation in welfare, compared to the families with children in another neighborhood elementary school in Ypsilanti (the Erickson School).

6 Measured by the Stanford-Binet IQ test (1960s norming), which has approximate mean of 111 and standard deviation of 16 at study entry (ages 3–4).

Pages 35–36:

Comparability in later life outcomes between the restricted group and the Perry control group suggests that the Perry sample, while not necessarily representative of the African-American population as a whole, is representative of a particular subsample of that population. Specifically, this subsample reflects the eligibility requirements of the Perry program, such as low IQ of the child and a low parental SES [socio-economic status] index.

The US population in 1960 was 180 million people, of which 10.6% (19 million) were black.52 We use the NLSY79 [1979 National Longitudinal Survey of Youth] a representative sample of the total population that was born between 1957 and 1964, to estimate the number of persons in the US that resemble the Perry population at entry (age 3). According to the NLSY79, the black cohort born in 1957–1964 is composed of 2.2 million males and 2.3 million females. We estimate that 17% of the male cohort and 15% of the female cohort would be eligible for the Perry program if it were applied nationwide. This translates into a population estimate of 712,000 persons out of this 4.5 million black cohort resemble the Perry population.53 For further information on the comparison groups and their construction, see Web Appendix I and Tables I.1 and I.2 for details.

CALCULATIONS:

  • 712,000 / 4,500,000 = 15.8%
  • 10.6% × 15.8% = 1.7%

[504] Paper: “The Rate of Return to the High/Scope Perry Preschool Program.” By James J. Heckman and others. U.S. National Library of Medicine, National Institutes of Health, February 2010. <www.ncbi.nlm.nih.gov>

Page 4:

Drawn from the community served by the Perry Elementary School, participants were located through a survey of families associated with that school, as well as through neighborhood group referrals, and door-to-door canvassing. Disadvantaged children living in adverse circumstances were identified using IQ scores and a family socioeconomic status (SES) index.

[505] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1482: “The Abecedarian Project recruited and treated four cohorts of children in the Chapel Hill, North Carolina area from 1972 to 1977. … The Abecedarian data set contains 111 children, 57 assigned to the treatment group and 54 assigned to the control group.”

[506] Paper: “Comparative Benefit–Cost Analysis of the Abecedarian Program and Its Policy Implications.” By W.S. Barnett and Leonard N. Masse. Economics of Education Review, February 2007. Pages 113–125. <nieer.org>

Page 116: “The curricula are called ‘Learningames, The Abecedarian Curriculum’ and ‘Partners for Learning’ …. The curriculum emphasized language development, but addressed all developmental domains.4

[507] Article: “How Preschool Can Make You Smarter and Healthier.” By Madeline Ostrander. PBS, April 9, 2015. <www.pbs.org>

There was a sense of idealism in the air in 1971 when Craig Ramey, a psychologist in his late 20s with a newly minted Ph.D., took a job in Chapel Hill, North Carolina, to launch what would become one of the longest-running educational experiments in history. He became a lead researcher at the University of North Carolina’s Frank Porter Graham Child Development Center…. He and Joseph Sparling, the center’s senior investigator and associate director and a former school principal, wanted to study a sample of Chapel Hill children and test whether it was possible to change the course of a life by stepping in early, from infancy. They named their experiment the Abecedarian Project, from an obscure Latinate word for an alphabetical sequence.

[508] Paper: “Comparative Benefit–Cost Analysis of the Abecedarian Program and Its Policy Implications.” By W.S. Barnett and Leonard N. Masse. Economics of Education Review, February 2007. Pages 113–125. <nieer.org>

Pages 115–116:

The study randomly assigned to a treatment or control condition 112 children, mostly African American, born between 1972 and 1977 and who were believed to be at risk of retarded intellectual and social development. Family background characteristics at study entry were: maternal education of approximately 10 yr, maternal IQ of 85, 25% of households with both parents, and 55% of households receiving Aid to Families with Dependent Children.

[509] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1482: “Children were randomly assigned to treated and control groups.”

[510] Paper: “Comparative Benefit–Cost Analysis of the Abecedarian Program and Its Policy Implications.” By W.S. Barnett and Leonard N. Masse. Economics of Education Review, February 2007. Pages 113–125. <nieer.org>

Page 116: “Random assignment occurred between 6 and 12 weeks of age.”

[511] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1482: “The [Abecedarian] program focused on developing cognitive, language, and social skills in classes of about six.”

[512] Paper: “Comparative Benefit–Cost Analysis of the Abecedarian Program and Its Policy Implications.” By W.S. Barnett and Leonard N. Masse. Economics of Education Review, February 2007. Pages 113–125. <nieer.org>

Page 116: “The curricula are called ‘Learningames, The Abecedarian Curriculum’ and ‘Partners for Learning’ …. The curriculum emphasized language development, but addressed all developmental domains.”

[513] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1482: “The treated children entered the program very early (mean age, 4.4 months). They attended … until reaching schooling age.”

[514] Calculated with data from the paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1482: “The treated children entered the program very early (mean age, 4.4 months). They attended a preschool center for 8 hours per day, 5 days per week, 50 weeks per year until reaching schooling age.”

CALCULATIONS:

  • 8–10 hours per day × 5 days per week × 50 weeks per year = 2,000–2,500 hours per year
  • 2,000–2,500 hours per year × 4 years = 8,000–10,000 hours
  • 8,000–10,000 hours for the Abecedarian program / 924 hours for the Perry program = 8.7–10.8

NOTE: An Excel file containing more detailed calculations of preschool hours is available upon request.

[515] Paper: “Comparative Benefit–Cost Analysis of the Abecedarian Program and Its Policy Implications.” By W.S. Barnett and Leonard N. Masse. Economics of Education Review, February 2007. Pages 113–125. <nieer.org>

Page 116: “The center was operated from 7:30 a.m. to 5:30 p.m., 5 days per week, and 50 weeks out of the year, with free transportation available. This constitutes 2500 h/yr and is compatible with the needs of most full-time working parents, in contrast to the typical part-day preschool program which might provide 450–540 h/yr (2.5–3 h/day, 180 days).”

[516] Paper: “Comparative Benefit–Cost Analysis of the Abecedarian Program and Its Policy Implications.” By W.S. Barnett and Leonard N. Masse. Economics of Education Review, February 2007. Pages 113–125. <nieer.org>

Page 116: “The preschool program was center-based with teacher/child ratios that ranged from 1:3 for infants/ toddlers to 1:6 for older children.”

[517] Calculated with data from:

a) Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1482: “The treated children entered the program very early (mean age, 4.4 months). They attended a preschool center for 8 hours per day, 5 days per week, 50 weeks per year until reaching schooling age.”

b) Paper: “Comparative Benefit–Cost Analysis of the Abecedarian Program and Its Policy Implications.” By W.S. Barnett and Leonard N. Masse. Economics of Education Review, February 2007. Pages 113–125. <nieer.org>

Page 116: “The preschool program was center-based with teacher/child ratios that ranged from 1:3 for infants/ toddlers to 1:6 for older children.”

Page 117: “Average enrollment in the nursery was about 12 infants and the staff/child ratio was 1:3. Average age at entry was 4.4 months. In program years two and three group size averaged about seven children for both age groups and the staff/child ratio was 1:3.5. In program years four and five the average was 12 children per group at each age, and the staff/child ratio was 1:6.”

Page 122: “The Abecedarian program also had strong supervision, a well-designed curriculum, well-compensated staff (comparable to the public schools) and on-going evaluation.”

c) Report: “Kindergarten Entrance Age and Children’s Achievement: Impacts of State Policies, Family Background, and Peers.” by Todd E. Elder and Darren H. Lubotsky. RAND Corporation, June 2006. <citeseerx.ist.psu.edu>

Page 3: “Figures 1 and 2 present evidence that changes in kindergarten statutes have substantially increased average entrance ages. Figure 1 shows the population-weighted fraction of states with entrance cutoffs in six selected categories. In 1975, six states had cutoffs of September 14 or earlier, while 14 states had relatively late cutoffs between November 30 and January 1. An additional 15 states did not have any uniform state regulation and instead left such decisions up to individual school districts. From the mid-1970s to the mid-1990s, many states either moved their kindergarten birthday cutoff from December to September or instituted a September cutoff when there previously was no statewide mandate. By 2004, 29 states had cutoffs of September 14 or earlier, five states had cutoffs between November 30 and January 1, and only eight states had no uniform state law. … The most dramatic increases occurred among the 1969 to 1984 birth cohorts, who would have been affected by entrance cutoffs from roughly 1974 to 1989.”

d) Report: “Entering Kindergarten: A Portrait of American Children When They Begin School.” By Nicholas Zill and Jerry West. U.S. Department of Education, Office of Educational Research and Improvement, March 2001. <nces.ed.gov>

Page 10: “Figure 3.—Percentage distribution of first-time kindergartners, by age at kindergarten entrance: Fall 1998”

e) Dataset: “Table 203.90. Average Daily Attendance (ADA) as a Percentage of Total Enrollment, School Day Length, and School Year Length in Public Schools, by School Level and State: 2007–08 and 2011–12.” U.S. Department Of Education, National Center for Education Statistics, May 2013. <nces.ed.gov>

“2011–12 … United States … total elementary, secondary, and combined elementary/secondary schools … Average hours in school day [=] 6.7 … Average days in school year [=] 179”

f) Webpage: “Teacher Characteristics and Trends.” U.S. Department of Education, National Center for Education Statistics. Accessed July 12, 2023 at <nces.ed.gov>

“In 2019, the pupil/teacher ratio in public schools was 15.9….”

g) Dataset: “Table 236.20. Total Expenditures for Public Elementary and Secondary Education and Other Related Programs, by Function and Subfunction: Selected Years, 1990–91 Through 2018–19.” U.S. Department Of Education, National Center for Education Statistics, September 2021. <nces.ed.gov>

h) Dataset: “Table 236.55. Total and Current Expenditures Per Pupil in Public Elementary and Secondary Schools: Selected School Years, 1919–20 Through 2019–20.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

“Expenditure per pupil in fall enrollment … Total expenditure3 … 2019–20 … Constant 2021–22 dollars [=] 17,0136

NOTE: An Excel file containing the data and calculations is available upon request.

[518] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1482:

Data collection [for the Abecedarian program] began immediately and has continued, with gaps, through age 21. The data come from three primary sources: interviews with subjects and parents, program-administered tests, and school records. Children received IQ tests on an annual basis from ages 2 through 8, and then once at age 12 and once at age 15. Researchers collected information on grade retention and special education at age 12 and 15 from school records. Data on high school graduation, college attendance, employment, pregnancy, and criminal behavior come from an interview at age 21.

NOTE: The tables on pages 1489–1492 provide data from the Abecedarian program for ages 5, 6.5, 12, 15, 18, 19, and 21.

[519] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1482: “The Abecedarian Project recruited and treated four cohorts of children in the Chapel Hill, North Carolina area from 1972 to 1977. … The Abecedarian data set contains 111 children, 57 assigned to the treatment group and 54 assigned to the control group. … Follow-up attrition rates are low, ranging from 3% to 6% for most outcomes.”

Page 1483: “Table 1. Summary statistics … Abecedarian … Percent female [=] 53.2”

[520] Paper: “Comparative Benefit–Cost Analysis of the Abecedarian Program and Its Policy Implications.” By W.S. Barnett and Leonard N. Masse. Economics of Education Review, February 2007. Pages 113–125. <nieer.org>

Page 116: “By 1978, 104 participants remained in the study, and the follow-up at age 21 involved all 104 of these participants.”

[521] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1481: “This article focuses on the three prominent early intervention experiments: the Abecedarian Project, the Perry Preschool Program, and the Early Training Project.”

Pages 1489–1492:

Table 4. Effects on Preteen IQ Scores

Table 5. Effects on Preteen Primary School Outcomes

Table 6. Effects on Teenage Academic Outcomes

Table 7. Effects on Teenage Economic and Social Outcomes

Table 8. Effects on Adult Academic Outcomes

Table 9. Effects on Adult Economic Outcomes

Table 10. Effects on Adult Social Outcomes

NOTE: The authors of this paper did not embolden statistically significant outcomes in their tables of results (cited above). However, based on the text of the paper, the authors treat results with q values (also called FDR q values) less than .10 as statistically significant. They also sometimes apply a stricter standard of q < .05 and refer to a gray zone in which q values up to .13 may be significant. Hence, Just Facts has listed all of the Abecedarian program results with q <= .13. For an Excel file containing the data in the tables above, contact us.

[522] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1481:

This article focuses on the three prominent early intervention experiments: the Abecedarian Project, the Perry Preschool Program, and the Early Training Project. …

But serious statistical inference problems affect these studies. The experimental samples are very small, ranging from approximately 60 to 120. Statistical power is therefore limited, and the results of conventional tests based on asymptotic theory may be misleading. More importantly, the large number of measured outcomes raises concerns about multiple inference: Significant coefficients may emerge simply by chance, even if there are no treatment effects. This problem is well known in the theoretical literature … and the biostatistics field … but has received limited attention in the policy evaluation literature. These issues—combined with a puzzling pattern of results in which early test score gains disappear within a few years and are followed a decade later by significant effects on adult outcomes—have created serious doubts about the validity of the results….

Page 1484:

[M]ost randomized evaluations in the social sciences test many outcomes but fail to apply any type of multiple inference correction. To gauge the extent of the problem, we conducted a survey of randomized evaluation works published from 2004 to 2006 in the fields of economic or employment policy, education, criminology, political science or public opinion, and child or adolescent welfare. Using the CSA Illumina social sciences databases, we identified 44 such articles in peer-reviewed journals.

Of these 44 articles, 37 (84%) reported testing 5 or more outcomes, and 27 (61%) reported testing 10 or more outcomes. These figures represent lower bounds for the total number of tests conducted, because many tests may be conducted but not reported. Nevertheless, only three works (7%) implemented any type of multiple-inference correction. … Although multiple-inference corrections are standard (and often mandatory) in psychological research … they remain uncommon in other social sciences, perhaps because practitioners in these fields are unfamiliar with the techniques or because they have seen no evidence that they yield more robust conclusions.

Page 1494:

[Previous] researchers have emphasized the subset of unadjusted significant outcomes rather than applying a statistical framework that is robust to problems of multiple inference. …

Many studies in this field test dozens of outcomes and focus on the subset of results that achieve significance.

[523] Paper: “The Rate of Return to the High/Scope Perry Preschool Program.” By James J. Heckman and others. U.S. National Library of Medicine, National Institutes of Health, February 2010. <www.ncbi.nlm.nih.gov>

Page 2:

In a highly cited paper, Rolnick and Grunewald (2003) report a rate of return of 16 percent to the Perry program. Belfield and others (2006) report a 17 percent rate of return. …

… All of the reported estimates of rates of return are presented without standard errors, leaving readers uncertain as to whether the estimates are statistically significantly different from zero. The paper by Rolnick and Grunewald (2003) reports few details and no sensitivity analyses exploring the consequences of alternative assumptions about costs and benefits of key public programs and the costs of crime. The study by Belfield and others (2006) also does not report standard errors. It provides more details on how its estimates are obtained, but conducts only a limited sensitivity analysis.10

10 Barnett (1985) conducts a comprehensive analysis of the benefits and costs of the Perry program through age 19. He also conducts a sensitivity analysis to many of the assumptions he invokes. Our analysis is for the group through age 40. Our analysis builds on and extends this important analysis.

[524] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1481:

[S]everal randomized early intervention experiments have reported striking increases in short-term IQ scores and long-term outcomes for treated children… This article focuses on the three prominent early intervention experiments: the Abecedarian Project, the Perry Preschool Program, and the Early Training Project. … But serious statistical inference problems affect these studies.

Page 1482: “Of the three early intervention projects, Abecedarian was by far the most intensive.”

Page 1483: “Nevertheless, there are some important differences in these studies’ findings. In particular, the Perry Preschool Program reported large, statistically significant reductions in juvenile and adult criminal behavior that were not replicated in the Abecedarian Program.”

Page 1492: “Abecedarian females … experience no significant reduction in conviction or incarceration rates by age 21.”

Page 1493: “Previous findings demonstrating significant long-term effects for boys, primarily from the Perry program, do not survive multiplicity [multiple inference] adjustment [for statistical significance] and do not replicate in the other experiments.”

[525] Paper: “Comparative Benefit–Cost Analysis of the Abecedarian Program and Its Policy Implications.” By W.S. Barnett and Leonard N. Masse. Economics of Education Review, February 2007. Pages 113–125. <nieer.org>

Page 122: “Yet, the [Abecedarian] program did not produce gains in social and emotional development that elsewhere [the Perry program] have been found to account for a very large portion of potential benefits.”

[526] Paper: “A Reanalysis of the High/Scope Perry Preschool Program.” By James Heckman and others. University of Chicago, January 22, 2010. <www.researchgate.net>

Page 4: “[Perry] Program intensity was low compared to many subsequent early childhood development programs.4

[527] Calculated with data from the paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1482: “The treated children entered the program very early (mean age, 4.4 months). They attended a preschool center for 8 hours per day, 5 days per week, 50 weeks per year until reaching schooling age.”

CALCULATIONS:

  • 8–10 hours per day × 5 days per week × 50 weeks per year = 2,000–2,500 hours per year
  • 2,000–2,500 hours per year × 4 years = 8,000–10,000 hours
  • 8,000–10,000 hours for the Abecedarian program / 924 hours for the Perry program = 8.7–10.8

NOTE: An Excel file containing more detailed calculations of preschool hours is available upon request.

[528] Paper: “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects.” By Michael L. Anderson. Journal of the American Statistical Association, December 2008. Pages 1481–1495. <are.berkeley.edu>

Page 1481:

The view that the returns to educational investments are highest for early childhood interventions is widely held and stems primarily from several influential randomized trials—Abecedarian, Perry, and the Early Training Project—that point to super-normal returns to early interventions. … The experiments underlie the growing movement for universal prekindergarten education….

[529] Commentary: “The Vague Promise of Obama’s Ambitious Preschool Plan.” By Jonathan Cohn. New Republic, February 15, 2013. <www.newrepublic.com>

“President Barack Obama visited Georgia on Thursday to tout his ambitious new proposal for universal preschool. … Obama’s plan comes from two ‘amazing preschools’—the Perry Preschool Project, in Michigan, and the Abecedarian Project, in North Carolina.”

[530] Book: Encyclopedia of Human Services and Diversity. Edited by Linwood H. Cousins. Sage Publications, 2014. Article: “Educational Support Services.” By Stephen T. Schroth (Know College).

Page 447: “All 50 states and the District of Columbia provide public education for children from kindergarten through grade 12. Additionally, many states also fund preschool programs that permit some children as young as 3 years of age to attend classes.”

[531] Handbook of Research on School Choice. Edited by Mark Berends and others. Routledge, 2009.

Page xvi: “In the lower-left cell of Table 1.1 are traditional public schools, which are government-funded and government-operated.”

[532] Dataset: “Table 235.10. Revenues for Public Elementary and Secondary Schools, by Source of Funds: Selected Years, 1919–20 Through 2019–20.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

“2019–20 … Percentage Distribution … Federal [=] 7.6% … State [=] 47.5% … Local (including intermediate sources below the state level) … Total [=] 44.9% … Property taxes [=] 36.5% … Other public revenue [=] 7.1% … Private1 [=] 1.2% … 1 Includes revenues from gifts, and tuition and fees from patrons.”

[533] Book: Educational Administration: Concepts and Practices (6th edition). By Fred Lunenburg and Allan Ornstein. Wadsworth Cengage Learning, 2012.

Page 343:

School Attendance

All fifty states have some form of compulsory school attendance law. These statutes provide the right of children residing in a district to receive a free public education up to a certain age and exact penalties for noncompliance on parents or guardians.

Compulsory Attendance Laws The courts have sustained compulsory attendance laws on the basis of the legal doctrine of parens patriae. Under this doctrine, the state has the legal authority to provide for the welfare of its children. In turn, the welfare of the state is served by the development of an enlightened citizenry.

Attendance at a public school is not the only way to satisfy the compulsory attendance law. Over eighty years ago, the U.S. Supreme Court in Pierce v. Society of Sisters invalidated an Oregon statute requiring children between the ages of eight and sixteen to attend public schools.67 The Court concluded that by restricting attendance to public schools, the state violated both the property rights of the school and the liberty interests of parents in choosing the plan of education for their children, protected by the Fourteenth Amendment to the Constitution.

Subsequent to Pierce, states have expanded the options available to parents (guardians) for meeting the compulsory attendance law. For example, currently in the state of Kentucky, parents are in compliance with that state’s statute by selecting from the following options: enrolling their children, who must regularly attend, in a private, parochial, or church-related day school; enrolling their children, who must regularly attend, in a private, parochial, church- or state-supported program for exceptional children; or providing home, hospital, institutional, or other regularly scheduled, suitable, equivalent instruction that meets standards of the state board of education.68

Parents or guardians who select one of the options to public school instruction must obtain equivalent instruction. For example, the Washington Supreme Court held that home instruction did not satisfy that states compulsory attendance law, for the parents who were teaching the children did not hold a valid teaching certificate.69 In its decision, the court described four essential elements of a school: a certified teacher, pupils of school age, an institution established to instruct school-age children, and a required program of studies (curriculum) engaged in for the full school term and approved by the state board of education. Subsequently, statutes establishing requirements for equivalent instruction (such as certified teachers, program of studies, time devoted to instruction, school-age children, and place or institution) generally have been sustained by the courts.70

Exceptions to Compulsory Attendance The prevailing view of the courts is that religious beliefs cannot abrogate a states compulsory attendance law. An exception is the U.S. Supreme Court ruling in Wisconsin v. Yoder, which prevented that state from requiring Amish children to submit to compulsory formal education requirements beyond the eighth grade.71 The Court found that this was a violation of the free exercise of religion clause of the First Amendment. However, most other attempts to exempt students from school based on religious beliefs have failed.

It is commonly held that married pupils, regardless of age, are exempt from compulsory attendance laws. The rationale is that married persons assume adult status, and consequently the doctrine of parens patriae no longer applies. The precedent in this area is based on two Louisiana cases in which fifteen- and fourteen-year-old married women were considered not “children” under the compulsory attendance law.72 A later New York case followed the rationale of the two Louisiana cases in declaring that the obligations of a married woman were inconsistent with school attendance.73 It should be noted, however, that a state cannot deny married minors the right to attend school if they wish.

[534] Handbook of Research on School Choice. Edited by Mark Berends and others. Routledge, 2009.

Page xvi: “In the lower-left cell of Table 1.1 are traditional public schools, which are government-funded and government-operated. Students within their boundaries are normally assigned to them, and they represent by far the largest number of American schools.”

Page xvii:

Perhaps surprisingly, an estimated one million youngsters (see the homeschooling chapter in this book) are now schooled at home. …

Also in the upper-right quadrant are for-profit tutoring and schooling. When families believe they lack the knowledge, skills, time, or desire to provide homeschooling, yet want things that they think the public schools do not adequately provide, they may voluntarily choose to pay for private tutoring. …

Non-profit private schools, both independent and sectarian, are a long-standing form of privately-funded and privately-operated choice. Parents place such value on the education and circumstances private schools offer that they pay the tuition to send their children to them. …

The upper left quadrant refers to rare schools that are privately operated with the partial or complete financial support of government, either for the school or for individual student tuition. An example is the provision made for autistic, severely physically handicapped, and other types of students with low-incidence, very special needs. Small districts that have insufficient numbers of such students to justify special schools may pay private schools within or outside their boundaries to educate them.

[535] Report: “Documentation to the NCES Common Core of Data Public Elementary/Secondary School Universe Survey: School Year 2010–11, Version Provisional 2a.” U.S. Department of Education, National Center for Education Statistics, September 2012. <nces.ed.gov>

Page C-3: “Charter School A school providing free public elementary and/or secondary education to eligible students under a specific charter granted by the state legislature or other appropriate authority, and designated by such authority to be a charter school.”

[536] Ruling: Zelman v. Simmons-Harris. U.S. Supreme Court, June 27, 2002. Decided 5–4. Majority: Rehnquist, O’Connor, Scalia, Kennedy, Thomas. Dissenting: Stevens, Souter, Ginsburg, and Breyer. <caselaw.findlaw.com>

Majority: “Magnet schools are public schools operated by a local school board that emphasize a particular subject area, teaching method, or service to students.”

[537] Handbook of Research on School Choice. Edited by Mark Berends and others. Routledge, 2009.

Page xvii:

In the lower-right quadrant are charter schools. They are government-funded but governed and operated by private boards. The aim of charter-enabling state legislation is to promote educational diversity, effectiveness, and accountability. Charter boards may appoint their own staff or hire nonprofit or for-profit management organizations.

The extent to which charter schools are freed from conventional public school regulations and oversight varies substantially from state to state, but in all cases charter schools are accountable to their chartering authority for student achievement and progress. From their beginnings, charter schools were subject to closure for poor achievement performance, but now, if traditional public schools repeatedly fail to improve student achievement, they are also subject to NCLB [No Child Left Behind] sanctions and eventual closure or other means of restructuring.

[538] Book: The Education Gap: Vouchers and Urban Schools (Revised edition). By William G. Howell and Paul E. Peterson with Patrick J. Wolf and David E. Campbell. Brookings Institution Press, 2006 (first published in 2002). <www.brookings.edu>

Page 11:

The first major choice initiative emerged from the conflicts surrounding desegregation in the 1960s. So unpopular was compulsory busing with many Americans that the magnet school was developed as an alternative way of increasing racial and ethnic integration. According to magnet school theory, families could be enticed into choosing integrated schools by offering them distinctive, improved education programs. Although the magnet idea was initially broached in the 1960s, it was not until after 1984 that the magnet school concept, supported by federal funding under the Magnet Schools Assistance program, began to have a national impact.

[539] Handbook of Research on School Choice. Edited by Mark Berends and others. Routledge, 2009.

Page xvi:

The questions raised here are simplified in that they group several distinctive forms of school choice into a single category of chosen schools. Consider some fundamental distinctions among the major forms of school choice represented in Table 1.1. The four-fold classification categorizes schools according to the possible combinations of school governance and operation on one hand and school funding on the other. As in the case of universities, these distinctions are hardly crisp. Public universities, for example, receive private tuition and donations. Sizable fractions of private universities’ research budgets come from the federal government. Still, these terms are common and offer useful starting points for discussion before turning to more precise operational definitions in the following sections and chapters.

In the lower-left cell of Table 1.1 are traditional public schools, which are government-funded and government-operated. Students within their boundaries are normally assigned to them, and they represent by far the largest number of American schools. In school choice research and policy deliberations, such traditional public schools, also called “neighborhood schools,” are often compared to choice schools such as charter and private schools, which may be near to or far from a student’s home.

Page xvii:

Perhaps surprisingly, an estimated one million youngsters (see the homeschooling chapter in this book) are now schooled at home. (Again, such categorization isn’t precise since some primarily homeschooled students take supplementary classes and play sports in local public schools and colleges.)

Also in the upper-right quadrant are for-profit tutoring and schooling. When families believe they lack the knowledge, skills, time, or desire to provide homeschooling, yet want things that they think the public schools do not adequately provide, they may voluntarily choose to pay for private tutoring. At least in part, East Asia’s thriving private tutoring sector is often credited for that region’s top scores on international achievement tests. Private tutoring is also popular with East Asian immigrants to the United States, whose children tend to be highly successful students. ….

The NCLB [No Child Left Behind] legislation has also accelerated the growth of for-profit companies, called educational management organizations, which operate schools for school districts and charter boards. They contract with local school districts to take over repeatedly failing public schools.

Non-profit private schools, both independent and sectarian, are a long-standing form of privately-funded and privately-operated choice. Parents place such value on the education and circumstances private schools offer that they pay the tuition to send their children to them. “Public vouchers” provide full or partial tuition at public expense to enable families, often poor and urban, to send their children to these schools. In more than 50 cities, “private vouchers” support such families with contributions from firms and wealthy individuals.

In the lower-right quadrant are charter schools. They are government-funded but governed and operated by private boards. The aim of charter-enabling state legislation is to promote educational diversity, effectiveness, and accountability. Charter boards may appoint their own staff or hire nonprofit or for-profit management organizations. …

Magnet schools arose in response to court-ordered racial desegregation plans that required involuntary bussing of students away from their racially isolated schools to maintain school racial percentages close to their overall district’s percentages. …

The upper left quadrant refers to rare schools that are privately operated with the partial or complete financial support of government, either for the school or for individual student tuition. An example is the provision made for autistic, severely physically handicapped, and other types of students with low-incidence, very special needs. Small districts that have insufficient numbers of such students to justify special schools may pay private schools within or outside their boundaries to educate them.

[540] Dataset: “Table 333.40. Total Revenue of Private Nonprofit Degree-Granting Postsecondary Institutions, by Source of Funds and Level of Institution: Selected Years 1999–2000 Through 2020–21.” U.S. Department of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

“2020–21 … Percentage Distribution … All levels … Tuition and fees (net of allowances1†) [=] 19.00% … Federal appropriations, grants, and contracts1,2 [=] 8.46% … State and local appropriations, grants, and contracts [=] 0.60% … 1 Private institutions typically report Pell grants as revenues from tuition and fees rather than as revenues from federal grants.”

NOTE: † This includes some government revenues:

Different institutions may classify certain funds differently as a scholarship or fellowship or as a pass-through. One common area of differences is [federal] Pell grants. Private institutions (and a few public institutions) operate under accounting standards adopted by the Financial Accounting Standards Board (FASB). The vast majority of public institutions use accounting standards adopted by the Governmental Accounting Standards Board (GASB). Public institutions using current GASB accounting standards are required to treat Pell grants as scholarships, using the logic that the institution is involved in the administration of the program (as evidenced by the administrative allowance paid to the institution). FASB standards give private institutions the option to treat Pell grants as scholarships or as pass-through transactions, using the logic that the federal government determines who is eligible for the grant, not the institution. Because of this difference in requirements, public institutions will report Pell grants as federal revenues and as allowances (reducing tuition revenues), whereas FASB institutions may do this as well or (as seems to be the majority) treat Pell grants as pass-through transactions. The result is that in the case where a FASB institution and GASB institution each receive the same amount of Pell grants on behalf of their students, the GASB institution will appear to have less tuition and more federal revenues, whereas the FASB institution treating Pell as pass-through will appear to have more tuition and less federal revenues.

[Webpage: “IPEDS [Integrated Postsecondary Education Data System] Finance Survey Tips Scholarships, Grants, Discounts, and Allowances.” U.S. Department of Education, National Center for Education Statistics. Accessed October 2, 2018 at <nces.ed.gov>]

[541] Ruling: Zelman v. Simmons-Harris. U.S. Supreme Court, June 27, 2002. Decided 5–4. Majority: Rehnquist, O’Connor, Scalia, Kennedy, Thomas. Dissenting: Stevens, Souter, Ginsburg, and Breyer. <caselaw.findlaw.com>

Concurrence (O’Connor):

Federal aid to religious schools is also substantial. Although data for all States is not available, data from Minnesota, for example, suggest that a substantial share of Pell Grant and other federal funds for college tuition reach religious schools. Roughly one-third or $27.1 million of the federal tuition dollars spent on students at schools in Minnesota were used at private 4-year colleges. … The vast majority of these funds—$23.5 million—flowed to religiously affiliated institutions.

[542] Handbook of Research on School Choice. Edited by Mark Berends and others. Routledge, 2009.

Page xvi: “Sizable fractions of private universities’ research budgets come from the federal government.”

Page xvii: “The upper left quadrant refers to rare schools that are privately operated with the partial or complete financial support of government, either for the school or for individual student tuition. An example is the provision made for autistic, severely physically handicapped, and other types of students with low-incidence, very special needs.”

Page 80:

Hence, since the 19th century, the conception of public education within the United States has been slightly at odds with the conception held in most other industrialized democracies. Outside the U.S. public education implies that the state helps provide common content, regulation of attendance and public financing, but not necessarily public delivery. Within the U.S. public education implies public delivery as well as the prohibition of public financing of faith-based schools.

Page 81: “[T]he degree of private financing in non-state [K–12] schools also varies widely, ranging from 0% in France, Austria, Spain, and Hungary to 100% in the United States.”

[543] Handbook of Research on School Choice. Edited by Mark Berends and others. Routledge, 2009.

Chapter 5: “International Perspectives on School Choice.” By Stephen P. Heyneman (Vanderbilt University). Pages 79–96.

Page 81: “[T]he degree of private financing in non-state schools also varies widely, ranging from 0% in France, Austria, Spain, and Hungary to 100% in the United States.”

Page 84: “In 1991, New Zealand shifted its traditional highly centralized school system to allow parents to send their children to whatever school they wish, without regard to school ownership or geographical catchment area.”

Page 85: “Without being as visible as either Chile or New Zealand, the approach to school choice in Australia may be more worthy of note. The federal government has funded nonreligious, non-state education since the 1970s.”

Pages 85–86:

Canada is a good source of evidence on school choice, in part, because each province has set its own policies. Public support for nongovernment and nonreligious schools began in the 1960s in Alberta. Today, up to one half of the recurring cost for educating a child in public education is offered to nongovernment schools in Alberta, Manitoba, Quebec, and British Columbia. On the other hand, no subsidy is offered to private schools in Newfoundland, Nova Scotia, Ontario, Saskatchewan, New Brunswick, or Prince Edward Island. One province, Alberta, provides public funding for homeschooling. In three provinces (Alberta, Ontario, and Saskatchewan), full public funding is supplied to Catholic or Protestant schools through school boards.

Pages 87–88: “This country [the Netherlands] has perhaps the oldest and most pervasive policy of school choice…. Today, 76% of Holland students attend non-state schools, and 90% of those are affiliated with either Catholic or Protestant churches.”

[544] Handbook of Research on School Choice. Edited by Mark Berends and others. Routledge, 2009.

Chapter 5: “International Perspectives on School Choice.” By Stephen P. Heyneman (Vanderbilt University). Pages 79–96.

Page 81:

Fourth and most important is the issue of administrative autonomy. Some assume that non-state schools have more administrative autonomy, as do private and charter schools in the United States. But, in fact, there is a wide variation in the degree to which non-state schools are free of governmental regulation…. In Australia, non-state schools are financed by the government but experience a very low level of government regulation. In Italy and Greece, just the opposite pertains.

Page 82:

Toma argues that the critical difference is not whether the schools are government or non-government but their degree of administrative and financial latitude. She finds that, in general, students and non-government schools tend to perform better in mathematics, but that restrictions on the decision-making authority in nongovernment schools significantly reduce the performance advantage.

Page 83:

The evidence from middle and low income countries is generally consistent with the evidence from OECD [Organization for Economic Cooperation and Development] countries. Students tend to perform better academically in private schools, and schools with control over their own resources, and in school systems which have been administratively decentralized….

[545] Handbook of Research on School Choice. Edited by Mark Berends and others. Routledge, 2009.

Chapter 5: “International Perspectives on School Choice.” By Stephen P. Heyneman (Vanderbilt University). Pages 79–96.

Page 80:

Much of the debate over school choice is based on the premise that there is a public monopoly over the provision of schooling and that schools are inefficient, in part, because of the absence of competition. If families could be treated as consumers and had the right to freely choose which kind of education they would prefer for their children, choice advocates assert that both government and non-government schools would improve…. Choice is believed to have the potential of a stimulant to better teaching, more creative curriculum, more attention to outcomes, and more transparency with respect to results. In short, competition is believed to represent a “tide which will lift all boats” (Hoxby, 2003).

[546] Textbook: Macroeconomics: A Contemporary Introduction (10th edition). By William A. McEachern. South-Western Cengage Learning, 2014.

Page 410:

Government officials sometimes decide that entrepreneurs are unable to generate the kind of economic growth the country needs. State enterprises are therefore created to do what government believes the free market cannot do. But state-owned enterprises may have objectives other than producing goods efficiently—objectives that could include providing jobs for friends and relatives of government officials. Also, with state enterprises there is little or no competition.

[547] Textbook: Economics: Private and Public Choice. By James D. Gwartney and others. South-Western Cengage Learning, 2009. Page 338:

As Adam Smith stressed long ago, when competition is present, even self-interested individuals will tend to promote the general welfare. Conversely, when competition is weakened, business firms will have more leeway to raise prices and pursue their own objectives and less incentive to innovate and develop better ways of doing things.

Competition is a disciplining force for both buyers and sellers. In a competitive environment, producers must provide goods at a low cost and serve the interests of consumers; if they don’t, other suppliers will. Firms that develop improved products and figure out how to produce them at low cost will succeed. Sellers that are unwilling or unable to provide consumers with quality goods at competitive prices will be driven from the market. This process leads to improved products and production methods and directs resources toward projects that create more value. It is a powerful stimulus for economic progress.

[548] Book: Quality Concepts for the Process Industry (2nd edition). By Michael Speegle. Delmar Cengare Learning, 2010.

Page 134:

When competition is low, producers can be inefficient without being penalized, and manpower and other resources can be wasted. A lack of competition and rank inefficiency was typical of the manufacturing climate of the former Soviet Union, where workers showed up for work if they felt like it, production lines shut down because parts did not arrive on time, and raw materials and energy were wasted because there was no need to be efficient. No one was penalized because the manufacturer had no competition.

[549] Book: Conditional Cash Transfers in Latin America. Edited by Michelle Adato and John Hoddinott. Johns Hopkins University Press, 2010.

Chapter 6: “The Economics of Conditional Cash Transfers.” By Jere R. Behrman and Emmanuel Skoufias. Pages 127–158.

Page 136:

The sectors that provide some types of services (for example, information, healthcare, and schooling) may produce inefficiently because institutional arrangements do not induce efficient production of an efficient basket of commodities. School teachers and staff, for example, might be oriented toward rewards established by the Ministry of Education or union negotiations based on tenure or credentials, not toward satisfying the demands of clients.

[550] Dataset: “Table 236.55. Total and Current Expenditures Per Pupil in Public Elementary and Secondary Schools: Selected Years, 1919–20 Through 2019–20.” U.S. Department of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

Expenditure per pupil in fall enrollment … Constant 2021–22 dollars2 … Total expenditure3 … 2019–20 [=] $17,0136

2 Constant dollars based on the Consumer Price Index, prepared by the Bureau of Labor Statistics, U.S. Department of Labor, adjusted to a school-year basis.

3 Excludes “Other current expenditures,” such as community services, private school programs, adult education, and other programs not allocable to expenditures per student at public schools.

[551] Report: “Documentation to the NCES [National Center for Education Statistics] Common Core of Data Public Elementary/Secondary School Universe Survey: School Year 2010–11, Version Provisional 2a.” U.S. Department of Education, National Center for Education Statistics, September 2012. <nces.ed.gov>

Page C-6: “Elementary A general level of instruction classified by state and local practice as elementary, composed of any span of grades not above grade 8; preschool or kindergarten included only if it is an integral part of an elementary school or a regularly established school system.”

Page C-14: “Secondary The general level of instruction classified by state and local practice as secondary and composed of any span of grades beginning with the next grade following the elementary grades and ending with or below grade 12.”

[552] Click here for documentation that the following items are excluded from spending data published by the National Center for Education Statistics:

  • State administration spending
  • Unfunded pension benefits
  • Post-employment non-pension benefits like health insurance

[553] Calculated with data from:

a) Dataset: “Table 2.4.5U. Personal Consumption Expenditures by Type of Product.” U.S. Bureau of Economic Analysis. Last revised April 27, 2023. <apps.bea.gov>

b) Dataset: “Table 1.1.5. Gross Domestic Product.” U.S. Bureau of Economic Analysis. Last revised April 27, 2023. <apps.bea.gov>

c) Dataset: “CPI—All Urban Consumers (Current Series).” U.S. Department of Labor, Bureau of Labor Statistics. Accessed January 27, 2023 at <www.bls.gov>

“Series Id: CUUR0000SA0; Series Title: All Items in U.S. City Average, All Urban Consumers, Not Seasonally Adjusted; Area: U.S. City Average; Item: All Items; Base Period: 1982–84=100”

d) Dataset: “Table 236.20. Total Expenditures for Public Elementary and Secondary Education and Other Related Programs, by Function and Subfunction: Selected Years, 1990–91 Through 2019–20.” U.S. Department Of Education, National Center for Education Statistics, December 2022. <nces.ed.gov>

e) Dataset: “Table 105.20. Enrollment in Elementary, Secondary, and Degree-Granting Postsecondary Institutions, by Level and Control of Institution, Enrollment Level, and Attendance Status and Sex of Student: Selected Years, Fall 1990 Through Fall 2030.” U.S. Department Of Education, National Center for Education Statistics, March 2022. <nces.ed.gov>

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • The next five footnotes provide important context for understanding the data and calculations used to determine this fact. In short, they document the following:
    • The result of this calculation is corroborated by a 1995 working paper published by the U.S. Department of Education.
    • The combination of “elementary” and “secondary” schools roughly encompasses grades K–12.
    • Private-sector spending on education is equal to the sum of these three measures reported by the federal government’s Bureau of Economic Analysis:
      1) personal consumption expenditures (PCE)
      2) gross private domestic investment (GPDI)
      3) net exports of goods and services
    • PCE is the “primary measure of consumer spending on goods and services” by private individuals and nonprofit organizations.
    • GPDI is a measure of private spending on “structures, equipment, and intellectual property products.”
    • Since private school education is not a service that is typically imported or exported, a valid approximation of spending on private K–12 schools can be obtained by summing PCE, GPDI, and government spending on private K–12 schools.

[554] In 1995, the U.S. Department of Education (DOE) published a working paper which “estimated that the total expenditures for private schools in 1991–92 (including operating expenses and capital) were between $18.0 and $19.4 billion.”† Using the same methodology as in the footnote above, Just Facts calculated that total expenditures for private schools in the same year were $16.2 billion, or 10–16% lower than DOE’s estimates.‡ This difference fits with the DOE working paper’s statement that “we would be surprised if improved data changed our overall estimate of total expenditures on private education by more than perhaps 10 or 15%.” Some of shortcomings of the data used in DOE working paper are as follows:

  • “The main area of concern in the data for Catholic elementary and secondary schools is the response rate: each had a response rate far below 100%. (The response rate for the elementary survey was just above 50%, and for the secondary survey it was about 57%.)”
  • “In addition, our use of region as a proxy for geographic variation may be somewhat crude.”
  • “The principal caveat that needs to be attached to our estimates is that we are uncertain about the specific expenditures school officials included in their responses to the survey items we relied on in our analysis.”
  • “Nor do we know whether most schools responded to the survey items on the basis of a formal school budget or on the basis of less formal materials.”

NOTES:

  • † Working paper: “Estimates of Expenditures for Private K–12 Schools.” By Michael Garet, Tsze H. Chan, and Joel D. Sherman. U.S. Department of Education, National Center for Education Statistics, May 1995. <nces.ed.gov>
  • ‡ An Excel file containing the data and calculations is available here.

[555] Report: “Documentation to the NCES [National Center for Education Statistics] Common Core of Data Public Elementary/Secondary School Universe Survey: School Year 2010–11, Version Provisional 2a.” U.S. Department of Education, National Center for Education Statistics, September 2012. <nces.ed.gov>

Page C-6: “Elementary A general level of instruction classified by state and local practice as elementary, composed of any span of grades not above grade 8; preschool or kindergarten included only if it is an integral part of an elementary school or a regularly established school system.”

Page C-14: “Secondary The general level of instruction classified by state and local practice as secondary and composed of any span of grades beginning with the next grade following the elementary grades and ending with or below grade 12.”

[556] Report: “Fiscal Year 2013 Analytical Perspectives, Budget of the U.S. Government.” White House Office of Management and Budget, February 12, 2012. <www.gpo.gov>

Page 471:

The main purpose of the NIPAs [national income and product accounts published by the U.S. Bureau of Economic Analysis] is to measure the Nation’s total production of goods and services, known as gross domestic product (GDP), and the incomes generated in its production. GDP excludes intermediate production to avoid double counting. Government consumption expenditures along with government gross investment—State and local as well as Federal—are included in GDP as part of final output, together with personal consumption expenditures, gross private domestic investment, and net exports of goods and services (exports minus imports).

[557] Report: “Concepts and Methods of the U.S. National Income and Product Accounts, Chapter 5: Personal Consumption Expenditures.” U.S. Bureau of Economic Analysis. Updated December 2022. <www.bea.gov>

Page 5-1:

Personal consumption expenditures (PCE) is the primary measure of consumer spending on goods and services in the U.S. economy.1 It accounts for about two-thirds of domestic final spending, and thus it is the primary engine that drives future economic growth. PCE shows how much of the income earned by households is being spent on current consumption as opposed to how much is being saved for future consumption.

PCE also provides a comprehensive measure of types of goods and services that are purchased by households. Thus, for example, it shows the portion of spending that is accounted for by discretionary items, such as motor vehicles, or the adjustments that consumers make to changes in prices, such as a sharp run-up in gasoline prices.2

Page 5-2:

PCE measures the goods and services purchased by “persons”—that is, by households and by nonprofit institutions serving households (NPISHs)—who are resident in the United States. Persons resident in the United States are those who are physically located in the United States and who have resided, or expect to reside, in this country for 1 year or more. PCE also includes purchases by U.S. government civilian and military personnel stationed abroad, regardless of the duration of their assignments, and by U.S. residents who are traveling or working abroad for 1 year or less.3

Page 5-69:

Nonprofit Institutions Serving Households

In the NIPAs [National Income and Product Accounts], nonprofit institutions serving households (NPISHs), which have tax-exempt status, are treated as part of the personal sector of the economy. Because NPISHs produce services that are not generally sold at market prices, the value of these services is measured as the costs incurred in producing them.

In PCE, the value of a household purchase of a service that is provided by a NPISH consists of the price paid by the household or on behalf of the household for that service plus the value added by the NPISH that is not included in the price. For example, the value of the educational services provided to a student by a university consists of the tuition fee paid by the household to the university and of the additional services that are funded by sources other than tuition fees (such as by the returns to an endowment fund).

[558] Report: “Measuring the Economy: A Primer on GDP and the National Income and Product Accounts.” U.S. Bureau of Economic Analysis, December 2015. <www.bea.gov>

Page 8: “Gross private domestic investment consists of purchases of fixed assets (structures, equipment, and intellectual property products) by private businesses that contribute to production and have a useful life of more than one year, of purchases of homes by households, and of private business investment in inventories.”

[559] Textbook: Antitrust Law (2nd edition). By Richard A. Posner. University of Chicago Press, 2001.

Pages 12–13:

The optimum monopoly price may be much higher than the competitive price, depending on the intensity of consumer preference for the monopolized product—how much of it they continue to buy at successively higher prices-in relation to its cost. And the monopoly output will be smaller.3

So we now know that output is smaller under monopoly4 than under competition but not that the reduction in output imposes a loss on society. After all, the reduction in output in the monopolized market frees up resources that can and will be put to use in other markets. There is a loss in value, however. The increase in the price of the monopolized product above its cost induces the consumer to substitute products that must cost more (adjusting for any quality difference) to produce (or else the consumer would have substituted them before the price increase), although now they are relatively less expensive, assuming they are priced at a competitive level, that is, at the economically correct measure of cost. Monopoly pricing confronts the consumer with false alternatives: the product that he chooses because it seems cheaper actually requires more of society’s scarce resources to produce. Under monopoly, consumer demands are satisfied at a higher cost than necessary.

This analysis identifies the cost of monopoly with the output that the monopolist does not produce, and that a competitive industry would. I have said nothing about the higher prices paid by those consumers who continue to purchase the product at the monopoly price. Those higher prices are the focus of the layperson’s concern about monopoly—an example of the often sharp divergence between lay economic intuition and economic analysis. Antitrust economists used to treat the transfer of wealth from consumer to monopoly producer as completely costless to society, on the theory that the loss to the consumer was exactly offset by the gain to the producer.6 The only cost of monopoly in that analysis was the loss in value resulting from substitution for the monopolized product, since the loss to the substituting consumers is not recouped by the monopolist or anyone else and is thus a net loss, rather than merely a transfer payment and therefore a mere bookkeeping entry on the social books. But the traditional analysis was shortsighted.7 It ignored the fact that an opportunity to obtain a lucrative transfer payment in the form of monopoly profits will attract real resources into efforts by sellers to monopolize and by consumers to avoid being charged monopoly prices (other than by switching to other products, the source of the cost of monopoly on which the conventional economic analysis of monopoly focused). The costs of the resources consumed in these endeavors are costs of monopoly just as much as the costs resulting from the substitution of products that cost society more to produce than the monopolized product, though we’ll see that there may sometimes be offsetting benefits in this competition to become or fend off a monopolist.

[560] Textbook: Economics: Private and Public Choice. By James D. Gwartney and others. South-Western Cengage Learning, 2009.

Page 338:

As Adam Smith stressed long ago, when competition is present, even self-interested individuals will tend to promote the general welfare. Conversely, when competition is weakened, business firms will have more leeway to raise prices and pursue their own objectives and less incentive to innovate and develop better ways of doing things.

Competition is a disciplining force for both buyers and sellers. In a competitive environment, producers must provide goods at a low cost and serve the interests of consumers; if they don’t, other suppliers will. Firms that develop improved products and figure out how to produce them at low cost will succeed. Sellers that are unwilling or unable to provide consumers with quality goods at competitive prices will be driven from the market. This process leads to improved products and production methods and directs resources toward projects that create more value. It is a powerful stimulus for economic progress.

[561] Textbook: Business Process Modeling, Simulation and Design. By Manuel Laguna and Johan Marklund. Pearson, 2011.

Page 55:

Each market segment where goods and services are sold establishes the basis for competition. The same product, for example, may be sold in different markets by emphasizing price in one, quality in another, functionality (attributes) in yet another, and reliability or service elsewhere. Free trade agreements among countries, such as the North Atlantic Free Trade Agreement (NAFTA), or within the European Union (EU) compound the complexity and the intensity of competition because governments are less willing to implement policies designed to protect the local industry. The good news for consumers is that this intense competition tends to drive quality up and prices down. The challenge for companies is that the level of efficiency in their operations must increase (to various degrees, depending upon the status quo), because companies must be able to compete with the world’s best.

[562] Book: Quality Concepts for the Process Industry (2nd edition). By Michael Speegle. Delmar Cengare Learning, 2010

Page 134:

Government Regulation of Competition

Because consumers benefit from competition, the United States government has traditionally sought to maintain a competitive environment for business by promulgating laws or regulations intended to prevent abuses in specific areas. Many government regulations also deal with product standards, environmental impacts, and other matters not directly related to competition. Without competition, prices of goods and services tend to be higher than they would be with competition, plus manufacturing output is lower. See Figure 11.2 and follow the x-axis from left to right. On the left is a monopoly with low or nonexistent innovation and high prices. As competition is introduced across the x-axis, innovation increases and prices come down.

When competition is low, producers can be inefficient without being penalized, and manpower and other resources can be wasted. A lack of competition and rank inefficiency was typical of the manufacturing climate of the former Soviet Union, where workers showed up for work if they felt like it, production lines shut down because parts did not arrive on time, and raw materials and energy were wasted because there was no need to be efficient. No one was penalized because the manufacturer had no competition.

[563] Ruling: Abood v. Detroit Board of Education. U.S. Supreme Court, May 23, 1977. Decided 9–0 (with three separate concurrences from four Justices, who sometimes expressed opposing views to the Court’s opinion). <www.law.cornell.edu>

NOTE: This portion of the ruling was not disputed by any of the Justices.

Majority:

The appellants’ second argument is that in any event collective bargaining in the public sector is inherently “political” and thus requires a different result under the First and Fourteenth Amendments. This contention rests upon the important and often-noted differences in the nature of collective bargaining in the public and private sectors.24 A public employer, unlike his private counterpart, is not guided by the profit motive and constrained by the normal operation of the market. Municipal services are typically not priced, and where they are they tend to be regarded as in some sense “essential” and therefore are often price-inelastic. Although a public employer, like a private one, will wish to keep costs down, he lacks an important discipline against agreeing to increases in labor costs that in a market system would require price increases. A public-sector union is correspondingly less concerned that high prices due to costly wage demands will decrease output and hence employment.

The government officials making decisions as the public “employer” are less likely to act as a cohesive unit than are managers in private industry, in part because different levels of public authority department managers, budgetary officials, and legislative bodies are involved, and in part because each official may respond to a distinctive political constituency. And the ease of negotiating a final agreement with the union may be severely limited by statutory restrictions, by the need for the approval of a higher executive authority or a legislative body, or by the commitment of budgetary decisions of critical importance to others.

Finally, decisionmaking by a public employer is above all a political process. The officials who represent the public employer are ultimately responsible to the electorate, which for this purpose can be viewed as comprising three overlapping classes of voters taxpayers, users of particular government services, and government employees. Through exercise of their political influence as part of the electorate, the employees have the opportunity to affect the decisions of government representatives who sit on the other side of the bargaining table. Whether these representatives accede to a union’s demands will depend upon a blend of political ingredients, including community sentiment about unionism generally and the involved union in particular, the degree of taxpayer resistance, and the views of voters as to the importance of the service involved and the relation between the demands and the quality of service. It is surely arguable, however, that permitting public employees to unionize and a union to bargain as their exclusive representative gives the employees more influence in the decisionmaking process than is possessed by employees similarly organized in the private sector. …

24 See, e. g., K. Hanslowe, The Emerging Law of Labor Relations in Public Employment (1967); H. Wellington & R. Winter, Jr., The Unions and the Cities (1971); Hildebrand, The Public Sector, in J. Dunlop and N. Chamberlain (eds.), Frontiers of Collective Bargaining 125–154 (1967); Rehmus, Constraints on Local Governments in Public Employee Bargaining, 67 Mich.L.Rev. 919 (1969); Shaw & Clark, The Practical Differences Between Public and Private Sector Collective Bargaining, 19 U.C.L.A.L.Rev. 867 (1972); Smith, State and Local Advisory Reports on Public Employment Labor Legislation: A Comparative Analysis, 67 Mich.L.Rev. 891 (1969); Summers, Public Employee Bargaining: A Political Perspective, 83 Yale L.J. 1156 (1974); Project, Collective Bargaining and Politics in Public Employment, 19 U.C.L.A.L.Rev. 887 (1972). The general description in the text of the differences between private- and public-sector collective bargaining is drawn from these sources.

[564] Paper: “Two Faces of Union Voice in the Public Sector.” By Morley Gunderson. Journal of Labor Research, Summer 2005. Pages 393–413. <link.springer.com>

Pages 404–405:

Public services, in contrast, are less subject to the pressures of globalization and trade liberalization, but they are not immune. Countries, and political jurisdictions within countries, are increasingly competing for business investment and the jobs associated with that investment,30 creating stronger incentives to provide public services and infrastructures more cost effectively. Physical capital, financial capital, and human capital are increasingly mobile and footloose—able to escape jurisdictions that have excessive taxes and costs for public services and infrastructures. They can increasingly vote with their feet, moving to jurisdictions that provide the Tiebout-type tax and public expenditure package that suits their needs and preferences compelling governments to face a harder rather than softer budget constraint. Government may not go out of business, but they may not survive the next election.

[565] Paper: “Binding Interest Arbitration in the Public Sector: Is It Constitutional?” William & Mary Law Review, 1977. Pages 787–821. <scholarship.law.wm.edu>

Page 790:

Overburdened taxpayers, on the other hand, also have been harmed by higher costs of living. They surrender a material portion of their paychecks to the government and expect quality public services at reasonable rates. Public officials, in an effort to appease constituents, attempt to maximize the productivity of public employees as much as possible while holding public spending to a minimum.

[566] For facts about the importance of using experimental studies to determine the effects of public policies, see the introductory notes.

[567] The following footnotes contain the primary sources of all experimental (or quasi-experimental) school choice studies known to Just Facts. The studies are arranged from newest to oldest. To locate and sort through these studies, Just Facts:

  • conducts daily reviews of 15+ different government agencies, think tanks, and media outlets that span a broad ideological spectrum.
  • only includes fully published studies, not those that are published merely as “working papers.”
  • only includes studies that provide truly randomized results from which causal effects can be determined.
  • conducts an annual review of school choice studies compiled by the Friedman Foundation for Educational Choice, which is now known as EdChoice.†‡ In September 2015, Just Facts wrote to the Friedman Foundation to ask how it can be sure that there are no other random-assignment school choice studies beyond the one the foundation has located. A senior fellow replied:

Obviously no literature review can ever be totally sure that it hasn’t overlooked something. That is why we lay out in the report the procedure we use to check our knowledge. We start with what we know of, then we use the procedure (which is described in the report) to search for any studies we don’t know about. However, that having been said, the amount of empirical scientific research on school choice programs is not very great, and the world of people who publish and discuss this research professionally is small, so it is unlikely that something as important as a random-assignment study could come out and not be noticed by the entire field.§

NOTES:

  • † Report: “A Win-Win Solution: The Empirical Evidence on School Choice.” By Greg Forster. Friedman Foundation for Educational Choice, May 2016. <www.edchoice.org>
  • ‡ Report: “The 123s of School Choice: What the Research Says About Private School Choice Programs in America.” By Paul DiPerna and others. EdChoice, June 26, 2023. <www.edchoice.org>
  • § Email from the Friedman Foundation to Just Facts, September 14, 2015.

[568] Paper: “School Vouchers and Student Outcomes: Experimental Evidence from Washington, DC.” By Patrick J. Wolf and others. Journal of Policy Analysis and Management, Spring 2013. Pages 246–270. <onlinelibrary.wiley.com>

Abstract:

Here we examine the empirical question of whether or not a school voucher program in Washington, DC, affected achievement or the rate of high school graduation for participating students. The District of Columbia Opportunity Scholarship Program (OSP) has operated in the nation’s capital since 2004, funded by a federal government appropriation. Because the program was oversubscribed in its early years of operation, and vouchers were awarded by lottery, we were able to use the “gold standard” evaluation method of a randomized experiment to determine what impacts the OSP had on student outcomes. Our analysis revealed compelling evidence that the DC voucher program had a positive impact on high school graduation rates, suggestive evidence that the program increased reading achievement, and no evidence that it affected math achievement.

Page 258: “Results are described as statistically significant or highly statistically significant if they reach the 95 percent or 99 percent confidence level, respectively.”

Page 260:

The attainment impact analysis revealed that the offer of an OSP scholarship raised students’ probability of graduating from high school by 12 percentage points (Table 3). The graduation rate was 82 percent for the treatment group compared to 70 percent for the control group. The impact of using a scholarship was an increase of 21 percentage points in the likelihood of graduating. The positive impact of the program on this important student outcome was highly statistically significant.

Page 261:

We observed no statistically significant evidence of impacts on graduation rates at the subgroup level for students who applied to the program from non-SINI [schools in need of improvement] schools, with relatively lower levels of academic performance, and male students. For all subgroups, the graduation rates were higher among the treatment group compared with the control group, but the differences did not reach the level of at least marginal statistical significance for these three student subgroups. …

Our analysis indicated a marginally statistically significant positive overall impact of the program on reading achievement after at least four years. No significant impacts were observed in math. The reading test scores of the treatment group as a whole averaged 3.9 scale score points higher than the scores of students in the control group, equivalent to a gain of about 2.8 months of additional learning. The calculated impact of using a scholarship was a reading gain of 4.8 scale score points or 3.4 months of additional learning (Table 4).

Page 262:

Reading … Adjusted impact estimate [=] 4.75 … p-value of estimates [=] .06 …

The reading impacts appeared to cumulate over the first three years of the evaluation, reaching the marginal level of statistical significance after two years and the standard level after three years. By that third-year impact evaluation, only 85 of the 2,308 students in the evaluation (3.7 percent) had graded-out of the impact sample, having exceeded 12th grade. Between the third-year and final-year evaluation, an additional 211 students (12.2 percent) graded-out of the sample, reducing the final test score analytic sample to a subgroup of the original analytic sample. Due to this loss of cases for the final test score analysis, the confidence interval around the final point estimates is larger than it was after three years, and the positive impact of the program on reading achievement was only statistically significant at the marginal level.

Page 266: “Here, in the form of the DC school voucher program, Congress and the Obama administration uncovered what appears to be one of the most effective urban dropout prevention programs yet witnessed.”

Page 267: “We did find evidence to suggest that scholarship use boosted student reading scores by the equivalent of about one month of additional learning per year. Most parents, especially in the inner city, would welcome such an improvement in their child’s performance.”

[569] Paper: “Private School Vouchers and Student Achievement: A Fixed Effects Quantile Regression Evaluation.” By Carlos Lamarche. Labour Economics, August 2008. Pages 575–590. <doi.org>

Page 575:

Fundamental to the recent debate over school choice is the issue of whether voucher programs actually improve students’ academic achievement. Using newly developed quantile regression approaches, this paper investigates the distribution of achievement gains in the first school voucher program implemented in the US. We find that while high-performing students selected for the Milwaukee Parental Choice program had a positive, convexly increasing gain in mathematics, low-performing students had a nearly linear loss. However, the program seems to prevent low-performing students from having an even bigger loss experienced by students in the public schools.

[570] Paper: “Effectiveness of School Choice: The Milwaukee Experiment.” By Jay P. Greene, Paul E. Peterson, and Jiangtao Du. Education and Urban Society, February 1999. Pages 131–258. <journals.sagepub.com>

Page 193:

Only a few studies of school effectiveness have been able to draw upon data from randomized experiments, probably because it is difficult to justify random denial of access to apparently desirable education conditions.3 The results from the Milwaukee choice program reported here are the first to estimate from a randomized experiment the comparative achievement effects of public and private schools.

Page 194:

The Milwaukee choice program, initiated in 1990, provided a voucher to a limited number of students from low-income families to pay tuition at their choice of secular private schools in Milwaukee. …

The number of producers was restricted by the requirement that no more than half of a school’s enrollment could receive vouchers. … Consumer choice was further limited by excluding the participation of religious schools (thereby precluding use of approximately 90% of the private school capacity within the city of Milwaukee). Coproduction was discouraged by prohibiting families from supplementing the voucher with tuition payments of their own. (But schools did ask families to pay school fees and make voluntary contributions.) Other restrictions also limited program size. Only 1% of the Milwaukee public schools could participate, and students could not receive a voucher unless they had been attending a public school or were not of school age at the time of application.

These restrictions significantly limited the amount of school choice that was made available. Most choice students attended fiscally constrained institutions with limited facilities and poorly paid teachers.5

Page 200:

Estimated effects of choice schools on mathematics achievement were slight for the first 2 years students were in the program. But after 3 years of enrollment students scored 5 percentile points higher; after 4 years, they scored 10.7 points higher than the control group. These differences between the two groups 3 and 4 years after their application to choice schools are .24 and .51 standard deviation of the national distribution of math test scores, respectively, statistically significant at accepted confidence levels.13

Differences on the reading test were between 2 and 3 percentile points for the first 3 years and increased to 5.8 percentile points in the fourth year. Results for the third and fourth year are statistically significant, when the two are jointly estimated.14

Pages 205–206:

The consistency of the results is noteworthy. Positive results are found for all years and for all comparisons except one. The results reported in the main analysis for both math and reading are statistically significant for students remaining in the program for 3 to 4 years, when these are jointly estimated.

These results after 3 and 4 years are moderately large, ranging from .1 of a standard deviation to as much as .5 of a standard deviation. Studies of educational effects interpret effects of .1 standard deviations as slight, effects of .2 and .3 standard deviation as moderate, and effects of .5 standard deviation as large…. Even effects of .1 standard deviation are potentially large if they accumulate over time…. The average difference in test performances of Whites and minorities in the United States is one standard deviation….

[571] Paper: “Private School Vouchers and Student Achievement: An Evaluation of the Milwaukee Parental Choice Program.” By Cecilia Elena Rouse. Quarterly Journal of Economics, May 1998. Pages 553–602. <doi.org>

Page 553:

In 1990 Wisconsin began providing vouchers to a small number of low-income students to attend nonsectarian private schools. Controlling for individual fixed-effects, I compare the test scores of students selected to attend a participating private school with those of unsuccessful applicants and other students from the Milwaukee public schools. I find that students in the Milwaukee Parental Choice Program had faster math score gains than, but similar reading score gains to, the comparison groups. The results appear robust to data imputations and sample attrition, although these deficiencies of the data should be kept in mind when interpreting the results.

Page 554:

In 1990 Wisconsin became the first state in the country to implement a school choice program that provides vouchers to low-income students to attend nonsectarian private schools.2 The number of students in any year was originally limited to 1 percent of the Milwaukee public schools membership, but was expanded to 1.5 percent in 1994. Only students whose family income was at or below 1.75 times the national poverty line were eligible to apply.

Page 558:

I find that students selected for the choice program scored approximately 1.5–2.3 extra percentile points per year in math compared with unsuccessful applicants and the sample of other students in the Milwaukee public schools. The achievement gains of those actually enrolled in the choice schools were quite similar. Given a (within-sample) standard deviation of about nineteen percentile points on the math test, this suggests effect sizes on the order of 0.080–0.120σ per year, or 0.320–0.480σ over four years, which are quite large for education production functions. I do not estimate statistically significant differences between sectors in reading scores.

Page 561:

Table II. Numbers of Applicants, Selections, And Enrollments

Year of “First” Application

1990

1991

1992

1993

Number of applicants

583

558

558

559

Number selected

376

452

321

395

[572] Paper: “A Modified General Location Model for Noncompliance With Missing Data: Revisiting the New York City School Choice Scholarship Program Using Principal Stratification.” By Hui Jin and others. Journal of Educational and Behavioral Statistics, April 2010. Pages 154–173. <jeb.sagepub.com>

Page 156:

In February 1997, the School Choice Scholarship Foundation (SCSF) launched the New York City School Choice Scholarship Program and invited applications from eligible low-income families interested in scholarships toward private school expenses; these scholarships offered up to $1,400 for the academic year 1997–1998. Eligibility requirements included that the children were attending public school in Grades K through 4 in the New York City at the time of application and that their families were poor enough to qualify for free school lunch. The SCSF received applications from over 20,000 students. In a mandatory information session before the lottery to assign the scholarships, each family provided background information, and the children in Grades 1 through 4 took the Iowa Test of Basic Skills (ITBS), the pretest in reading and math. In the final lottery held in May 1997, about 1,000 students were randomly selected to the treatment group and were awarded offers of scholarships; about another 1,000 were selected to the control group without the scholarship. Both groups were followed up and strongly encouraged to take a posttest, again the ITBS, at the end of the 1997–1998 academic year.

Pages 168–170:

[B]oth models find that for compliers [students who moved to private schools] originally from schools with low average scores, attendance in private school will unambiguously improve their overall math performance … as compared to attendance in public school. Such an improvement is especially evident for children in Grade 1…. However, results from the two models differ in some other groups…. Using our model, we find that reading score was likely improved for children from Grade 4 of low average … and children from Grade 1 of high average schools … [T]he estimates of Barnard and others (2003) of the two groups … respectively, were much smaller.

[573] Paper: “School Choice as a Latent Variable: Estimating the ‘Complier Average Causal Effect’ of Vouchers in Charlotte.” By Joshua M. Cowen. Policy Studies Journal, May 2008. Pages 301–315. <onlinelibrary.wiley.com>

Page 307:

Incoming second- through eighth-grade students from low-income families in Charlotte were offered the opportunity to apply for a $1,700 scholarship to attend a private school for the 1999–2000 school year. Of the original applicants, 347 (30%) agreed to participate in a program evaluation the following spring. At the end of the school year, Iowa Tests of Basic Skills (ITBS) were administered to all students, while their parents completed surveys designed to obtain background information. There was no pretest Families who had either lost the lottery or had chosen not to accept the voucher were offered $20 and a chance to win a new scholarship to attend the testing sessions.

Page 309:

I begin the analysis of a voucher impact by estimating a typical “Intention-to-Treat” (ITT) model. In this model, students are considered to receive the treatment regardless of whether they use the voucher….

The ITT results indicate a positive voucher impact of 5 points on math scores and roughly 6 points on reading scores, all else equal. …

Next, I … [estimate] the mean effect of voucher treatment using an IV analysis, where the instrument for treatment is the random voucher offer itself….The results are similar to the ITT estimates in their statistical significance: A positive voucher effect appears evident at the p ≤ 0.10 for math achievement and p ≤ 0.05 for reading. The point estimates of the voucher effect increase from 5 to nearly 7 points in math, and from 6 to 8 points in reading.

[574] Paper: “Principal Stratification Approach to Broken Randomized Experiments: A Case Study of School Choice Vouchers in New York City.” By John Barnard and others. Journal of the American Statistical Association, June 2003. Pages 299–323. <biosun01.biostat.jhsph.edu>

Abstract:

Although this study benefits immensely from a randomized design, it suffers from complications common to such research with human subjects: noncompliance with assigned “treatments” and missing data. Recent work has revealed threats to valid estimates of experimental effects that exist in the presence of noncompliance and missing data, even when the goal is to estimate simple intention-to-treat effects. Our goal was to create a better solution when faced with both noncompliance and missing data. This article presents a model that accommodates these complications that is based on the general framework of “principal stratification” and thus relies on more plausible assumptions than standard methodology. Our analyses revealed positive effects on math scores for children who applied to the program from certain types of schools—those with average test scores below the citywide median. Among these children, the effects are stronger for children who applied in the first grade and for African-American children.

[575] NOTE: The following source conducted three different random-assignment school choice studies:

Book: The Education Gap: Vouchers and Urban Schools (Revised edition). By William G. Howell and Paul E. Peterson with Patrick J. Wolf and David E. Campbell. Brookings Institution Press, 2006 (first published in 2002). <www.brookings.edu>

Page 39: “We evaluated the privately funded voucher programs in New York City, Dayton, Ohio, and Washington, D.C., and the nationwide CSF [Children’s Scholarship Fund] program by means of randomized field trials (RFTs), a research design that is well known in the medical field. … In an RFT, subjects are randomly assigned to a treatment or control group.”

Page 44: “A total of 1,500 vouchers were offered to public school students in New York City, 811 in Washington, and 515 in Dayton.46 Because vouchers were allocated randomly, the characteristics of those offered vouchers did not differ significantly from members of the control group.”

Pages 145–146:

All impacts are calculated in terms of national percentile ranking (NPR) points, which vary between 0 and 100, with a national median of 50. … As mentioned, to produce more stable estimates, we provide estimates that combine reading and math scores. (However, impacts did not differ significantly by subject matter.) …

Table 6-1 … reveals no overall private school impact of switching to a private school on student test scores in the three cities. Nor does it reveal any private school impact on the test scores of students from other than African American backgrounds (mainly Hispanic students in New York and white students in Dayton). However, the table shows that the switch to a private school had significantly positive impacts on the test scores of African American students.

Table 6-1 shows that African Americans in all three cities gained, on average, roughly 3.9 NPR points after Year I, 6.3 points after Year II, and 6.6 points after Year III.21 Results for African American students varied by city. In Year I, the only significant gains were observed in New York City, where African Americans attending a private school scored, on average, 5.4 percentile points higher than members of the control group.22 In Year II, significant impacts on African American test scores were evident in all three cities, ranging from 4.3 percentile points in New York City, to 6.5 points in Dayton, to 9.2 points in Washington, D.C. The Year III impact of 9.2 points on African American students’ test scores in New York City is statistically significant. The –1.9 point impact in Year III in Washington, however, is not.

[576] Paper: “Vouchers in Charlotte.” By Jay P. Greene. Education Next, Summer 2001. Pages 55–60. <www.educationnext.org>

Page 55:

During the 1999–2000 school year, the private Children’s Scholarship Fund (CSF) offered partial scholarships to low-income students in Charlotte, North Carolina. The partial scholarships defrayed up to $1,700 in tuition expenses at the private elementary or secondary school of a family’s choosing. Scholarships were awarded by lottery to families who went through an application process, because not enough funds were available to provide them to all the interested families.

The awarding of scholarships by lottery created a rare opportunity in educational research: a field experiment in which students were assigned randomly to both public and private schools, thus allowing me to test the effects of receiving a voucher and, more generally, to compare the performance of public and private schools. The study used both standardized test scores and surveys of parents and students to evaluate the effect of the scholarship program on both academic performance and student and parental satisfaction.

Pages 56–57:

A lottery was used to select which students would be offered scholarships, creating, as the statistical analysis has confirmed, two groups that were nearly identical. While noncompliance and nonparticipation have created differences between the two groups, they are similar enough that adjusting for observed characteristics is likely to produce highly reliable results.

Results

After one year, the results show that students who used a scholarship to attend a private school scored 5.9 percentile points higher on the math section of the ITBS [Iowa Test of Basic Skills] than comparable students who remained in public schools. Choice students scored 6.5 percentile points higher than their public school counterparts in reading after one year….

Using a statistical technique known as instrumental analysis to adjust for the potential bias of noncompliance yields results that remain strong and positive. The results of this analysis show that, after only one year’s time, attending a private-school improved student performance on standardized tests in math and reading by between 5.4 and 7.7 percentile points.

On average, a scholarship raised students from the 30th percentile to the 37th percentile. This is a fairly large gain—approximately 0.25 standard deviation in math and reading. To put this gain in perspective, the difference nationwide between minority and white students is approximately 1.0 standard deviation. The benefits of the Charlotte CSF program are roughly one-quarter as large at the end of only one year.

[577] Paper: “Experimentally Estimated Impacts of School Vouchers on Educational Attainments of Moderately and Severely Disadvantaged Students.” By Albert Cheng and Paul E. Peterson. Sociology of Education, April 2021. Pages 159–174. <journals.sagepub.com>

Page 160:

Educational outcomes are largely dependent on multiple forms of capital: human, social, economic, and cultural. Individuals and their communities possess combinations of these forms of capital to varying degrees….

But evaluations of programs expected to rectify educational disparities, including those involving school choice, seldom pay attention to distinctions in degree of deprivation. For example, using a randomized control trial to evaluate the School Choice Scholarships Foundation (SCSF) program, a school voucher intervention in New York City, Chingos and Peterson (2015) estimate modest positive effects of vouchers offered to low-income minority elementary students on college enrollments and four-year degree attainment. Yet this study and others do not explore theoretically significant heterogeneities by joint deprivation by ethnicity and socioeconomic status (SES) within that population….

In this article, we estimate these theoretically significant heterogeneous effects on college enrollment and degree attainment ignored by Chingos and Peterson (2015) in their study of the long-term effects of the same 1997 New York City voucher intervention. Our postsecondary outcome data come from the National Student Clearinghouse as of 2017, four years later than the time Chingos and Peterson observed college enrollment and attainment. The additional time span allows for identification of college attainment levels even if a student’s education was interrupted.

We detect no significant intention-to-treat (ITT) effects for severely disadvantaged students, that is, for ethnic-minority students from extremely low-income households or, separately, for ethnic-minority first-generation college students. However, we find effects of 8 percentage points on any college enrollment, and 7 percentage-point effects on any degree and four-year degree attainment, for ethnic-minority students whose mother attended college. Moreover, treatment-on-treated (TOT) effects, that is, the effect on attainment for moderately disadvantaged students who used the voucher opportunity to attend a private school, are 11 to 15 percentage points for enrollment at any college and as much as 10 percentage points for degree attainment at any college and at four-year institutions. Given the generally low levels of educational attainment among disadvantaged students, these effects are quite large. They are approximately 20 to 30 percent higher for enrollment at any college, over 40 percent higher for degree attainment at any college, and 67 percent higher for four-year degree attainment than what would otherwise be the case, as estimated by control group rates.

Page 163:

Attrition occurred when participating students were not found in the administrative data available from the National Student Clearinghouse (NSC). Fortunately, that attrition was very small. Information … needed to attempt a match to the NSC database was available for 2,634 students, or 98.8 percent of the 2,666 students in the analytic sample. …

Among students for whom we attempted a college enrollment match, 1,356 had been assigned to the treatment group and 1,278 to the control group.

[578] Report: “Evaluation of the DC Opportunity Scholarship Program: Impacts Three Years After Students Applied.” By Ann Webber and others. U.S. Department of Education, Institute of Education Sciences, May 2019. <files.eric.ed.gov>

Pages 1–2:

The District of Columbia (DC) Opportunity Scholarship Program (OSP) is the only federally funded program that provides vouchers to low-income families to send their children to private schools. Congress created the OSP in 2004 and reauthorized it in 2011 under the Scholarships for Opportunity and Results (SOAR) Act.1 As part of the 2011 SOAR Act, Congress required an independent evaluation of the OSP. This is the sixth2 and final report from that evaluation, describing how the OSP affected students and their parents three years after they applied to the program. Specifically, the report examines impacts on student achievement, student and parent satisfaction with schools, student and parent perceptions of school safety, and parent involvement with education.

Overview of the Program

The SOAR Act establishes criteria for student eligibility, the groups of students who receive priority for scholarships, and scholarship dollar amounts (exhibit 1). A program operator administers the OSP through a grant awarded by the U.S. Department of Education. Program operators establish protocols for families applying to the program, recruit applicants and schools, award scholarships, and place and monitor scholarship awardees in participating private schools (see appendix A-1 for more information). Participating private schools must agree to requirements regarding nondiscrimination in admissions, fiscal accountability, having teachers with at least a bachelor’s degree, and cooperation with an evaluation of the program.

Overview of the Evaluation

Congress required the evaluation to use “the strongest possible research design” to measure the impacts of being offered and using an OSP scholarship on key outcomes….3 To determine the OSP’s effectiveness, an experiment—considered the “gold standard” of evaluation methodology—was conducted that compared outcomes for two groups. The treatment group was comprised of students who applied for a scholarship and were offered one. The control group was comprised of students who applied for a scholarship but were not offered one. Lotteries were used to randomly award scholarships to applicants. Randomization helped to ensure that the two groups being compared were truly similar at the time of OSP application, and that—other than by chance—the only difference that could influence outcomes was whether applicants received a scholarship offer or not.

Pages 4–5:

There were no statistically significant impacts on either reading or mathematics achievement three years after students applied to the program. Students in the group that received a scholarship offer scored 0.1 percentile points higher on the mathematics test, and 1.6 percentile points lower on the reading test, than students in the control group (figure 2) after three years. Students using a scholarship scored 0.2 percentile points higher on the mathematics test, and 2.1 percentile points lower on the reading test, than students in the control group. None of the differences were statistically significant. …

There were no statistically significant impacts on either reading or mathematics achievement for students in any of the study’s eight subgroups. In each case, those offered or using a scholarship had test scores that were similar to those not offered a scholarship (appendix tables C-1 and C-2). This included (1) students attending schools in need of improvement when they applied to the OSP and students not attending schools in need of improvement, (2) students entering elementary and secondary grades when they applied, (3) students scoring above or below the median11 in reading at the time of application, and (4) students scoring above or below the median in mathematics at the time of application.

[579] Paper: “Distributional Analysis in Educational Evaluation: A Case Study from the New York City Voucher Program.” By Marianne P. Bitler and others. Journal of Research on Educational Effectiveness, July–September, 2015. Pages 419–450. <www.ncbi.nlm.nih.gov>

Page 1:

We use quantile treatment effects estimation to examine the consequences of the random-assignment New York City School Choice Scholarship Program (NYCSCSP) across the distribution of student achievement. Our analyses suggest that the program had negligible and statistically insignificant effects across the skill distribution.

Pages 19–20:

In the first model of Panel 1 (column 1), we replicate Howell & Peterson’s estimates of the treatment effect for African-Americans (point estimates are identical, SEs [standard error] nearly so, differing due to our use of bootstrapping by family within strata for SEs). This analysis indicates that the voucher offer significantly improved black student math achievement in the study’s first and third years. (This analysis yields a positive, but not statistically significant, treatment effect for black students in Year 2.) Similarly, in the first model of Panel 2 (column 1), we attempt to replicate Krueger and Zhu’s racial categorization scheme to estimate of the effects of the voucher offer for African-Americans. While this replication is not perfect (our sample sizes are 1 observation off from their reported sample sizes),12 it returns an estimate of the African-American treatment effect that is very close to Krueger and Zhu’s published findings. Using the Krueger and Zhu definition of African-American and also treating the 99s as valid percentile scores of 0, we find a positive and significant treatment effect on Math scores in Year 1, but no effects in subsequent years.

[580] Paper: “Using Experimental Economics to Measure the Effects of a Natural Educational Experiment on Altruism.” By Eric Bettinger and Robert Slonim. Journal of Public Economics, September 2006. Pages 1625–1648. <www.sciencedirect.com>

Abstract:

Economic research examining how educational intervention programs affect primary and secondary schooling focuses largely on test scores although the interventions can affect many other outcomes. This paper examines how an educational intervention, a voucher program, affected students’ altruism. The voucher program used a lottery to allocate scholarships among low-income applicant families with children in K–8th grade. By exploiting the lottery to identify the voucher effects, and using experimental economic methods, we measure the effects of the intervention on children’s altruism. We also measure the voucher program’s effects on parents’ altruism and several academic outcomes including test scores. We find that the educational intervention positively affects students’ altruism towards charitable organizations but not towards their peers. We fail to find statistically significant effects of the vouchers on parents’ altruism or [students’ math] test scores.

[581] Paper: “Another Look at the New York City School Voucher Experiment.” By Alan B. Krueger and Pei Zhu. American Behavioral Scientist, January 2004. Pages 658–698. <journals.sagepub.com>

Abstract:

This article reexamines data from the New York City school choice program, the largest and best-implemented private school scholarship experiment yet conducted. In the experiment, low-income public school students in kindergarten to Grade 4 were eligible to participate in a series of lotteries for a private school scholarship in May 1997. Data were collected from students and their parents at baseline and in the spring of each of the next 3 years.

Page 693:

Our reanalysis of the New York City school voucher experiment suggests that the positive effect of vouchers on the achievement of African American students emphasized by previous researchers is less robust than commonly acknowledged. Most important, if the cohort of students who were enrolled in kindergarten when the experiment began is included in the sample, the effect of vouchers is greatly attenuated. As the results in Table 5 indicate, treating mother’s and father’s race symmetrically further attenuates the effect of school vouchers for African American children. The evidence is stronger that the availability of private school vouchers raised achievement on math than on reading exams after 3 years, but both effects are relatively small if the sample includes students with missing baseline test scores and students who have at least one Black parent.

[582] Paper: “The Effects of the Louisiana Scholarship Program on Student Achievement and College Entrance.” By Heidi H. Erickson, Jonathan N. Mills, and Patrick J. Wolf. Journal of Research on Educational Effectiveness, August 11, 2021. Pages 861–899. <doi.org>

Page 861:

The Louisiana Scholarship Program (LSP) offers publicly funded vouchers to moderate- and low-income students in low-performing public schools to enroll in participating private schools. Established in 2008 as a pilot program in New Orleans, the LSP expanded statewide in 2012. Drawing upon the random lotteries that placed students in LSP schools, we estimate the causal impact of using an LSP voucher to enroll in a private school on student achievement on the state accountability assessments in math, English Language Arts, and science over a four-year period, as well as on the likelihood of enrolling in college.

Pages 862–863:

Our analyses use oversubscription lotteries to generate unbiased estimates of the participant impacts of the LSP.1 We use admission lotteries for student’s first-choice, or top-ranked,2 private school as instrumental variables to estimate the causal effect of using an LSP scholarship to enroll in a participating private school for applicants induced to attend that school as a result of winning the lottery…. Our analysis uses student-level data obtained via a data-sharing agreement with the state of Louisiana along with college enrollment data from the National Student Clearinghouse. We operationalize achievement as student performance on the criterion-referenced tests in English Language Arts, math, and science that the state mandates for public school accountability purposes. College entrance is a dichotomous measure defined if students enrolled, or not, in at least one semester within six months of their expected high school graduation. …

Using the same lottery-based analytical strategy, we also estimate the effect of LSP usage on the likelihood that students enter college. We estimate that 42.0% of scholarship users entered any college compared to 38.8% of their control group counterparts; however, this difference of 3.2 percentage points is not statistically distinguishable from zero. In addition, we find no significant difference in the rates that scholarship users and their control counterparts enter two- or four-year institutions.

Our research contributes to the existing literature on the participant effects of school voucher programs in multiple ways. First, it uses a highly rigorous lottery-based design to estimate causal programmatic effects by avoiding self-selection bias concerns. Second, ours is just the second lottery-based evaluation of a private school voucher program in the U.S. that includes both achievement and attainment results.3 Third, it is among the first evaluations of the new wave of statewide school voucher programs … which are replacing the more heavily studied urban initiatives of the past. Finally, this study examines medium-run student outcomes of achievement effects in ELA [English Language Arts], math, and science after four years, as well as initial educational attainment effects. These outcomes, taken jointly, provide a more thorough understanding of how the effects of this voucher program evolved over time than is the case for most other voucher evaluations.

1 We describe our research design as “lottery-based” and “causal.” We do not call it “experimental” because researchers disagree regarding whether or not treatment-on-the-treated estimates using lotteries as instrumental variables, such as the ones featured in our study, are truly “experimental.” We do not enter that debate here. We characterize our findings as “causal” because there is general agreement that random lotteries are the ideal instrumental variables and that sound instrumental variables recover unbiased causal estimates of the impact of a treatment on the self-selected complier subgroup of participants offered the treatment….

2 Eligible LSP applicants could submit up to five rank-ordered private school preferences. We focus on first-choice school lotteries to ensure independence of treatment assignment, as whether or not a student won a lottery for placement in a lower-choice school likely was influenced by factors such as the number and popularity of non-first-choice schools listed which could bias comparisons of “any-lottery winners” to “no-lottery winners.”

Pages 873–874:

In addition to the application data the LDOE [Louisiana Department of Education] provide, we use data from the National Student Clearinghouse (NSC) Student Tracker Service for the college entrance analysis. The NSC collects data on college entrance, persistence, and degree attainment from nearly all public and private post-secondary institutions in the U.S. The comprehensiveness of the NSC database allows us to capture records for students in our sample who attend college within or outside of Louisiana. We focus on college enrollment, in both two- and four-year institutions, as few students in our sample are old enough to have completed a degree. Seventh-grade LSP applicants, in the first year of the program, could have enrolled in one semester of college by fall 2018 and twelfth grade applicants could have enrolled in up to five and a half years of college by fall 2018. We consider students “entering college” if they enrolled in at least one semester of courses within six months following their expected high school graduation, as confirmed by the NSC data.12 We believe this is a conservative estimate of college entrance as many students enter college a year or two following high school graduation; however, we only observe the seventh-grade applicants six months past their expected high school graduation.

Page 882:

We estimate the effect of LSP usage on enrolling in a two- or four-year institution in separate models. Our results indicate that scholarship users enter two- and four-year institutions at similar rates as their control student counterparts. Scholarship users enroll at a slightly higher rate, by 3.3 percentage points, in two-year institutions and at a slightly lower rate, by 0.2 percentage points, in four-year institutions than their control counterparts, but these differences are statistically null….

[583] Report: “The Effects of Means-Tested Private School Choice Programs on College Enrollment and Graduation.” By Matthew M. Chingos and others. Urban Institute, July 2019. <www.urban.org>

Pages 17–18:

The DC Opportunity Scholarship Program (OSP), created by an act of Congress in January 2004, provides scholarships to low-income families (defined as those making no more than 185 percent of the federal poverty level) to attend private schools. Scholarships are available only to DC residents attending participating DC private schools. Participating schools must agree to such requirements as nondiscrimination in admissions, fiscal accountability, and the provision of data and information for evaluation purposes (Wolf and others 2005).

The program has enrolled between 1,000 and 2,000 students each year since its inception in 2004–05, with a peak of 1,930 in 2007–08 (Chingos 2018), and 1,653 enrolled in the most recent year for which data are available (2017–18).10 Scholarship amounts were initially capped at $7,500 (about $9,700 in 2017 dollars) (Wolf and others 2005); the maximum is now $9,022 for elementary and middle school and $13,534 for high school.11

Enrollment in the program is small relative to public school enrollment in DC. OSP enrollment has never exceeded 3 percent of total enrollment (district, public charter, and OSP) and is currently closer to 2 percent.12 This largely reflects the fact that program funding can accommodate only a limited number of students (roughly 1,200 scholarships per year in recent years).13

Data and Methods

The present study builds on existing work on the OSP by using administrative records to measure the college enrollment patterns of participants in the first two lotteries. We track the college enrollment outcomes of a subset of 1,776 students who applied for a scholarship in 2004 or 2005 and are now old enough to have potentially graduated from high school and enrolled in college.

Urban Institute researchers worked with the current OSP administrator, Serving Our Children (SOC), to reconstruct baseline files from the original lottery applications that the Washington Scholarship Fund (the original OSP administrator) used in 2004 and 2005 and matched them to college enrollment records from the National Student Clearinghouse. …

… The results in the present study are updated to include an additional 182 students who are now old enough to be observed for at least two years following expected high school graduation, for a total of 1,776 students. Descriptive statistics for this sample of students are provided in appendix table A.6. All results are weighted to reflect applicants’ likelihood of winning the lottery, as described in Chingos (2018).

These estimates are “intent to treat” (ITT) in that they capture the effects of being offered a scholarship, when in fact many students who were offered a scholarship did not use one. Among all students who won the lottery, 70 percent used a scholarship for at least one year.14 We report only ITT estimates throughout this study, but the effects of using a scholarship for at least one year can be calculated by dividing the ITT estimates by 0.7.15

… Overall, students offered a scholarship were somewhat less likely to enroll in college within two years of expected graduation from high school: 43 percent did compared with 46 percent of applicants who lost the lottery. None of these differences are statistically distinguishable from zero at conventional levels.

This pattern holds for both two- and four-year colleges and for four-year public and four-year private colleges…. Adding control variables has little impact on the results, as would be expected given random assignment.

[584] Paper: “The Effects of the Louisiana Scholarship Program on Student Achievement and College Entrance.” By Heidi H. Erickson, Jonathan N. Mills, and Patrick J. Wolf. Journal of Research on Educational Effectiveness, August 11, 2021. Pages 861–899. <doi.org>

Page 861:

The Louisiana Scholarship Program (LSP) offers publicly funded vouchers to moderate- and low-income students in low-performing public schools to enroll in participating private schools. Established in 2008 as a pilot program in New Orleans, the LSP expanded statewide in 2012. Drawing upon the random lotteries that placed students in LSP schools, we estimate the causal impact of using an LSP voucher to enroll in a private school on student achievement on the state accountability assessments in math, English Language Arts, and science over a four-year period, as well as on the likelihood of enrolling in college. The results from our primary analytic sample indicate substantial negative achievement impacts, especially in math, that diminish after the first year but persist after four years.

Pages 862–863:

Our analyses use oversubscription lotteries to generate unbiased estimates of the participant impacts of the LSP.1 … We operationalize achievement as student performance on the criterion-referenced tests in English Language Arts, math, and science that the state mandates for public school accountability purposes. …

Our research contributes to the existing literature on the participant effects of school voucher programs in multiple ways. First, it uses a highly rigorous lottery-based design to estimate causal programmatic effects by avoiding self-selection bias concerns. Second, ours is just the second lottery-based evaluation of a private school voucher program in the U.S. that includes both achievement and attainment results.3 Third, it is among the first evaluations of the new wave of statewide school voucher programs … which are replacing the more heavily studied urban initiatives of the past. Finally, this study examines medium-run student outcomes of achievement effects in ELA [English Language Arts], math, and science after four years, as well as initial educational attainment effects. These outcomes, taken jointly, provide a more thorough understanding of how the effects of this voucher program evolved over time than is the case for most other voucher evaluations.

Previous research indicates large negative impacts of LSP voucher usage on ELA, math, and science achievement after one year of participation (Abdulkadiroglu et al., 2018; Mills, 2015) that appear to attenuate over the following two years (Mills & Wolf, 2017a, 2017b). Our study indicates that the trend of improvement in LSP scholarship usage effects observed in prior research did not continue into the fourth year of the program. Although the negative effects students experienced in the first year of the program are substantially reduced by year four, students who used LSP vouchers to enroll in LSP participating private schools at any point between 2012–13 and 2015–16 continue to score significantly worse on state assessments than their control group counterparts. These effect estimates are statistically significant in most models and are negative in all tested subjects.

1 We describe our research design as “lottery-based” and “causal.” We do not call it “experimental” because researchers disagree regarding whether or not treatment-on-the-treated estimates using lotteries as instrumental variables, such as the ones featured in our study, are truly “experimental.” We do not enter that debate here. We characterize our findings as “causal” because there is general agreement that random lotteries are the ideal instrumental variables and that sound instrumental variables recover unbiased causal estimates of the impact of a treatment on the self-selected complier subgroup of participants offered the treatment.

Page 873:

Our analysis of the LSP’s impact on student achievement uses LDOE- [Louisiana Department of Education] provided student application and administrative records. These data include information on students’ school choice sets, results of placement lotteries, demographic background, and achievement on Louisiana’s statewide assessments. The Louisiana state accountability system emphasizes test-based accountability. We use student performance on the Louisiana state assessments in grades three through eight as our outcome measure of interest in the achievement analysis. Private schools are required to test all students participating in the LSP using the state accountability assessment for any grade in which the public school system also tests its students. The 2011–12 (baseline), 2012–13 (Year 1 Outcome), and 2013–14 (Year 2 Outcome) assessment data contain student scores on the LEAP [Louisiana Educational Assessment Program] and iLEAP [Integrated Louisiana Educational Assessment Program] exams, criterion-referenced tests aligned to Louisiana state education standards. The 2014–15 (Year 3 Outcome) data instead provide student scores in ELA and math on the PARCC [Partnership for Assessment of Readiness in College and Career], a criterion referenced test aligned with the Common Core standards. In science, in contrast, students continued to take the LEAP/iLEAP exams aligned with state standards.11 Beginning in 2015–16 (Year 4 Outcome), Louisiana students took the LEAP 2025 assessments, a revamped version of the LEAP/iLEAP, in grades three through eight in ELA, math, and science. These assessments are aligned with the 2014–15 PARCC tests (Louisiana Department of Education, 2015).

Page 878:

We first present the results for our primary estimates of the impact of LSP scholarship usage on student achievement.22 Prior work reported large declines in both math and ELA achievement after one year of voucher usage (Abdulkadiroglu et al., 2018; Mills, 2015). These negative effects declined by half after Year 2 (Mills & Wolf, 2017a) and were not statistically significant after Year 3 (Mills & Wolf, 2017b). In contrast, the results presented here indicate large negative effects of ever using an LSP voucher by Year 4 on achievement on the Louisiana state assessments in ELA, math, and science. As with earlier work, the negative effect estimates are particularly large for math.

[585] Paper: “Free to Choose: Can School Choice Reduce Student Achievement?” By Atila Abdulkadirogl, Parag A. Pathak, and Christopher R. Walters. American Economic Journal: Applied Economics, January 2018. <pubs.aeaweb.org>

Page 176:

This paper provides a striking contrast to the literature on lottery-based studies of school choice. We evaluate the Louisiana Scholarship Program (LSP), a school choice program that provides private school vouchers for disadvantaged Louisiana students attending low-performing public schools. Income-eligible students enrolled in public schools graded “C” or below on an achievement-based rating system may apply for an LSP voucher to cover tuition at an eligible private school. Private schools gain eligibility by applying to the Louisiana Board of Elementary and Secondary Education to host LSP students (Louisiana Department of Education 2015a). If the number of eligible applicants to a private school exceeds the available seats, LSP vouchers are distributed via stratified random lottery. We estimate causal effects of LSP vouchers by comparing outcomes for lottery winners and losers in 2013, the first year after the LSP expanded throughout Louisiana. Lottery-based estimates show that LSP vouchers dramatically reduce academic achievement. Attending an LSP-eligible private school lowers math scores by an average of 0.41 standard deviations (σ) and reduces reading, science, and social studies scores by 0.08σ, 0.26σ, and 0.33σ one year after the lottery. LSP participation shifts the distribution of scores downward in all four subjects, increasing the likelihood of a failing score by between 24 and 50 percent. These impacts are similar across family income levels and geographic locations. LSP voucher effects are more negative in earlier grades, though vouchers reduce achievement in later grades as well.

[586] Paper: “Private School Vouchers and Student Achievement: A Fixed Effects Quantile Regression Evaluation.” By Carlos Lamarche. Labour Economics, August 2008. Pages 575–590. <doi.org>

Page 575:

Fundamental to the recent debate over school choice is the issue of whether voucher programs actually improve students’ academic achievement. Using newly developed quantile regression approaches, this paper investigates the distribution of achievement gains in the first school voucher program implemented in the US. We find that while high-performing students selected for the Milwaukee Parental Choice program had a positive, convexly increasing gain in mathematics, low-performing students had a nearly linear loss. However, the program seems to prevent low-performing students from having an even bigger loss experienced by students in the public schools.

[587] Transcript: “Full Interview Between President Obama and Bill O’Reilly.” Fox News, February 3, 2014. <www.foxnews.com>

O’Reilly: The secret to getting a je—good job is education. And in these chaotic families, the children aren’t well-educated because it isn’t—it isn’t, um, encouraged at home as much as it is in other precincts. Now, school vouchers is a way to level the playing field. Why do you oppose school vouchers when it would give poor people a chance to go to better schools?

Obama: Actually—every study that’s been done on school vouchers, Bill, says that it has very limited impact if any—

O’Reilly: Try it.

OBAMA: On—it has been tried, it’s been tried in Milwaukee, it’s been tried right here in DC—

O’Reilly: [OVERLAP] And it worked here.

Obama: No, actually it didn’t. When you end up taking a look at it, it didn’t actually make that much of a difference. So what we have been supportive of is, uh, something called charters. Which, within the public school system gives the opportunity for creative experiments by teachers, by principals to-to start schools that have a different approach. And—

O’Reilly: [OVERLAP] You would revisit that? I—I just think—I used be, teach in a Catholic school, a—and I just know—

Obama: [OVERLAP] Bill—you know, I—I’ve taken, I’ve taken—I’ve taken a look at it. As a general proposition, vouchers has not significantly improved the performance of kids that are in these poorest communities—

O’Reilly: [OVERLAP] [INAUDIBLE]

Obama: Some charters—some charters are doing great. Some Catholic schools do a great job, but what we have to do is make sure every child….

[588] Report: “Evaluation of the DC Opportunity Scholarship Program.” By Patrick Wolf and others. U.S. Department of Education, Institute of Education Sciences, June 2010. <ies.ed.gov>

Page xvii:

Guided by language in the statute, the evaluation of the OSP [Opportunity Scholarship Program] relied on lotteries of eligible applicants—random chance—to create two statistically equivalent groups who were followed over time and whose outcomes were compared to estimate Program impacts. A total of 2,308 eligible applicants in the first two years of Program implementation were entered into scholarship lotteries (492 in year one, called “cohort 1,” and 1,816 in year two, called “cohort 2”). Across the cohorts, 1,387 students were randomly assigned to the impact sample’s treatment group (offered a scholarship), while the remaining 921 were assigned to the control group (not offered a scholarship).

Pages xix–xxi:

Student Achievement

• Overall reading and math test scores were not significantly affected by the Program, based on our main analysis approach. On average over the 40-plus months of potential participation, the treatment group scored 3.90 points higher in reading and .70 points higher in math than the control group, but these differences were not statistically significant (figure ES-2).

• No significant impacts on achievement were detected for students who applied from SINI [Schools in Need of Improvement] 2003–05 schools, the subgroup of students for whom the statute gave top priority, or for male students, or those who were lower performing academically when they applied.

• The Program may have improved the reading but not math achievement of the other three of six student subgroups. These include students who came from not SINI 2003–05 schools (by 5.80 scale score points), who were initially higher performing academically (by 5.18 points), or who were female (5.27 points). However, the impact estimates for these groups may be due to chance after applying a statistical test to adjust for multiple comparisons.

High School Graduation (Educational Attainment)

• The offer of an OSP scholarship raised students’ probability of completing high school by 12 percentage points overall. The graduation rate based on parent-provided information6 was 82 percent for the treatment group compared to 70 percent for the control group (figure ES-3). There was a 21 percent difference (impact) for using a scholarship to attend a participating private school.

• The offer of a scholarship improved the graduation prospects by 13 percentage points for the high-priority group of students from schools designated SINI in 2003–05 (79 percent for the treatment group versus 66 percent for the control group) (figure ES-3). The impact of using a scholarship on this group was 20 percentage points.

• Two other subgroups had statistically higher graduation rates as a result of the Program. Those who entered the Program with relatively higher levels of academic performance had a positive impact of 14 percentage points from the offer of a scholarship and 25 percentage points from the use of a scholarship. Female students had a positive impact of 20 percentage points from the offer of a scholarship and 28 percentage points from the use of a scholarship.

• The graduation rates of students from the other subgroups were also higher if they were offered a scholarship, but these differences were not statistically significant.

[589] Webpage: “The Executive Branch.” White House. Accessed July 7, 2023 at <www.whitehouse.gov>

Under Article II of the Constitution, the President is responsible for the execution and enforcement of the laws created by Congress. Fifteen executive departments—each led by an appointed member of the President’s Cabinet—carry out the day-to-day administration of the federal government. …

Department of Education

The mission of the Department of Education is to promote student learning and preparation for college, careers, and citizenship in a global economy by fostering educational excellence and ensuring equal access to educational opportunity.

The Department administers federal financial aid for higher education, oversees educational programs and civil rights laws that promote equity in student learning opportunities, collects data and sponsors research on America’s schools to guide improvements in education quality, and works to complement the efforts of state and local governments, parents, and students.

The U.S. Secretary of Education oversees the Department’s 4,200 employees and $68.6 billion budget.

[590] Report: “Losing Our Future: How Minority Youth are Being Left Behind by the Graduation Rate Crisis.” By Gary Orfield and others. Civil Rights Project at Harvard University, Urban Institute, Advocates for Children of New York,

and Civil Society Institute, February 25, 2004. <escholarship.org>

Page 2:

In an increasingly competitive global economy, the consequences of dropping out of high school are devastating to individuals, communities and our national economy. At an absolute minimum, adults need a high school diploma if they are to have any reasonable opportunities to earn a living wage. A community where many parents are dropouts is unlikely to have stable families or social structures. Most businesses need workers with technical skills that require at least a high school diploma. Yet, with little notice, the United States is allowing a dangerously high percentage of students to disappear from the educational pipeline before graduating from high school.

[591] Paper: “The Importance of the Ninth Grade on High School Graduation Rates and Student Success in High School.” By Kyle M. McCallumore and Ervin F. Sparapani. Education, March 2010. Pages 447–456. <eric.ed.gov>

Abstract: “[T]here is really not much appealing about the reality of the problems in the American education system that permeate beyond kindergarten. Graduation rates are one of the most troubling concerns.”

[592] Book: High School Dropout, Graduation, and Completion Rates: Better Data, Better Measures, Better Decisions. Edited by Robert M. Hauser and Judith Anderson Koenig. National Academies Press, 2011. <www.nap.edu>

Page 1: “High school graduation and dropout rates have long been used as indicators of educational system productivity and effectiveness and of social and economic well-being.”

[593] “2012 Democratic Party Platform.” Democratic National Committee, September 2012. <www.presidency.ucsb.edu>

Page 5:

Too many students, particularly students of color and disadvantaged students, drop out of our schools, and Democrats know we must address the dropout crisis with the urgency it deserves. The Democratic Party understands the importance of turning around struggling public schools. We will continue to strengthen all our schools and work to expand public school options for low-income youth, including magnet schools, charter schools, teacher-led schools, and career academies.

[594] Paper: “School Vouchers and Student Outcomes: Experimental Evidence from Washington, DC.” By Patrick J. Wolf and others. Journal of Policy Analysis and Management, February 19, 2013. Pages 246–270. <onlinelibrary.wiley.com>

Abstract:

Here we examine the empirical question of whether or not a school voucher program in Washington, DC, affected achievement or the rate of high school graduation for participating students. The District of Columbia Opportunity Scholarship Program (OSP) has operated in the nation’s capital since 2004, funded by a federal government appropriation. Because the program was oversubscribed in its early years of operation, and vouchers were awarded by lottery, we were able to use the “gold standard” evaluation method of a randomized experiment to determine what impacts the OSP had on student outcomes. Our analysis revealed compelling evidence that the DC voucher program had a positive impact on high school graduation rates, suggestive evidence that the program increased reading achievement, and no evidence that it affected math achievement.

Page 258: “Results are described as statistically significant or highly statistically significant if they reach the 95 percent or 99 percent confidence level, respectively.”

Page 260:

The attainment impact analysis revealed that the offer of an OSP scholarship raised students’ probability of graduating from high school by 12 percentage points (Table 3). The graduation rate was 82 percent for the treatment group compared to 70 percent for the control group. The impact of using a scholarship was an increase of 21 percentage points in the likelihood of graduating. The positive impact of the program on this important student outcome was highly statistically significant.

Page 261:

We observed no statistically significant evidence of impacts on graduation rates at the subgroup level for students who applied to the program from non-SINI [Schools in Need of Improvement] schools, with relatively lower levels of academic performance, and male students. For all subgroups, the graduation rates were higher among the treatment group compared with the control group, but the differences did not reach the level of at least marginal statistical significance for these three student subgroups. …

Our analysis indicated a marginally statistically significant positive overall impact of the program on reading achievement after at least four years. No significant impacts were observed in math. The reading test scores of the treatment group as a whole averaged 3.9 scale score points higher than the scores of students in the control group, equivalent to a gain of about 2.8 months of additional learning. The calculated impact of using a scholarship was a reading gain of 4.8 scale score points or 3.4 months of additional learning (Table 4).

Page 262:

Reading … Adjusted impact estimate [=] 4.75 … p-value of estimates [=] .06 …

The reading impacts appeared to cumulate over the first three years of the evaluation, reaching the marginal level of statistical significance after two years and the standard level after three years. By that third-year impact evaluation, only 85 of the 2,308 students in the evaluation (3.7 percent) had graded-out of the impact sample, having exceeded 12th grade. Between the third-year and final-year evaluation, an additional 211 students (12.2 percent) graded-out of the sample, reducing the final test score analytic sample to a subgroup of the original analytic sample. Due to this loss of cases for the final test score analysis, the confidence interval around the final point estimates is larger than it was after three years, and the positive impact of the program on reading achievement was only statistically significant at the marginal level.

Page 266: “Here, in the form of the DC school voucher program, Congress and the Obama administration uncovered what appears to be one of the most effective urban dropout prevention programs yet witnessed.”

Page 267: “We did find evidence to suggest that scholarship use boosted student reading scores by the equivalent of about one month of additional learning per year. Most parents, especially in the inner city, would welcome such an improvement in their child’s performance.”

[595] Paper: “Better Schools, Less Crime?” By David J. Deming. Quarterly Journal of Economics, November 2011. Pages 2063–2115. <scholar.harvard.edu>

Page 2064:

In this article, I link a long and detailed panel of administrative data from Charlotte-Mecklenburg school district (CMS) to arrest and incarceration records from Mecklenburg County and the North Carolina Department of Corrections (NCDOC). In 2002, CMS implemented a district-wide open enrollment school choice plan. Slots at oversubscribed schools were allocated by random lottery. School choice in CMS was exceptionally broad-based.

Page 2065:

Across various schools and for both middle and high school students, I find consistent evidence that winning the lottery reduces adult crime.4 The effect is concentrated among African American males and youth who are at highest risk for criminal involvement. Across several different outcome measures and scalings of crime by severity, high-risk youth who win the lottery commit about 50% less crime. They are also more likely to remain enrolled and “on track” in school, and they show modest improvements on school-based behavioral outcomes such as absences and suspensions. However, there is no detectable impact on test scores for any youth in the sample.

Page 2070: “With over 150,000 students enrolled in the 2008–2009 school year, CMS is the 20th largest school district in the nation.”

Pages 2089–2090:

In Figure II, we see that winning the lottery leads to fewer felony arrests overall (p = .078), and the effect is concentrated among the highest risk youth (0.77 felony arrests for lottery losers, 0.43 for winners, p = .013). Similarly, the trimmed social cost of crime is lower overall for lottery winners (p = .040), but the effect is concentrated among the top risk quintile youth ($11,000 for losers, $6,389 for winners, p=.036). The concentration of effects in the top risk quintile is even more pronounced for the middle school sample. The social cost of arrested crimes is $12,500 for middle school lottery losers and $4,643 for winners (p = .020), and the effect for days incarcerated is similarly large and concentrated among high-risk youth (55.5 days for losers, 17.2 for winners, p = .003).

NOTE: Credit for bringing this paper to the attention of Just Facts belongs to Alex Adrianson of the Heritage Foundation. [Commentary: “School Choice a Crime Fighter.” By Alex Adrianson. InsiderOnline, March 2012.]

[596] “WWC Review of the Report ‘Better Schools, Less Crime?’ ” U.S. Department of Education, Institute of Education Sciences, What Works Clearinghouse, July 2013. <ies.ed.gov>

Page 2:

The research described in this report meets WWC [What Works Clearinghouse] evidence standards without reservations

Strengths: The intervention and comparison groups were formed by a well-implemented random process.

Cautions: The study had high levels of attrition for one outcome, the 2004 reading score. The study author demonstrated that students in the intervention and comparison groups were equivalent at baseline on reading achievement. Therefore, the analysis for this outcome meets WWC standards with reservations.

[597] Book: The Education Gap: Vouchers and Urban Schools (Revised edition). By William G. Howell and Paul E. Peterson with Patrick J. Wolf and David E. Campbell. Brookings Institution Press, February 14, 2006 (first published in 2002). <www.brookings.edu>

Rear cover:

William G. Howell is an associate professor in the Government Department at Harvard University and deputy director of the Program on Education Policy and Governance at Harvard. …

Paul E. Peterson is the Henry Lee Shattuck Professor of Government and director of the Program on Education Policy and Governance at Harvard, a senior fellow at the Hoover institution, and editor-in-chief of Education Next.

Page 30:

No publicly funded voucher program offers all students within a political jurisdiction the opportunity to attend the private school of their choice. All are limited in size and scope, providing vouchers only to students who come from low-income families, who attend “failing” public schools, or who lack a public school in their community.

Page 194: “Most publicly funded voucher programs today are so small that they do little to enrich the existing educational market.”

Pages 196–197:

Most privately funded voucher programs operating today promise financial support for only three to four years.

In the short term, vouchers may yield some educational benefits to the low-income families that use them. But sweeping, systemic change will not materialize as long as small numbers of vouchers, worth small amounts of money, are offered to families for short periods of time. The claims of vouchers’ strongest advocates as well as those of the most ardent opponents, both of whom forecast all kinds of transformations, will be put to the test only if and when the politics of voucher programs stabilizes, support grows, and increasing numbers of educational entrepreneurs open new private schools.

[598] Article: “Spending in Nation’s Schools Falls Again, with Wide Variation Across States.” By Emma Brown. Washington Post, January 27, 2016. <www.washingtonpost.com>

“Per-pupil spending is ‘the gold standard in school finance,’ said Stephen Cornman, of the National Center for Education Statistics [NCES], which produced the analysis.”

NOTE: Cornman is a statistician who specializes in school finance. “Statistician … Administrative Data Division: Elementary and Secondary Branch, NCES … Specialties and Functions: School Finance and all Fiscal Surveys: National Public Education Finance Survey; School District Finance Survey” [Webpage: “Stephen Cornman.” U.S. Department of Education, National Center for Education Statistics. Accessed January 29, 2016 at <nces.ed.gov>]

[599] Book: Economics of Education. Edited by Dominic J. Brewer and Patrick J. McEwan. Academic Press (an imprint of Elsevier), 2010.

Chapter: “School Quality and Earnings.” By J. R. Betts (University of California, San Diego). Pages 52–59.

Page 52: “What factors contribute high-quality schooling? A large literature studies the relation between class size, teacher qualifications, spending per pupil, and other measures of school inputs with gains in student achievement.”

Page 53: “The literature review below, unless stated otherwise, discusses the relatively large US literature. … The measure of school resources most typically used is school spending per pupil in the district attended, or in the worker’s state of birth.”

[600] Handbook of Research in Education Finance and Policy (2nd edition). Edited by Helen F. Ladd and Margaret E. Goertz. Routledge, 2015.

Chapter 15: “Measuring Equity and Adequacy in School Finance.” By Thomas A Downes and Leanna Stiefel. Pages 244–259.

Page 244:

Over the past 45 years, researchers have devoted significant effort to developing ways to measure two important goals of state school finance systems: the promotion of equity and, more recently the provision of adequacy. Equity, as the term is traditionally used in the school finance literature, is a relative concept that is based on comparisons of inputs (often aggregated into a per-pupil spending measure across school districts). Thus, an equitable finance system is one that reduces to a “reasonable level” the disparity in per-pupil spending across a state’s districts.

Page 245: “While the equity concepts are defined in terms of the treatment of individuals, school finance systems are designed for districts not individuals. Thus the concepts are translated from the individual to the district level by focusing on averages across groups of individuals.”

[601] Book: The Education Gap: Vouchers and Urban Schools (Revised edition). By William G. Howell and Paul E. Peterson with Patrick J. Wolf and David E. Campbell. Brookings Institution Press, 2006 (first published in 2002). <www.brookings.edu>

Pages 200–202:

Vouchers, by themselves, do not change the amount spent on public education, only the way it is distributed.

In fact, declining enrollments can actually help school systems that rely principally on local taxes because they secure the same amount of money to educate fewer students.

With state funding, the picture changes. For more than a century, almost all states have allocated most of their money to school districts on a per-pupil basis—apparently in the belief that the cost of schooling varies directly with the number of students being taught.

Fluctuations in student enrollments are not uncommon. Over the past few decades, changing birth rates have had marked consequences for public school enrollments across the nation. In 1971, nearly 51.3 million students were enrolled in elementary and secondary schools. A dozen years later, just under 45 million students attended public schools, a decline of more than 12 percent. …

… Although public schools lose state funding [when enrollment declines], they retain all of their local funding, and they have fewer students to teach. The net fiscal impact, therefore, need not cripple public schools. Indeed, it may actually prove salutary. Some simple math makes the point.

Assume that a district receives 45 percent of its funding from the local government, another 45 percent from the state, and 10 percent from the federal government. Next, assume that a voucher program is introduced and that 20 percent of public school students switch to private schools. The public schools automatically lose the state and federal aid that follows those students. Because the district retains all of its local funding while having fewer students to teach, however, per-pupil expenditures actually increase by roughly 11 percent.

[602] See the section above on the costs of public and private schools.

[603] Book: Economics of Education: Research and Studies. Edited by George Psacharopoulos. Pergamon Press, 1987.

Chapter: “Cost Analysis in Education.” By M. Woodhall. Pages 393–399.

Page 394:

The additional cost attributable to one extra student is called the marginal cost, or sometimes the incremental cost. It is measured by the increase in total costs which occurs as a result of increasing enrollment by one unit.

The relationship between average and marginal costs varies between different institutions and depends on the form of the cost function, that is, the relationship between cost and size. It is obvious that total costs will increase if the number of students enrolled in a school or other institution increases, but average and marginal costs may increase, decrease, or remain constant as the number of students changes. There are three possible ways in which average and marginal costs may change as a result of an increase in enrollment. The reason why average and marginal costs vary in different circumstances is that in schools, colleges, or other institutions some costs are fixed, while others are variable with respect to size or number of students. The way in which average and marginal costs change as the total number of students increases depends on whether the majority of costs are fixed or variable and whether all resources are fully utilized or whether there is any spare capacity, which would mean that the number of students could be increased without incurring additional fixed costs.

Whether costs are fixed or variable depends, of course, on the time scale. In the short run, the costs of teachers as well as buildings may be fixed, although the number of books, stationery, and other materials is variable with the number of students. In the long run, however, the number of teachers employed may be varied. The short-run marginal costs of education are therefore likely to be lower than the long-run marginal costs. The extra costs incurred when additional students are enrolled will also depend on the magnitude of the change involved. It may be impossible to measure the extra costs of enrolling one additional student, but perfectly possible to measure the additional costs of enrolling 50 or 100 students, or alternatively the marginal savings made by enrolling 50 or 100 fewer students. However, one recent study of the marginal costs of overseas students in the United Kingdom pointed out that:

clearly there is no such thing as a marginal cost or a marginal saving of overseas students: marginal costs and marginal savings arc not discrete numbers but stepwise functions; the marginal costs of adding 100 students might be zero, whereas the marginal costs of adding 200 students might be considerable; moreover, the marginal costs of adding 1,000 students is not twice the marginal costs of adding 500 students. Similar remarks apply to the case of marginal savings. (Blaug 1981 p. 55)

[604] Paper: “The Ripple Effect.” By David Figlio, Cassandra M. D. Hart, and Krzysztof Karbownik. Education Next, 2022. <www.educationnext.org>

What we really want to know is how market pressure affects the performance of local public schools over the long run. As a private-school choice program grows, how does increased competition affect educational outcomes for public-school students who don’t use scholarships or vouchers?

We examine these questions based on a rich dataset from the state of Florida, where a tax-credit scholarship program for low-income students has been operating since 2002. During that time, the number of participating students has grown sevenfold to nearly 110,000 as of 2017–18, or 4 percent of total K–12 school enrollment in the state. We construct an index of competitive pressure to measure the degree of market competition each student’s school faced prior to the program’s start. …

Instead, we find broad and growing benefits for students at local public schools as the school-choice program scales up. In particular, students who attend neighborhood schools with higher levels of market competition have lower rates of suspensions and absences and higher test scores in reading and math. And while our analysis reveals gains for virtually all students, we find that those most positively affected are students with the greatest barriers to school success, including those with low family incomes and less-educated mothers. …

We look at the program’s first 16 years, ending our analysis with the 2016–17 school year. Our data include students’ test scores, absences, and suspensions, as well as race, ethnicity, and whether they qualify for free or reduced-price school lunch. …

Our analysis finds consistent evidence that, as the scholarship program scaled up, academic and behavioral outcomes improved for students attending traditional public schools. More specifically, we find that students attending schools with more competitive pressure made larger gains as program enrollment grew statewide than did students at schools with less market competition. …

We also look at results according to the level of education of students’ mothers. As with income level, we find larger positive impacts among students whose mothers did not progress beyond high school compared to those whose mothers graduated from college. We then consider these factors in combination, along with other details such as whether Medicaid paid for the hospital bill at birth and the median income of the mother’s zip code at birth. We divide students into deciles based on their relative level of socioeconomic advantage to see whether the impacts of expanded competitive pressure differ along this spectrum of resources. While the effects are strongest for students in the bottom six deciles, students in every decile except the very top decile benefit from more competition. Notably, even students in the top decile do not suffer educational losses as a result of program expansion. …

We find consistent evidence that as more students use scholarships to attend private schools, students in public schools most likely to experience heightened competition due to the program see positive effects. Students at schools that face greater levels of market competition exhibited greater gains in reading and math tests compared to students attending public schools with less competitive pressure. …

We further find that program expansion and increased market pressure are associated with positive behavioral outcomes among non-scholarship students, which have not been well-explored in prior research on the effects of competition from voucher programs or charter schools.

Finally, we note that the public-school students who are most positively affected come from lower socioeconomic backgrounds, which is the set of students that schools would potentially lose to competing private schools under a scholarship or voucher program. However, in most cases smaller effects remain statistically significant, even for students who are very unlikely to qualify for scholarships themselves. This suggests that benefits may come partially through generalized school improvements rather than through improvements targeted solely at eligible students.

[605] Article: “Measuring Competitive Effects From School Voucher Programs: A Systematic Review.” By Anna J. Egalite. Journal of School Choice: International Research and Reform, December 2, 2013. Pages 443–464. <www.tandfonline.com>

Page 447: “This article reviews the complete set of studies that measure the effects of private school competition on public school students’ test scores as a result of school voucher or tax credit scholarship programs using one of the high-quality study designs described above or a similarly rigorous alternative specification.”

Page 449: “The third and final phase of this literature search took a systematic approach to ensure no studies had been overlooked.”

Page 452:

All but one of these 21 studies found neutral/positive or positive results. The only study to find no effects across all subjects was a 2006 study by Greene and Winters of the federal voucher program in Washington, DC. Although each choice program examined in this review takes place in a unique environment, the DC voucher program was exceptional because it was restricted to a relatively small number of participants in the year this study was conducted. Furthermore, a “hold-harmless” provision ensured that public schools were insulated from the financial loss from any students that transferred into private schools with a voucher. The absence of a positive competition effect is thus unsurprising, given these design features.

Page 460:

The strongest studies were those employing a regression discontinuity approach. This sophisticated quasi-experimental, empirical method should be used wherever possible … [because it] allows us to interpret estimates from this model as causal estimates of the competitive effects of a voucher program. Results from studies using this approach unanimously find positive impacts on student academic achievement.

[606] Article: “Competition Passes the Test: Vouchers Improve Public Schools in Florida.” By Marcus A. Winters and Jay P. Greene. Education Next, Summer 2004. Pages 66–71. <bit.ly>

Page 66:

The A+ program offers all the students in schools that chronically fail the Florida Comprehensive Assessment Test (FCAT) the opportunity to use a voucher to transfer to a private school. Schools face the threat of vouchers only if they are failing. They can remove the threat by improving their test scores. Comparing the performance of schools that were threatened with vouchers and the performance of those that faced no such threat gives a measure of how public schools respond to competition. …

Schools that receive a grade of F twice during any four-year period are deemed chronically failing. Their students then become eligible to receive vouchers, called opportunity scholarships, which they can use at another public school or at a private school. The vouchers are worth the lesser of per-pupil spending in the public schools or the cost of attending the chosen private school.

Page 67:

To analyze the program’s impact on public schools, we collected school-level test scores on the 2001–02 and 2002–03 administrations of the FCAT and the Stanford-9, a national norm-referenced test that is given to all Florida public school students around the same time as the FCAT. The results from the Stanford-9 are particularly useful for our analysis. Schools are not held accountable for their students’ performance on the Stanford-9. As a result, they have little incentive to manipulate the results by “teaching to the test” or through outright cheating. Thus, if gains are witnessed on both the FCAT and the Stanford-9, we can be reasonably confident that the gains reflect genuine improvements in student learning.

Page 68:

We compared the change in test-score performance for each of these groups relative to the rest of Florida public schools between the 2001–02 and 2002–03 administrations of the FCAT and the Stanford-9. …

Each of these results is statistically significant at a very high level, meaning that we can be highly confident that the test-score gains made by schools facing the actuality or prospect of voucher competition were larger than the gains made by other public schools.

[607] Webpage: “Top Organization Contributors.” Center for Responsive Politics. Accessed July 7, 2023 at <www.opensecrets.org>

Cycle: All cycles (1990–2022)

Rank 16

American Federation of Teachers

Total Contributions $127,218,032

To Dems & Liberals $126,596,375 (99.7%)

To Repubs & Conservs $371,005 (0.3%) …

Rank 20

National Education Assn

Total Contributions $108,586,479

To Dems & Liberals $104,350,943 (96.3%)

To Repubs & Conservs $4,014,467 (3.7%) …

Based on data relased [sic] by the FEC [U.S. Federal Election Commission] on March 20, 2023.

Totals on this page reflect donations from employees of the organization, its PAC and in some cases its own treasury. These totals include all campaign contributions to federal candidates, parties, political action committees (including super PACs), federal 527 organizations, and Carey committees. The totals do not include contributions to 501(c) organizations, whose political spending has increased markedly in recent cycles. Unlike other political organizations, they are not required to disclose the corporate and individual donors that make their spending possible. Only contributions to Democrats and Republicans or liberal and conservative outside groups are included in calculating the percentages the donor has given to either party. These figures do not include money transferred between affiliated organizations.

[608] Webpage: “Quick Answers to General Questions.” U.S. Federal Election Commission. Accessed September 22, 2015 at <bit.ly>

What Is a 527 Organization?

Entities organized under section 527 of the tax code are considered “political organizations,” defined generally as a party, committee or association that is organized and operated primarily for the purpose of influencing the selection, nomination or appointment of any individual to any federal, state or local public office, or office in a political organization. All political committees that register and file reports with the FEC [U.S. Federal Election Commission] are 527 organizations, but not all 527 organizations are required to file with the FEC. Some file reports with the Internal Revenue Service (IRS).

[609] Webpage: “Glossary: Hybrid PAC.” Federal Election Commission. Accessed April 18, 2019 at <www.fec.gov>

Hybrid PAC [political action committee] – A committee that, in addition to making contributions, establishes a separate bank account to deposit and withdraw funds raised in unlimited amounts from individuals, corporations, labor organizations and/or other political committees, consistent with the stipulated judgment in Carey v. FEC [U.S. Federal Election Commission]. The funds maintained in this separate account will not be used to make contributions, whether direct, in-kind or via coordinated communications, or coordinated expenditures, to federal candidates or committees.

[610] Book: Labor Relations in the Public Sector (5th edition). By Richard C. Kearney and Patrice M. Mareschal. CRC Press, 2014.

Page 40:

Public education is the largest public employer by far in state and local government—more than 5 million individuals work in public education. It is also the most expensive of all state and local services, consuming some $571 billion in expenditures.

Two organizations have dominated the union movement in education: the independent National Education Association (NEA) and the AFL–CIO [American Federation of Labor–Congress of Industrial Organizations]-affiliated American Federation of Teachers (AFT). The NEA, the largest labor union in the United States, claims a national membership of about 3 million active and retired members, but it lost more than 100,000 between 2010 and 2011, attributable to the Great Recession and its aftermath. The NEA operates with 14,000 local affiliates, with its primary strength in mid-sized cities and suburbs. Its largest state affiliate is the California Teachers Association, with a whopping 325,000 members. Eighty percent of its members are classroom teachers.

The AFT, with a membership roll of about 1.4 million, is concentrated in large cities such as New York, Boston, Chicago, Minneapolis, and Denver. The AFT has presented itself as an aggressive union seeking collective bargaining rights for teachers since its inception in 1919. In contrast, the NEA was born in 1857 as a professional organization open to both teachers and supervisory personnel. Even though many NEA locals function as unions today, and the national organization is officially labeled a “union” by the Bureau of Labor Statistics and the Internal Revenue Service, a large portion of the members are found in locals that do not engage in collective bargaining relationships.

[611] “Letter to the Democrats in the House and Senate on DC Vouchers.” By Dennis Van Roekel (President). National Education Association, March 05, 2009. <www.arizonaea.org>

“Opposition to vouchers is a top priority for NEA [National Education Association]. Throughout its history, NEA has strongly opposed any diversion of limited public funds to private schools. The more than 10,000 delegates who attend NEA’s national convention each year have consistently reaffirmed this position.”

[612] “2020 Democratic Party Platform.” Democratic National Committee, August 17, 2020. <www.presidency.ucsb.edu>

Charter schools were originally intended to be publicly funded schools with increased flexibility in program design and operations. Democrats believe that education is a public good and should not be saddled with a private profit motive, which is why we will ban for-profit private charter businesses from receiving federal funding. And we recognize the need for more stringent guardrails to ensure charter schools are good stewards of federal education funds. We support measures to increase accountability for charter schools, including by requiring all charter schools to meet the same standards of transparency as traditional public schools, including with regard to civil rights protections, racial equity, admissions practices, disciplinary procedures, and school finances. We will call for conditioning federal funding for new, expanded charter schools or for charter school renewals on a district’s review of whether the charter will systematically underserve the neediest students. And Democrats oppose private school vouchers and other policies that divert taxpayer-funded resources away from the public school system, including the program at issue in the recent Espinoza decision.

[613] “Resolution Regarding the Republican Party Platform.” Republican National Committee, August 22, 2020. <www.presidency.ucsb.edu>

WHEREAS, All platforms are snapshots of the historical contexts in which they are born, and parties abide by their policy priorities, rather than their political rhetoric …

RESOLVED, That the 2020 Republican National Convention will adjourn without adopting a new platform until the 2024 Republican National Convention.

[614] “2016 Republican Party Platform.” Republican National Committee, July 2016. <www.presidency.ucsb.edu>

We support options for learning, including home-schooling, career and technical education, private or parochial schools, magnet schools, charter schools, online learning, and early-college high schools. We especially support the innovative financing mechanisms that make options available to all children: education savings accounts (ESAs), vouchers, and tuition tax credits. Empowering families to access the learning environments that will best help their children to realize their full potential is one of the greatest civil rights challenges of our time.

[615] Constitution of the United States. Signed September 17, 1787. Enacted June 21, 1788. <www.justfacts.com>

Article 2, Section 2, Clause 2: “[The President] with the Advice and Consent of the Senate, shall appoint Ambassadors, other public Ministers and Consuls, Judges of the supreme Court….”

[616] Constitution of the United States. Signed September 17, 1787. Enacted June 21, 1788. <justfacts.com>

Article III, Section 1: “The Judges, both of the supreme and inferior Courts, shall hold their Offices during good Behaviour….”

Article II, Section 4: “The President, Vice President and all civil Officers of the United States, shall be removed from Office on Impeachment for, and Conviction of, Treason, Bribery, or other high Crimes and Misdemeanors.”

Article I, Section 2, Clause 5: “The House of Representatives shall chuse their Speaker and other Officers; and shall have the sole Power of Impeachment.”

Article I, Section 3, Clause 6: “The Senate shall have the sole Power to try all Impeachments. … And no Person shall be convicted without the Concurrence of two thirds of the Members present.”

[617] Article: “Senate Republicans Deploy ‘Nuclear Option’ to Clear Path for Gorsuch.” By Matt Flegenheimer. New York Times, April 6, 2017. <www.nytimes.com>

After Democrats held together Thursday morning and filibustered President Trump’s nominee, Republicans voted to lower the threshold for advancing Supreme Court nominations from 60 votes to a simple majority. …

Senate Democrats in 2013 first changed the rules of the Senate to block Republican filibusters of presidential nominees to lower courts and to government positions. But they left the filibuster for Supreme Court nominees untouched, an acknowledgment of the court’s exalted status. On Thursday, that last pillar was swept away on a party-line vote, with all 52 Republicans choosing to overrule Senate precedent and all 48 Democrats and liberal-leaning independents pushing to keep it.

Democrats placed the blame squarely on Senator Mitch McConnell of Kentucky, the majority leader, and his Republican colleagues….

[618] Report: “Filibusters and Cloture in the Senate.” By Valerie Heitshusen and Richard S. Beth. Congressional Research Service, April 7, 2017. <digital.library.unt.edu>

Page 2 (of PDF):

The filibuster is widely viewed as one of the Senate’s most characteristic procedural features. Filibustering includes any use of dilatory or obstructive tactics to block a measure by preventing it from coming to a vote. The possibility of filibusters exists because Senate rules place few limits on Senators’ rights and opportunities in the legislative process. …

Senate Rule XXII, however, known as the cloture rule, enables Senators to end a filibuster on any debatable matter the Senate is considering. Sixteen Senators initiate this process by presenting a motion to end the debate. In most circumstances, the Senate does not vote on this cloture motion until the second day of session after the motion is made. Then, it requires the votes of at least three-fifths of all Senators (normally 60 votes) to invoke cloture. (Invoking cloture on a proposal to amend the Senate’s standing rules requires the support of two-thirds of the Senators present and voting, whereas cloture on nominations requires a numerical majority.)

Page 9:

Invoking cloture usually requires a three-fifths vote of the entire Senate—“three-fifths of the Senators duly chosen and sworn.” Thus, if there is no more than one vacancy, 60 Senators must vote to invoke cloture. In contrast, most other votes require only a simple majority (that is, 51%) of the Senators present and voting, assuming those Senators constitute a quorum. In the case of a cloture vote, the key is the number of Senators voting for cloture, not the number voting against. Failing to vote on a cloture motion has the same effect as voting against the motion: it deprives the motion of one of the 60 votes needed to agree to it.

There are two important exceptions to the three-fifths requirement to invoke cloture. First, under Rule XXII, an affirmative vote of two-thirds of the Senators present and voting is required to invoke cloture on a measure or motion to amend the Senate rules. This provision has its origin in the history of the cloture rule. Before 1975, two-thirds of the Senators present and voting (a quorum being present) was required for cloture on all matters. In early 1975, at the beginning of the 94th Congress, Senators sought to amend the rule to make it somewhat easier to invoke cloture. However, some Senators feared that if this effort succeeded, that would only make it easier to amend the rule again, making cloture still easier to invoke. As a compromise, the Senate agreed to move from two-thirds of the Senators present and voting (a maximum of 67 votes) to three-fifths of the Senators duly chosen and sworn (normally, and at a maximum, 60 votes) on all matters except future rules changes, including changes in the cloture rule itself.17 Second, pursuant to precedent established by the Senate on November 21, 2013, and April 6, 2017, the Senate can invoke cloture on a nomination by a majority of Senators voting (a quorum being present).18

[619] Webpage: “Rules of the Senate: Rule XXII: Precedence of Motions.” Accessed October 2, 2018 at <www.rules.senate.gov>

2. Notwithstanding the provisions of rule II or rule IV or any other rule of the Senate, at any time a motion signed by sixteen Senators, to bring to a close the debate upon any measure, motion, other matter pending before the Senate, or the unfinished business, is presented to the Senate, the Presiding Officer, or clerk at the direction of the Presiding Officer, shall at once state the motion to the Senate, and one hour after the Senate meets on the following calendar day but one, he shall lay the motion before the Senate and direct that the clerk call the roll, and upon the ascertainment that a quorum is present, the Presiding Officer shall, without debate, submit to the Senate by a yea-and-nay vote the question:

“Is it the sense of the Senate that the debate shall be brought to a close?” And if that question shall be decided in the affirmative by three-fifths of the Senators duly chosen and sworn—except on a measure or motion to amend the Senate rules, in which case the necessary affirmative vote shall be two-thirds of the Senators present and voting—then said measure, motion, or other matter pending before the Senate, or the unfinished business, shall be the unfinished business to the exclusion of all other business until disposed of.

Thereafter no Senator shall be entitled to speak in all more than one hour on the measure, motion, or other matter pending before the Senate, or the unfinished business, the amendments thereto, and motions affecting the same, and it shall be the duty of the Presiding Officer to keep the time of each Senator who speaks. Except by unanimous consent, no amendment shall be proposed after the vote to bring the debate to a close, unless it had been submitted in writing to the Journal Clerk by 1 o’clock p.m. on the day following the filing of the cloture motion if an amendment in the first degree, and unless it had been so submitted at least one hour prior to the beginning of the cloture vote if an amendment in the second degree. No dilatory motion, or dilatory amendment, or amendment not germane shall be in order. Points of order, including questions of relevancy, and appeals from the decision of the Presiding Officer, shall be decided without debate.

After no more than thirty hours of consideration of the measure, motion, or other matter on which cloture has been invoked, the Senate shall proceed, without any further debate on any question, to vote on the final disposition thereof to the exclusion of all amendments not then actually pending before the Senate at that time and to the exclusion of all motions, except a motion to table, or to reconsider and one quorum call on demand to establish the presence of a quorum (and motions required to establish a quorum) immediately before the final vote begins. The thirty hours may be increased by the adoption of a motion, decided without debate, by a three-fifths affirmative vote of the Senators duly chosen and sworn, and any such time thus agreed upon shall be equally divided between and controlled by the Majority and Minority Leaders or their designees. However, only one motion to extend time, specified above, may be made in any one calendar day.

[620] Article: “In Landmark Vote, Senate Limits Use of the Filibuster.” By Jeremy W. Peters. New York Times, November 21, 2013. <www.nytimes.com>

The Senate approved the most fundamental alteration of its rules in more than a generation on Thursday, ending the minority party’s ability to filibuster most presidential nominees….

… But when the vote was called, Senator Harry Reid, the majority leader who was initially reluctant to force the issue, prevailed 52 to 48.

Under the change, the Senate will be able to cut off debate on executive and judicial branch nominees with a simple majority rather than rounding up a supermajority of 60 votes. The new precedent established by the Senate on Thursday does not apply to Supreme Court nominations or legislation itself. …

Only three Democrats voted against the measure.

The changes will apply to all 1,183 executive branch nominations that require Senate confirmation—not just cabinet positions but hundreds of high- and midlevel federal agency jobs and government board seats.

[621] Article: “Senate Republicans Deploy ‘Nuclear Option’ to Clear Path for Gorsuch.” By Matt Flegenheimer. New York Times, April 6, 2017. <www.nytimes.com>

After Democrats held together Thursday morning and filibustered President Trump’s nominee, Republicans voted to lower the threshold for advancing Supreme Court nominations from 60 votes to a simple majority. …

Senate Democrats in 2013 first changed the rules of the Senate to block Republican filibusters of presidential nominees to lower courts and to government positions. But they left the filibuster for Supreme Court nominees untouched, an acknowledgment of the court’s exalted status. On Thursday, that last pillar was swept away on a party-line vote, with all 52 Republicans choosing to overrule Senate precedent and all 48 Democrats and liberal-leaning independents pushing to keep it.

Democrats placed the blame squarely on Senator Mitch McConnell of Kentucky, the majority leader, and his Republican colleagues….

[622] Constitution of the United States. Signed September 17, 1787. Enacted June 21, 1788. <justfacts.com>

Article III, Section 1: “The Judges, both of the supreme and inferior Courts, shall hold their Offices during good Behaviour….”

Article II, Section 4: “The President, Vice President and all civil Officers of the United States, shall be removed from Office on Impeachment for, and Conviction of, Treason, Bribery, or other high Crimes and Misdemeanors.”

Article I, Section 2, Clause 5: “The House of Representatives shall chuse their Speaker and other Officers; and shall have the sole Power of Impeachment.”

Article I, Section 3, Clause 6: “The Senate shall have the sole Power to try all Impeachments. … And no Person shall be convicted without the Concurrence of two thirds of the Members present.”

[623] Ruling: Zelman v. Simmons-Harris. U.S. Supreme Court, June 27, 2002. Decided 5–4. Majority: Rehnquist, O’Connor, Scalia, Kennedy, Thomas. Dissenting: Stevens, Souter, Ginsburg, and Breyer. <caselaw.findlaw.com>

Syllabus:

Ohio’s Pilot Project Scholarship Program gives educational choices to families in any Ohio school district that is under state control pursuant to a federal-court order. The program provides tuition aid for certain students in the Cleveland City School District, the only covered district, to attend participating public or private schools of their parent’s choosing and tutorial aid for students who choose to remain enrolled in public school. Both religious and nonreligious schools in the district may participate, as may public schools in adjacent school districts. Tuition aid is distributed to parents according to financial need, and where the aid is spent depends solely upon where parents choose to enroll their children.

NOTE: Rehnquist was appointed by Nixon. O’Connor, Scalia, and Kennedy were appointed by Reagan. Thomas and Souter were appointed by G.H.W. Bush. Stevens was appointed by Ford. Ginsburg and Breyer were appointed by Clinton. [Webpage: “Justices 1789 to Present.” U.S. Supreme Court. Accessed March 7, 2022 at <www.supremecourt.gov>]

[624] Article: “The 2015 EdNext Poll on School Reform: Public Thinking on Testing, Opt Out, Common Core, Unions, and More.” By Michael B. Henderson, Paul E. Peterson, and Martin R. West. Education Next, Winter 2016. <educationnext.org>

These are among the many findings to emerge from the ninth annual Education Next survey, administered in May and June 2015 to a nationally representative sample of some 4,000 respondents, including oversamples of roughly 700 teachers, 700 African Americans, and 700 Hispanics (see methodology sidebar). …

The results presented here are based upon a nationally representative, stratified sample of adults (age 18 years and older) and representative oversamples of the following subgroups: teachers (693), African Americans (661). and Hispanics (734). Total sample size is 4,083. Respondents could elect to complete the survey in English or Spanish. Survey weights were employed to account for nonresponse and the oversampling of specific groups. …

The survey was conducted from May 21 to June 8, 2015, by the polling firm Knowledge Networks (KN), a GfK company. KN maintains a nationally representative panel of adults, obtained via address-based sampling techniques, who agree to participate in a limited number of online surveys.

NOTE: For facts about what constitutes a scientific survey and the factors that impact their accuracy, visit Just Facts’ research on Deconstructing Polls & Surveys.

[625] Poll: “Policy and Governance Survey 2015.” Commissioned by Education Next and the Program on Education Policy and Governance at the Harvard Kennedy School of Government. Conducted by Knowledge Networks during May–June 2015. <bit.ly>

Page 17:

21a. A proposal has been made that would give all families with children in public schools a wider choice, by allowing them to enroll their children in private schools instead, with government helping to pay the tuition. Would you favor or oppose this proposal?

Public

Parents

Teachers

Blacks

Hispanics

Whites

Completely Support

19%

26%

16%

27%

26%

17%

Somewhat Support

27

25

19

31

39

23

Somewhat Oppose

19

16

22

12

8

22

Completely Oppose

17

14

35

6

5

21

Neither Support nor Oppose

18

20

7

24

23

17

Page 18:

23. Thinking about the school-age children who currently live with you, what kinds of schools have they attended?

Public

Parents

Teachers

Blacks

Hispanics

Whites

Traditional public school

85%

85%

84%

91%

85%

83%

Charter school

9

9

10

10

11

8

Private school

14

14

22

14

8

18

Home school

6

6

8

7

4

7

[626] Press release: “Public Accountability & Private-School Choice.” By Adam Emerson. Thomas B. Fordham Institute, January 08, 2014. <fordhaminstitute.org>

The Fordham Institute supports school choice, done right. That means designing voucher and tax-credit policies that provide an array of high-quality education options for kids that are also accountable to parents and taxpayers. In that vein, Fordham has created the Public accountability & private-school choice toolkit to help with the design of strong outcomes-based accountability in private-school-choice programs.

[627] Report: “Where Do Public School Teachers Send Their Kids to School?” By Denis P. Doyle, Brian Diepold, and David A. DeSchryver. Thomas B. Fordham Institute, September 7, 2004. <edex.s3-us-west-2.amazonaws.com>

Page 2:

Across the states, 12.2 percent of all families (urban, rural, and suburban) send their children to private schools—a figure that roughly corresponds to perennial and well-known data on the proportion of U.S. children enrolled in private schools. But urban public school teachers send their children to private schools at a rate of 21.5 percent, nearly double the national rate of private-school attendance. Urban public school teachers are also more likely to send their children to private school than are urban families in general (21.5 vs. 17.5 percent).

Pages 6–7:

As commanded by the Constitution, every 10 years, the U.S. government undertakes a Census of the population. The 5 percent PUMS (Public Use Microdata Sample) data set is available in two media.

Public Use Microdata Areas (PUMAs) are the smallest unit within the PUMS. The PUMS data set provides access to answers from several hundred questions asked on the Census Long Form questionnaire. Because the Census Bureau must protect the privacy of individuals in the sample, one method is publishing the PUMS data in PUMAs, which are areas no smaller than 100,000 in population. …

The cities referenced in this paper are defined in two ways. This was necessary because the PUMAs do not seem to equal the political boundaries of cities, with a few exceptions such as Washington, D.C. The Census has defined each PUMA as belonging to a Metropolitan Statistical Area (MSA). The Census has also defined each PUMA as located in the central city if the entire PUMA is within the central city boundaries; located outside the central city if the entire PUMA is outside the city boundaries; or “unknown” for PUMAs that include areas inside and outside city boundaries. Subsequently, each household within a PUMA is assigned the same MSA and the same central city status. From this information, we are able to define the cities based on the MSA and the central city status.

[628] Article: “Biden Woos Teachers Union, Slams GOP.” By Brian Slodysko. Chicago Tribune, July 3, 2011. <articles.chicagotribune.com>

Biden tied current battles over public workers’ collective bargaining rights to teacher-specific issues such as smaller classes, private school voucher programs, and reduced benefits and wages.

Throughout the speech the message was clear, intent on boosting solidarity between Democrats and their traditional allies.

[629] Press release: “Joseph R. Biden ’61 Becomes First Auk Elected as President of the United States.” Archmere Academy, November 12, 2020. <www.archmereacademy.com>

Archmere Academy cultivates empathetic leaders who are prepared for every good work, and now the first alumnus of the school has been elected to this country’s highest office. Joseph Robinette Biden Jr. ‘61 has been elected the 46th President of the United States of America. …

Three of Biden’s children attended Archmere, Hunter ‘88, Ashley ‘99, and the late Joseph R. (Beau) Biden III ’87. …

Archmere Academy is a private, Catholic, college preparatory co-educational academy, grades 9–12 founded in 1932 by the Norbertine Fathers.

[630] Webpage: “Joe Biden.” Office of the President-Elect. Accessed September 18, 2015. <bit.ly>

“Joseph Robinette Biden Jr., age 66, was born in Scranton, Pennsylvania, on November 20, 1942, to Joseph Sr. and Jean Biden, the oldest of four. In 1953, the Biden family moved from Pennsylvania to Claymont, Delaware. Biden attended parochial school at St. Helena’s School in Wilmington and the Archmere Academy in Claymont†.”

† NOTE: “Archmere Academy is a private, Catholic, college preparatory co-educational academy, grades 9–12 founded in 1932 by the Norbertine Fathers.” [Webpage: “About Us.” Archmere Academy. Accessed September 18, 2015 at <www.archmereacademy.com>]

[631] Transcript: “CNN/YouTube Democratic Presidential Debate (Part I).” CNN, July 24, 2007. <www.cnn.com>

BIDEN: My kids did go to private schools, because right after I got elected, my wife and daughter were killed. I had two sons who survived. My sister was the head of the history department. She was helping me raise my children at Wilmington Friends School.

BIDEN: When it came time to go to high school when they had come through their difficulties—I’m a practicing Catholic—it was very important to me they go to a Catholic school, and they went to a Catholic school.

My kids would not have gone to that school were it not for the fact that my wife and daughter were killed and my two children were under the care of my sister who drove them to school every morning.

[632] Article: “Obama Girls to Go to Sidwell.” Seattle Times, November 22, 2008. <www.seattletimes.com>

“President-elect Obama and his wife have chosen Sidwell Friends School for their daughters…. She also said that Sasha and Malia had become good friends with Vice President-elect Joseph Biden’s grandchildren, who go to the school.”

[633] Transcript: “Full Interview Between President Obama and Bill O’Reilly.” Fox News, February 3, 2014. <www.foxnews.com>

O’Reilly: The secret to getting a je—good job is education. And in these chaotic families, the children aren’t well-educated because it isn’t—it isn’t, um, encouraged at home as much as it is in other precincts. Now, school vouchers is a way to level the playing field. Why do you oppose school vouchers when it would give poor people a chance to go to better schools?

Obama: Actually—every study that’s been done on school vouchers, Bill, says that it has very limited impact if any—

O’Reilly: Try it.

OBAMA: On—it has been tried, it’s been tried in Milwaukee, it’s been tried right here in DC—

O’Reilly: [OVERLAP] And it worked here.

Obama: No, actually it didn’t. When you end up taking a look at it, it didn’t actually make that much of a difference. So what we have been supportive of is, uh, something called charters. Which, within the public school system gives the opportunity for creative experiments by teachers, by principals to-to start schools that have a different approach. And—

O’Reilly: [OVERLAP] You would revisit that? I—I just think—I used be, teach in a Catholic school, a—and I just know—

Obama: [OVERLAP] Bill—you know, I—I’ve taken, I’ve taken—I’ve taken a look at it. As a general proposition, vouchers has not significantly improved the performance of kids that are in these poorest communities—

O’Reilly: [OVERLAP] [INAUDIBLE]

Obama: Some charters—some charters are doing great. Some Catholic schools do a great job, but what we have to do is make sure every child….

[634] Article: “Hawaii Prep School Gave Obama Window to Success.” By Martin Kaste. National Public Radio, October 12, 2012. <www.npr.org>

“Punahou School was founded by missionaries in 1841…. Punahou occupies a privileged position, not just on the hillside, but in Hawaii society. In his memoir, Dreams From My Father, Barack Obama recalled how his grandfather pulled strings to get him in.”

[635] Commentary: “Education Secretary Duncan’s Children to Go to Chicago Private School He Attended.” By Valerie Strauss. Washington Post, July 9, 2015. <www.washingtonpost.com>

“President Obama’s two daughters attended the [private, prestigious University of Chicago Laboratory] school before moving to Washington in 2009.”

[636] Article: “Obama Girls to Go to Sidwell.” Seattle Times, November 22, 2008. <www.seattletimes.com>

“President-elect Obama and his wife have chosen Sidwell Friends School for their daughters, opting for a private institution that another White House child, Chelsea Clinton, attended.”

[637] Article: “Obama: D.C. Schools Don’t Measure Up to His Daughters’ Private School.” By Nick Anderson. Washington Post, September 27, 2010. <www.washingtonpost.com>

Obama made his comments on NBC’s “Today” show in response to a woman who asked whether Malia and Sasha Obama “would get the same kind of education at a D.C. public school” that they would get at the D.C. private school that has educated generations of the city’s elite.

“I’ll be blunt with you: The answer is no, right now,” Obama said. D.C. public schools “are struggling,” he said, but they “have made some important strides over the last several years to move in the direction of reform. There are some terrific individual schools in the D.C. system.”

[638] Article: “Clinton Urges ‘No’ Vote on School Voucher Initiative: Election: He Says the Plan Would Seriously Hurt Public Education. Backers of Prop. 174 Note That the President’s Daughter Attends a Private School.” By David Lauter. Los Angeles Times, October 5, 1993. <bit.ly>

President Clinton waded into the midst of one of the state’s most controversial political issues Monday, urging Californians to defeat Proposition 174—the school voucher initiative on the November ballot. …

“Wouldn’t it be ironic if, at the very moment we’re finally trying to raise standards” for public schools, the government would “turn around and start sending tax money to private schools that didn’t have to meet any standards at all?” Clinton said in a speech to the AFL–CIO [American Federation of Labor–Congress of Industrial Organizations] convention here.

“The people will regret this if they pass it,” Clinton said. “If I were a citizen of the state of California, I would not vote for Proposition 174.”

[639] Article: “School House to White House: The Education of the Presidents.” Prologue Magazine, Spring 2007. <www.archives.gov>

“William J. Clinton attended both private and public schools growing up in Arkansas and graduated from Hot Springs High School in 1964.”

[640] Article: “Obama Girls to Go to Sidwell.” Seattle Times, November 22, 2008. <www.seattletimes.com>

“President-elect Obama and his wife have chosen Sidwell Friends School for their daughters, opting for a private institution that another White House child, Chelsea Clinton, attended.”

[641] Position paper: “A Great Public School Education for Every Student.” By Elizabeth Warren. Medium, October 21, 2019. <medium.com>

Combating the Privatization and Corruption of Our Public Schools

To keep our traditional public school systems strong, we must resist efforts to divert public funds out of traditional public schools. Efforts to expand the footprint of charter schools, often without even ensuring that charters are subject to the same transparency requirements and safeguards as traditional public schools, strain the resources of school districts and leave students behind, primarily students of color. …

End federal funding for the expansion of charter schools: The Federal Charter School Program (CSP), a series of federal grants established to promote new charter schools, has been an abject failure. …

Ban for-profit charter schools: Our public schools should benefit students, not the financial or ideological interests of wealthy patrons like the DeVos and Walton families. I will fight to ban for-profit charter schools and charter schools that outsource their operations to for-profit companies.

[642] Article: “Finally, Democratic Candidates Talk About Education in a Debate, But Nobody Raised This Key Issue.” By Valerie Strauss. Washington Post, September 15, 2019. <www.washingtonpost.com>

[Quoting Elizabeth Warren:] “You know, I think I’m the only person on this stage who has been a public school teacher. … I had wanted to be a public school teacher since I was in second grade. And let’s be clear in all the ways we talk about this, money for public schools should stay in public schools, not go anywhere else.”

[643] Video: “Warren Denies Sending Son to Private School.” Washington Free Beacon, November 22, 2019. <www.youtube.com>

Time marker 0:01:

Sarah Carpenter: We want, we just, we’re going to have the same choice that you had for your kids because I read that your children went to private school.

Warren: No, my children went to public schools.

[644] Article: “Minority Voters Chafe as Democratic Candidates Abandon Charter Schools.” By Erica L. Green and Eliza Shapiro. New York Times, November 26, 2019. Updated 11/27/19. <www.nytimes.com>

Senators Elizabeth Warren of Massachusetts and Bernie Sanders of Vermont, the two leading liberals, have vowed to curb charter school growth if elected. …

The Freedom Coalition for Charter Schools, started by Howard Fuller, a longtime school-choice activist and former Milwaukee schools superintendent, was formed in July after Mr. Sanders announced his plan. It gained momentum after Ms. Warren began signaling her skepticism of federal funding for charter schools. …

The candidate vowed to review her education plan to make sure she “got it right.” But the exchange ignited another controversy when Ms. Warren told an activist that her children had attended public schools. Her campaign clarified in a statement that although her daughter attended public school, her son completed the majority of his education in private school.

[645] Article: “Warren’s Son Attended One of Nation’s Most Elite Private Schools.” By Collin Anderson. Washington Free Beacon, December 2, 2019. <freebeacon.com>

While Warren’s daughter, Amelia Warren Tyagi, attended public schools for the entirety of her elementary and high school education, her son, Alex Warren, spent the majority of his formative years at one of the country’s most elite private schools, the Haverford School, according to yearbooks obtained by the Washington Free Beacon. …

Alex Warren attended the school for six years, from 1988—when he began as a sixth grader—until his graduation in 1994, the yearbooks show. He spent his junior year in Boston when Warren accepted an invitation to teach at Harvard Law School for a year in 1992. In her book A Fighting Chance, Warren wrote that her son “took the opportunity to reinvent himself at a new high school.”

[646] Book: The Two-Income Trap: Why Middle-Class Parents Are Going Broke. By Elizabeth Warren and Amelia Warren Tyagi. Basic Books, 2003.

Page 34:

It is time to sound the alarm that the crisis in education is not only a crisis of reading and arithmetic; it is also a crisis in middle-class family economics. At the core of the problem is the time-honored rule that where you live dictates where you go to school. Any policy that loosens the ironclad relationship between location-location-location and school-school-school would eliminate the need for parents to pay an inflated price for a home just because it happens to lie within the boundaries of a desirable school district.

A well-designed voucher program would fit the bill neatly. A taxpayer-funded voucher that paid the entire cost of educating a child (not just a partial subsidy) would open a range of opportunities to all children. With fully funded vouchers, parents of all income levels could send their children—and the accompanying financial support—to the schools of their choice. Middle-class parents who used state funds to send their kids to school would be able to live in the neighborhood of their choice—or the neighborhood of their pocketbook. Fully funded vouchers would relieve parents from the terrible choice of leaving their kids in lousy schools or bankrupting themselves to escape those schools.

[647] Webpage: “Questionnaire Responses by John Fetterman—Candidate for Lt. Governor.” Reclaim Philadelphia, January 24, 2018. <medium.com>

9. Approximately 33% of Philadelphia students are enrolled in charter schools. What is your position on the expansion of charter schools? Should they be given public money via vouchers or similar programs (like ESAs[Education Savings Accounts])? No, I do not support expansion of publicly funded charter schools. There is very poor and inconsistent systems of accountability for charters, especially cyber-charters. Local control needs to be reinstated. Taking money away from public schools and putting them into charter and private schools does not solve anything.

[648] Press release: “John Fetterman Secures NEA Recommendation for U.S. Senate.” Pennsylvania State Education Association, June 22, 2022. <www.psea.org>

With strong support from the Pennsylvania State Education Association (PSEA), the National Education Association (NEA) has recommended Lt. Gov. John Fetterman in Pennsylvania’s race for the U.S. Senate.

PSEA President Rich Askey commended Fetterman for his leadership on education issues in Pennsylvania. Askey pointed out that Fetterman believes that addressing the educator and support professional shortage is a priority, supports reducing the number of federally mandated standardized tests, will advocate for broad-based student debt relief, and opposes tuition voucher programs.

[649] Article: “What Happens to Braddock if John Fetterman Becomes Pa.’s Lieutenant Governor?” By Colin Deppen. The Incline, May 21, 2018. <www.ydr.com>

“When I took office, we didn’t have a single functioning playground or community center or quality youth program,” Fetterman told The Incline. “We were not a safe community. We didn’t have a Free Store or 412 Food Rescue program [both founded by wife Gisele] that eliminated food insecurity in our community. …

Fetterman bristles a bit at this point, saying he’s lived in Braddock, fallen in love with it, raised a family here—his children attend a private school in Pittsburgh—and advocated for it relentlessly. It stings to hear someone question his motives.

[650] Facebook post: “Congrats to WT Parent Giselle Fetterman.” Winchester Thurston School, August 31, 2015. <www.facebook.com>

“Congrats to WT parent Giselle Fetterman and WT alumna parent Majestic Lane for being named to Pittsburgh’s 40 under 40 list for their commitment to shaping our region and making it a better place for everyone to live, work, and play.”

[651] Facebook post: “WT Mom Giselle Barreto Fetterman.” Winchester Thurston School, March 3, 2021. <www.facebook.com>

This winter, WT’s Programs Team combined forces with fiber artist and WT Mom Gisele Barreto Fetterman for a Virtual Extreme Punto De Cruz Workshop.

This unique and fun family workshop allowed children and adults to connect, engage, and learn together, and resulted in amazing and colorful works of art.

Special thanks to Gisele for her willingness to share her time, passion, and talent with WT families!

[652] Article: “Secretary Duncan Wants D.C. Kids to Keep Vouchers.” USA Today, March 4, 2009. <usatoday30.usatoday.com>

Duncan opposes vouchers, he said in an interview with The Associated Press. But he said Washington is a special case, and kids already in private schools on the public dime should be allowed to continue. … “I don’t think vouchers ultimately are the answer,” Duncan said.

[653] Commentary: “Education Secretary Duncan’s Children to Go to Chicago Private School He Attended.” By Valerie Strauss. Washington Post, July 9, 2015. <www.washingtonpost.com>

“Education Secretary Arne Duncan grew up in Chicago and attended the private, prestigious University of Chicago Laboratory Schools. … And in the fall, Duncan’s children will be attending Lab, too, while his wife works there.”

[654] Interview: Rahm Emanuel on “Public Affairs,” January 10, 2002. <www.youtube.com>

Time marker 25:40:

Jeff Berkowitz: You know the State Board of Education has said that one out of every two schools in the city of Chicago is still a failing school. … Why not say to those people who’ve got kids in a failing school district: “We want to give you some choice. We’re spending about $7,000 per person in grade school. Here, we’ll make it simple for you.” And take that money … just like your parents at one point made a choice [and allow them to use it at a] private school … The main thing is that it gives people a way out of failing public schools. I want to know, because I see you have the endorsement of the National Education Association—they don’t believe in the kind of choice I just said—does Rahm Emanuel believe in that choice?

Rahm Emanuel: I don’t believe in vouchers. I don’t think they’re the right solution. I don’t believe in abandoning public education. … My parents never asked for state-sponsorship of our private education, which is religious education as well.

[655] Article: “Rahm Emanuel, Obama’s Pick for Chief of Staff, Is Tough, Direct and Wedded to His Jewish Roots.” Jewish Journal, November 6, 2008. <jewishjournal.com>

“When his family lived in Chicago, he attended Bernard Zell Anshe Emet Day School, a Jewish day school. After his family moved to Wilmette, he attended public school: Romona School, Wilmette Junior High School, and New Trier High School.”

[656] Webpage: “About Us.” Bernard Zell Anshe Emet Day School. Accessed October 2, 2018 at <www.bernardzell.org>

Bernard Zell Anshe Emet Day School is an independent Jewish day school for the 21st century. From Early Childhood through Middle School, we inspire our students to love learning through innovative teaching, hands-on exploration and discovery. Plus the individualized attention your children will receive is unrivaled; we offer a 5:1 student-teacher ratio—the very lowest in the city.

[657] Article: “Emanuel Sending Kids to Private School.” By Kristen Mack. Chicago Tribune, July 22, 2011. <www.chicagotribune.com>

Mayor Rahm Emanuel will bypass Chicago Public Schools, like many high-profile politicians before him, and send his children to the University of Chicago Laboratory Schools in Hyde Park this fall. …

Emanuel’s children previously attended a private religious school in Chicago before moving to Washington while he served as President Obama’s White House chief of staff. They attended private schools in Washington and finished the school year there.

[658] Article: “Steyer Slams Private Education, Sent Kids to Pricey Prep School.” By Charles Fain Lehman. Washington Free Beacon, January 28, 2020. <freebeacon.com>

During a Tuesday morning appearance on C-SPAN’s Washington Journal, Steyer told a caller he believes that “quality public education from universal preschool through college” is “an absolute obligation of the government and a right for every American” and that “having a quality public education is the only way we can have justice and mobility in our society.” Steyer has promised to put government money where his mouth is, telling the Washington Post that he “does not support using public money for private or religious education.” …

Records indicate that Steyer’s four children—all of whom are now adults—attended San Francisco University High School as adolescents.

[659] Article: “Hedge Fund Chief Takes Major Role in Philanthropy.” By Stephanie Strom. New York Times, September 16, 2011. <www.nytimes.com>

“Until last year, Tom Steyer was just another billionaire hedge fund manager intent on keeping a low profile. … Just 13 months apart in age, the Steyer brothers [Tom and Jim] … shared a bedroom in an apartment on New York’s Upper East Side, attending the Buckley School, then Phillips Exeter Academy and Yale….”

[660] Article: “Where Democrats Stand: Education.” Washington Post, 2020. <www.washingtonpost.com>

“Tom Steyer (Dropped out), Billionaire activist … Steyer is no longer running for president. Steyer does not support using public money for private or religious education, he told the Post.”

[661] Webpage: “Oppose Taxpayer Subsidies for Religious Schools.” American Civil Liberties Union. Accessed September 23, 2015 at <www.aclu.org>

School Voucher Schemes Would Force All Taxpayers to Support Religious Beliefs and Practices with Which They May Strongly Disagree.

At the heart of these proposals are the goals of commingling church and state in the classroom and diverting public funds to private and parochial schools. Religious schools represent 85 percent of the total private school enrollment in the United States. These schools by their nature combine proselytization with education and therefore should not be funded by taxpayer dollars.

[662] Webpage: “Public Funding for Abortion.” American Civil Liberties Union. Accessed January 25, 2017 at <bit.ly>

What About Those Who Are Morally or Religiously Opposed to Abortion?

Our tax dollars fund many programs that individual people oppose. For example, those who oppose war on moral or religious grounds pay taxes that are applied to military programs. The congressional bans on abortion funding impose a particular religious or moral viewpoint on those women who rely on government-funded health care. Providing funding for abortion does not encourage or compel women to have abortions, but denying funding compels many women to carry their pregnancies to term. Nondiscriminatory funding would simply place the profoundly personal decision about how to treat a pregnancy back where it belongs—in the hands of the woman who must live with the consequences of that decision.

[663] Handbook of Research on School Choice. Edited by Mark Berends and others. Routledge, 2009.

Page xvi:

In the lower-left cell of Table 1.1 are traditional public schools, which are government-funded and government-operated. Students within their boundaries are normally assigned to them, and they represent by far the largest number of American schools. In school choice research and policy deliberations, such traditional public schools, also called “neighborhood schools,” are often compared to choice schools such as charter and private schools, which may be near to or far from a student’s home. (It may be misleading, however, to distinguish traditional public schools as “unchosen.” Some parents choose to live near excellent public schools and thereby choose their children’s schools by residential location. Thus, the terms of school choice nomenclature should be carefully and explicitly defined and analyzed.)

[664] Book: The Education Gap: Vouchers and Urban Schools (Revised edition). By William G. Howell and Paul E. Peterson with Patrick J. Wolf and David E. Campbell. Brookings Institution Press, 2006 (first published in 2002). <www.brookings.edu>

Page 23:

Low-income families do not have the earning power to buy into a neighborhood with high-quality schools. Quite the opposite—they often can afford a home or apartment only because it is located in a poorer neighborhood with lower-quality schools. Ironically, when a neighborhood serving a low-income community improves, land values rise and poor families often are displaced.

Page 57: “By requiring students to live within a specific district, school boards have created a system that is highly stratified by class and race.4 Families that are able and willing to pay what the housing market demands can buy good schools, while those that lack sufficient resources are consigned to poorer ones.”

Page 187: “Because students are assigned to schools on the basis of where they live, public schools inherit all of the racial inequalities that plague housing markets.”

[665] 21st Century Geography: A Reference Handbook (Volume 2). Edited by Joseph P. Stoltman. Sage Publications, 2012.

Page 515:

It is not just in developing countries that marginalized groups find themselves living in hazardous situations. In the United States, for instance, low-lying areas in and around New Orleans are occupied primarily by economically depressed populations with limited access to resources, meaning that such individuals have restricted choices on where they can live and are thus channeled into high risk areas.

[666] Report: “Affording a House in a Highly Ranked School Zone? It’s Elementary.” By Tommy Unger. Redfin, September 25, 2013. Updated 10/6/20. <www.redfin.com>

Redfin took a look at homes on Multiple Listing Services (MLS), databases used by real estate brokers, that sold between May 1 and July 31, 2013 to calculate median sale price and price per square foot of homes within school zones. … The percentile rankings are based on test scores for each of the schools in this report. School and home coverage consisted of 10,811 elementary school zones across 57 metro areas and included 407,509 home sales. …

… In the United States, housing prices in the zones of highly ranked public schools are remarkably higher than those served by lower ranked schools. …

… When accounting for size, on average, people pay $50 more per square foot for homes in top-ranked school zones compared with homes served by average-ranked schools. This means that the price differences for similar homes located near each other but served by different schools can range from tens to hundreds of thousands of dollars.

[667] “An Interview with Education Secretary Arne Duncan.” Science, April 10, 2009. <bit.ly>

Q: As the second education secretary with school-aged kids, where does your daughter go to school, and how important was the school district in your decision about where to live?

A: She goes to Arlington [Virginia] public schools. That was why we chose where we live, it was the determining factor. That was the most important thing to me. My family has given up so much so that I could have the opportunity to serve; I didn’t want to try to save the country’s children and our educational system and jeopardize my own children’s education.

[668] Calculated with data from:

a) Dataset: “Median Family Income (in 2009 Inflation-Adjusted Dollars), United States, County by State and Puerto Rico.” 2009 American Community Survey, U.S. Census Bureau. <census.gov>

“Geographical Area [=] Arlington County … Median [=] 136,542”

b) Webpage: “CPI Inflation Calculator.” United States Department of Labor, Bureau of Labor Statistics. Accessed July 7, 2023. <www.bls.gov>

“$136,542 in January 2009 has the same buying power as $194,547.27 in January 2023 … The CPI inflation calculator uses the Consumer Price Index for All Urban Consumers (CPI-U) U.S. city average series for all items, not seasonally adjusted. This data represents changes in the prices of all goods and services purchased for consumption by urban households.”

NOTE: Like all Census Bureau measures of “money” income, this dataset doesn’t include noncash benefits like subsidized housing, food stamps, charitable services, and government or employer-provided health benefits. Also, the data are collected via government surveys, and low-income households substantially underreport their income on such surveys.

[669] Report: “Income and Poverty in the United States: 2009.” By Carmen DeNavas-Walt, Bernadette D. Proctor, and Jessica C. Smith. U.S. Census Bureau, September 2010. <www.census.gov>

Pages 2–3: “The income and poverty estimates shown in this report are based solely on money income before taxes and do not include the value of noncash benefits, such as nutritional assistance, Medicare, Medicaid, public housing, and employer-provided fringe benefits.”

[670] Article: “How VIPs Lobbied Schools.” By Azam Ahmed. Chicago Tribune, March 23, 2010. <www.chicagotribune.com>

Whispers have long swirled that some children get spots in the city’s premier schools based on whom their parents know. But a list maintained over several years in Duncan’s office and obtained by the Tribune lends further evidence to those charges. …

The log is a compilation of politicians and influential business people who interceded on behalf of children during Duncan’s tenure. …

After getting a request, he or another staffer would look up the child’s academic record. If the student met their standard, they would call the principal of the desired school.

Pickens said the calls from his office were not directives to the principals; no one was ever told they had to accept a student. Often, students did not get any of their top choices but were placed in larger, less competitive, but still desirable schools such as Lane Technical High School. …

The initials “AD” are listed 10 times as the sole person requesting help for a student, and as a co-requester about 40 times. Pickens said “AD” stood for Arne Duncan, though Duncan’s involvement is unclear.

[671] Ruling: Zelman v. Simmons-Harris. U.S. Supreme Court, June 27, 2002. Decided 5–4. Majority: Rehnquist, O’Connor, Scalia, Kennedy, Thomas. Dissenting: Stevens, Souter, Ginsburg, and Breyer. <caselaw.findlaw.com>

Syllabus:

Ohio’s Pilot Project Scholarship Program gives educational choices to families in any Ohio school district that is under state control pursuant to a federal-court order. The program provides tuition aid for certain students in the Cleveland City School District, the only covered district, to attend participating public or private schools of their parent’s choosing and tutorial aid for students who choose to remain enrolled in public school. Both religious and nonreligious schools in the district may participate, as may public schools in adjacent school districts. Tuition aid is distributed to parents according to financial need, and where the aid is spent depends solely upon where parents choose to enroll their children.

[672] Ruling: Zelman v. Simmons-Harris. U.S. Supreme Court, June 27, 2002. Decided 5–4. Majority: Rehnquist, O’Connor, Scalia, Kennedy, Thomas. Dissenting: Stevens, Souter, Ginsburg, and Breyer. <caselaw.findlaw.com>

Majority:

The Establishment Clause of the First Amendment, applied to the States through the Fourteenth Amendment, prevents a State from enacting laws that have the “purpose” or “effect” of advancing or inhibiting religion. Agostini v. Felton…. (“[W]e continue to ask whether the government acted with the purpose of advancing or inhibiting religion [and] whether the aid has the ‘effect’ of advancing or inhibiting religion” (citations omitted)). There is no dispute that the program challenged here was enacted for the valid secular purpose of providing educational assistance to poor children in a demonstrably failing public school system. Thus, the question presented is whether the Ohio program nonetheless has the forbidden “effect” of advancing or inhibiting religion.

[673] First Amendment to the Constitution of the United States. Ratified December 15, 1791. <justfacts.com>

“Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.”

[674] Fourteenth Amendment to the Constitution of the United States. Ratified July 9, 1868. <justfacts.com>

Section 1. All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside. No State shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any State deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.

[675] Ruling: Zelman v. Simmons-Harris. U.S. Supreme Court, June 27, 2002. Decided 5–4. Majority: Rehnquist, O’Connor, Scalia, Kennedy, Thomas. Dissenting: Stevens, Souter, Ginsburg, and Breyer. <caselaw.findlaw.com>

Majority:

The question presented is whether this program offends the Establishment Clause of the United States Constitution. We hold that it does not. …

In sum, the Ohio program is entirely neutral with respect to religion. It provides benefits directly to a wide spectrum of individuals, defined only by financial need and residence in a particular school district. It permits such individuals to exercise genuine choice among options public and private, secular and religious. The program is therefore a program of true private choice. In keeping with an unbroken line of decisions rejecting challenges to similar programs, we hold that the program does not offend the Establishment Clause.

[676] Ruling: Zelman v. Simmons-Harris. U.S. Supreme Court, June 27, 2002. Decided 5–4. Majority: Rehnquist, O’Connor, Scalia, Kennedy, Thomas. Dissenting: Stevens, Souter, Ginsburg, and Breyer. <caselaw.findlaw.com>

Dissent (Souter):

Today, however, the majority holds that the Establishment Clause is not offended by Ohio’s Pilot Project Scholarship Program, under which students may be eligible to receive as much as $2,250 in the form of tuition vouchers transferable to religious schools. In the city of Cleveland the overwhelming proportion of large appropriations for voucher money must be spent on religious schools if it is to be spent at all, and will be spent in amounts that cover almost all of tuition. The money will thus pay for eligible students’ instruction not only in secular subjects but in religion as well, in schools that can fairly be characterized as founded to teach religious doctrine and to imbue teaching in all subjects with a religious dimension.2 Public tax money will pay at a systemic level for teaching the covenant with Israel and Mosaic law in Jewish schools, the primacy of the Apostle Peter and the Papacy in Catholic schools, the truth of reformed Christianity in Protestant schools, and the revelation to the Prophet in Muslim schools, to speak only of major religious groupings in the Republic.

[677] Ruling: Zelman v. Simmons-Harris. U.S. Supreme Court, June 27, 2002. Decided 5–4. Majority: Rehnquist, O’Connor, Scalia, Kennedy, Thomas. Dissenting: Stevens, Souter, Ginsburg, and Breyer. <caselaw.findlaw.com>

Concurrence (O’Connor):

Although $8.2 million is no small sum, it pales in comparison to the amount of funds that federal, state, and local governments already provide religious institutions. Religious organizations may qualify for exemptions from the federal corporate income tax … the corporate income tax in many States … and property taxes in all 50 States … and clergy qualify for a federal tax break on income used for housing expenses…. In addition, the Federal Government provides individuals, corporations, trusts, and estates a tax deduction for charitable contributions to qualified religious groups. … Finally, the Federal Government and certain state governments provide tax credits for educational expenses, many of which are spent on education at religious schools. …

Most of these tax policies are well established … yet confer a significant relative benefit on religious institutions. The state property tax exemptions for religious institutions alone amount to very large sums annually….

These tax exemptions, which have “much the same effect as [cash grants] … of the amount of tax [avoided]” … are just part of the picture. Federal dollars also reach religiously affiliated organizations through public health programs such as Medicare … and Medicaid … through educational programs such as the Pell Grant program … and through child care programs such as the Child Care and Development Block Grant Program….

A significant portion of the funds appropriated for these programs reach religiously affiliated institutions, typically without restrictions on its subsequent use. … Federal aid to religious schools is also substantial. Although data for all States is not available, data from Minnesota, for example, suggest that a substantial share of Pell Grant and other federal funds for college tuition reach religious schools. …

Against this background, the support that the Cleveland voucher program provides religious institutions is neither substantial nor atypical of existing government programs.

[678] Ruling: Zelman v. Simmons-Harris. U.S. Supreme Court, June 27, 2002. Decided 5–4. Majority: Rehnquist, O’Connor, Scalia, Kennedy, Thomas. Dissenting: Stevens, Souter, Ginsburg, and Breyer. <caselaw.findlaw.com>

Dissent (Stevens):

For the reasons stated by Justice Souter and Justice Breyer, I am convinced that the Court’s decision is profoundly misguided. Admittedly, in reaching that conclusion I have been influenced by my understanding of the impact of religious strife on the decisions of our forbears [sic] to migrate to this continent, and on the decisions of neighbors in the Balkans, Northern Ireland, and the Middle East to mistrust one another. Whenever we remove a brick from the wall that was designed to separate religion and government, we increase the risk of religious strife and weaken the foundation of our democracy.

[679] Ruling: Zelman v. Simmons-Harris. U.S. Supreme Court, June 27, 2002. Decided 5–4. Majority: Rehnquist, O’Connor, Scalia, Kennedy, Thomas. Dissenting: Stevens, Souter, Ginsburg, and Breyer. <caselaw.findlaw.com>

Concurrence (Thomas):

The wisdom of allowing States greater latitude in dealing with matters of religion and education can be easily appreciated in this context. Respondents advocate using the Fourteenth Amendment to handcuff the State’s ability to experiment with education. But without education one can hardly exercise the civic, political, and personal freedoms conferred by the Fourteenth Amendment. Faced with a severe educational crisis, the State of Ohio enacted wide-ranging educational reform that allows voluntary participation of private and religious schools in educating poor urban children otherwise condemned to failing public schools. The program does not force any individual to submit to religious indoctrination or education. It simply gives parents a greater choice as to where and in what manner to educate their children.5 This is a choice that those with greater means have routinely exercised.

Cleveland parents now have a variety of educational choices. There are traditional public schools, magnet schools, and privately run community schools, in addition to the scholarship program. Currently, 46 of the 56 private schools participating in the scholarship program are church affiliated (35 are Catholic), and 96 percent of students in the program attend religious schools. … Thus, were the Court to disallow the inclusion of religious schools, Cleveland children could use their scholarships at only 10 private schools.

[680] For a listing of school choice court cases and outcomes, visit the webpage: “Educational Choice.” Institute for Justice. Accessed October 3, 2018 at <ij.org>

[681] Ruling: Bush v. Holmes. Supreme Court of Florida, January 5, 2006. Decided 5–2. <efactssc-public.flcourts.org>

Majority:

The issue we decide is whether the State of Florida is prohibited by the Florida Constitution from expending public funds to allow students to obtain a private school education in kindergarten through grade twelve, as an alternative to a public school education. The law in question, now codified at section 1002.38, Florida Statutes (2005), authorizes a system of school vouchers and is known as the Opportunity Scholarship Program (OSP).

Under the OSP, a student from a public school that fails to meet certain minimum state standards has two options. The first is to move to another public school with a satisfactory record under the state standards. The second option is to receive funds from the public treasury, which would otherwise have gone to the student’s school district, to pay the student’s tuition at a private school. The narrow question we address is whether the second option violates a part of the Florida Constitution requiring the state to both provide for “the education of all children residing within its borders” and provide “by law for a uniform, efficient, safe, secure, and high quality system of free public schools that allows students to obtain a high quality education.” Art. IX, § 1(a), Fla. Const. …

Using the same term, “adequate provision,” article IX, section 1(a) further states: “Adequate provision shall be made by law for a uniform, efficient, safe, secure, and high quality system of free public schools.” For reasons expressed more fully below, we find that the OSP violates this language. It diverts public dollars into separate private systems parallel to and in competition with the free public schools that are the sole means set out in the Constitution for the state to provide for the education of Florida’s children. This diversion not only reduces money available to the free schools, but also funds private schools that are not “uniform” when compared with each other or the public system. Many standards imposed by law on the public schools are inapplicable to the private schools receiving public monies. In sum, through the OSP the state is fostering plural, nonuniform systems of education in direct violation of the constitutional mandate for a uniform system of free public schools.

[682] Ruling: Meredith v. Pence. Indiana Supreme Court, March 26, 2013. Decided 5–0. <www.in.gov>

Majority:

Asserting violation of three provisions of the Indiana Constitution, the plaintiffs challenge Indiana’s statutory program for providing vouchers to eligible parents for their use in sending their children to private schools. Finding that the challengers have not satisfied the high burden required to invalidate a statute on constitutional grounds, we affirm the trial court’s judgment upholding the constitutionality of the statutory voucher program. …

The plaintiffs contend that the school voucher program violates Article 8, Section 1,1 and Article 1, Sections 42 and 6,3 of the Indiana Constitution “both because it uses taxpayer funds to pay for the teaching of religion to Indiana schoolchildren and because it purports to provide those children’s publicly funded education by paying tuition for them to attend private schools rather than the ‘general and uniform system of Common Schools’ the Constitution mandates.” …

The school voucher program (denominated by the legislature as the “Choice Scholarship Program”) was enacted by the General Assembly in 2011 … and permits eligible students to obtain scholarships (also called “vouchers”) that may be used toward tuition at participating nonpublic schools in Indiana. … To be eligible for the voucher program, a student must live in a “household with an annual income of not more than one hundred fifty percent (150%) of the amount required for the individual to qualify for the federal free or reduced price lunch program.” …

… To be eligible to receive program students, a nonpublic school must meet several criteria, including accreditation from the Indiana State Board of Education (“Board of Education”) or other recognized accreditation agency, administration of the Indiana statewide testing for educational progress (ISTEP), and participation in the Board of Education’s school improvement program under Indiana Code Section…. Participation in the program does not subject participating schools to “regulation of curriculum content, religious instruction or activities, classroom teaching, teacher and staff hiring requirements, and other activities carried out by the eligible school” … except that the school must meet certain minimum instructional requirements which correspond to the mandatory curriculum in Indiana public schools and nonpublic schools accredited by the Board of Education. … The requirements include instruction in Indiana and United States history and government, social studies, language arts, mathematics, sciences, fine arts, and health. …

The program statute is silent with respect to religion, imposing no religious requirement or restriction upon student or school eligibility … and as of October 2011, most of the schools that had sought and received approval from the Department to participate in the voucher program were religiously affiliated….

… First, the voucher program expenditures do not directly benefit religious schools but rather directly benefit lower-income families with schoolchildren by providing an opportunity for such children to attend non-public schools if desired. Second, the prohibition against government expenditures to benefit religious or theological institutions does not apply to institutions and programs providing primary and secondary education. Summary judgment for the defendants was thus proper as to the plaintiffs’ Section 6 claims.

Conclusion:

We hold that the Indiana school voucher program, the Choice Scholarship Program, is within the legislature’s power under Article 8, Section 1, and that the enacted program does not violate either Section 4 or Section 6 of Article 1 of the Indiana Constitution. We affirm the grant of summary judgment to the defendants.

[683] Ruling: Espinoza v. Montana Department of Revenue. U.S. Supreme Court, June 30, 2020. Decided 5–4. Majority: Roberts, Thomas, Alito, Gorsuch, Kavanaugh. Concurrence: Thomas, Alito, and Gorsuch. Dissenting: Ginsburg, Breyer, Kagan, and Sotomayor. <www.supremecourt.gov>

Syllabus:

The Montana Legislature established a program that grants tax credits to those who donate to organizations that award scholarships for private school tuition. To reconcile the program with a provision of the Montana Constitution that bars government aid to any school “controlled in whole or in part by any church, sect, or denomination,” Art. X, §6(1), the Montana Department of Revenue promulgated “Rule 1,” which prohibited families from using the scholarships at religious schools. Three mothers who were blocked by Rule 1 from using scholarship funds for their children’s tuition at Stillwater Christian School sued the Department in state court, alleging that the Rule discriminated on the basis of their religious views and the religious nature of the school they had chosen. The trial court enjoined Rule 1. Reversing, the Montana Supreme Court held that the program, unmodified by Rule 1, aided religious schools in violation of the Montana Constitution’s no-aid provision. The Court further held that the violation required invalidating the entire program.

Held: The application of the no-aid provision discriminated against religious schools and the families whose children attend or hope to attend them in violation of the Free Exercise Clause of the Federal Constitution.

[684] First Amendment to the Constitution of the United States. Ratified December 15, 1791. <justfacts.com>

“Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.”

[685] Ruling: Espinoza v. Montana Department of Revenue. U.S. Supreme Court, June 30, 2020. Decided 5–4. Majority: Roberts, Thomas, Alito, Gorsuch, Kavanaugh. Concurrence: Thomas, Alito, and Gorsuch. Dissenting: Ginsburg, Breyer, Kagan, and Sotomayor. <www.supremecourt.gov>

Syllabus:

Held: The application of the no-aid provision discriminated against religious schools and the families whose children attend or hope to attend them in violation of the Free Exercise Clause of the Federal Constitution. …

(a) The Free Exercise Clause “protects religious observers against unequal treatment” and against “laws that impose special disabilities on the basis of religious status.” Trinity Lutheran Church of Columbia, Inc. v. Comer…. In Trinity Lutheran, this Court held that disqualifying otherwise eligible recipients from a public benefit “solely because of their religious character” imposes “a penalty on the free exercise of religion that triggers the most exacting scrutiny.” … Here, the application of Montana’s no-aid provision excludes religious schools from public benefits solely because of religious status. As a result, strict scrutiny applies. …

(b) Contrary to the Department’s contention, this case is not governed by Locke v. Davey…. The plaintiff in Locke was denied a scholarship “because of what he proposed to do—use the funds to prepare for the ministry,” an essentially religious endeavor. … By contrast, Montana’s no-aid provision does not zero in on any essentially religious course of instruction but rather bars aid to a religious school “simply because of what it is”—a religious school. … Locke also invoked a “historic and substantial” state interest in not funding the training of clergy … but no comparable tradition supports Montana’s decision to disqualify religious schools from government aid.

[686] Ruling: Espinoza v. Montana Department of Revenue. U.S. Supreme Court, June 30, 2020. Decided 5–4. Majority: Roberts, Thomas, Alito, Gorsuch, Kavanaugh. Concurrence: Thomas, Alito, and Gorsuch. Dissenting: Ginsburg, Breyer, Kagan, and Sotomayor. <www.supremecourt.gov>

Majority:

The question for this Court is whether the Free Exercise Clause precluded the Montana Supreme Court from applying Montana’s no-aid provision to bar religious schools from the scholarship program. For purposes of answering that question, we accept the Montana Supreme Court’s interpretation of state law—including its determination that the scholarship program provided impermissible “aid” within the meaning of the Montana Constitution—and we assess whether excluding religious schools and affected families from that program was consistent with the Federal Constitution.2

The Free Exercise Clause, which applies to the States under the Fourteenth Amendment, “protects religious observers against unequal treatment” and against “laws that impose special disabilities on the basis of religious status.” Trinity Lutheran … see Cantwell v. Connecticut…. Those “basic principle[s]” have long guided this Court. … See, for example, Everson v. Board of Ed. of Ewing, … (a State “cannot exclude individual Catholics, Lutherans, Mohammedans, Baptists, Jews, Methodists, Non-believers, Presbyterians, or the members of any other faith, because of their faith, or lack of it, from receiving the benefits of public welfare legislation”); Lyng v. Northwest Indian Cemetery Protective Assn. … (the Free Exercise Clause protects against laws that “penalize religious activity by denying any person an equal share of the rights, benefits, and privileges enjoyed by other citizens”).

Most recently, Trinity Lutheran distilled these and other decisions to the same effect into the “unremarkable” conclusion that disqualifying otherwise eligible recipients from a public benefit “solely because of their religious character” imposes “a penalty on the free exercise of religion that triggers the most exacting scrutiny.” … In Trinity Lutheran, Missouri provided grants to help nonprofit organizations pay for playground resurfacing, but a state policy disqualified any organization “owned or controlled by a church, sect, or other religious entity.” … Because of that policy, an otherwise eligible church-owned preschool was denied a grant to resurface its playground. Missouri’s policy discriminated against the Church “simply because of what it is—a church,” and so the policy was subject to the “strictest scrutiny,” which it failed. … We acknowledged that the State had not “criminalized” the way in which the Church worshipped or “told the Church that it cannot subscribe to a certain view of the Gospel.” … But the State’s discriminatory policy was “odious to our Constitution all the same.” …

Here too Montana’s no-aid provision bars religious schools from public benefits solely because of the religious character of the schools. The provision also bars parents who wish to send their children to a religious school from those same benefits, again solely because of the religious character of the school. This is apparent from the plain text. The provision bars aid to any school “controlled in whole or in part by any church, sect, or denomination.” … The provision’s title—“Aid prohibited to sectarian schools”—confirms that the provision singles out schools based on their religious character. … And the Montana Supreme Court explained that the provision forbids aid to any school that is “sectarian,” “religiously affiliated,” or “controlled in whole or in part by churches.” … The provision plainly excludes schools from government aid solely because of religious status.

[687] Ruling: Espinoza v. Montana Department of Revenue. U.S. Supreme Court, June 30, 2020. Decided 5–4. Majority: Roberts, Thomas, Alito, Gorsuch, Kavanaugh. Concurrence: Thomas, Alito, and Gorsuch. Dissenting: Ginsburg, Breyer, Kagan, and Sotomayor. <www.supremecourt.gov>

Dissent (Sotomayor):

Even on its own terms, the Court’s answer to its hypothetical question is incorrect. The Court relies principally on Trinity Lutheran, which found that disqualifying an entity from a public benefit “solely because of [the entity’s] religious character” could impose “a penalty on the free exercise of religion.” … Trinity Lutheran held that ineligibility for a government benefit impermissibly burdened a church’s religious exercise by “put[ting it] to the choice between being a church and receiving a government benefit.” … Invoking that precedent, the Court concludes that Montana must subsidize religious education if it also subsidizes nonreligious education.3

The Court’s analysis of Montana’s defunct tax program reprises the error in Trinity Lutheran. Contra the Court’s current approach, our free exercise precedents had long granted the government “some room to recognize the unique status of religious entities and to single them out on that basis for exclusion from otherwise generally applicable laws.”

[688] Ruling: Espinoza v. Montana Department of Revenue. U.S. Supreme Court, June 30, 2020. Decided 5–4. Majority: Roberts, Thomas, Alito, Gorsuch, Kavanaugh. Concurrence: Thomas, Alito, and Gorsuch. Dissenting: Ginsburg, Breyer, Kagan, and Sotomayor. <www.supremecourt.gov>

Concurrence (Thomas):

There is mixed historical evidence concerning whether the Establishment Clause was understood as an individual right at the time of the Fourteenth Amendment’s ratification. … Even assuming that the Clause creates a right and that such a right could be incorporated, however, it would only protect against an “establishment” of religion as understood at the founding, i.e., “ ‘coercion of religious orthodoxy and of financial support by force of law and threat of penalty.’ ” … (quoting Lee v. Weisman … (Scalia, J., dissenting); emphasis deleted); American Legion v. American Humanist Assn. … (Thomas, J., concurring in judgment) … see also McConnell, Establishment and Disestablishment at the Founding, Part I: Establishment of Religion … McConnell, Coercion: The Lost Element of Establishment….1

Thus, the modern view, which presumes that States must remain both completely separate from and virtually silent on matters of religion to comply with the Establishment Clause, is fundamentally incorrect. Properly understood, the Establishment Clause does not prohibit States from favoring religion. They can legislate as they wish, subject only to the limitations in the State and Federal Constitutions. See Muñoz, The Original Meaning of the Establishment Clause and the Impossibility of Its Incorporation….

[689] Ruling: Espinoza v. Montana Department of Revenue. U.S. Supreme Court, June 30, 2020. Decided 5–4. Majority: Roberts, Thomas, Alito, Gorsuch, Kavanaugh. Concurrence: Thomas, Alito, and Gorsuch. Dissenting: Ginsburg, Breyer, Kagan, and Sotomayor. <www.supremecourt.gov>

Dissent (Breyer):

It is true that Montana’s no-aid provision broadly bars state aid to schools based on their religious affiliation. But this case does not involve a claim of status-based discrimination. The schools do not apply or compete for scholarships, they are not parties to this litigation, and no one here purports to represent their interests. We are instead faced with a suit by parents who assert that their free exercise rights are violated by the application of the no-aid provision to prevent them from using taxpayer-supported scholarships to attend the schools of their choosing. In other words, the problem, as in Locke, is what petitioners “ ‘propos[e] to do—use the funds to’ ” obtain a religious education. … (quoting Trinity Lutheran….).

Even if the schools’ status were relevant, I do not see what bearing the majority’s distinction could have here. There is no dispute that religious schools seek generally to inspire religious faith and values in their students. How else could petitioners claim that barring them from using state aid to attend these schools violates their free exercise rights? Thus, the question in this case—unlike in Trinity Lutheran—boils down to what the schools would do with state support. And the upshot is that here, as in Locke, we confront a State’s decision not to fund the inculcation of religious truths.

[690] Ruling: Espinoza v. Montana Department of Revenue. U.S. Supreme Court, June 30, 2020. Decided 5–4. Majority: Roberts, Thomas, Alito, Gorsuch, Kavanaugh. Concurrence: Thomas, Alito, and Gorsuch. Dissenting: Ginsburg, Breyer, Kagan, and Sotomayor. <www.supremecourt.gov>

Concurrence (Alito): “I join the opinion of the Court in full. The basis of the decision below was a Montana constitutional provision that, according to the Montana Supreme Court, forbids parents from participating in a publicly funded scholarship program simply because they send their children to religious schools. Regardless of the motivation for this provision or its predecessor, its application here violates the Free Exercise Clause.”

[691] Webpage: “School Choice in America.” Friedman Foundation for Educational Choice. Last modified April 17, 2023. <www.edchoice.org>

More states than ever offer families K–12 programs that help them access options like private school or a customized education that fits their needs. …

Select All States … All Program Types … Most Recent Data Available

[692] Webpage: “About the Standards.” Common Core State Standards Initiative. Accessed April 18, 2019 at <www.thecorestandards.org>

This site is the official home of the Common Core State Standards. It is hosted and maintained by the Council of Chief State School Officers (CCSSO) and the National Governors Association Center for Best Practices (NGA Center). It provides parents, educators, policymakers, journalists, and others easy access to the actual standards, as well as supporting information and resources.

About the Common Core State Standards

The Common Core is a set of high-quality academic standards in mathematics and English language arts/literacy (ELA). These learning goals outline what a student should know and be able to do at the end of each grade. The standards were created to ensure that all students graduate from high school with the skills and knowledge necessary to succeed in college, career, and life, regardless of where they live. Forty-one states, the District of Columbia, four territories, and the Department of Defense Education Activity (DoDEA) have voluntarily adopted and are moving forward with the Common Core.

[693] Webpage: “About the Standards.” Common Core State Standards Initiative. Accessed October 10, 2015 at <www.thecorestandards.org>

Recognizing the value and need for consistent learning goals across states, in 2009 the state school chiefs and governors that comprise CCSSO [Council of Chief State School Officers] and the NGA [National Governors Association] Center coordinated a state-led effort to develop the Common Core State Standards. Designed through collaboration among teachers, school chiefs, administrators, and other experts, the standards provide a clear and consistent framework for educators.

[694] Report: “Frequently Asked Questions.” Common Core State Standards Initiative, June 5, 2014. <www.thecorestandards.org>

Page 1:

Who Led the Development of the Common Core State Standards?

The nation’s governors and education commissioners, through their representative organizations, the National Governors Association Center for Best Practices (NGA) and the Council of Chief State School Officers (CCSSO), led the development of the Common Core State Standards and continue to lead the initiative. Teachers, parents, school administrators, and experts from across the country, together with state leaders, provided input into the development of the standards. …

Who Will Manage the Common Core State Standards in the Future?

The Common Core State Standards are and will remain a state-led effort, and adoption of the standards and any potential revisions will continue to be a voluntary state decision. The National Governors Association Center for Best Practices and the Council of Chief State School Officers will continue to serve as the two leading organizations with ownership of the Common Core and will make decisions about the timing and substance of future revisions to the standards in consultation with the states.

[695] “Common Core State Standards for English Language Arts & Literacy in History/Social Studies, Science, and Technical Subjects.” Common Core State Standards Initiative, June 21, 2021. Modified 1/24/13. <bit.ly>

Page 3: “The Standards are intended to be a living work: as new and better evidence emerges, the Standards will be revised accordingly.”

[696] Webpage: “About Us.” Council of Chief State School Officers. Accessed October 2, 2018 at <ccsso.org>

The Council of Chief State School Officers (CCSSO) is a nonpartisan, nationwide, nonprofit organization of public officials who head departments of elementary and secondary education in the states, the District of Columbia, the Department of Defense Education Activity, the Bureau of Indian Education and the five U.S. extra-state jurisdictions.

As an organization, we are committed to ensuring that all students participating in our public education system—regardless of background—graduate prepared for college, careers, and life. To realize this, we bring together dedicated leaders and exceptional ideas to achieve measurable progress for every student.

[697] Webpage: “FAQ.” National Governors Association. Accessed April 18, 2019 at <www.nga.org>

6. How Are NGA and the NGA Center for Best Practices Funded?

NGA has separate funding sources that provide revenues for NGA [National Governors Association] and the NGA Center for Best Practices (NGA Center). State dues fund the association’s advocacy and other activities. The NGA Center, a 501(c)(3) corporation, is an integral part of NGA and is the only policy research and development firm that directly serves the nation’s governors by developing innovative solutions to today’s most pressing public policy challenges. The NGA Center is funded through federal grants and contracts, fee-for-service programs, private and corporate foundation contributions, and the NGA Partners program.

[698] Webpage: “National Governors Association.” National Governors Association. Accessed March 8, 2022 at <www.nga.org>

Founded in 1908, the National Governors Association is the voice of the leaders of 55 states, territories, and commonwealths. Our nation’s Governors are dedicated to leading bipartisan solutions that improve citizens’ lives through state government. Through NGA, Governors identify priority issues and deal with matters of public policy and governance at the state, national and global levels.

NGA is the premier resource for not only Governors but also for their cabinet members, state policy experts, the U.S. Congress, and private enterprise. NGA offers an array of services to help collaboratively tell the states’ story. Thanks to decades of broad expertise, NGA teams are able to work side-by-side with state leaders to identify challenges, help Governors stay ahead of the curve and offer solutions before challenges become problems.

[699] Email from the National Governors Association to Just Facts, October 19, 2015.

NGA [National Governors Association] is an instrumentality of the states. State dues fund the National Governors Association’s lobbying, communications, management services, and a portion of general administration activities. NGA’s dues-supported staff are organized into five offices that provide cross-cutting services across program areas (executive director, federal relations, management consulting and training, communications, and administration and finance).

The NGA Center for Best Practices (NGA Center), a 501(c)(3) corporation, is an integral part of the National Governors Association and is the only research and development firm that directly serves the nation’s governors and their key policy staff. The NGA Center is funded through federal grants and contracts, fee-for-service programs, private and corporate foundation contributions, and NGA’s Corporate Fellows program.

[700] Webpage: “FAQ.” National Governors Association. Accessed April 18, 2019 at <www.nga.org>

6. How are NGA and the NGA Center for Best Practices funded?

NGA has separate funding sources that provide revenues for NGA and the NGA Center for Best Practices (NGA Center). State dues fund the association’s advocacy and other activities. The NGA Center, a 501(c)(3) corporation, is an integral part of NGA and is the only policy research and development firm that directly serves the nation’s governors by developing innovative solutions to today’s most pressing public policy challenges. The NGA Center is funded through federal grants and contracts, fee-for-service programs, private and corporate foundation contributions, and the NGA Partners program.

[701] Article: “How Bill Gates Pulled Off the Swift Common Core Revolution.” By Lyndsey Layton. Washington Post, June 7, 2014. <www.washingtonpost.com>

The pair of education advocates had a big idea, a new approach to transform every public-school classroom in America. By early 2008, many of the nation’s top politicians and education leaders had lined up in support.

But that wasn’t enough. The duo needed money—tens of millions of dollars, at least—and they needed a champion who could overcome the politics that had thwarted every previous attempt to institute national standards.

So they turned to the richest man in the world.

On a summer day in 2008, Gene Wilhoit, director of a national group of state school chiefs, and David Coleman, an emerging evangelist for the standards movement, spent hours in Bill Gates’s sleek headquarters near Seattle, trying to persuade him and his wife, Melinda, to turn their idea into reality. …

After the meeting, weeks passed with no word. Then Wilhoit got a call: Gates was in. …

The Bill and Melinda Gates Foundation didn’t just bankroll the development of what became known as the Common Core State Standards. With more than $200 million, the foundation also built political support across the country, persuading state governments to make systemic and costly changes.

Bill Gates was de facto organizer, providing the money and structure for states to work together on common standards….

The Gates Foundation spread money across the political spectrum, to entities including the big teachers unions, the American Federation of Teachers and the National Education Association, and business organizations such as the U.S. Chamber of Commerce….

… Gates money went to state and local groups, as well, to help influence policymakers and civic leaders. And the idea found a major booster in President Obama, whose new administration was populated by former Gates Foundation staffers and associates.

[702] Article: “Forbes’ 32nd Annual World’s Billionaires Issue.” Forbes, March 6, 2018. <www.forbes.com>

“Bill Gates, who has been the richest person in the world for 18 of the past 24 years, drops to No. 2 on the Forbes billionaires list. Gates has a fortune of $90 billion, up from $86 billion last year.”

[703] Press release: “Common Core State Standards Development Work Group and Feedback Group Announced.” Common Core State Standards Initiative, July 1, 2009. <www.nga.org>

The National Governors Association Center for Best Practices (NGA Center) and the Council of Chief State School Officers (CCSSO) today announced the names of the experts serving on the Common Core State Standards Development Work Group and Feedback Group and provided more detailed information on the college and career ready standards development process. …

The Work Group’s deliberations will be confidential throughout the process. States and national education organizations will have an opportunity to review and provide evidence-based feedback on the draft documents throughout the process.

The members of the mathematics Work Group are: [15 people listed]

Members of the English-language Arts Work Group are: [14 people listed]

Also, as a step in the standards development process, the NGA Center and CCSSO are overseeing the work of a Feedback Group. The role of this Feedback Group is to provide information backed by research to inform the standards development process by offering expert input on draft documents. Final decisions regarding the common core standards document will be made by the Standards Development Work Group. The Feedback Group will play an advisory role, not a decision-making role in the process.

[704] Press release: “Common Core State Standards Initiative Validation Committee Announced.” Common Core State Standards Initiative, September 24, 2009. <bit.ly>

The National Governors Association Center for Best Practices (NGA Center) and the Council of Chief State School Officers (CCSSO) today released the names of the members of the Validation Committee for the Common Core State Standards Initiative. This committee will immediately be tasked with reviewing and verifying the standards development process and the resulting evidence-based college- and career-readiness standards. …

Members of the validation committee were nominated by states and national organizations, with a group of six governors and six chief state school officers in the participating states selecting the final committee membership. The six governors were Colorado Gov. Bill Ritter; Connecticut Gov. M. Jodi Rell; Delaware Gov. Jack Markell; Georgia Gov. Sonny Perdue; Vermont Gov. Jim Douglas; and West Virginia Gov. Joe Manchin. The chief state school officers were: Maine Chief and CCSSO Board President Susan Gendron; Michigan Chief Michael Flanagan; Pennsylvania Chief Gerald Zahorchak; South Carolina Chief Jim Rex; and West Virginia Chief Steve Paine. …

The members of the Validation Committee are:

1. Bryan Albrecht, President, Gateway Technical College, Kenosha, Wisconsin

2. Arthur Applebee, Distinguished Professor, Center on English Learning & Achievement, School of Education, University at Albany, SUNY

3. Sarah Baird, 2009 Arizona Teacher of the Year, K–5 Math Coach, Kyrene School District

4. Jere Confrey, Joseph D. Moore Distinguished University Professor, William and Ida Friday Institute for Educational Innovation, College of Education, North Carolina State University

5. David T. Conley, Professor, College of Education, University of Oregon CEO, Educational Policy Improvement Center (Co-Chair)

6. Linda Darling-Hammond, Charles E. Ducommun Professor of Education, Stanford University

7. Alfinio Flores, Hollowell Professor of Mathematics Education, University of Delaware

8. Brian Gong, Executive Director, Center for Assessment (Co-Chair)

9. Kenji Hakuta, Lee L. Jacks Professor of Education, Stanford University

10. Kristin Buckstad Hamilton, Teacher, Battlefield Senior High School, NEA [National Education Association]

11. Feng-Jui Hsieh, Associate Professor of the Mathematics Department, National Taiwan Normal University

12. Mary Ann Jordan, Teacher, New York City Dept of Education, AFT [American Federation of Teachers]

13. Jeremy Kilpatrick, Regents Professor of Mathematics Education, University of Georgia

14. Dr. Jill Martin, Principal, Pine Creek High School

15. Barry McGaw, Professor and Director of Melbourne Education Research Institute, University of Melbourne; Director for Education, OECD [Organization for Economic Cooperation and Development]

16. James Milgram, Professor Emeritus, Stanford University

17. David Pearson, Professor and Dean, Graduate School of Education, University of California, Berkeley

18. Steve Pophal, Principal, DC Everest Junior High

19. Stanley Rabinowitz, Senior Program Director, Assessment and Standards Development Services, WestEd

20. Lauren Resnick, Distinguished University Professor, Psychology and Cognitive Science, Learning Sciences and Education Policy, University of Pittsburgh

21. Andreas Schleicher, Head, Indicators and Analysis Division of the OECD Directorate for Education

22. William Schmidt, University Distinguished Professor, Michigan State University

23. Catherine Snow, Henry Lee Shattuck Professor of Education, Harvard Graduate School of Education

24. Christopher Steinhauser, Superintendent of Schools, Long Beach Unified School District

25. Sandra Stotsky, Professor of Education Reform, 21st Century Chair in Teacher Quality, University of Arkansas

26. Dorothy Strickland, Samuel DeWitt Proctor Professor of Ed., Emerita, Distinguished Research Fellow, National Institute for Early Education Research, Rutgers, The State University of NJ

27. Martha Thurlow, Director, National Center on Educational Outcomes, University of Minnesota

28. Norman Webb, Senior Research Scientist, Emeritus, Wisconsin Center for Education Research, University of Wisconsin

29. Dylan William, Deputy Director, Institute of Education, University of London

NOTE: The numbering above was added by Just Facts.

[705] Commentary: “Can This Country Survive Common Core’s College Readiness Level?” By R. James Milgram and Sandra Stotsky†, September 2013. <bit.ly>

Page 3:

As a condition of membership, all VC [validation committee] members had to agree to 10 conditions, among which were the following: …

I agree to maintain the deliberations, discussions, and work of the Validation Committee, including the content of any draft or final documents, on a strictly confidential basis and shall not disclose or communicate any information related to the same, including in summary form, except within the membership of the Validation Committee and to CCSSO [Council of Chief State School Officers] and the NGA [National Governors Association] Center.

As can be seen in the second condition listed above, members of the VC could never, then or in the future, discuss whether or not the VC discussed the meaning of college readiness or had any recommendations to offer on the matter.

† NOTE: See the next footnote.

[706] Report: “Reaching Higher: The Common Core State Standards Validation Committee.” National Governors Association Center for Best Practices and the Council of Chief State School Officers, June 2010. <bit.ly>

Page 6:

R. James Milgram

Emeritus Professor at Stanford University’s Department of Mathematics

Milgram, one of the authors of the California Mathematics Standards and the California Mathematics Framework, has worked with a number of states, and with the Achieve Mathematics Advisory Panel, on standards in education. As a member of the National Board for Education Sciences, he has worked with the U.S. Department of Education on the math that pre-service K–8 teachers need to know and understand. …

Sandra Stotsky

Endowed Chair in Teacher Quality at the University of Arkansas’s Department of Education Reform and Chair of the Sadlier Mathematics Advisory Board

Stotsky has abundant experience in developing and reviewing ELA [English Language Arts] standards. As senior associate commissioner of the Massachusetts Department of Education, she helped revise pre-K–12 standards. She also served on the 2009 steering committee for NAEP [National Assessment of Educational Progress] reading and on the 2006 National Math Advisory Panel.

[707] Press release: “Common Core State Standards Initiative Validation Committee Announced.” Common Core State Standards Initiative, September 24, 2009. <bit.ly>

The National Governors Association Center for Best Practices (NGA Center) and the Council of Chief State School Officers (CCSSO) today released the names of the members of the Validation Committee for the Common Core State Standards Initiative. This committee will immediately be tasked with reviewing and verifying the standards development process and the resulting evidence-based college- and career-readiness standards. …

Members of the validation committee were nominated by states and national organizations, with a group of six governors and six chief state school officers in the participating states selecting the final committee membership. The six governors were Colorado Gov. Bill Ritter; Connecticut Gov. M. Jodi Rell; Delaware Gov. Jack Markell; Georgia Gov. Sonny Perdue; Vermont Gov. Jim Douglas; and West Virginia Gov. Joe Manchin. The chief state school officers were: Maine Chief and CCSSO Board President Susan Gendron; Michigan Chief Michael Flanagan; Pennsylvania Chief Gerald Zahorchak; South Carolina Chief Jim Rex; and West Virginia Chief Steve Paine. …

The members of the Validation Committee are:

1. Bryan Albrecht, President, Gateway Technical College, Kenosha, Wisconsin

2. Arthur Applebee, Distinguished Professor, Center on English Learning & Achievement, School of Education, University at Albany, SUNY

3. Sarah Baird, 2009 Arizona Teacher of the Year, K–5 Math Coach, Kyrene School District

4. Jere Confrey, Joseph D. Moore Distinguished University Professor, William and Ida Friday Institute for Educational Innovation, College of Education, North Carolina State University

5. David T. Conley, Professor, College of Education, University of Oregon CEO, Educational Policy Improvement Center (Co-Chair)

6. Linda Darling-Hammond, Charles E. Ducommun Professor of Education, Stanford University

7. Alfinio Flores, Hollowell Professor of Mathematics Education, University of Delaware

8. Brian Gong, Executive Director, Center for Assessment (Co-Chair)

9. Kenji Hakuta, Lee L. Jacks Professor of Education, Stanford University

10. Kristin Buckstad Hamilton, Teacher, Battlefield Senior High School, NEA [National Education Association]

11. Feng-Jui Hsieh, Associate Professor of the Mathematics Department, National Taiwan Normal University

12. Mary Ann Jordan, Teacher, New York City Dept of Education, AFT [American Federation of Teachers]

13. Jeremy Kilpatrick, Regents Professor of Mathematics Education, University of Georgia

14. Dr. Jill Martin, Principal, Pine Creek High School

15. Barry McGaw, Professor and Director of Melbourne Education Research Institute, University of Melbourne; Director for Education, OECD [Organization for Economic Cooperation and Development]

16. James Milgram, Professor Emeritus, Stanford University

17. David Pearson, Professor and Dean, Graduate School of Education, University of California, Berkeley

18. Steve Pophal, Principal, DC Everest Junior High

19. Stanley Rabinowitz, Senior Program Director, Assessment and Standards Development Services, WestEd

20. Lauren Resnick, Distinguished University Professor, Psychology and Cognitive Science, Learning Sciences and Education Policy, University of Pittsburgh

21. Andreas Schleicher, Head, Indicators and Analysis Division of the OECD Directorate for Education

22. William Schmidt, University Distinguished Professor, Michigan State University

23. Catherine Snow, Henry Lee Shattuck Professor of Education, Harvard Graduate School of Education

24. Christopher Steinhauser, Superintendent of Schools, Long Beach Unified School District

25. Sandra Stotsky, Professor of Education Reform, 21st Century Chair in Teacher Quality, University of Arkansas

26. Dorothy Strickland, Samuel DeWitt Proctor Professor of Ed., Emerita, Distinguished Research Fellow, National Institute for Early Education Research, Rutgers, The State University of NJ

27. Martha Thurlow, Director, National Center on Educational Outcomes, University of Minnesota

28. Norman Webb, Senior Research Scientist, Emeritus, Wisconsin Center for Education Research, University of Wisconsin

29. Dylan William, Deputy Director, Institute of Education, University of London

NOTE: The numbering above was added by Just Facts.

[708] Report: “Reaching Higher: The Common Core State Standards Validation Committee.” National Governors Association Center for Best Practices and the Council of Chief State School Officers, June 2010. <bit.ly>

Page 1: “The NGA [National Governors Association] Center [Center For Best Practices] and CCSSO [Council of Chief State School Officers], as part of the CCSSI [Common Core State Standards Initiative] convened a 25-member Validation Committee (VC) composed of leading figures in the education standards community.”

Page 4:

Certification

Based on the deliberations of, and review by, the Validation Committee, the National Governors Association Center for Best Practices and the Council of Chief State School Officers accept the Validation Committee’s certification that the Common Core State Standards in English language arts and mathematics are consistent with the criteria established in the charge to the Validation Committee.

Signed,

1. Bryan Albrecht

2. Arthur Applebee

3. Sarah Baird

4. Jere Confrey

5. David T. Conley

6. Linda Darling-Hammond

7. Brian Gong

8. Kenji Hakuta

9. Kristin Buckstad Hamilton

10. Feng-Jui Hsieh

11. Mary Ann Jordan

12. Jeremy Kilpatrick

13. Jill Martin

14. David Pearson

15. Steve Pophal

16. Stanley Rabinowitz

17. Lauren Resnick

18. Andreas Schleicher

19. William Schmidt

20. Catherine Snow

21. Christopher Steinhauser

22. Dorothy Strickland

23. Martha Thurlow

24. Norman Webb

NOTES:

  • The numbering above was added by Just Facts.
  • A comparison of this list of signatories to the list of people appointed to the validation committee shows that the following five names are missing: Alfinio Flores, Barry McGaw, James Milgram, Sandra Stotsky, and Dylan William. An Excel file cross-referencing these lists is available upon request.
  • Among the five people who did not certify the standards, McGaw left the committee before the report was issued (see next footnote), and the other four declined to certify the standards. Their names are listed among the authors of this report, and their biographies appear in pages 5–6 of the report as follows:

Alfinio Flores

Holowell Professor of Mathematics Education in the Department of Mathematical Sciences and School of Education at the University of Delaware’s College of Education & Public Policy

Flores is a nationally recognized expert in mathematics education and mathematics teaching and learning, curriculum development, and pre-and in-service preparation of teachers of mathematics.

R. James Milgram

Emeritus Professor at Stanford University’s Department of Mathematics

Milgram, one of the authors of the California Mathematics Standards and the California Mathematics Framework, has worked with a number of states, and with the Achieve Mathematics Advisory Panel, on standards in education. As a member of the National Board for Education Sciences, he has worked with the U.S. Department of Education on the math that pre-service K–8 teachers need to know and understand. …

Sandra Stotsky

Endowed Chair in Teacher Quality at the University of Arkansas’s Department of Education Reform and Chair of the Sadlier Mathematics Advisory Board

Stotsky has abundant experience in developing and reviewing ELA [English Language Arts] standards. As senior associate commissioner of the Massachusetts Department of Education, she helped revise pre–K–12 standards. She also served on the 2009 steering committee for NAEP [National Assessment of Educational Progress] reading and on the 2006 National Math Advisory Panel.

Dylan William

Director of the Learning and Teaching Research Center at the Educational Testing Service

William has taught Master’s and doctorate-level courses on educational assessment, research methods, and the use of information technology in academic research. He served as the academic coordinator for the Consortium for Assessment and Testing in Schools, which developed a variety of statutory and non-statutory assessments for the national curriculum of England and Wales. He is currently exploring how assessments can be used to support learning.

[709] E-mail from Professor Barry McGaw to Just Facts, November 2, 2015.

“I agreed to participate but, when the time came to review documents, was buried under work commitments, so I withdrew. That was early in the process and reflected no judgment on the common core.”

[710] “Testimony for the House Study Committee on the Role of Federal Government in Education.” By Sandra Stotsky (University of Arkansas, Department of Education Reform). Georgia General Assembly, House Study Committee on the Role of the Federal Government in Education. September 24, 2014. <bit.ly>

Page 1:

I begin with remarks on Common Core’s Validation Committee, on which I served from 2009–2010. This committee, which was created to put the seal of approval on Common Core’s standards, was invalid both in its membership and in the procedures it was told to follow. …

… Common Core’s standards did not emerge from a state-led process and were not written by nationally known experts, claims regularly made by its advocates. In fact, the people who wrote the standards were not qualified to draft K–12 standards at all. …

… Not only were no high school mathematics teachers involved, no English professors or high school English teachers were, either. Because everyone worked without open meetings or accessible public comment, their reasons for making the decisions they did are lost to history. To this day we do not know why Common Core’s high school mathematics standards do not provide a pathway to STEM [Science, Technology, Engineering, and Math] careers or why David Coleman was allowed to mandate a 50/50 division between literary study and “informational” text at every grade level from K–12 in the ELA [English Language Arts] standards, with no approval from English teachers across the country or from the parents of students in our public schools.

The absence of relevant professional credentials in the two standards-writing teams helps to explain the flaws in these standards, on which costly tests are based and scheduled to be given in Georgia in 2015–2016. The “lead” writers for the ELA standards, David Coleman and Susan Pimentel, had never taught reading or English in K–12 or at the college level. Neither has a doctorate in English, nor published serious work on curriculum and instruction. They were virtually unknown to English language arts educators and to higher education faculty in rhetoric, speech, composition, or literary study.

None of the three lead standards-writers in mathematics, Jason Zimba, William McCallum, and Phil Daro, the only member of this three-person team with teaching experience, had ever developed K–12 mathematics standards before. Who wanted these people as standards-writers and why, we still do not know. No one in the media showed the slightest interest in their qualifications or the low level of college readiness they aimed for on a grade 11 test.

Page 2:

The federal government could have funded an independent group of experts to evaluate the soundness and rigor of the standards it was incentivizing the states to adopt via the Race to the Top grant competition, but it did not do so. Instead, the private organizations that chose the standards writers and created Common Core’s standards also created their own Validation Committee (VC) in 2009 of 25–29 members to exercise this function. The VC contained almost no academic experts on ELA and mathematics standards; most were education professors or associated with testing companies, from here and abroad. There was only one mathematician on the VC—R. James Milgram—although there were many mathematics educators on it, i.e., people with appointments in an education school and/or who worked chiefly in teacher education. I was the only nationally recognized expert on English language arts standards by virtue of my work in Massachusetts and for Achieve, Inc.’s American Diploma Project.

Why didn’t I sign off on Common Core’s standards? Professor Milgram and I were two of the five members of the VC who did not sign off on the standards. So far as we could determine, the Validation Committee was intended to function as a rubber stamp even though we had been asked to validate the standards. Despite repeated requests, we did not get the names of countries whose standards were supposedly used as benchmarks for Common Core’s. So far as I could figure out, Common Core’s standards were intentionally not made comparable to the most demanding sets of standards elsewhere. It did not offer any research evidence to justify its omission of high school mathematics standards leading to STEM careers, its stress on writing over reading, its division of reading instructional texts into “information” and “literature,” its deferral of the completion of Algebra I to grade 9 or 10, and its experimental approach to teaching Euclidean geometry. Nor did Common Core offer evidence that its standards meet entrance requirements for most colleges and universities in this country or elsewhere—or for a high school diploma in many states.

[711] “Testimony About Issues-with-Core Math Standards.” By R. James Milgram (Professor of Mathematics, Stanford University). Arkansas State Legislature, July 23, 2013. <www.arkleg.state.ar.us>

Pages 1–2 (of PDF):

[T]he new Common Core national standards … are claimed to be research based, but the main reason I could not sign off on them was that there were too many areas where the writing team could not show me suitable research that justified their handling of key topics—particularly when they differed from standard approaches. …

The three most severe problem areas are

1. the beginning handling of whole numbers in particular adding, subtracting, multiplying, and dividing;

2. the handling of geometry in middle school and high school;

3. the very low level expectations for high school graduation that barely prepare students for attending a community college, let alone a 4-year university.

Unfortunately, these are the three most crucial areas where our math outcomes have to improve.

Core Standard’s approach to whole numbers is just the continuation of the approach pioneered in California in the early 1990’s that had such bad outcomes that it spawned the Math Wars. Moreover, the use of student-constructed algorithms is at odds with the practices of high-achieving countries and the research that supports student constructed algorithms appears highly suspect.

Additionally, the way Common Core presents geometry is not research-based—and the only country that tried this approach on a large scale, the old USSR [Union of Soviet Socialist Republics], rapidly abandoned it. The problem is that—though the outlined approach to geometry is rigorous—it depends on too many highly specialized topics, that even math majors at a four year university would not see until their second or more likely their third years. Again, there is no research with actual students that supports the Core Standards approach.

Tied in with the problems in geometry, there are also severe problems with the way Common Core handles percents, ratios, rates, and proportions—the critical topics that are essential if students are to learn more advanced topics such as trigonometry, statistics, and even calculus. …

The classic method of, for example, adding two-digit numbers is to add the digits in the “ones” column, carry the tens in the sum to the “tens” column, then add the “tens” digits, and so on. This “standard algorithm” works first time, every time. But instead of preparing for and teaching this method, by first carefully studying and understanding the meaning of our place value notation, as they do in the high achieving countries, Common Core creates a three-step process starting with student constructed algorithms guided by the absurd belief that all this content is somehow innate. …

I cannot emphasize enough that Common Core is using our children for a huge and risky experiment, one that consistently failed when tried by individual states such as California in the early 1990’s and even countries such as the old USSR in the 1970’s.

[712] Report: “Race to the Top Program Executive Summary.” U.S. Department of Education, November 2009. <files.eric.ed.gov>

Page 2:

On February 17, 2009, President Obama signed into law the American Recovery and Reinvestment Act of 2009 (ARRA), historic legislation designed to stimulate the economy, support job creation, and invest in critical sectors, including education. The ARRA lays the foundation for education reform by supporting investments in innovative strategies that are most likely to lead to improved results for students, long-term gains in school and school system capacity, and increased productivity and effectiveness.

The ARRA provides $4.35 billion for the Race to the Top Fund, a competitive grant program designed to encourage and reward States that are creating the conditions for education innovation and reform; achieving significant improvement in student outcomes, including making substantial gains in student achievement, closing achievement gaps, improving high school graduation rates, and ensuring student preparation for success in college and careers; and implementing ambitious plans in four core education reform areas….

[713] Webpage: “Race to the Top.” White House. Accessed October 30, 2015 at <obamawhitehouse.archives.gov>

To date, President Obama’s Race to the Top initiative has dedicated over $4 billion to 19 states that have created robust plans that address the four key areas of K–12 education reform as described below. …

Forty-six states and the District of Columbia submitted comprehensive reform plans to compete in the Race to the Top competition. While 19 states have received funding so far, 34 states modified state education laws or policies to facilitate needed change, and 48 states worked together to create a voluntary set of rigorous college- and career-ready standards.

[714] Code of Federal Regulations Title 34, Subtitle B, Chapter II: “Race to the Top Fund.” U.S. Government Printing Office, November 18, 2009. <www.govinfo.gov>

Page 59689:

In response to comments indicating that some States would have difficulty meeting a June 2010 deadline for adopting a new set of common, kindergarten-to-grade-12 (K–12) standards, this notice extends the deadline for adopting standards as far as possible, while still allowing the Department to comply with the statutory requirement to obligate all Race to the Top funds by September 30, 2010. As set forth in criterion (B)(1)(ii), the new deadline for adopting a set of common K–12 standards is August 2, 2010. States that cannot adopt a common set of K–12 standards by this date will be evaluated based on the extent to which they demonstrate commitment and progress toward adoption of such standards by a later date in 2010 (see criterion (B)(1) and Appendix B). Evidence supporting the State’s adoption claims will include a description of the legal process in the State for adopting standards, and the State’s plan, current progress against that plan, and timeframe for adoption.

For criteria (B)(1) and (B)(2) (proposed criteria (A)(1) and (A)(2), respectively), regarding the development and adoption of common, high-quality standards and assessments, the term “significant number of States” has been further explained in the scoring rubric that will be used by reviewers to judge the Race to the Top applications (see Appendix B). The rubric clarifies that, on this aspect of the criterion, a State will earn “high” points if its consortium includes a majority of the States in the country; it will earn “medium” or “low” points if its consortium includes one-half or fewer of the States in the country.

Further, for criterion (B)(2), concerning the development and implementation of common, high-quality assessments, States will be asked to present, as evidence, copies of their Memoranda of Agreement showing that the State is part of a consortium that intends to develop high-quality assessments aligned with the consortium’s common set of standards. This is similar to the evidence required for criterion (B)(1) concerning the development and adoption of common standards.

Pages 59711–12:

Discussion: The Department is encouraging States to develop a common set of high-quality K–12 standards that are internationally benchmarked and that build toward college- and career-readiness by the time of high school graduation. In addition, the Department is encouraging States to develop and implement common, high-quality assessments that are aligned with those standards. Thus, criterion (B)(1) assesses the extent to which a State has demonstrated its commitment to adopting a common set of high-quality standards, and criterion (B)(2) assesses the extent to which the State has demonstrated its commitment to improving the quality of its assessments. It is a State’s responsibility to determine the content of those standards and assessments, including whether to develop a common set of core STEM [Science, Technology, Engineering, and Math] standards and assessments. Likewise, States are responsible for establishing high school graduation requirements. Thus, whether or not four years of STEM courses are included as a requirement for graduation from high school is a decision that is made by States, not the Federal Government.

Page 59802:

B. Standards and Assessments

State Reform Conditions Criteria

(B)(1) Developing and adopting common standards: The extent to which the State has demonstrated its commitment to adopting a common set of high-quality standards, evidenced by (as set forth in Appendix B)—

(i) The State’s participation in a consortium of States that—

(a) Is working toward jointly developing and adopting a common set of K–12 standards (as defined in this notice) that are supported by evidence that they are internationally benchmarked and build toward college and career readiness by the time of high school graduation; and

(b) Includes a significant number of States; and

(ii)(a) For Phase 1 applications, the State’s high-quality plan demonstrating its commitment to and progress toward adopting a common set of K–12 standards (as defined in this notice) by August 2, 2010, or, at a minimum, by a later date in 2010 specified by the State, and to implementing the standards thereafter in a well-planned way; or

(b) For Phase 2 applications, the State’s adoption of a common set of K–12 standards (as defined in this notice) by August 2, 2010, or, at a minimum, by a later date in 2010 specified by the State in a high-quality plan toward which the State has made significant progress, and its commitment to implementing the standards thereafter in a well-planned way.12

(B)(2) Developing and implementing common, high-quality assessments: The extent to which the State has demonstrated its commitment to improving the quality of its assessments, evidenced by (as set forth in Appendix B) the State’s participation in a consortium of States that—

(i) Is working toward jointly developing and implementing common, high-quality assessments (as defined in this notice) aligned with the consortium’s common set of K–12 standards (as defined in this notice); and

(ii) Includes a significant number of States.

Reform Plan Criteria

(B)(3) Supporting the transition to enhanced standards and high-quality assessments: The extent to which the State, in collaboration with its participating LEAs [Local Education Agency] (as defined in this notice), has a high-quality plan for supporting a statewide transition to and implementation of internationally benchmarked K–12 standards that build toward college and career readiness by the time of high school graduation, and high-quality assessments (as defined in this notice) tied to these standards. State or LEA activities might, for example, include: developing a rollout plan for the standards together with all of their supporting components; in cooperation with the State’s institutions of higher education, aligning high school exit criteria and college entrance requirements with the new standards and assessments; developing or acquiring, disseminating, and implementing high-quality instructional materials and assessments (including, for example, formative and interim assessments (both as defined in this notice)); developing or acquiring and delivering high-quality professional development to support the transition to new standards and assessments; and engaging in other strategies that translate the standards and information from assessments into classroom practice for all students, including high-need students (as defined in this notice).

12 Phase 2 applicants addressing selection criterion (B)(1)(ii) may amend their June 1, 2010 application submission through August 2, 2010 by submitting evidence of adopting common standards after June 1, 2010.

[715] Article: “How Bill Gates Pulled Off the Swift Common Core Revolution.” By Lyndsey Layton. Washington Post, June 7, 2014. <www.washingtonpost.com>

Gates money went to state and local groups, as well, to help influence policymakers and civic leaders. And the idea found a major booster in President Obama, whose new administration was populated by former Gates Foundation staffers and associates. …

As Race to the Top was being drafted, the [Obama] administration and the Gates-led effort were in close coordination.

An early version highlighted the Common Core standards by name, saying that states that embraced those specific standards would be better positioned to win federal money. That worried Wilhoit, who feared that some states would consider that unwanted—and possibly illegal—interference from Washington. He took up the matter with Weiss. …

The words “Common Core” were deleted.

The administration said states could develop their own “college and career ready” standards, as long as their public universities verified that those standards would prepare high school graduates for college-level work.

Still, most states eyeing Race to the Top money opted for the easiest route and signed onto the Common Core.

[716] Letter from U.S. Education Secretary Arne Duncan to the Chief State School Officers, September 23, 2011. <www2.ed.gov>

Over the past few years, States and districts have initiated groundbreaking reforms and innovations to increase the quality of instruction and improve academic achievement for all students. Forty-four States and the District of Columbia have adopted a common set of State-developed college- and career-ready standards, and 46 States and the District of Columbia are developing high-quality assessments aligned with these standards.

[717] Webpage: “Common Core State Standards Adoption Map.” Certica Solutions. Accessed April 18, 2019 at <statestandards.certicasolutions.com>

State

Adopted

Adoption Type

Final Adoption

Alabama

Adopted with Modifications

Full

11/28/2010

Alaska

Not Adopted

N/A

Arizona

Adopted with Modifications

Incremental

6/28/2010

Arkansas

Withdrawn

Incremental

7/12/2010

California

Adopted with Modifications

Incremental

8/2/2010

Colorado

Adopted with Modifications

Full

8/30/2010

Connecticut

Adopted Verbatim

Incremental

7/7/2010

Delaware

Adopted Verbatim

Incremental

8/19/2010

District of Columbia

Adopted Verbatim

Incremental

7/5/2010

Florida

Adopted with Modifications

Incremental

7/27/2010

Georgia

Adopted with Modifications

Full

7/8/2010

Hawaii

Adopted Verbatim

Incremental

6/18/2010

Idaho

Adopted Verbatim

Full

1/24/2011

Illinois

Adopted with Modifications

Incremental

6/24/2010

Indiana

Withdrawn

N/A

8/3/2010

Iowa

Adopted with Modifications

Incremental

7/9/2010

Kansas

Adopted with Modifications

Incremental

10/12/2010

Kentucky

Adopted Verbatim

Full

2/10/2010

Louisiana

Withdrawn

Incremental

7/10/2010

Maine

Adopted Verbatim

Incremental

4/4/2011

Maryland

Adopted Verbatim

Incremental

6/1/2010

Massachusetts

Adopted with Modifications

Incremental

7/21/2010

Michigan

Adopted Verbatim

Full

6/15/2010

Minnesota

Partially Adopted

Full

9/1/2010

Mississippi

Adopted with Modifications

Incremental

7/2/2010

Missouri

Withdrawn

Full

6/15/2010

Montana

Adopted with Modifications

Full

11/4/2011

Nebraska

Not Adopted

N/A

Nevada

Adopted Verbatim

Incremental

6/18/2010

New Hampshire

Adopted Verbatim

Full

7/8/2010

New Jersey

Adopted with Modifications

Incremental

6/16/2010

New Mexico

Adopted with Modifications

Full

10/29/2010

New York

Withdrawn

Incremental

7/19/2010

North Carolina

Withdrawn

Full

6/4/2010

North Dakota

Withdrawn

Full

6/24/2010

Ohio

Adopted with Modifications

Full

6/7/2010

Oklahoma

Withdrawn

N/A

6/24/2010

Oregon

Adopted with Modifications

Incremental

10/28/2010

Pennsylvania

Adopted with Modifications

Incremental

7/1/2010

Rhode Island

Adopted Verbatim

Full

7/1/2010

South Carolina

Withdrawn

Incremental

7/14/2010

South Dakota

Adopted Verbatim

Incremental

11/29/2010

Tennessee

Withdrawn

Incremental

7/30/2010

Texas

Not Adopted

N/A

Utah

Adopted with Modifications

Incremental

8/6/2010

Vermont

Adopted Verbatim

Incremental

8/17/2010

Virginia

Not Adopted

N/A

Washington

Adopted Verbatim

Incremental

6/1/2012

West Virginia

Withdrawn

Incremental

5/12/2010

Wisconsin

Adopted Verbatim

Incremental

6/2/2010

Wyoming

Adopted Verbatim

Full

6/15/2012

[718] Webpage: “Common Core State Standards.” National Conference of State Legislatures, May 1, 2014. <bit.ly>

How Did States Adopt the Common Core State Standards?

In most states, state law delegates to state boards of education the authority to establish or adopt academic standards for statewide K–12 public education systems. In four states, however, the legislature retains authority to grant final approval of academic standards. The table below contains a breakdown of the state actor who adopted the Standards. …

Government Actor/Agency Who Adopted the Standards:

Board of Education (or Comparable State Agency)

Alabama, Arizona, Arkansas, California, Colorado, Connecticut, Delaware, District of Columbia, Florida, Georgia, Hawaii, Indiana,† Iowa, Kansas, Louisiana, Maryland, Massachusetts, Michigan, Mississippi, Missouri, Montana, New Hampshire, New Jersey, New York, North Carolina, Northern Mariana Islands, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Tennessee, Utah, Vermont, West Virginia, Wyoming

Chief State Education Officer (or Similar State Actor)

Minnesota***, New Mexico, North Dakota, Wisconsin

***Minnesota adopts standards for individual academic subjects on an annual cycle as dictated by statute. During the 2009–2010 school year, the statute authorized the Commissioner of Education to “revise and align the state’s academic standards and high school graduation requirements in language arts” only. Minnesota last revised is mathematics standards during the 2006–2007 school year. The state’s mathematics standards will not be reviewed again until the 2015–2016 school year. (See Minn. Stat. § 120B.023(2)).

Legislative Directed, Reviewed or Gave Final Approval

Idaho (State Senate Education Committee approved Board of Education’s decision to adopt)

Illinois (reviewed by Joint Committee on Administrative Rules)

Kentucky (Senate Bill 1 (2009) directed the Kentucky Board of Education, the Council on Postsecondary Education, and the Education Professional Standards Board to revise the state’s academic standards based on college- and career-readiness criteria. The chairs of those three agencies later signed a formal resolution directing their agencies to implement the Standards, which finalized adoption.)

Maine (Legislative Document 12 (2011) authorized the Department of Education’s adoption of the Standards)

Nevada (The Council to Establish Academic Standards first adopted the Standards. The Board of Education formalized the adoption with their approval. A legislative commission then reviewed the Standard’s adoption.)

Oklahoma (Senate Bill 2033 (2010) directed the Board of Education to adopt the Standards by Aug. 1, 2010, which it did)

Washington (State Superintendent thru authorization from state legislature)

Did Not Adopt

Alaska, Nebraska, Texas, Virginia

† NOTE: As shown in the next footnote, Indiana had pulled out of Common Core before the publication date of this webpage.

[719] Article: “Indiana Drops Common Core Education Standards.” By Denver Nicks. Time, March 25, 2014. <time.com>

“Indiana Gov. Mike Pence signed legislation Monday making his the first state to withdraw from national Common Core education standards that have become a lightning rod for critics of federal government overreach.”

[720] From March 26 to 28, 2019, Just Facts conducted an extensive search of the status of Common Core in the states and found the following:

State

Common Core

Sources

Adopted

Replaced

1

Alabama

X

(a) (q)

2

Alaska

(a)

3

Arizona

X

X

(a) (b) (q)

4

Arkansas

X

X

(a) (c) (q)

5

California

X

(a) (q)

6

Colorado

X

(a)

7

Connecticut

X

(a)

8

Delaware

X

(a)

9

District of Columbia

X

(a)

10

Florida

X

X

(a) (q)

11

Georgia

X

(a) (q)

12

Hawaii

X

(a)

13

Idaho

X

(a) (q)

14

Illinois

X

(a)

15

Indiana

X

X

(a) (d) (q)

16

Iowa

X

(a) (q)

17

Kansas

X

(a)

18

Kentucky

X

X

(a) (e)

19

Louisiana

X

X

(a) (f) (q)

20

Maine

X

(a)

21

Maryland

X

(a)

22

Massachusetts

X

X

(a) (g) (q)

23

Michigan

X

(a)

24

Minnesota

ELA

(a)

25

Mississippi

X

(a) (q)

26

Missouri

X

X

(a) (h)

27

Montana

X

(a) (q)

28

Nebraska

(a)

29

Nevada

X

(a)

30

New Hampshire

X

(a)

31

New Jersey

X

X

(a) (i) (q)

32

New Mexico

X

(a)

33

New York

X

X

(a) (j) (q)

34

North Carolina

X

(a) (k) (q)

35

North Dakota

X

(a) (q)

36

Ohio

X

(a) (q)

37

Oklahoma

X

X

(a) (l) (q)

38

Oregon

X

(a)

39

Pennsylvania

X

(a) (q)

40

Rhode Island

X

(a)

41

South Carolina

X

X

(a) (m) (q)

42

South Dakota

X

X

(a) (n)

43

Tennessee

X

X

(a) (o) (q)

44

Texas

(a)

45

Utah

X

(a) (q)

46

Vermont

X

(a)

47

Virginia

(a)

48

Washington

X

(a)

49

West Virginia

X

X

(a) (p) (q)

50

Wisconsin

X

(a)

51

Wyoming

X

(a)

Total Number

47

14

SOURCES:

  1. Webpage: “Map: Tracking the Common Core State Standards.” Education Week, September 18, 2017. <www.edweek.org>
  2. Article: “State Board Votes to Replace Common Core Standards.” Arizona Department of Education, December 21, 2016. <www.azed.gov>
  3. Article: “Schools to Start Using New Standards.” By Cynthia Howell. Arkansas Democrat Gazette, July 22, 2016. <www.arkansasonline.com>
  4. Webpage: “Indiana Academic Standards.” Indiana Department of Education, March 26, 2019. <www.in.gov>
  5. Webpage: “Kentucky Academic Standards.” Kentucky Department of Education, March 24, 2019. <education.ky.gov>
  6. Webpage: “BESE Approves Louisiana Student Standards, Adopts 2016–17 Education Funding Formula.” Louisiana Board of Elementary and Secondary Education, March 4, 2016. <bese.louisiana.gov>
  7. Press release: “Massachusetts Adopts Revised English Language Arts and Math Standards.” Massachusetts Department of Elementary and Secondary Education, March 28, 2017. <www.doe.mass.edu>
  8. Webpage: “State Board Approves New Missouri Learning Standards.” Missouri Department of Elementary and Secondary Education, April 19, 2016. <dese.mo.gov>
  9. Article: “N.J. Revises, Renames Common Core Academic Standards.” New Jersey Star Ledger, May 4, 2016. <www.nj.com>
  10. Webpage: “Board of Regents P-12 Committee Approves Next Generation Learning Standards.” New York State Education Department, September 11, 2017. <www.nysed.gov>
  11. “Minutes of the North Carolina State Board of Education, January 1–February 1, 2018.” North Carolina State Board of Education, February 1, 2018. <www.dpi.nc.gov>
  12. Report: “Oklahoma Academic Standards (OAS).” Oklahoma State Department of Education, March 2017. <sde.ok.gov>
  13. Webpage: “New Standards for English Language Arts and Mathematics.” South Carolina Department of Education. Accessed March 28, 2019 at <ed.sc.gov>
  14. “Minutes of the South Dakota Board of Education Standards.” South Dakota Board of Education Standards, March 19, 2018. <doe.sd.gov>
  15. Webpage: “Math and English Language Arts.” Tennessee State Board of Education. Accessed March 28, 2019 at <www.tn.gov>
  16. Webpage: “Standards.” West Virginia Department of Education. Accessed March 28, 2019 at <wvde.us>
  17. Report: “Strong Standards: A Review of Changes to State Standards Since the Common Core.” Achieve, November 13, 2017. Updated 3/29/2019. <www.achieve.org>
  18. Article: “Common Core Math ‘Eradicated,’ Ivey Says, After Alabama School Board Vote.” By Tricia Powell. AL.com, December 12, 2019. <www.al.com>
  19. Press release: “Governor Desantis Eliminates Common Core.” Florida Department of Education, February 7, 2020. <www.fldoe.org>
  20. Article: “Idaho Lawmakers Enact New Education Standards, Replacing Common Core.” CBS2 Idaho News, March 24, 2022. <idahonews.com>
  21. Article: “Background on North Dakota’s use of Common Core.” Journal Tioga-Tribune, April 26, 2022. <www.journaltrib.com>
  22. Article: “Quarrel Over Common Core: A Pennsylvania Primer.” By Randy Kraft. WFMZ-TV 69 News, September 19, 2014. Updated 10/4/2019. <www.wfmz.com>

[721] Senate Bill 44: “Prohibiting the State From Requiring Implementation of Common Core Standards and Relative to the Amendment or Approval of Academic Standards.” State of New Hampshire, 2017 Regular Session. Signed into law by Chris Sununu on July 18, 2017. <legiscan.com>

Effective 09/16/2017 …

This bill prohibits the department of education and the state board of education from requiring the implementation of the common core standards in any school or school district in this state. …

(a) The minimum standards for public school approval for the areas identified in paragraph I shall constitute the opportunity for the delivery of an adequate education. The general court shall periodically, but not less frequently than every 10 years, review, revise, and update, as necessary, the minimum standards identified in paragraph I and shall ensure that the high quality of the minimum standards for public school approval in each area of education identified in paragraph I is maintained. Changes made by the board of education to the school approval standards through rulemaking after the effective date of this section shall not be included within the standards that constitute the opportunity for the delivery of an adequate education without prior adoption by the general court. The board of education shall provide written notice to the speaker of the house of representatives, the president of the senate, and the chairs of the house and senate education committees of any changes to the school approval standards adopted pursuant to RSA [Revised Statutes Annotated] 541-A.

(b) Neither the department of education nor the state board of education shall by statute or rule require that the common core state standards developed jointly by the National Governors Association Center for Best Practices and the Council of Chief State School Officers be implemented in any school or school district in this state.

[722] Press release: “Governor DeSantis Eliminates Common Core.” Florida Department of Education, February 7, 2020. <www.fldoe.org>

At the direction of Governor Ron DeSantis, the Department of Education released the proposed Florida B.E.S.T. (Benchmarks for Excellent Student Thinking) Standards for English Language Arts (ELA) and Mathematics, and announced that Common Core has been officially eradicated from Florida classrooms. The Commissioner is recommending that the State Board of Education formally adopt these standards February 12.

“Florida has officially eliminated Common Core. I truly think this is a great next step for students, teachers, and parents,” said Governor Ron DeSantis. “We’ve developed clear and concise expectations for students at every grade level and allow teachers the opportunity to do what they love most—inspire young Floridians to achieve their greatest potential. These standards create pathways for students that lead to great college and professional outcomes and parents will now be able to reinforce what their children are learning in the classroom every day. Florida’s B.E.S.T. Standards were made by Florida teachers for Florida students, and I know they will be a model for the rest of the nation.”

[723] Report: “Reaching Higher: The Common Core State Standards Validation Committee.” National Governors Association Center for Best Practices and the Council of Chief State School Officers, June 2010. <bit.ly>

Page 1: “The Common Core State Standards represent what American students need to know and do to be successful in college and careers.”

[724] Report: “Reaching Higher: The Common Core State Standards Validation Committee.” National Governors Association Center for Best Practices and the Council of Chief State School Officers, June 2010. <bit.ly>

Page 3: “These common standards are an important step in bringing about a real and meaningful transformation of the education system for the benefit of all students.”

[725] Webpage: “About the Standards.” Common Core State Standards Initiative. Accessed October 10, 2015 at <www.thecorestandards.org>

The Common Core is informed by the highest, most effective standards from states across the United States and countries around the world. The standards define the knowledge and skills students should gain throughout their K–12 education in order to graduate high school prepared to succeed in entry-level careers, introductory academic college courses, and workforce training programs.

The standards are:

1. Research- and evidence-based

2. Clear, understandable, and consistent

3. Aligned with college and career expectations

4. Based on rigorous content and application of knowledge through higher-order thinking skills

5. Built upon the strengths and lessons of current state standards

6. Informed by other top performing countries in order to prepare all students for success in our global economy and society

[726] Report: “Reaching Higher: The Common Core State Standards Validation Committee.” National Governors Association Center for Best Practices and the Council of Chief State School Officers, June 2010. <bit.ly>

Page 6:

R. James Milgram

Emeritus Professor at Stanford University’s Department of Mathematics

Milgram, one of the authors of the California Mathematics Standards and the California Mathematics Framework, has worked with a number of states, and with the Achieve Mathematics Advisory Panel, on standards in education. As a member of the National Board for Education Sciences, he has worked with the U.S. Department of Education on the math that pre-service K–8 teachers need to know and understand. …

Sandra Stotsky

Endowed Chair in Teacher Quality at the University of Arkansas’s Department of Education Reform and Chair of the Sadlier Mathematics Advisory Board

Stotsky has abundant experience in developing and reviewing ELA [English Language Arts] standards. As senior associate commissioner of the Massachusetts Department of Education, she helped revise pre-K–12 standards. She also served on the 2009 steering committee for NAEP [National Assessment of Educational Progress] reading and on the 2006 National Math Advisory Panel.

[727] “Testimony for the House Study Committee on the Role of Federal Government in Education.” By Sandra Stotsky (University of Arkansas, Department of Education Reform). Georgia General Assembly, House Study Committee on the Role of the Federal Government in Education. September 24, 2014. <bit.ly>

Page 1:

I begin with remarks on Common Core’s Validation Committee, on which I served from 2009–2010. This committee, which was created to put the seal of approval on Common Core’s standards, was invalid both in its membership and in the procedures it was told to follow. …

… Common Core’s standards did not emerge from a state-led process and were not written by nationally known experts, claims regularly made by its advocates. In fact, the people who wrote the standards were not qualified to draft K–12 standards at all. …

… Not only were no high school mathematics teachers involved, no English professors or high school English teachers were, either. Because everyone worked without open meetings or accessible public comment, their reasons for making the decisions they did are lost to history. To this day we do not know why Common Core’s high school mathematics standards do not provide a pathway to STEM [Science, Technology, Engineering, and Math] careers or why David Coleman was allowed to mandate a 50/50 division between literary study and “informational” text at every grade level from K–12 in the ELA [English Language Arts] standards, with no approval from English teachers across the country or from the parents of students in our public schools.

The absence of relevant professional credentials in the two standards-writing teams helps to explain the flaws in these standards, on which costly tests are based and scheduled to be given in Georgia in 2015–2016. The “lead” writers for the ELA standards, David Coleman and Susan Pimentel, had never taught reading or English in K–12 or at the college level. Neither has a doctorate in English, nor published serious work on curriculum and instruction. They were virtually unknown to English language arts educators and to higher education faculty in rhetoric, speech, composition, or literary study.

None of the three lead standards-writers in mathematics, Jason Zimba, William McCallum, and Phil Daro, the only member of this three-person team with teaching experience, had ever developed K–12 mathematics standards before. Who wanted these people as standards-writers and why, we still do not know. No one in the media showed the slightest interest in their qualifications or the low level of college readiness they aimed for on a grade 11 test.

Page 2:

Why didn’t I sign off on Common Core’s standards? Professor Milgram and I were two of the five members of the VC [Validation Committee] who did not sign off on the standards. So far as we could determine, the Validation Committee was intended to function as a rubber stamp even though we had been asked to validate the standards. Despite repeated requests, we did not get the names of countries whose standards were supposedly used as benchmarks for Common Core’s. So far as I could figure out, Common Core’s standards were intentionally not made comparable to the most demanding sets of standards elsewhere. It did not offer any research evidence to justify its omission of high school mathematics standards leading to STEM careers, its stress on writing over reading, its division of reading instructional texts into “information” and “literature,” its deferral of the completion of Algebra I to grade 9 or 10, and its experimental approach to teaching Euclidean geometry. Nor did Common Core offer evidence that its standards meet entrance requirements for most colleges and universities in this country or elsewhere—or for a high school diploma in many states.

[728] “Testimony for a Hearing on Indiana Senate Bill No. 373.” By Sandra Stotsky. University of Arkansas, January 25, 2012. <www.justfacts.com>

Page 2:

Common Core’s “college readiness” standards are not content standards but simply empty skill sets. To judge by the reading levels of the high school examples of “complexity” in Common Core’s Appendix B, the average reading level of the passages on the common tests now being developed to determine “college-readiness” may be at about the grade 7 level.

… Common Core’s “college readiness” ELA/R [English Language Arts and Reading] standards were deliberately designed as empty skill sets to enable a large number of high school students to be declared “college ready” and to enroll in post-secondary institutions that will have no choice but to place them in credit-bearing courses. These institutions will then likely be under pressure from the USDE [U.S. Department of Education] to retain these students in order to increase college graduation rates even if they are reading at only middle school level.

Page 3:

After the Common Core Initiative was launched in early 2009, the National Governors Association and the Council of Chief State School Officers never explained to the public what the qualifications were for membership on the standards-writing committees or how it would justify the specific standards they created. Most important, they never explained why Common Core’s high school exit standards were equal to college admission requirements without qualification, even though this country’s wide-ranging post-secondary institutions use a variety of criteria for admission.

Eventually responding to the many charges of a lack of transparency, the names of the 24 members of the “Standards Development Work Group” were revealed in a July 1, 2009 news release. The vast majority, it appeared, work for testing companies. Not only did CCSSO [Council of Chief State School Officers] and NGA [National Governors Association] give no rationale for the composition of this Work Group, it gave no rationale for the people it put on the two three-member teams in charge of writing the grade-level standards.

[729] Testimony: “Issues With Core Math Standards.” By R. James Milgram (Professor of Mathematics, Stanford University). Arkansas State Legislature, July 23, 2013. <www.arkleg.state.ar.us>

Pages 1–2 (of PDF):

[T]he new Common Core national standards … are claimed to be research based, but the main reason I could not sign off on them was that there were too many areas where the writing team could not show me suitable research that justified their handling of key topics—particularly when they differed from standard approaches. …

The three most severe problem areas are

1. the beginning handling of whole numbers in particular adding, subtracting, multiplying, and dividing;

2. the handling of geometry in middle school and high school;

3. the very low level expectations for high school graduation that barely prepare students for attending a community college, let alone a 4-year university.

Unfortunately, these are the three most crucial areas where our math outcomes have to improve.

Core Standard’s approach to whole numbers is just the continuation of the approach pioneered in California in the early 1990’s that had such bad outcomes that it spawned the Math Wars. Moreover, the use of student-constructed algorithms is at odds with the practices of high-achieving countries and the research that supports student constructed algorithms appears highly suspect.

Additionally, the way Common Core presents geometry is not research-based—and the only country that tried this approach on a large scale, the old USSR [Union of Soviet Socialist Republics], rapidly abandoned it. The problem is that—though the outlined approach to geometry is rigorous—it depends on too many highly specialized topics, that even math majors at a four year university would not see until their second or more likely their third years. Again, there is no research with actual students that supports the Core Standards approach.

Tied in with the problems in geometry, there are also severe problems with the way Common Core handles percents, ratios, rates, and proportions—the critical topics that are essential if students are to learn more advanced topics such as trigonometry, statistics, and even calculus. …

The classic method of, for example, adding two-digit numbers is to add the digits in the “ones” column, carry the tens in the sum to the “tens” column, then add the “tens” digits, and so on. This “standard algorithm” works first time, every time. But instead of preparing for and teaching this method, by first carefully studying and understanding the meaning of our place value notation, as they do in the high achieving countries, Common Core creates a three-step process starting with student constructed algorithms guided by the absurd belief that all this content is somehow innate. …

I cannot emphasize enough that Common Core is using our children for a huge and risky experiment, one that consistently failed when tried by individual states such as California in the early 1990’s and even countries such as the old USSR in the 1970’s.

[730] Report: “Frequently Asked Questions.” Common Core State Standards Initiative. Common Core State Standards Initiative, June 5, 2014. <www.thecorestandards.org>

Page 4:

Are There Plans to Develop Common Standards in Other Areas in the Future?

CCSSO [Council of Chief State School Officers] and NGA [National Governors Association] are not leading the development of standards in other academic content areas. Below is information on efforts of other organizations to develop standards in other academic subjects.

• Science: States have developed Next Generation Science Standards in a process managed by Achieve, with the help of the National Research Council, the National Science Teachers Association, and the American Association for the Advancement of Science. …

• World languages: The American Council on the Teaching of Foreign Languages published an alignment of the National Standards for Learning Languages with the ELA [English Language Arts] Common Core State Standards. …

• Arts: The National Coalition for Core Arts Standards is leading the revision of the National Standards for Arts Education.

[731] Book: Educational Decentralization: Asian Experiences and Conceptual Contributions. Edited by Christopher Bjork. Springer, 2006.

Chapter 1: “Strategies of Educational Decentralization: Key Questions and Core Issues.” By E. Mark Hanson. Pages 9–26.

Page 10: “Decentralization is defined as the transfer of decision-making authority, responsibility, and tasks from higher to lower organizational levels or between organizations.”

Page 11: “There are three major forms of decentralization … Deconcentration … Delegation … Devolution…. Privatization is a form of devolution as responsibility and resources are transferred from public to private sector institutions (Rondinelli, 1990).”

[732] Book: Mapping the Terrain of Education Reform: Global Trends and Local Responses in the Philippines. By Vicente Chua Reyes, Jr. Routledge, 2016.

Page 11: “Centralized bodies of education systems proven to be ineffective in delivering its mandated services have steadily heard the clamor for transforming themselves, devolving and decentralizing its powers to those that could be most effective (Cuban, 1990; Tyack & Cuban, 1997).”

[733] Book: Common Core Meets Education Reform: What It All Means for Politics, Policy, and the Future of Schooling. Edited by Frederick M. Hess and Michael Q. McShane. Teachers College Press (Columbia University), 2014.

Chapter 5: “Accountability: A Story of Opportunities and Challenges.” By Deven Carlson. Pages 96–117.

Page 113:

On the one hand, supporters of more centralized education policy are enthusiastic about the fact that the Common Core Standards specify a single body of knowledge and skills that students in 45 states and the District of Columbia will be expected to possess, and they are willing to accept the fact that the Common Core Standards are not formal federal policy.

[734] Book: Educational Decentralization: Asian Experiences and Conceptual Contributions. Edited by Christopher Bjork. Springer, 2006.

Chapter 1: “Strategies of Educational Decentralization: Key Questions and Core Issues.” By E. Mark Hanson. Pages 9–26.

Page 10:

The principal arguments behind educational centralization are, as Winkler (1993) and Wheeler (1993) observed:

• … the equitable allocation of resources to reduce regional economic disparities;

• … the equitable allocation of policy and programmatic uniformity, to establish consistency and quality, programs and activities (for example, curriculum, hiring, examinations, delivery of administrative services) …

• … the equitable allocation of improved teaching-learning, a tightly controlled curriculum can be one policy response to the problem of poorly qualified teachers.

Page 15:

A potential danger of decentralizing a national educational system is that the regional or municipal systems may go their own way to the extent that intersystem continuity and articulation are significantly compromised. For example, different regions can develop distinct salary structures, with the better educators gravitating toward the higher-paying areas; algebra can be taught in the seventh grade in one region and the ninth in other; and the school calendar across the country can vary significantly.

[735] Book: Comparative Public Policy and Citizen Participation: Energy, Education, Health and Urban Issues in the U.S. and Germany. Edited by Charles R. Foster. Pergamon Press, 1980.

Chapter 7: “Education As Loosely Coupled Systems in West Germany and the United States.” By Maurice A Garnier. Pages 87–98.

Page 94: “Centralization and standardization, the opposite of loose coupling, are methods which facilitate equality. American higher education is decentralized and so is Swiss education. Such systems are characterized by great disparities between schools and between districts.”

Page 95:

Differences in educational opportunities exist in all countries, but they tend to be greater in decentralized systems than in centralized ones. … Local occupational structures obviously affect educational attainment in every society. Thus, communities that are characterized by high-status occupations will exhibit higher than average educational attainment. Whenever comparisons between communities are made concerning the net contribution of educational resources, occupational structure must be controlled. …

Localized adaptation may also foster idiosyncratic standards. If no standardization exists, schools can postulate anything as satisfying graduation requirements. The development of standardized testing constitutes a response to this problem in the United States, but many graduates are led to believe that they have received a certain kind of education when, in reality their achievement is low. In the United States, the situation prevails both at the secondary and university levels.

[736] Book: Educational Decentralization: Asian Experiences and Conceptual Contributions. Edited by Christopher Bjork. Springer, 2006.

Chapter 1: “Strategies of Educational Decentralization: Key Questions and Core Issues.” By E. Mark Hanson. Pages 9–26.

Page 10:

The principal arguments behind educational centralization are, as Winkler (1993) and Wheeler (1993) observed: …

• financial, to benefit through economies of scale … quality, programs and activities (for example, curriculum, hiring, examinations, delivery of administrative services);

• financial, to benefit through economies of scale … central placement of scarce human resources, to place strategically the scarce, skilled human resource at those points in the institution where their impact can reach across the entire educational system….

[737] Book: Encyclopedia of Educational Leadership and Administration (Volume 1). Edited by Fenwick W. English. Sage Publications, 2006. Article: “Finance, of Public Schools.” By Scott R. Sweetland.

Page 394:

Increased centralization of funding and organization structures was meant to save money and improve education. Economies of scale were realized when small schools and districts were consolidated into larger schools and districts. Greater numbers of children in classrooms and numbers of teachers in buildings, for example, reduced the costs of teacher per child and principals per teacher.

[738] Book: Educational Planning: The International Dimension. Edited by Jacques Hallak and Francoise Caillods. International Bureau of Education, International Institute for Educational Planning. Garland Publishing, 1995.

Chapter: “Managing Schools for Educational Quality and Equity: Finding the Proper Mix to Make it Work.” By Jacques Hallak. Pages 107–118.

Page 108:

Centralized systems have proved most effective in countries characterized by a politically and economically stable environment, strong administrative systems, good infrastructure, comparatively well-educated and compensated teachers, and a relatively homogeneous context for schooling. For example, this is been the case in the Republic of Korea and Japan.

However, in countries with long distances between individual schools in the center, great ethnic and linguistic diversity, and relatively poorly developed transportation and communication systems, rigid centralization blocks resources and information flows and leads to the inefficient and ineffective operation of the system.

[739] Book: Comparative Public Policy and Citizen Participation: Energy, Education, Health and Urban Issues in the U.S. and Germany. Edited by Charles R. Foster. Pergamon Press, 1980.

Chapter 7: “Education As Loosely Coupled Systems in West Germany and the United States.” By Maurice A Garnier. Pages 87–98.

Page 95: “By definition, a loosely coupled system is not coordinated. Thus, parts of the system may respond to the immediate environment but may fail to respond to broader or more distant aspects of it. A community may be unaware of changing requirements of the labor force and therefore may fail to adapt its curriculum.”

[740] Book: Educational Decentralization: Asian Experiences and Conceptual Contributions. Edited by Christopher Bjork. Springer, 2006.

Chapter 1: “Strategies of Educational Decentralization: Key Questions and Core Issues.” By E. Mark Hanson. Pages 9–26.

Page 10:

The principal arguments behind educational centralization are, as Winkler (1993) and Wheeler (1993) observed …

• financial, to benefit through economies of scale as well as the equitable allocation of the diffusion of innovation, to spread changes more rapidly through the entire system….

[741] Book: Comparative Public Policy and Citizen Participation: Energy, Education, Health and Urban Issues in the U.S. and Germany. Edited by Charles R. Foster. Pergamon Press, 1980.

Chapter 7: “Education As Loosely Coupled Systems in West Germany and the United States.” By Maurice A Garnier. Pages 87–98.

Page 94:

A loosely coupled system presents a number of very significant disadvantages, however. The first can be readily disposed of since, in practice, it does not constitute a problem in the American case. A loosely coupled system is characterized by numerous solutions to educational problems. Successful solutions, because they are localized, cannot readily spread throughout the entire system. This is not a problem because professional organizations and their publications ensure that successful solutions are diffused throughout the entire educational world.

[742] Book: Decentralization of Education: Politics and Consensus. By Edward B. Fiske. World Bank, 1996. <books.google.com>

Page 24: “As noted at the outset, virtually all proponents of school decentralization, whatever their stated and unstated objectives, claim that such reorganization will improve the quality of teaching and learning by locating decisions closer to the point at which they must be carried out and by energizing teachers and administrators to do a better job.”

[743] Book: Private and Public School Partnerships: Sharing Lessons About Decentralization. By Jean Madsen. Falmer Press (imprint of Taylor & Francis), 1997.

Page 1:

Site-based management is a business derivative of decentralization and participatory decision-making. The intent of site-based management is to improve student performance by making those closest to the delivery of services—teachers and principals—more autonomous, resulting in their being more responsive to parents and students concerns.

[744] Book: Globalization and Education: The Quest for Quality Education in Hong Kong. Edited by Joshua Ka-ho Mok and David Kin-keung Chan. Hong Kong University Press, 2002.

Chapter 7: “Towards School Management Reform: Organizational Values of Government Schools in Hong Kong.” By Nicholas Sun-keung Pang. Pages 171–194.

Page 174:

Under the external-control management (centralization) of education, many problems are identified. Some principals feel frustrated by the lack of flexibility in acquiring resources they need to do their jobs. The difficulty appears to be that principals responsible for the education of students have little authority in controlling educational resources, while the central government (Education Department), which does not deal directly with students, has authority to control resources. Moreover, principals do not feel particularly accountable and their roles are not clearly specified. A budgeting practice involves principals ordering nonessential items so that grants will be spent entirely, because surpluses cannot be carried forward. The rationale is that if the money is not spent, budgets will be reduced next year. Hence, the principals resort to spending inefficiently.

The adoption of decentralization can go a long way in solving this type of problem. The central government (i.e. Education Department) will be confined to policy matters more, while the central office staff will no longer have direct authority over schools. Decentralization would allow principals more flexibility in decision making. The new flexibility permits initiatives to be taken and encourages long-term school planning. Principals’ authority over many kinds of school resources will be increased and greater pride in their schools can result. Thus decentralization enable the schools to manage their own budgets and it requires them to examine carefully their priorities. Some costs may be reduced and the money saved can be expended on other items to benefit the school. On the whole, decentralization is often considered to be a means to achieve four goals, namely, organizational responsiveness, flexibility, accountability and productivity. All these, will, in turn, raise the quality of education in schools.

[745] Paper: “Deregulation and Decentralization of Education in Japan.” By Hiromitsu Muta. Journal of Educational Administration, 2000. Pages 455–467. <www.emeraldinsight.com>

Page 455:

Japan in the late nineteenth century centralized its institutions, including education, in order to catch up with the Western industrialized nations. However, in the late twentieth century, in order to maintain its competitive edge as a world leader in the economic globalization process, the national leadership instituted a series of reforms to deregulate and decentralize the educational system. The objective is to provide sufficient flexibility and local control at the school level that creativity, individual initiative, and the spirit of entrepreneurship will become part of the teaching/learning process for each new generation of Japanese students.

[746] Book: Educational Planning: The International Dimension. Edited by Jacques Hallak and Francoise Caillods. International Bureau of Education, International Institute for Educational Planning. Garland Publishing, 1995.

Chapter: “Managing Schools for Educational Quality and Equity: Finding the Proper Mix to Make it Work.” By Jacques Hallak. Pages 107–118.

Pages 108–109:

Principals are largely excluded from decisions that affect their ability to improve student achievement. Curricula are designed centrally, often with little attention to the diversity of schools and students’ interests. Teachers are often appointed, assigned and evaluated centrally, leaving principals with little control over the choice or discipline of teachers.

The failure of teacher employment policies to take into account regional and local needs, subject-matter and grade-level needs undermine significantly the ability of principals to build and maintain an effective school environment. … The lack of full authority at the school level is most prevalent in highly centralized systems but even more in decentralized ones, authority is not always delegated below the intermediate levels.

[747] Book: Educational Decentralization: Asian Experiences and Conceptual Contributions. Edited by Christopher Bjork. Springer, 2006.

Chapter 1: “Strategies of Educational Decentralization: Key Questions and Core Issues.” By E. Mark Hanson. Pages 9–26.

Page 11: “The goal of increased efficiency through decentralization drove another Venezuelan initiative in the early 1990s but this time the focus was on reducing bureaucratic stagnation, centralized inefficiencies, and corruption (de la Cruz, 1992).”

[748] Book: Educational Planning: The International Dimension. Edited by Jacques Hallak and Francoise Caillods. International Bureau of Education, International Institute for Educational Planning. Garland Publishing, 1995.

Chapter: “Managing Schools for Educational Quality and Equity: Finding the Proper Mix to Make it Work.” By Jacques Hallak. Pages 107–118.

Page 108: “Administrative weaknesses generally arise when managers do not have sufficient authority and/or resources to do their job effectively, when communication channels are blocked, when roles and responsibilities are unclear when managers time is either consumed with routine tasks or with addressing, on an ad hoc basis, requests by the political structure.”

[749] Book: Comparative Public Policy and Citizen Participation: Energy, Education, Health and Urban Issues in the U.S. and Germany. Edited by Charles R. Foster. Pergamon Press, 1980.

Chapter 7: “Education As Loosely Coupled Systems in West Germany and the United States.” By Maurice A Garnier. Pages 87–98.

Page 94:

A loosely coupled system should be relatively inexpensive to run because it takes time and money to coordinate people” (Weick, 1976:8). While we have seen earlier that increase size led to relatively lower administrative ratios, nevertheless the argument that little coordination need take place in loosely coupled systems is a persuasive one. A look at the number of administrators employed by state education organizations as well as by school districts reveals that, indeed, the number of supervisory personnel is small. Thus, it may be that loosely coupled systems are relatively inexpensive.

[750] Book: Encyclopedia of Educational Leadership and Administration (Volume 1). Edited by Fenwick W. English. Sage Publications, 2006. Article: “Finance, of Public Schools.” By Scott R. Sweetland.

Page 394: “Negative consequences also accompanied the structural shift toward the centralized education system. … Transaction costs, including those associated with communicating and coordinating among federal, state, and local authorities, increased.”

[751] Book: Comparative Public Policy and Citizen Participation: Energy, Education, Health and Urban Issues in the U.S. and Germany. Edited by Charles R. Foster. Pergamon Press, 1980.

Chapter 7: “Education As Loosely Coupled Systems in West Germany and the United States.” By Maurice A Garnier. Pages 87–98.

Page 94:

The greater self-determination of loosely coupled systems may lead to a greater sense of efficacy. There is little doubt that in many American communities the school constitutes the most central institution to which everyone belongs or has belonged. It is a social center, and the emotional attachment to that school is strong. People do tend to feel that the school is theirs, and there is substantial involvement on the part of the community members in school affairs. … If surveys were available on the subject, there is very little doubt that Americans would rank very high on involvement in school affairs.

[752] Book: Educational Decentralization: Asian Experiences and Conceptual Contributions. Edited by Christopher Bjork. Springer, 2006.

Chapter 1: “Strategies of Educational Decentralization: Key Questions and Core Issues.” By E. Mark Hanson. Pages 9–26.

Page 11:

Educational decentralization reforms typically have their roots in the political arena. For example, as nations make the transition from autocratic to democratic forms of government, and almost natural outcome is an effort to decentralize the educational system as one important mechanism of establishing citizen participation in government institutions (Hansen, 1996a).

[753] Book: Comparative Public Policy and Citizen Participation: Energy, Education, Health and Urban Issues in the U.S. and Germany. Edited by Charles R. Foster. Pergamon Press, 1980.

Chapter 7: “Education As Loosely Coupled Systems in West Germany and the United States.” By Maurice A Garnier. Pages 87–98.

Page 93:

While it would be easy to describe the French system of education as a structure, it is easy to describe the American system, both at the state and national levels, as loosely coupled. Thus, loose coupling does not apply to schools nor, often, to school districts. The concept applies to the state and national systems.

Five distinct advantages are outlined by Weick as relevant to educational organizations (Weick, 1976:6–8). The first is that “loosely coupled systems … ‘know’ their environments better than is true for more tightly coupled systems.” (p. 6). …

The consequence of knowing the environment well is that localized adaptations are possible. Thus, each school district can change its schedule to suit local conditions, can emphasize certain aspects of the curriculum over others (arts over sports, for example), can hire the kinds of teachers it needs, pay them what it wants, determine its particular bureaucratic arrangement, i.e., one which gives either a large or small amount of control to school principals, etc.

[754] Book: Encyclopedia of Educational Leadership and Administration (Volume 1). Edited by Fenwick W. English. Sage Publications, 2006. Article: “Finance, of Public Schools.” By Scott R. Sweetland.

Page 394: “Negative consequences also accompanied the structural shift toward the centralized education system. … State-prescribed curriculum, testing, and graduation requirements emphasized uniformity over specialized programs that were designed to meet local needs.”

[755] Book: Comparative Public Policy and Citizen Participation: Energy, Education, Health and Urban Issues in the U.S. and Germany. Edited by Charles R. Foster. Pergamon Press, 1980.

Chapter 7: “Education As Loosely Coupled Systems in West Germany and the United States.” By Maurice A Garnier. Pages 87–98.

Page 93:

Furthermore if a bad or ineffective policy is selected by one school district it does not affect the others. Thus, if the director of elementary schools recommends that reading be taught in a certain way and if it should turn out that that method is ineffective, only children in that school district need be affected. This would be in sharp contrast to the French situation, for example, all French children would be affected.

[756] Book: Multiple Regression: A Primer. By Paul D. Allison. Pine Forge Press, 1998.

Chapter 1: “What Is Multiple Regression?” <us.sagepub.com>

Pages 20–21:

There’s a more subtle aspect to this problem of statistical control: It’s not enough to be able to measure all the variables that we want to control. We also have to measure them well. … That may not be a serious problem when we’re dealing with variables like gender or age (based on official records), but there are lots of “fuzzy” variables in the social sciences that we can measure only crudely, at best, among them intelligence, depression, need for achievement, marital conflict, and job satisfaction.

[757] Book: Educational Decentralization: Asian Experiences and Conceptual Contributions. Edited by Christopher Bjork. Springer, 2006.

Chapter 1: “Strategies of Educational Decentralization: Key Questions and Core Issues.” By E. Mark Hanson. Pages 9–26.

Page 11: “There is no such thing as a truly decentralized educational system. In reality, most all decisions (for example, finance, personnel, curriculum) retain degrees of centralization and decentralization—the issue is finding the appropriate balance.”

[758] Book: Educational Planning: The International Dimension. Edited by Jacques Hallak and Francoise Caillods. International Bureau of Education, International Institute for Educational Planning. Garland Publishing, 1995.

Chapter: “Managing Schools for Educational Quality and Equity: Finding the Proper Mix to Make it Work.” By Jacques Hallak. Pages 107–118.

Page 109: “The lack of full authority at the school level is most prevalent in highly centralized systems but even more in decentralized ones, authority is not always delegated below the intermediate levels.”

[759] Book: Private and Public School Partnerships: Sharing Lessons About Decentralization. By Jean Madsen. Falmer Press (imprint of Taylor & Francis), 1997.

Page 1:

Site-based management [SMB] is a business derivative of decentralization and participatory decision-making. The intent of site-based management is to improve student performance by making those closest to the delivery of services—teachers and principals—more autonomous, resulting in their being more responsive to parents and students concerns.

Page 2:

While many schools in the United States claim to implement SBM, very little decision-making is truly decentralized. In most cases SBM is only a subset of the various types of decisions that are made at the district level. Thus, some districts may decentralize budget decisions but may maintain control of personnel and curriculum concerns. Other SBM plans give some autonomy about trivial issues like school safety, parent involvement, and career education. The illusion of autonomy based on SBM is often constrictive because the district office retains the final authority or limits the range of decision-making (Bimber, 1993).

[760] Book: Encyclopedia of Educational Leadership and Administration (Volume 1). Edited by Fenwick W. English. Sage Publications, 2006. Article: “Finance, of Public Schools.” By Scott R. Sweetland.

Page 394: “Hence, the structure maintained both decentralized and centralized characteristics of funding and organization. Each characteristic was dynamic, creating beneficial and detrimental implications for schooling.”

[761] Book: Educational Planning: The International Dimension. Edited by Jacques Hallak and Francoise Caillods. International Bureau of Education, International Institute for Educational Planning. Garland Publishing, 1995.

Chapter: “Managing Schools for Educational Quality and Equity: Finding the Proper Mix to Make it Work.” By Jacques Hallak. Pages 107–118.

Pages 108–109:

Centralized systems have proved most effective in countries characterized by a politically and economically stable environment, strong administrative systems, good infrastructure, comparatively well-educated and compensated teachers, and a relatively homogeneous context for schooling. For example, this is been the case in the Republic of Korea and Japan.

However, in countries with long distances between individual schools in the center, great ethnic and linguistic diversity, and relatively poorly developed transportation and communication systems, rigid centralization blocks resources and information flows and leads to the inefficient and ineffective operation of the system. In such circumstances, education systems are likely to be more efficient and effective if certain functions are devolved to the lower levels. Given the wide variety of administrative traditions in different societies, it is not possible to prescribe what these functions of responsibility should be. The challenge is to define criteria for the evolving functions and responsibilities while maintaining an overall effective organizational structure. …

Administrative weaknesses generally arise when managers do not have sufficient authority and/or resources to do their job effectively, when communication channels are blocked, when roles and responsibilities are unclear when managers time is either consumed with routine tasks or with addressing, on an ad hoc basis, requests by the political structure. …

Principals are largely excluded from decisions that affect their ability to improve student achievement. Curricula are designed centrally, often with little attention to the diversity of schools and students’ interests. Teachers are often appointed, assigned and evaluated centrally, leaving principals with little control over the choice or discipline of teachers.

The failure of teacher employment policies to take into account regional and local needs, subject-matter and grade-level needs undermine significantly the ability of principals to build and maintain an effective school environment. … The lack of full authority at the school level is most prevalent in highly centralized systems but even more in decentralized ones, authority is not always delegated below the intermediate levels.

[762] Book: Mapping the Terrain of Education Reform: Global Trends and Local Responses in the Philippines. By Vicente Chua Reyes, Jr. Routledge, 2016.

Pages 11–12: “Alongside the need to decentralize would be the equally important question of whether or not other stakeholders in education who have received the devolved and decentralized powers are ready to take on their new responsibilities.”

[763] Book: Educational Decentralization: Asian Experiences and Conceptual Contributions. Edited by Christopher Bjork. Springer, 2006.

Chapter 1: “Strategies of Educational Decentralization: Key Questions and Core Issues.” By E. Mark Hanson. Pages 9–26.

Page 21:

However, even though various studies have concluded that parents and educators seem to be more satisfied in a decentralized system, the research literature has not demonstrated a direct relationship between decentralization and increased student achievement….

Sharpe … argues that producing conclusive evidence is hardly possible because there are simply too many intervening variables between the management device of decentralization and improved student outcomes, such as parental attitudes, peer group support, school culture of learning, different teaching and learning styles, time-on-task, teacher motivation, and so forth.

[764] Book: Balancing Change and Tradition in Global Education Reform (2nd edition). Edited by Iris C. Rotberg. Rowman & Littlefield Education, 2010.

Chapter: “Concluding Thoughts: On Change, Tradition, and Choices.” By Iris C. Rotberg. Pages 381–403.

Page 385:

In many countries, immigration has led to increasingly diverse student populations, who are often concentrated in city centers or in the suburbs immediately surrounding. The dramatic increase in immigration (and, therefore, in the mix of racial/ethnic groups, cultures, and languages) has occurred in only a few decades. Even a country like the United States, with its long tradition of immigration and diversity, continues to have a significant increase in the proportion of students from minority populations. Indeed, in many parts of the country, the term minority is a misnomer.

The increasing diversity places new demands on school systems, which are often blamed for educational problems. Many school systems respond by attempting to reverse traditional practices, which are perceived as ineffective in serving new student populations. Thus, countries with highly centralized education systems have sought to relax central control in order to make schools more responsive to the diverse student population. Countries with the tradition of local control, on the other hand, have moved in the opposite direction and have increased central oversight—all in response to the perception the changing demographics require a shift from the status quo, whatever it was.

[765] Paper: “Econometric Methods for Causal Evaluation of Education Policies and Practices: A Non-Technical Guide.” By Martin Schlotter, Guido Schwerdt, and Ludger Woessmann. Education Economics, January 2011. <www.tandfonline.com>

Page 132:

In medical research, experimental evaluation techniques are a well-accepted standard device to learn what works and what does not. No one would treat large numbers of people with a certain medication unless it has been shown to work. Experimental and quasi-experimental studies are the best way to reach such an assessment. It is hoped that a similar comprehension is reached in education so that future education policies and practices will be able to better serve the students.

NOTE: Click here for detailed documentation about the importance of experimental studies and the pitfalls of observational studies.

[766] Book: Decentralization of Education: Politics and Consensus. By Edward B. Fiske. World Bank, 1996. <books.google.com>

Page 24: “In general, researchers have developed very little data showing a direct connection—one way or the other—between decentralization schemes and the performance of students on standardized tests.”

[767] In October 2015, Just Facts wrote to five different scholars who have authored research on educational centralization or decentralization. Just Facts asked if they were aware of “any experimental studies on the effects of educational centralization or decentralization.” One of them replied, and he was unaware of any experimental or quasi-experimental studies on the subject. The scholars include:

[768] Webpage: “About the Standards.” Common Core State Standards Initiative. Accessed October 10, 2015 at <www.thecorestandards.org>

For years, the academic progress of our nation’s students has been stagnant, and we have lost ground to our international peers. Particularly in subjects such as math, college remediation rates have been high. One root cause has been an uneven patchwork of academic standards that vary from state to state and do not agree on what students should know and be able to do at each grade level.

[769] Email from Just Facts to the Common Core State Standards Initiative, October 14, 2015.

Subject: Research on Common Core Standards [CCS]

I am researching the CCS and kindly request that you point me to the specific studies that support the following four CC principles and standards. I am familiar with the “Compendium of Research on the Common Core State Standards,” but it is not clear to me what research specifically supports each standard. Hence, I am looking to connect the research to the standards as if the standards were methodically footnoted.

1) “One root cause [for our nation’s stagnant academic progress] has been an uneven patchwork of academic standards that vary from state to state and do not agree on what students should know and be able to do at each grade level.” [“About the Standards,” <www.thecorestandards.org>] …

I am particularly interested in experimental studies (as opposed to observational studies). Thus, would you also please identify studies pertaining to the standards above that are experimental or quasi-experimental?

[770] On 10/16/15, Just Facts followed up with a phone call to CCSSI’s [Common Core State Standards Initiative] press office, received a call in return, and resent the email at CCSSI’s request. CCSSI replied with a generic email that did not answer the questions posed by Just Facts. Just Facts responded, pointed out that the reply did not answer the questions, and requested the “specific studies that support” CCSSI’s assertions. The press office replied, “Let me pass your question along to my colleague.”

On 10/21 Just Facts followed up with an email to CCSSI’s press office. The press office replied that it will “follow up.”

On 10/22, CCSSI’s press office wrote to Just Facts: “I reached out to my colleague, and she will be connecting with you soon.”

On 10/23, Just Facts followed up with an email to CCSSI’s press office.

[771] Dataset: “Table 235.10. Revenues for Public Elementary and Secondary Schools, by Source of Funds: Selected Years, 1919–20 Through 2019–20.” U.S. Department Of Education, National Center for Education Statistics, September 2022. <nces.ed.gov>

“Percentage distribution … 1919–20 … Federal [=] 0.3 … State [=] 16.5 … Local [=] 83.2 … 2019–20 … Federal [=] 7.6% … State [=] 47.5% … Local [=] 45.1%”

[772] Book: Encyclopedia of Educational Leadership and Administration (Volume 1). Edited by Fenwick W. English. Sage Publications, 2006. Article: “Finance, of Public Schools.” By Scott R. Sweetland.

Pages 393–394:

The funding structure for public education in the United States changed dramatically from 1920 to 2000. For example the proportion of local government funding for schools decreased from 83.2% of total funding to 43.2%. This drop of nearly one half was accompanied by triple the commitment in state funding. The proportion of state government funding for schools increased from 16.5% of total funding to 49.5% during the same period. These figures also indicate that it increased state involvement in local schooling occurred. Whereas state government was always responsible for public schooling, early schools were primarily funded, control, and operated by localities. A progressive increase in state funding for schools naturally lend itself to greater assertion of state control over schools. Federal funding also increased from 1920 to 2000. The federal proportion of total funding for elementary and secondary schools grew from .3% to 7.3%. Localities accepted state and federal funding increases, but the shifting structure of funding among levels of government threatened to diminish local control over schools.

The organization structure of public education in the United States also changed dramatically. For example, the number of public school districts in 1940 was 117,108 and then dropped to 14,928 in 2000. From 1950 to 2000, the number of public elementary and secondary schools dropped from 152,767 to 94,580. These decreases resulted from the consolidation of public school districts and schools. Although, on the surface, this consolidation was not a direct indicator of increased state control over spending, the shifting structure of organization portrayed diminished local control over schools. There were fewer schools, meaning that many communities lost direct control over schooling to more broadly defined, centralized school districts. Had demand for education declined during this period, observed decreases in districts and schools might have reflected diminishing demand for schooling. To the contrary, public school enrollment increased from about 25 million to about 47 million during the period.

[773] Dataset: “Table 214.10. Number of Public School Districts and Public and Private Elementary and Secondary Schools: Selected Years, 1869–70 Through 2021–22.” U.S. Department of Education, National Center for Education Statistics, October 2022. <nces.ed.gov>

“School year [=] 2021–22 … Regular public school districts [=] 13,318”

[774] Book: Private and Public School Partnerships: Sharing Lessons About Decentralization. By Jean Madsen. Falmer Press (imprint of Taylor & Francis), 1997.

Page 2: “While many schools in the United States claim to implement SBM [site-based management], very little decision-making is truly decentralized. In most cases SBM is only a subset of the various types of decisions that are made at the district level. Thus, some districts may decentralize budget decisions but may maintain control of personnel and curriculum concerns.”

[775] Book: Encyclopedia of Educational Leadership and Administration (Volume 1). Edited by Fenwick W. English. Sage Publications, 2006. Article: “Finance, of Public Schools.” By Scott R. Sweetland.

Page 394: “Greater concentration of power at the state level made it possible to require and enforce uniform standards across grade levels, schools, and districts.”

[776] Book: Comparative Public Policy and Citizen Participation: Energy, Education, Health and Urban Issues in the U.S. and Germany. Edited by Charles R. Foster. Pergamon Press, 1980.

Chapter 7: “Education As Loosely Coupled Systems in West Germany and the United States.” By Maurice A Garnier. Pages 87–98.

Page 93:

The concept of local control of schools is an old one in the United States and in England. While, in the United States, the legal authority for education is vested in the state, most states (with the exception of Hawaii) have delegated that responsibility to local authorities. Over the years, states have increased their role, particularly in matters of finance and teacher certification.

[777] Book: Human Resource Management in Public Service: Paradoxes, Processes, and Problems (4th edition). By Evan M. Berman, James S. Bowman, Jonathan P. West, and Montgomery R. Van Wart. SAGE Publications, 2013.

Page 444:

The institutional structure and legal rights related to collective bargaining vary by level of government, jurisdiction, and occupational groups. … Currently, 31 states and the District of Columbia authorize collective bargaining for public employees. Ten other states allow bargaining for some state and/or local employees (for example, public safety, teachers). The remaining nine states lack collective bargaining statutes for their state and local government employees (American Federation of State County & Municipal Employees, 2010). In some instances, however, executive orders or local ordinances confer rights to bargain or have representation.

[778] Article: “Walker Brings Unwanted Attention to Local Teacher.” By Erin Richards. Milwaukee Journal Sentinel, March 10, 2011. <archive.jsonline.com>

“In 2010, Megan Sampson was named an Outstanding First Year Teacher in Wisconsin,” Walker writes. “A week later, she got a layoff notice from Milwaukee Public Schools. Why would one of the best new teachers in the state be one of the first let go? Because her collective-bargaining contract requires staffing decisions to be made based on seniority.”

Sampson’s experience in Milwaukee Public Schools was the result of the “last hired, first fired” policy of the teachers union, in which seniority decides who gets cut in times of layoffs, no matter how great the skill of younger members.

[779] Paper: “Compulsory Arbitration: The Scope of Judicial Review.” By Victor Cohen. St. John’s Law Review, Spring 1977. Pages 604–631. <scholarship.law.stjohns.edu>

Page 606: “[A]s a means of settling contract disputes between governmental bodies and key public employees, a number of states have enacted statutes providing for compulsory interest arbitration.”

[780] Article: “Teachers Push for Binding Arbitration.” By Jennifer D. Jordan. Providence Journal, May 9, 2012. <www.providencejournal.com>

The fight to secure binding arbitration for Rhode Island teachers is being revived at the State House after last year’s failed attempt by labor leaders in the last days of the legislative session. …

House Bill 7617 would expand the scope of binding arbitration for teachers to include wages and other financial matters and would include non-teacher educational employees such as janitors and support staff. It would also permit either side—labor or management—to declare the move to binding arbitration.

[781] Article: “Found to Have Misbehaved with Pupils, but Still Teaching.” By David W. Chen and Patrick McGeehan. New York Times, April 5, 2012. <www.nytimes.com>

The New York City Education Department wanted to fire these teachers. But in these and 13 other cases in recent years in which teachers were accused of inappropriate behavior with students, the city was overruled by an arbitrator who, despite finding wrongdoing, opted for a milder penalty like a fine, a suspension or a formal reprimand.

As a result, 14 of those 16 teachers are still teaching and in contact with students, on either a daily or occasional basis. The other two were removed from their positions within the last month when new allegations of misbehavior surfaced against them, according to the Education Department. …

But to union officials, the right to an impartial hearing is sacrosanct, to protect teachers from losing their livelihoods because a principal or a student might have an ax to grind.

[782] Handbook on Human Service Administration. Edited by Jack Rabin and Marcia B. Steinhauer. CRC Press, 1988.

Chapter 9: “Employee Organizations in Human Services Administration.” By Paul E. Fitzgerald, Jr. Pages 309–340.

Pages 318–319:

Strikes are almost always illegal for public employees, although they have increased in number significantly over the past two decades. Most public employee strikes are by local government employees, and the one professional group of employees most likely to strike is teachers. …

As in the laws governing federal employees, often there are very severe penalties in state laws for striking workers and labor organizations, but these penalties often are not imposed.

[783] Report: “Digest of Education Statistics 2012.” By Thomas D. Snyder and Sally A. Dillow. U.S. Department of Education, National Center for Education Statistics, December 2013. <nces.ed.gov>

Page 63: “The Individuals with Disabilities Education Act (IDEA), enacted in 1975, mandates that children and youth ages 3–21 with disabilities be provided a free and appropriate public school education.”

[784] Webpage: “The No Child Left Behind Act of 2001.” U.S. Department of Education, January 7, 2002. <www2.ed.gov>

The NCLB [No Child Left Behind] Act will strengthen Title I accountability by requiring States to implement statewide accountability systems covering all public schools and students. These systems must be based on challenging State standards in reading and mathematics, annual testing for all students in grades 3–8, and annual statewide progress objectives ensuring that all groups of students reach proficiency within 12 years. Assessment results and State progress objectives must be broken out by poverty, race, ethnicity, disability, and limited English proficiency to ensure that no group is left behind. School districts and schools that fail to make adequate yearly progress (AYP) toward statewide proficiency goals will, over time, be subject to improvement, corrective action, and restructuring measures aimed at getting them back on course to meet State standards. Schools that meet or exceed AYP objectives or close achievement gaps will be eligible for State Academic Achievement Awards.

[785] Code of Federal Regulations Title 34, Subtitle B, Chapter II: “Race to the Top Fund.” Federal Register, November 18, 2009. <www.gpo.gov>

Page 59688:

Summary: The Secretary of Education (Secretary) announces priorities, requirements, definitions, and selection criteria for the Race to the Top Fund. The Secretary may use these priorities, requirements, definitions, and selection criteria in any year in which this program is in effect.

Dates: Effective Date: These priorities, requirements, definitions, and selection criteria are effective January 19, 2010.

Page 59802:

B. Standards and Assessments

State Reform Conditions Criteria

(B)(1) Developing and adopting common standards: The extent to which the State has demonstrated its commitment to adopting a common set of high-quality standards, evidenced by (as set forth in Appendix B)—

(i) The State’s participation in a consortium of States that—

(a) Is working toward jointly developing and adopting a common set of K–12 standards (as defined in this notice) that are supported by evidence that they are internationally benchmarked and build toward college and career readiness by the time of high school graduation; and

(b) Includes a significant number of States; and

(ii)(a) For Phase 1 applications, the State’s high-quality plan demonstrating its commitment to and progress toward adopting a common set of K–12 standards (as defined in this notice) by August 2, 2010, or, at a minimum, by a later date in 2010 specified by the State, and to implementing the standards thereafter in a well-planned way; or

(b) For Phase 2 applications, the State’s adoption of a common set of K–12 standards (as defined in this notice) by August 2, 2010, or, at a minimum, by a later date in 2010 specified by the State in a high-quality plan toward which the State has made significant progress, and its commitment to implementing the standards thereafter in a well-planned way.

[786] Report: “ESEA Flexibility.” U.S. Department of Education. Updated June 7, 2012. <www2.ed.gov>

In order to move forward with State and local reforms designed to improve academic achievement and increase the quality of instruction for all students in a manner that was not originally contemplated by the No Child Left Behind Act of 2001 (NCLB), a State educational agency (SEA) may request flexibility, on its own behalf and on behalf of its local educational agencies (LEAs), through waivers of ten provisions of the Elementary and Secondary Education Act of 1965 (ESEA) and their associated regulatory, administrative, and reporting requirements. In order to receive this flexibility, an SEA must meet the principles described in the next section. …

To receive flexibility through the waivers outlined above, an SEA must submit a request that addresses each of the following four principles, consistent with the definitions and timelines described later in this document, to increase the quality of instruction for students and improve student academic achievement in the State and its LEAs. In the SEA’s request, the SEA must describe how it will ensure that LEAs will fully implement these principles, consistent with the SEA’s authority under State law and the SEA’s request.

1. College- and Career-Ready Expectations for All Students

2. State-Developed Differentiated Recognition, Accountability, and Support

3. Supporting Effective Instruction and Leadership

4. Reducing Duplication and Unnecessary Burden

[787] Book: Comparative Public Policy and Citizen Participation: Energy, Education, Health and Urban Issues in the U.S. and Germany. Edited by Charles R. Foster. Pergamon Press, 1980.

Chapter 7: “Education As Loosely Coupled Systems in West Germany and the United States.” By Maurice A Garnier. Pages 87–98.

Page 93:

The concept of local control of schools is an old one in the United States and in England. While, in the United States, the legal authority for education is vested in the state, most states (with the exception of Hawaii) have delegated that responsibility to local authorities. Over the years, states have increased their role, particularly in matters of finance and teacher certification. Nevertheless, the American assumption is that communities constitute the unit most capable of running the schools. While the state may mandate that districts’ boundaries be redrawn, the notion that a particular state might be capable of running all schools within its boundaries is unthinkable in the American context. We will later examine the shortcomings of such a system, but in theory as well as in practice, American schools are locally run and the formal connections between school districts within the same state are virtually nonexistent (Wayland, 1973).

[788] Book: Private and Public School Partnerships: Sharing Lessons About Decentralization. By Jean Madsen. Falmer Press (imprint of Taylor & Francis), 1997.

Page 1:

Site-based management [SMB] is a business derivative of decentralization and participatory decision-making. The intent of site-based management is to improve student performance by making those closest to the delivery of services—teachers and principals—more autonomous, resulting in their being more responsive to parents and students concerns.

Page 2:

While many schools in the United States claim to implement SBM, very little decision-making is truly decentralized. In most cases SBM is only a subset of the various types of decisions that are made at the district level. Thus, some districts may decentralize budget decisions but may maintain control of personnel and curriculum concerns. Other SBM plans give some autonomy about trivial issues like school safety, parent involvement, and career education. The illusion of autonomy based on SBM is often constrictive because the district office retains the final authority or limits the range of decision-making (Bimber, 1993).

[789] “Common Core State Standards for Mathematics.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

[790] Report: “Reaching Higher: The Common Core State Standards Validation Committee.” National Governors Association Center for Best Practices and the Council of Chief State School Officers, June 2010. <bit.ly>

Page 6:

R. James Milgram

Emeritus Professor at Stanford University’s Department of Mathematics

Milgram, one of the authors of the California Mathematics Standards and the California Mathematics Framework, has worked with a number of states, and with the Achieve Mathematics Advisory Panel, on standards in education. As a member of the National Board for Education Sciences, he has worked with the U.S. Department of Education on the math that pre-service K–8 teachers need to know and understand.

[791] Commentary: “Can This Country Survive Common Core’s College Readiness Level?” By R. James Milgram and Sandra Stotsky, September 2013. <bit.ly>

Page 2:

Milgram was the only mathematician on the VC [validation committee] (there were several mathematics educators, people with a doctorate in mathematics education holding an academic appointment in an education school or engaged full-time in teacher training and professional development), and Stotsky was the only expert on K–12 English language arts standards by virtue of her work in the Massachusetts Department of Education from 1999–2003 and with Achieve, Inc. on its American Diploma Project high school exit standards for English language arts in 2004 and its subsequent backmapped standards for earlier grade levels.

[792] Commentary: “What I Learned About the Opposition to the Common Core State Standards When I Testified in Indiana.” By Jason Zimba. Thomas B. Fordham Institute, August 9, 2013. <fordhaminstitute.org>

I don’t make much of the fact, which critics like to repeat, that Professor Milgram was the only mathematician on the validation committee and that he didn’t sign off on the standards. The fact is true, but the impression it gives that the standards are mathematically unsound is false. Professor Emeritus of mathematics Hung-Hsi Wu has said that “the statements of the standards are mathematically correct and the progression from topic to topic is logical.” …

Jason Zimba was a lead author of the Common Core State Standards for Mathematics and is a founding partner of Student Achievement Partners, a nonprofit organization. He holds a BA from Williams College with a double major in mathematics and astrophysics; an MSc by research in mathematics from the University of Oxford; and a PhD in mathematical physics from the University of California at Berkeley.

[793] Testimony: “Issues With Core Math Standards.” By R. James Milgram (Professor of Mathematics, Stanford University). Arkansas State Legislature, July 23, 2013. <www.arkleg.state.ar.us>

Pages 1–2 (of PDF):

[T]he new Common Core national standards … are claimed to be research based, but the main reason I could not sign off on them was that there were too many areas where the writing team could not show me suitable research that justified their handling of key topics—particularly when they differed from standard approaches. …

The three most severe problem areas are:

1. the beginning handling of whole numbers in particular adding, subtracting, multiplying, and dividing;

2. the handling of geometry in middle school and high school;

3. the very low level expectations for high school graduation that barely prepare students for attending a community college, let alone a 4-year university.

Unfortunately, these are the three most crucial areas where our math outcomes have to improve. Core Standard’s approach to whole numbers is just the continuation of the approach pioneered in California in the early 1990’s that had such bad outcomes that it spawned the Math Wars. Moreover, the use of student-constructed algorithms is at odds with the practices of high-achieving countries and the research that supports student constructed algorithms appears highly suspect.

Additionally, the way Common Core presents geometry is not research-based—and the only country that tried this approach on a large scale, the old USSR [Union of Soviet Socialist Republics], rapidly abandoned it. The problem is that—though the outlined approach to geometry is rigorous—it depends on too many highly specialized topics, that even math majors at a four year university would not see until their second or more likely their third years. Again, there is no research with actual students that supports the Core Standards approach.

Tied in with the problems in geometry, there are also severe problems with the way Common Core handles percents, ratios, rates, and proportions—the critical topics that are essential if students are to learn more advanced topics such as trigonometry, statistics, and even calculus. …

The classic method of, for example, adding two-digit numbers is to add the digits in the “ones” column, carry the tens in the sum to the “tens” column, then add the “tens” digits, and so on. This “standard algorithm” works first time, every time. But instead of preparing for and teaching this method, by first carefully studying and understanding the meaning of our place value notation, as they do in the high achieving countries, Common Core creates a three-step process starting with student constructed algorithms guided by the absurd belief that all this content is somehow innate. …

I cannot emphasize enough that Common Core is using our children for a huge and risky experiment, one that consistently failed when tried by individual states such as California in the early 1990’s and even countries such as the old USSR in the 1970’s.

[794] Commentary: “Is Anybody Up for Defending the Common Core Math Standards?” By Rick Hess. Education Next, September 6, 2011. <www.edweek.org>

I’ve been executive editor of Education Next for more than a decade. In that role, one of the things I’ve done is coordinate our “forums” on various topics. Over the years, we’ve done 40-odd forums, and have usually gotten our first-choice authors. When we haven’t gotten them, we’ve almost invariably gotten our second choice. All of which makes it astonishing that, over the past three months, we’ve now asked six individuals involved in the Common Core math standards to pen a piece making the case for their rigor and quality, and each has declined in turn.

[795] Article: “Straight Up Conversation: Math Scholar Hung-Hsi Wu on the Common Core.” By Rick Hess. Education Next, October 5, 2011. <www.edweek.org>

A few weeks back, I penned a post about the lack of response we’d received regarding our in-the-works Education Next forum on the Common Core math standards. I heard from a number of individuals who offered to defend the standards. One was Hung-Hsi Wu, professor emeritus in mathematics from UC-Berkeley, who has just penned the cover story on this topic for AFT’s [American Federation of Teachers] magazine American Educator. Dr. Wu, who started teaching at Berkeley in 1973, has been actively involved in math education for the past two decades, helping write California’s 1999 Mathematics Framework and California’s Standards Tests. He was also a member of NAEP’s [National Assessment of Educational Progress] Mathematics Steering Committee, 2000–2001, that contributed to the revision of the NAEP Framework. …

Wu: …

The Common Core math standards place great emphasis on mathematical integrity, [in other words] the statements of the standards are mathematically correct and the progression from topic to topic is logical. In this regard, it is at least comparable to the best state standards, such as those of California and Massachusetts. …

… [P]art of Common Core math standards’ design to optimize mathematics learning by giving students enough time, whenever feasible, to absorb the material as well as time for teachers to teach the material. For children, the addition of fractions is so conceptually complicated that they need the time to internalize the whole process. This particular treatment of fraction addition is one of the outstanding features of the Common Core standards.

[796] Commentary: “The ‘New’ New Jersey Mathematics Standards—Circa 2009.” By Joseph G. Rosenstein. NJ Spotlight News, July 13, 2015. <www.njspotlightnews.org>

Unfortunately, the Common Core mathematics standards is based on the false assumption that all students should learn much of what is found in an Algebra II course. And that assumption has implications all the way down to the early grades, where it is manifested in what one educator called “a fanatical focus on fractions” in the Common Core mathematics standards.

A second inadequacy of the Common Core mathematics standards is that they essentially banish statistics, probability, and discrete mathematics to the later grades; these are topics that should be woven throughout the curriculum and all grade levels. …

Joseph G. Rosenstein is a Distinguished Professor of Mathematics at Rutgers University. His focus during the last 30 years has been on K–12 mathematics education. In addition to co-chairing the teams that produced the 1996 and 2002 New Jersey Mathematics Standards, he was the editor and a principal author of the New Jersey Mathematics Curriculum Framework. He was Director of the New Jersey Mathematics Coalition for over 15 years.

[797] Webpage: “Key Shifts in Mathematics.” Common Core State Standards Initiative. Accessed October 10, 2015 at <bit.ly>

Procedural skills and fluency: The standards call for speed and accuracy in calculation. Students must practice core functions, such as single-digit multiplication, in order to have access to more complex concepts and procedures. Fluency must be addressed in the classroom or through supporting materials, as some students might require more practice than others.

[798] “Common Core State Standards for Mathematics.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

Page 13:

In Grade 1, instructional time should focus on four critical areas: (1) developing understanding of addition, subtraction, and strategies for addition and subtraction within 20; …

(1) Students develop strategies for adding and subtracting whole numbers based on their prior work with small numbers. … They use properties of addition to add whole numbers and to create and use increasingly sophisticated strategies based on these properties (for example, “making tens”) to solve addition and subtraction problems within 20. …

(2) Students develop, discuss, and use efficient, accurate, and generalizable methods to add within 100 and subtract multiples of 10. They compare whole numbers (at least to 100) to develop understanding of and solve problems involving their relative sizes. They think of whole numbers between 10 and 100 in terms of tens and ones (especially recognizing the numbers 11 to 19 as composed of a ten and some ones).

Page 15:

Add and subtract within 20, demonstrating fluency for addition and subtraction within 10. Use strategies such as counting on; making ten (for example, 8 + 6 = 8 + 2 + 4 = 10 + 4 = 14); decomposing a number leading to a ten (for example, 13 – 4 = 13 – 3 – 1 = 10 – 1 = 9); using the relationship between addition and subtraction (for example, knowing that 8 + 4 = 12, one knows 12 – 8 = 4); and creating equivalent but easier or known sums (for example, adding 6 + 7 by creating the known equivalent 6 + 6 + 1 = 12 + 1 = 13).

[799] Article: “Homework Helper: Math Tips for the Common Core.” By Melissa Holmes, WGRZ (NBC – Buffalo, NY), September 3, 2014. <www.wgrz.com>

The Common Core aims to teach strategies beyond memorization, focusing on pictures, numbers, and words. Some students find the math lessons confusing, and sadly, many parents can’t help them because the lessons are foreign to them.

Fourth grade math teacher, Eileen Klag Ryan, from Maple West Elementary in the Williamsville School District explains different principles in 2 On Your Side’s Homework Helper series.

NOTE: Credit for bringing this to the attention of Just Facts belongs to Kelsey Lucas of the Heritage Foundation [<www.dailysignal.com>].

[800] Webpage: “About the Standards.” Common Core State Standards Initiative. Accessed October 10, 2015 at <www.thecorestandards.org>

The Common Core is informed by the highest, most effective standards from states across the United States and countries around the world. The standards define the knowledge and skills students should gain throughout their K–12 education in order to graduate high school prepared to succeed in entry-level careers, introductory academic college courses, and workforce training programs.

The standards are:

1. Research- and evidence-based

2. Clear, understandable, and consistent

3. Aligned with college and career expectations

4. Based on rigorous content and application of knowledge through higher-order thinking skills

5. Built upon the strengths and lessons of current state standards

6. Informed by other top performing countries in order to prepare all students for success in our global economy and society

[801] Email from Just Facts to the Common Core State Standards Initiative, October 14, 2015.

Subject: Research on Common Core Standards

Dear Sir or Madame:

I am researching the CCS and kindly request that you point me to the specific studies that support the following four CC principles and standards. I am familiar with the “Compendium of Research on the Common Core State Standards,” but it is not clear to me what research specifically supports each standard. Hence, I am looking to connect the research to the standards as if the standards were methodically footnoted. …

4) Students “use properties of addition to add whole numbers and to create and use increasingly sophisticated strategies based on these properties (for example, ‘making tens’) to solve addition and subtraction problems within 20. … They think of whole numbers between 10 and 100 in terms of tens and ones (especially recognizing the numbers 11 to 19 as composed of a ten and some ones). … Use strategies such as counting on; making ten (for example, 8 + 6 = 8 + 2 + 4 = 10 + 4 = 14); decomposing a number leading to a ten (for example, 13 – 4 = 13 – 3 – 1 = 10 – 1 = 9)” [Emphases added, “Common Core Standards for Mathematics.” <bit.ly>]

I am particularly interested in experimental studies (as opposed to observational studies). Thus, would you also please identify studies pertaining to the standards above that are experimental or quasi-experimental?

[802] On 10/16/15, Just Facts followed up with a phone call to CCSSI’s [Common Core State Standards Initiative] press office, received a call in return, and resent the email at CCSSI’s request. CCSSI replied with a generic email that did not answer the questions posed by Just Facts. Just Facts responded, pointed out that the reply did not answer the questions, and requested the “specific studies that support” CCSSI’s assertions. The press office replied, “Let me pass your question along to my colleague.”

On 10/21 Just Facts followed up with an email to CCSSI’s press office. The press office replied that it will “follow up.”

On 10/22, CCSSI’s press office wrote to Just Facts: “I reached out to my colleague, and she will be connecting with you soon.”

On 10/23, Just Facts followed up with an email to CCSSI’s press office.

[803] Footnote reserved if CCSSI should respond in the future.

[804] Footnote reserved if CCSSI should respond in the future.

[805] “Common Core State Standards for Mathematics.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

Page 4:

These Standards define what students should understand and be able to do in their study of mathematics. Asking a student to understand something means asking a teacher to assess whether the student has understood it. But what does mathematical understanding look like? One hallmark of mathematical understanding is the ability to justify, in a way appropriate to the student’s mathematical maturity, why a particular mathematical statement is true or where a mathematical rule comes from. There is a world of difference between a student who can summon a mnemonic device to expand a product such as (a + b)(x + y) and a student who can explain where the mnemonic comes from. The student who can explain the rule understands the mathematics, and may have a better chance to succeed at a less familiar task such as expanding (a + b + c)(x + y). Mathematical understanding and procedural skill are equally important, and both are assessable using mathematical tasks of sufficient richness.

Page 23 [Grade 3]: “Identify arithmetic patterns (including patterns in the addition table or multiplication table), and explain them using properties of operations. For example, observe that 4 times a number is always even, and explain why 4 times a number can be decomposed into two equal addends.”

Page 27 [Grade 4]: “They develop fluency with efficient procedures for multiplying whole numbers; understand and explain why the procedures work based on place value and properties of operations; and use them to solve problems.”

Page 33 [Grade 5]:

Students also use the meaning of fractions, of multiplication and division, and the relationship between multiplication and division to understand and explain why the procedures for multiplying and dividing fractions make sense. (Note: this is limited to the case of dividing unit fractions by whole numbers and whole numbers by unit fractions.)”

Page 39 [Grade 5]: “Students use the meaning of fractions, the meanings of multiplication and division, and the relationship between multiplication and division to understand and explain why the procedures for dividing fractions make sense.”

Page 66 [High School Algebra]:

Explain why the x-coordinates of the points where the graphs of the equations y = f(x) and y = g(x) intersect are the solutions of the equation f(x) = g(x); find the solutions approximately, for example, using technology to graph the functions, make tables of values, or find successive approximations. Include cases where f(x) and/or g(x) are linear, polynomial, rational, absolute value, exponential, and logarithmic functions.

NOTE: The examples above represent several of many in the Common Core math standards.

[806] Teaching guide: “Instructional Support Tools for Achieving New Standards: 3rd Grade Mathematics, Unpacked Content.” North Carolina Department of Public Instruction. Updated September 2015. <bit.ly>

Page 1: “This document is designed to help North Carolina educators teach the Common Core (Standard Course of Study). NCDPI [North Carolina Department of Public Instruction] staff are continually updating and improving these tools to better serve teachers. This document was written by the NCDPI Mathematics Consultants with the collaboration of many educators from across the state.”

Page 15:

What do you notice about the numbers highlighted in pink in the multiplication table? Explain a pattern using properties of operations.

When (commutative property) one changes the order of the factors they will still gets the same product, example 6 x 5 = 30 and 5 x 6 = 30.

Common Core Math Verbalization Problem

NOTE: The grammar and punctuation above are exactly as they appeared in the teaching guide.

[807] Webpage: “W. Stephen Wilson.” Education Next. Accessed October 24, 2015 at <www.educationnext.org>

W Stephen Wilson is a Professor of Mathematics at the Johns Hopkins University and a Professor in the School of Education. He has been Chair of the Department of Mathematics. His 1972 Ph.D. in mathematics is from M.I.T. He spent 8 months of 2006 as the Senior Advisor for Mathematics, Office of Elementary and Secondary Education, United States Department of Education and was one of the coauthors of the Fordham Foundation Report: The State of State MATH Standards, 2005. He helped revise the Washington State K–12 mathematics standards and evaluate textbooks for the state. He has helped out with numerous smaller projects on standards, curricula, and textbooks. More recently he reviewed drafts of the new Common Core Mathematics Standards for the National Governors Association and the Council of Chief State School Officers and drafts of PARCC’s [Partnership for Assessment of Readiness for College and Careers] mathematics Framework. …

[808] Commentary: “The Common Core Math Standards: Are They a Step Forward or Backward?” By Ze`ev Wurman and W. Stephen Wilson. Education Next, Summer 2012. <educationnext.org>

Wilson: …

There is much to criticize about them [the Common Core standards], and there are several sets of standards, including those in California, the District of Columbia, Florida, Indiana, and Washington, that are clearly better. Yet Common Core is vastly superior—not just a little bit better, but vastly superior—to the standards in more than 30 states. …

Wurman: …

Steve [Wilson] sees the benefit of having Common Core standards that are better than those of “more than 30 states,” while I see the disadvantage of confining the whole nation to mediocre standards that are worse than those of highly rated states and high-achieving countries.

[809] Commentary: “The Common Core Math Standards: Are They a Step Forward or Backward?” By Ze`ev Wurman and W. Stephen Wilson. Education Next, Summer 2012. <educationnext.org>

Wilson: …

There will always be people who believe that you do not understand mathematics if you cannot write a coherent essay about how you solved a problem, thus driving future STEM [science, technology, engineering and math] students away from mathematics at an early age. A fairness doctrine would require English language arts (ELA) students to write essays about the standard [math] algorithms, thus also driving students away from ELA at an early age. The ability to communicate is NOT essential to understanding mathematics.

[810] “Common Core State Standards for Mathematics.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

Page 4:

These Standards define what students should understand and be able to do in their study of mathematics. Asking a student to understand something means asking a teacher to assess whether the student has understood it. But what does mathematical understanding look like? One hallmark of mathematical understanding is the ability to justify, in a way appropriate to the student’s mathematical maturity, why a particular mathematical statement is true or where a mathematical rule comes from. There is a world of difference between a student who can summon a mnemonic device to expand a product such as (a + b)(x + y) and a student who can explain where the mnemonic comes from. The student who can explain the rule understands the mathematics, and may have a better chance to succeed at a less familiar task such as expanding (a + b + c)(x + y). Mathematical understanding and procedural skill are equally important, and both are assessable using mathematical tasks of sufficient richness.

[811] Email from Just Facts to the Common Core State Standards Initiative, October 14, 2015.

Subject: Research on Common Core Standards

Dear Sir or Madame:

I am researching the CCS and kindly request that you point me to the specific studies that support the following four CC principles and standards. I am familiar with the “Compendium of Research on the Common Core State Standards,” but it is not clear to me what research specifically supports each standard. Hence, I am looking to connect the research to the standards as if the standards were methodically footnoted. …

2) “One hallmark of mathematical understanding is the ability to justify, in a way appropriate to the student’s mathematical maturity, why a particular mathematical statement is true or where a mathematical rule comes from.” [“Common Core Standards for Mathematics.” <bit.ly>] …

I am particularly interested in experimental studies (as opposed to observational studies). Thus, would you also please identify studies pertaining to the standards above that are experimental or quasi-experimental?

[812] On 10/16/15, Just Facts followed up with a phone call to CCSSI’s [Common Core State Standards Initiative] press office, received a call in return, and resent the email at CCSSI’s request. CCSSI replied with a generic email that did not answer the questions posed by Just Facts. Just Facts responded, pointed out that the reply did not answer the questions, and requested the “specific studies that support” CCSSI’s assertions. The press office replied, “Let me pass your question along to my colleague.”

On 10/21 Just Facts followed up with an email to CCSSI’s press office. The press office replied that it will “follow up.”

On 10/22, CCSSI’s press office wrote to Just Facts: “I reached out to my colleague, and she will be connecting with you soon.”

On 10/23, Just Facts followed up with an email to CCSSI’s press office.

[813] Footnote reserved if CCSSI should respond in the future.

[814] Footnote reserved if CCSSI should respond in the future.

[815] “Common Core State Standards for Mathematics.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

Page 15 [Grade 1]: “Use addition and subtraction within 20 to solve word problems involving situations of adding to, taking from, putting together, taking apart, and comparing, with unknowns in all positions, for example, by using objects, drawings, and equations with a symbol for the unknown number to represent the problem.”

Page 19 [Grade 2]: “Use addition and subtraction within 100 to solve one- and two-step word problems involving situations of adding to, taking from, putting together, taking apart, and comparing, with unknowns in all positions, for example, by using drawings and equations with a symbol for the unknown number to represent the problem.”

Page 25 [Grade 3]: “Add, subtract, multiply, or divide to solve one-step word problems involving masses or volumes that are given in the same units, for example, by using drawings (such as a beaker with a measurement scale) to represent the problem.”

Page 35 [Grade 5]: “Add, subtract, multiply, and divide decimals to hundredths, using concrete models or drawings and strategies based on place value, properties of operations, and/or the relationship between addition and subtraction; relate the strategy to a written method and explain the reasoning used.”

NOTE: The examples above represent several of many in the Common Core math standards.

[816] Paper: “Survey of Research on Learning Styles.” By Rita Dunn, Jeffrey S. Beaudry, and Angela Klavas. California Journal of Science Education, Spring 2002. Pages 75–98. <www.marric.us>

Page 75: “Learning style is a biologically and developmentally imposed set of personal characteristics that make the same teaching method effective for some and ineffective for others.”

Pages 77–78:

As new findings about left/right brain functions appeared, researchers investigated the connections between learning style and hemisphericity. The terms left/right, analytic/global, and inductive/deductive have been used interchangeably in the literature; descriptions of these pairs of variables parallel each other. Lefts/analytics/inductives appear to learn successively, in small steps leading to understanding; rights/globals/deductives more easily learn by obtaining meaning from a broad concept and then focusing on details.

Studies that examined the similarities and differences between hemispheric style and other elements of learning style revealed that, when concentrating on difficult academic material:

1) High school students who were less motivated than their classmates and who preferred working with distracters (music, low illumination, informal or casual seating, peers rather than alone or with the teacher, tactile rather than auditory or visual instructional resources) scored right-hemisphere significantly more often than left-hemisphere. Also, students who scored high on persistence invariably scored high as left processors (Dunn and others 1982). (The latter data may have implications for time-on-task research.)

2) Left-hemisphere youngsters in grades 5–12 preferred a conventional formal classroom seating design, more structure, less intake, and visual rather than tactile or kinesthetic resources during learning significantly more often than their right-preferenced classmates (Cody 1983).

3) Right-hemisphere 5th through 12th graders disliked structure and were not adult motivated but were strongly peer motivated. Gifted and highly gifted students were significantly more often right or integrated than left processors (Cody 1983).

4) Right-hemisphere community college adult math underachievers preferred learning with sound and intake. They wanted tactile and kinesthetic instructional resources and mobility significantly more often than their left-hemisphere counterparts, who preferred bright light and a formal design. [When the predominantly right-hemisphere students were taught alternately with both global and analytic lessons, they achieved statistically higher test scores through the global, rather than through the analytic, resources (Bruno 1988).]

Pages 80–81:

These correlational findings prompted researchers to conduct experimental studies to determine the effects of individual learning style on achievement, attitudes, and/or behavior. …

In addition to the instructional environment, sensory preferences influence the ways in which students learn. Eight studies within the past decade reveal that when youngsters were taught with instructional resources that both matched and mismatched their preferred modalities, they achieved statistically higher test scores in modality-matched, rather than mismatched, treatments (Dunn 1988; see fig. 2). In addition, when children were taught with multisensory resources, but initially through their most preferred modality and then were reinforced through their secondary or tertiary modality, their scores increased even more.

Perceptual preferences affect more than 70 percent of school-age youngsters. High school teachers who have translated their curriculum into electroboards, Flip chutes, multipart task cards, and Pick-A-Holes reported increased achievement and interest when such manipulatives were available for highly tactual students (Dunn and Griggs 1988).

Page 88: “Educational Leadership, 46, 6: 50–58, March 1989. Reprinted with permission from ASCD [Association for Supervision and Curriculum Development]. All rights reserved.”

[817] Book: The SAGE Encyclopedia of Educational Technology. Edited by J. Michael Spector. Sage Publications, 2015. Article: “Adaptive Learning Software and Platforms.” By Dr. Kinshuk. Pages 7–10.

Page 8:

The field of learning styles is complex, and although a lot of research has been conducted, there is still no consensus regarding an accepted definition of learning styles. It is clear that there are multiple ways in which students prefer to learn, and those preferences in turn affect the effectiveness of learning process. Consideration of learning styles can decrease students’ effort in terms of time required for learning and increase overall student satisfaction.

[818] Report: “Reaching Higher: The Common Core State Standards Validation Committee.” National Governors Association Center for Best Practices and the Council of Chief State School Officers, June 2010. <bit.ly>

Page 3: “These common standards are an important step in bringing about a real and meaningful transformation of the education system for the benefit of all students.”

[819] Email from Just Facts to the Common Core State Standards Initiative, October 14, 2015.

Subject: Research on Common Core Standards

Dear Sir or Madame:

I am researching the CCS and kindly request that you point me to the specific studies that support the following four CC principles and standards. I am familiar with the “Compendium of Research on the Common Core State Standards,” but it is not clear to me what research specifically supports each standard. Hence, I am looking to connect the research to the standards as if the standards were methodically footnoted. …

3) “Compose and decompose numbers from 11 to 19 into ten ones and some further ones, for example, by using objects or drawings, and record each composition or decomposition by a drawing or equation….” [Emphasis added, “Common Core Standards for Mathematics.” <bit.ly> …

I am particularly interested in experimental studies (as opposed to observational studies). Thus, would you also please identify studies pertaining to the standards above that are experimental or quasi-experimental?

[820] On 10/16/15, Just Facts followed up with a phone call to CCSSI’s [Common Core State Standards Initiative] press office, received a call in return, and resent the email at CCSSI’s request. CCSSI replied with a generic email that did not answer the questions posed by Just Facts. Just Facts responded, pointed out that the reply did not answer the questions, and requested the “specific studies that support” CCSSI’s assertions. The press office replied, “Let me pass your question along to my colleague.”

On 10/21 Just Facts followed up with an email to CCSSI’s press office. The press office replied that it will “follow up.”

On 10/22, CCSSI’s press office wrote to Just Facts: “I reached out to my colleague, and she will be connecting with you soon.”

On 10/23, Just Facts followed up with an email to CCSSI’s press office.

[821] Footnote reserved if CCSSI should respond in the future.

[822] Footnote reserved if CCSSI should respond in the future.

[823] “Common Core State Standards for English Language Arts & Literacy in History/Social Studies, Science, and Technical Subjects.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

Page 4: “Part of the motivation behind the interdisciplinary approach to literacy promulgated by the Standards is extensive research establishing the need for college and career ready students to be proficient in reading complex informational text independently in a variety of content areas.”

[824] Commentary: “Can This Country Survive Common Core’s College Readiness Level?” By R. James Milgram and Sandra Stotsky, September 2013. <bit.ly>

Page 3:

Milgram was the only mathematician on the [Common Core] VC [validation committee] … and Stotsky was the only expert on K–12 English language arts standards by virtue of her work in the Massachusetts Department of Education from 1999–2003 and with Achieve, Inc. on its American Diploma Project high school exit standards for English language arts in 2004 and its subsequent backmapped standards for earlier grade levels.

[825] Report: “Reaching Higher: The Common Core State Standards Validation Committee.” National Governors Association Center for Best Practices and the Council of Chief State School Officers, June 2010. <bit.ly>

Page 6:

Sandra Stotsky

Endowed Chair in Teacher Quality at the University of Arkansas’s Department of Education Reform and Chair of the Sadlier Mathematics Advisory Board

Stotsky has abundant experience in developing and reviewing ELA [English Language Arts] standards. As senior associate commissioner of the Massachusetts Department of Education, she helped revise pre–K–12 standards. She also served on the 2009 steering committee for NAEP [National Assessment of Educational Progress] reading and on the 2006 National Math Advisory Panel.

[826] “Testimony for the House Study Committee on the Role of Federal Government in Education.” By Sandra Stotsky (University of Arkansas, Department of Education Reform). Georgia General Assembly, House Study Committee on the Role of the Federal Government in Education. September 24, 2014. <bit.ly>

Page 1:

I begin with remarks on Common Core’s Validation Committee, on which I served from 2009–2010. This committee, which was created to put the seal of approval on Common Core’s standards, was invalid both in its membership and in the procedures it was told to follow. …

… Common Core’s standards did not emerge from a state-led process and were not written by nationally known experts, claims regularly made by its advocates. In fact, the people who wrote the standards were not qualified to draft K–12 standards at all. …

… Not only were no high school mathematics teachers involved, no English professors or high school English teachers were, either. Because everyone worked without open meetings or accessible public comment, their reasons for making the decisions they did are lost to history. To this day we do not know why Common Core’s high school mathematics standards do not provide a pathway to STEM [science, technology, engineering, and math] careers or why David Coleman was allowed to mandate a 50/50 division between literary study and “informational” text at every grade level from K–12 in the ELA [English language arts] standards, with no approval from English teachers across the country or from the parents of students in our public schools.

The absence of relevant professional credentials in the two standards-writing teams helps to explain the flaws in these standards, on which costly tests are based and scheduled to be given in Georgia in 2015–2016. The “lead” writers for the ELA standards, David Coleman and Susan Pimentel, had never taught reading or English in K–12 or at the college level. Neither has a doctorate in English, nor published serious work on curriculum and instruction. They were virtually unknown to English language arts educators and to higher education faculty in rhetoric, speech, composition, or literary study.

None of the three lead standards-writers in mathematics, Jason Zimba, William McCallum, and Phil Daro, the only member of this three-person team with teaching experience, had ever developed K–12 mathematics standards before. Who wanted these people as standards-writers and why, we still do not know. No one in the media showed the slightest interest in their qualifications or the low level of college readiness they aimed for on a grade 11 test.

Page 2:

Why didn’t I sign off on Common Core’s standards? Professor Milgram and I were two of the five members of the VC [Validation Committee] who did not sign off on the standards. So far as we could determine, the Validation Committee was intended to function as a rubber stamp even though we had been asked to validate the standards. Despite repeated requests, we did not get the names of countries whose standards were supposedly used as benchmarks for Common Core’s. So far as I could figure out, Common Core’s standards were intentionally not made comparable to the most demanding sets of standards elsewhere. It did not offer any research evidence to justify its omission of high school mathematics standards leading to STEM careers, its stress on writing over reading, its division of reading instructional texts into “information” and “literature,” its deferral of the completion of Algebra I to grade 9 or 10, and its experimental approach to teaching Euclidean geometry. Nor did Common Core offer evidence that its standards meet entrance requirements for most colleges and universities in this country or elsewhere—or for a high school diploma in many states.

[827] “Testimony for a Hearing on Indiana Senate Bill No. 373.” By Sandra Stotsky. University of Arkansas, January 25, 2012. <www.justfacts.com>

Page 1:

I draw on much state and national experience with K–12 standards, curricula, and assessments. I was the senior associate commissioner in the Massachusetts Department of Education from 1999–2003 where, among other duties, I was in charge of the development or revision of all the state’s K–12 standards. I reviewed all states’ English language arts and reading standards for the Thomas B. Fordham Institute in 1997, 2000, and 2005. I co-authored Achieve’s American Diploma Project high school exit test standards for English in 2004. I served as a reviewer and advisor to Indiana on its 2006 Academic Standards and its 2008 Core Standards. I served on Common Core’s Validation Committee from 2009–2010. Finally, I am the author of The Death and Resurrection of a Coherent Literature Curriculum: What Secondary English Teachers Can Do, to be published by Rowman & Littlefield in June 2012.

Page 2:

Common Core’s “college readiness” standards are not content standards but simply empty skill sets. To judge by the reading levels of the high school examples of “complexity” in Common Core’s Appendix B, the average reading level of the passages on the common tests now being developed to determine “college-readiness” may be at about the grade 7 level. …

Common Core’s “college readiness” ELA/R [English Language Arts/Reading] standards were deliberately designed as empty skill sets to enable a large number of high school students to be declared “college ready” and to enroll in post-secondary institutions that will have no choice but to place them in credit-bearing courses. These institutions will then likely be under pressure from the USDE [U.S. Department of Education] to retain these students in order to increase college graduation rates even if they are reading at only middle school level.

[828] “Common Core State Standards for English Language Arts & Literacy in History/Social Studies, Science, and Technical Subjects.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

Page 6:

The Standards set grade-specific standards but do not define the intervention methods or materials necessary to support students who are well below or well above grade-level expectations. No set of grade-specific standards can fully reflect the great variety in abilities, needs, learning rates, and achievement levels of students in any given classroom. However, the Standards do provide clear signposts along the way to the goal of college and career readiness for all students.

[829] “Common Core State Standards for English Language Arts & Literacy in History/Social Studies, Science, and Technical Subjects.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

Page 6: “While the Standards focus on what is most essential, they do not describe all that can or should be taught. A great deal is left to the discretion of teachers and curriculum developers.”

[830] Webpage: “Key Shifts in English Language Arts.” Common Core State Standards Initiative. Accessed October 10, 2015 at <bit.ly>

Because the standards are the roadmap for successful classrooms, and recognizing that teachers, school districts, and states need to decide on the journey to the destination, they intentionally do not include a required reading list. Instead, they include numerous sample texts to help teachers prepare for the school year and allow parents and students to know what to expect during the year.

[831] “Common Core State Standards for English Language Arts & Literacy in History/Social Studies, Science, and Technical Subjects.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

Pages 4–5:

Most of the required reading in college and workforce training programs is informational in structure and challenging in content; postsecondary education programs typically provide students with both a higher volume of such reading than is generally required in K–12 schools and comparatively little scaffolding.

The Standards are not alone in calling for a special emphasis on informational text. The 2009 reading framework of the National Assessment of Educational Progress (NAEP) requires a high and increasing proportion of informational text on its assessment as students advance through the grades.

Distribution of Literary and Informational Passages by Grade in the 2009 NAEP Reading Framework

Grade

Literary

Informational

4

50%

50%

8

45%

55%

12

30%

70%

Source: National Assessment Governing Board. (2008). Reading framework for the 2009 National Assessment of Educational Progress. Washington, DC: U.S. Government Printing Office. …

Fulfilling the Standards for 6–12 ELA [English language arts] requires much greater attention to a specific category of informational text—literary nonfiction—than has been traditional.

[832] Webpage: “Key Shifts in English Language Arts.” Common Core State Standards Initiative. Accessed October 10, 2015 at <bit.ly>

Students must be immersed in information about the world around them if they are to develop the strong general knowledge and vocabulary they need to become successful readers and be prepared for college, career, and life. Informational texts play an important part in building students’ content knowledge. Further, it is vital for students to have extensive opportunities to build knowledge through texts so they can learn independently.

In K–5, fulfilling the standards requires a 50–50 balance between informational and literary reading. Informational reading includes content-rich nonfiction in history/social studies, sciences, technical studies, and the arts. The K–5 standards strongly recommend that texts—both within and across grades—be selected to support students in systematically developing knowledge about the world.

In grades 6–12, there is much greater attention on the specific category of literary nonfiction, which is a shift from traditional standards. To be clear, the standards pay substantial attention to literature throughout K–12, as it constitutes half of the reading in K–5 and is the core of the work of 6–12 ELA [English language arts] teachers. Also in grades 6–12, the standards for literacy in history/social studies, science, and technical subjects ensure that students can independently build knowledge in these disciplines through reading and writing. Reading, writing, speaking, and listening should span the school day from K–12 as integral parts of every subject.

[833] “Common Core State Standards for English Language Arts & Literacy in History/Social Studies, Science, and Technical Subjects.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

Page 10 [Grades K–5 Reading Standards]: “Delineate and evaluate the argument and specific claims in a text, including the validity of the reasoning as well as the relevance and sufficiency of the evidence.”

Page 47 [Grades 6–12 Writing Standards]: “Apply grades 9–10 Reading standards to literary nonfiction (for example, ‘Delineate and evaluate the argument and specific claims in a text, assessing whether the reasoning is valid and the evidence is relevant and sufficient; identify false statements and fallacious reasoning’).”

[834] “Common Core State Standards for English Language Arts & Literacy in History/Social Studies, Science, and Technical Subjects.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

Page 50 [Grades 6–12 Speaking and Listening Standards]: “Evaluate a speaker’s point of view, reasoning, and use of evidence and rhetoric, identifying any fallacious reasoning or exaggerated or distorted evidence.”

[835] “Common Core State Standards for English Language Arts & Literacy in History/Social Studies, Science, and Technical Subjects.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

Page 3:

As specified by CCSSO [Council of Chief State School Officers] and NGA [National Governors Association], the Standards are (1) research and evidence based, (2) aligned with college and work expectations, (3) rigorous, and (4) internationally benchmarked. A particular standard was included in the document only when the best available evidence indicated that its mastery was essential for college and career readiness in a twenty-first-century, globally competitive society. The Standards are intended to be a living work: as new and better evidence emerges, the Standards will be revised accordingly.

[836] Webpage: “Myths vs. Facts.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

Myth: These standards amount to a national curriculum for our schools.

Fact: The Common Core is not a curriculum. It is a clear set of shared goals and expectations for what knowledge and skills will help our students succeed. Local teachers, principals, superintendents, and others will decide how the standards are to be met. Teachers will continue to devise lesson plans and tailor instruction to the individual needs of the students in their classrooms.

[837] Book: The Modern Middle School: Addressing Standards and Student Needs. By Gilbert H. Hunt (Ph.D., Coastal Carolina University), Dennis G. Wiseman (Ph.D., Coastal Carolina University), and Sandra Pope Bowden (Ph.D., Charleston Southern University). Charles C Thomas, 2003.

Page 113:

No issue currently impacts the middle level school more than curriculum reform based on state and national standards. Across the nation each state’s legislature has demonstrated a willingness to create curriculum standards for each grade level and to create state assessment tools designed to measure each student’s progress toward stated criteria. In this environment, where such behaviors as publishing each school’s test scores in local newspapers, reassigning and even replacing teachers and principals based on student academic achievement as reflected in standardized test scores, and states taking over entire school districts based primarily on low test performance, it is little wonder that great pressures exist on educators to align the curriculum with state content-standards for the specific grade levels involved. In fact, the aligning of curriculum and instruction to specific state content standards has become a universal teaching skill now taught in colleges of education and practiced in literally all school districts. Does this mean that the content of middle level curriculum is being controlled by the content of state standards, and, to some degree, the content of the state tests that are based on these standards? Certainly, without a doubt. To the degree that these standards form the foundation of a developmentally sound and academically appropriate curriculum, this is either a good or a bad development.

[838] Article: “How Bill Gates Pulled Off the Swift Common Core Revolution.” By Lyndsey Layton. Washington Post, June 7, 2014. <www.washingtonpost.com>

The Bill and Melinda Gates Foundation didn’t just bankroll the development of what became known as the Common Core State Standards. With more than $200 million, the foundation also built political support across the country, persuading state governments to make systemic and costly changes.

Bill Gates was de facto organizer, providing the money and structure for states to work together on common standards….

The Gates Foundation spread money across the political spectrum, to entities including the big teachers unions, the American Federation of Teachers and the National Education Association, and business organizations such as the U.S. Chamber of Commerce….

… Gates money went to state and local groups, as well, to help influence policymakers and civic leaders. And the idea found a major booster in President Obama, whose new administration was populated by former Gates Foundation staffers and associates.

[839] Speech: “Bill Gates Before National Conference of State Legislatures.” Bill and Melinda Gates Foundation, July 21, 2009. <www.gatesfoundation.org>

Fortunately, the state-led Common Core State Standards Initiative is developing clear, rigorous common standards that match the best in the world. Last month, 46 Governors and Chief State School Officers made a public commitment to embrace these common standards.

This is encouraging—but identifying common standards is not enough. We’ll know we’ve succeeded when the curriculum and the tests are aligned to these standards.

Secretary Arne Duncan recently announced that $350 million of the stimulus package will be used to create just these kinds of tests—next-generation assessments aligned to the common core.

When the tests are aligned to the common standards, the curriculum will line up as well—and that will unleash powerful market forces in the service of better teaching. For the first time, there will be a large base of customers eager to buy products that can help every kid learn and every teacher get better. Imagine having the people who create electrifying video games applying their intelligence to online tools that pull kids in and make algebra fun.

[840] Report: “Reaching Higher: The Common Core State Standards Validation Committee.” National Governors Association Center for Best Practices and the Council of Chief State School Officers, June 2010. <bit.ly>

Page 1: “Once the standards are adopted and implemented, states will determine how best to measure and hold students accountable for meeting these standards.”

Page 3: “Alignment of curricula and assessments to the Common Core State Standards—the next great task facing the states—will be essential to the staying power and lasting impact of the standards.”

[841] Commentary: “Commend Common Core.” By Bill Gates. USA Today, February 12, 2014. <www.usatoday.com>

In fact, the standards were sponsored by organizations made up of governors and school officials. The major teacher unions and 48 states sent teams, including teachers, to participate. The Gates Foundation helped fund this process because we believe that stronger standards will help more students live up to their potential. …

These are standards, just like the ones schools have always had; they are not a curriculum. They are a blueprint of what students need to know, but they have nothing to say about how teachers teach that information. It’s still up to local educators to select the curriculum.

In fact, the standards will give teachers more choices. When every state had its own standards, innovators making new educational software or cutting-edge lesson plans had to make many versions to reach all students. Now, consistent standards will allow more competition and innovation to help teachers do their best work.

[842] Report: “Frequently Asked Questions.” Common Core State Standards Initiative, June 5, 2014. <www.thecorestandards.org>

Page 2 (of PDF): “The standards establish what students need to learn, but they do not dictate how teachers should teach. Teachers will devise their own lesson plans and curriculum, and tailor their instruction to the individual needs of the students in their classrooms.”

[843] Commentary: “Commend Common Core.” By Bill Gates. USA Today, February 12, 2014. <www.usatoday.com>

In fact, the standards were sponsored by organizations made up of governors and school officials. The major teacher unions and 48 states sent teams, including teachers, to participate. The Gates Foundation helped fund this process because we believe that stronger standards will help more students live up to their potential. …

These are standards, just like the ones schools have always had; they are not a curriculum. They are a blueprint of what students need to know, but they have nothing to say about how teachers teach that information. It’s still up to local educators to select the curriculum.

[844] “Common Core State Standards for English Language Arts & Literacy in History/Social Studies, Science, and Technical Subjects.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

Page 4:

By emphasizing required achievements, the Standards leave room for teachers, curriculum developers, and states to determine how those goals should be reached and what additional topics should be addressed. Thus, the Standards do not mandate such things as a particular writing process or the full range of metacognitive strategies that students may need to monitor and direct their thinking and learning. Teachers are thus free to provide students with whatever tools and knowledge their professional judgment and experience identify as most helpful for meeting the goals set out in the Standards.

[845] Entry: “metacognition.” Merriam-Webster. Accessed October 28, 2015 at <www.merriam-webster.com>

“awareness or analysis of one’s own learning or thinking processes”

[846] “Common Core State Standards for Mathematics.” Common Core State Standards Initiative. Accessed October 3, 2018 at <bit.ly>

NOTE: To determine if the math standards contain a statement similar to the one quoted from the ELA [English Language Arts] standards, Just Facts surveyed the standards and searched for words like “metacognitive,” “cognitive,” “process,” and “strategies.” No equivalent statement was found. As opposed to the English standards, the math standards require students to use specific strategies and processes, such as the following on page 15:

Add and subtract within 20, demonstrating fluency for addition and subtraction within 10. Use strategies such as counting on; making ten (for example, 8 + 6 = 8 + 2 + 4 = 10 + 4 = 14); decomposing a number leading to a ten (for example, 13 – 4 = 13 – 3 – 1 = 10 – 1 = 9); using the relationship between addition and subtraction (for example, knowing that 8 + 4 = 12, one knows 12 – 8 = 4); and creating equivalent but easier or known sums (for example, adding 6 + 7 by creating the known equivalent 6 + 6 + 1 = 12 + 1 = 13).

[847] Commentary: “Straight Up Conversation: Math Scholar Hung-Hsi Wu on the Common Core.” By Rick Hess. Education Next, October 5, 2011. <www.edweek.org>

A few weeks back, I penned a post about the lack of response we’d received regarding our in-the-works Education Next forum on the Common Core math standards. I heard from a number of individuals who offered to defend the standards. One was Hung-Hsi Wu, professor emeritus in mathematics from UC-Berkeley, who has just penned the cover story on this topic for AFT’s [American Federation of Teachers] magazine American Educator. Dr. Wu, who started teaching at Berkeley in 1973, has been actively involved in math education for the past two decades, helping write California’s 1999 Mathematics Framework and California’s Standards Tests. He was also a member of NAEP’s [National Assessment of Educational Progress] Mathematics Steering Committee, 2000–2001, that contributed to the revision of the NAEP Framework. …

Wu: … [T]here is a profound common misunderstanding about something as basic as what it means to solve an equation. … The Common Core math standards, however, ask that students ‘understand solving equations as a process of reasoning’ and say explicitly what needs to be taught about this process (see Standard A-REI 1 in High School Algebra).

[848] Commentary: “Phoenix Rising: Bringing the Common Core Mathematics Standards to Life.” By Hung-Hsi Wu. American Educator, Fall 2011. Pages 3–13. <www.aft.org>

Pages 4–5:

Let us give two examples of the kind of change the CCSMS [Common Core math standards] (if properly implemented) will bring to the mathematics classroom. …

How should students add 1/8 + 5/6? …

In the CCSMS [Common Core math standards], adding fractions is spread through three grades, progressing from the simple to the complex, giving students time for complete mastery.* Briefly, in grade 3, students learn to think of a fraction as a point on the number line that is “so many copies” of its corresponding unit fraction. For example, 5/6 is 5 copies of the unit fraction 1/6 (and 1/6 is 1 copy). When we represent a fraction as a point on the number line, we place a unit fraction such as 1/6 on the division point to the right of 0 when the unit segment from 0 to 1 is divided into 6 equal segments. It is natural to identify such a point with the segment between the point itself and 0. Thus, as shown below, 1/6 is identified with the red segment between 0 and 1/6, 5/6 is identified with the segment between 0 and 5/6, etc. Then, the statement that “5/6 is 5 copies of 1/6” acquires an obvious visual meaning: the segment from 0 to 5/6 is 5 copies of the segment from 0 to 1/6.

[849] Webpage: “About.” Pathway to Equitable Math Instruction. Accessed March 28, 2022 at <equitablemath.org>

This toolkit was developed by a team of teachers, instructional coaches, researchers, professional development providers, and curriculum writers with expertise in mathematics education, English language development, and culturally responsive pedagogy. …

We also wish to thank the Bill and Melinda Gates Foundation for their generous financial support of this project.

[850] Webpage: “Curriculum & Instruction: English Learners.” Los Angeles County Office of Education. Accessed March 28, 2022 at <www.lacoe.edu>

Featured Items …

A Pathway to Equitable Math Instruction

LACOE is a proud partner in developing this toolkit because of its integrated approach to mathematics that centers Black, Latinx, and Multilingual students in grades 6–8, addresses barriers to math equity, and aligns instruction to grade-level priority standards.

[851] “Dismantling Racism in Mathematics Instruction: Exercises for Educators to Reflect on Their Own Biases to Transform Their Instructional Practice.” Pathway to Equitable Math Instruction, May 2021. <equitablemath.org>

Page 22:

Universal access to the content standards requires that educators apply a strong equity lens as they plan their instruction. This requires intentional focus on crafting teaching methods that are not only aligned to the standards, but that are designed with students—their identities, cultures, assets, and needs—at the center. “Equity has been the focus of more NCTM [National Council of Teachers of Mathematics] presidents’ messages and any other topic (Gojak, 2012), yet there is no mention of equity in the Common Core State Standards, and accommodations for ‘English/Language learners’ are in an appendix, something only the tenacious teacher would find.[”] (Gutierrez 2017)

Page 24:

White supremacy culture shows up in math classrooms when …

Only content standards guide learning in the classroom.

While access to grade-level content for every student is the responsibility of schools and essential for equity, a focus on content alone is insufficient for achieving meaningful mathematical power for all students. When only focusing on content without applying a culturally responsive lens or strategic scaffolding, there is a risk of perpetuating white supremacy culture and inequities. A hyperfocus on individual standards requires teachers to function under a system of urgency to “cover” all the material that will be on the test and not focus on actual learning of the big ideas. This approach is not only disengaging, it also limits opportunities for teachers to connect the content to students’ lives in meaningful, relevant ways.

[852] “Dismantling Racism in Mathematics Instruction: Exercises for Educators to Reflect on Their Own Biases to Transform Their Instructional Practice.” Pathway to Equitable Math Instruction, May 2021. <equitablemath.org>

Page 55:

White supremacy culture shows up in math classrooms when …

Students are required to “show their work” in standardized, prescribed ways.

Math teachers ask students to show work so that teachers know what students are thinking, but that can center the teacher’s need to understand rather than student learning. Teachers should seek to understand individual student perspectives and focus on students showing their work in ways that help students learn how to process information.

Instead …

Ask other questions that will demonstrate learning when it is not clear to you how students know the answer. …

Offer a variety of ways to demonstrate thinking and knowledge.

• Verbal Example: Show your thinking with words, pictures, symbols.

• Classroom Activity: Have students create TikTok videos, silent films, or cartoons about mathematical concepts or procedures.

• Professional development: Practice with math colleagues how to answer mathematical problems without using words or numbers.

[853] Book: Encyclopedia of Educational Psychology (Volume 1). Edited by Neil J. Salkind and Kristin Rasmussen. Sage Publications, 2008. Article: “Assessment.” By Marie Kraska. Pages 60–64.

Pages 60–61:

Assessment includes the collection, analysis, and interpretation of various kinds of information useful for educational decisions. Leonard Carmichael and Bette Caldwell suggested that assessment can produce direct benefits to students, as teachers use both formal and informal assessments to diagnose students’ strengths and weaknesses. Assessment can provide information that helps teachers to identify students who need additional instruction, special services, or more advanced work. Assessment also can serve as the basis for teacher reflections on their instructional effectiveness. Based on data collected through various kinds of assessments, teachers can make instructional decisions about re-teaching a lesson or unit or moving ahead with more challenging lessons. Results of assessments can provide feedback to students to indicate areas in which performance needs improvement and areas in which performance is satisfactory.

[854] Book: Incentives and Test-Based Accountability in Education. Edited by Michael Hout and Stuart W. Elliott. National Academies Press, 2011. <www.nap.edu>

Page 29:

Another part of the answer, of course, is that students are not always motivated to learn the things that may be useful or important for them to learn, so that the external signaling and motivation provided by grades can encourage students to learn skills and topics they might otherwise ignore. Sometimes the initial learning produced by external motivation will lead students to discover an interest in a new area that can lead to internal motivation for later learning.

[855] Paper: “Retrieval Practice Produces More Learning than Elaborative Studying with Concept Mapping.” By Jeffrey D. Karpicke and Janell R. Blunt. Science, February 11, 2011. Pages 772–775. <www.science.org>

Pages 772–773:

Educators rely heavily on learning activities that encourage elaborative studying, whereas activities that require students to practice retrieving and reconstructing knowledge are used less frequently. Here, we show that practicing retrieval produces greater gains in meaningful learning than elaborative studying with concept mapping. …

Eighty undergraduate students participated in Experiment 1. The students first studied a science text under one of four conditions within a single initial learning session. In the study-once condition, students studied the text in a single study period. In the repeated study condition, students studied the text in four consecutive study periods (8). In the elaborative concept mapping condition, students studied the text in an initial study period and then created a concept map of the concepts in the text. … Finally, in the retrieval practice condition, students studied the text in an initial study period and then practiced retrieval by recalling as much of the information as they could on a free recall test. After recalling once, the students restudied the text and recalled again. The total amount of learning time was exactly matched in the concept mapping and retrieval practice conditions (19). …

On the final test 1 week later, the repeated study, elaborative concept mapping, and retrieval practice conditions all outperformed the study-once condition on both verbatim and inference questions … Retrieval practice produced the best learning, better than elaborative studying with concept mapping, which itself was not significantly better than spending additional time reading. Collapsed across question type (verbatim and inference), the advantage of retrieval practice … over elaborative studying with concept mapping … represented about a 50% improvement in long-term retention scores.

Page 774:

Overall, 101 out of 120 students (84%) performed better on the final test after practicing retrieval than after elaborative studying with concept mapping. …

Retrieval practice [through tests] is a powerful way to promote meaningful learning of complex concepts commonly found in science education. …

Research on retrieval practice suggests a view of how the human mind works that differs from everyday intuition. Retrieval is not merely a read-out of the knowledge stored in one’s mind; the act of reconstructing knowledge itself enhances learning.

[856] Article: “To Really Learn, Quit Studying and Take a Test.” By Pam Belluck. New York Times, January 20, 2011. <www.nytimes.com>

The other—having students draw detailed diagrams documenting what they are learning—is prized by many teachers because it forces students to make connections among facts. …

[I]n “concept mapping” … [students study] with the passage in front of them … [while arranging] information from the passage into a kind of diagram, writing details and ideas in hand-drawn bubbles and linking the bubbles in an organized way.

The final group took a “retrieval practice” test. Without the passage in front of them, they wrote what they remembered in a free-form essay for 10 minutes. Then they reread the passage and took another retrieval practice test.

[857] Book: Incentives and Test-Based Accountability in Education. Edited by Michael Hout and Stuart W. Elliott. National Academies Press, 2011. <www.nap.edu>

Page 29: “[W]hy is the use of grades a standard part of the educational system? Part of the answer is that grades provide information to others (such as parents and college admissions officers) about how students have done, as well as information to the students themselves that they are learning the right things.”

[858] Book: Management. By Ranjay Gulati (Harvard Business School), Anthony Mayo (Harvard Business School), and Nitin Nohria (Harvard Business School). Cengage Learning, 2014.

Page 237:

Today’s companies utilize a wide variety of information to select the best job candidates. The process is similar to the process universities and colleges use to select students. In college admissions, schools collect a variety of both quantitative data such as test scores and grades and qualitative information such as recommendations, lists of activities and achievements, and notes from interviews with alumni and school representative. Companies also collect a variety of quantitative and qualitative information from their job candidates. This may include results from cognitive ability and personality tests, college GPA [grade point average], standardized test scores, basic background information provided by the candidate, and information gathered from interviews and references.

[859] Book: Knowing What Students Know: The Science and Design of Educational Assessment. Edited by James W. Pellegrino, Naomi Chudowsky, and Robert Glaser. National Academies Press, 2001. <www.nap.edu>

Page 40:

Another common purpose of assessment is to help policy makers formulate judgments about the quality and effectiveness of educational programs and institutions (these assessments also fall under the category of summative assessment). Assessments are used increasingly to make high-stakes decisions not only about individuals, but also about institutions. For instance, public reporting of state assessment results by school and district can influence the judgments of parents and taxpayers about their schools.

[860] Book: Comparative Public Policy and Citizen Participation: Energy, Education, Health and Urban Issues in the U.S. and Germany. Edited by Charles R. Foster. Pergamon Press, 1980.

Chapter 7: “Education As Loosely Coupled Systems in West Germany and the United States.” By Maurice A Garnier. Pages 87–98.

Page 95:

Localized adaptation may also foster idiosyncratic standards. If no standardization exists, schools can postulate anything as satisfying graduation requirements. The development of standardized testing constitutes a response to this problem in the United States, but many graduates are led to believe that they have received a certain kind of education when, in reality their achievement is low. In the United States, the situation prevails both at the secondary and university levels.

[861] Paper: “The Effect Heterogeneity of Central Exams: Evidence from TIMSS, TIMSS-Repeat and PISA.” By Ludger Woessmann. Education Economics, June 2005. Pages 143–169. <www.tandfonline.com>

Pages 146–147:

The provision of education can be understood as a network of principal–agent relationships in which a principal (for example, the parents) commissions an agent (for example, a school director) to perform a service (the education of the children) on behalf of the principal. … For if the agent’s interests diverge from those of the principal, and if the principal is not fully informed about the agent’s real performance, then the agent may pursue his own interests instead of those of the principal, without the latter becoming aware of this behaviour and thus without him being able to sanction it.

Central examinations can contribute to mitigating the problem of incomplete monitoring of the actions of the agents in the education system by supplying information about the performance of individual students relative to the national (or regional) student population. They harmonise the incentives of the agents more strongly with the interests of the principal and thus with the objectives of the education system (cf. Woessmann, 2002). …

The danger of opportunism by decentralised decision-makers is thus limited to those decision-making areas in which their interests diverge from the objective of enhancing students’ knowledge. This is, for instance, imaginable whenever the decision concerns the financial position or the workload to be fulfilled by the schools. In such cases, it is rational for the school decision-makers to favour their own interests over the promotion of student performance as long as monitoring agencies such as school governors or parents have no information about the actual behaviour of the schools.

[862] Commentary: “The Common Core Math Standards: Are They a Step Forward or Backward?” By Ze`ev Wurman and W. Stephen Wilson. Education Next, Summer 2012. <educationnext.org>

Wurman: … Moreover, there are organizations that have reasons to work for lower and less-demanding standards, specifically teachers unions and professional teacher organizations. While they may not admit it, they have a vested interest in lowering the accountability bar for their members.

[863] Book: Encyclopedia of Educational Psychology (Volume 1). Edited by Neil J. Salkind and Kristin Rasmussen. Sage Publications, 2008. Article: “Assessment.” By Marie Kraska. Pages 60–64.

Page 63:

Reliability and validity are two important concepts in assessment. Reliability refers to the consistency or stability of a test in producing the same or similar scores over repeated administrations of the test. For example, if a student scores high or low on a test of word comprehension in the morning, then one should expect that if the same or a similar test were administered in the afternoon, the student would have the same or a similar score providing that no changes were made in the student’s level of knowledge or administration of the test to bias the results. If the two administrations of the test yielded the same or similar results, then the test would be reliable; however, if there were a great discrepancy between the two administrations, then the test would be unreliable. Reliability is affected by different sources of error, which are called random errors. Variations due to individual attributes such as general knowledge, ability, skills, motivation, health, and attention are all sources of error. In addition, random errors may be due to variations in the characteristics of the test.

Page 64:

Validity is the extent to which a test measures what it was designed to measure. This means that tests are designed for specific purposes, and each test must have its own validity for the purpose for which it was designed. A valid test must also be reliable; however, the converse is not true. Reliable tests are not necessarily valid. That is, a test may consistently measure the wrong thing. Establishing test validity is thought to be a more complex process than establishing test reliability because establishing validity depends on the judgments to be made based on test results and how the results will be used. It is necessary to collect information as evidence that a test provides a true measure of such abstractions. To validate that tests provide true measures, certain information or evidence must be collected depending on the type of validity to be determined. Three common types of validity are content validity, criterion-related validity, and construct validity.

[864] Book: Encyclopedia of Measurement and Statistics (Volume 1). Edited by Neil J. Salkind and Kristin Rasmussen. Sage Publications, 2007. Article: “Achievement Tests.” By Thomas Haladyna. Pages 7–11.

Page 10:

Standardized achievement tests … [are] designed to maximize information that will be validly interpreted and used by its recipients. … Standards are set validly using procedures that are widely known and accepted as producing valid results. The interpretation of test scores is consistent with the intended purpose of the test. …

The process of validation is a responsibility of the test developer and the sponsor of the testing program. The Standards for Educational and Psychological Testing are very clear about the conditions and evidence needed for validation. A technical report or test manual contains the argument and evidence supporting each intended test score interpretation or use. If subscores are used, each subscore should also be validated. Test scores should never be used for purposes that are not validated. …

All standardized achievement tests can be used for various purposes. However, each purpose should be validated.

[865] Speech: “Beyond the Bubble Tests: The Next Generation of Assessments.” By Education Secretary Arne Duncan. U.S. Department of Education, September 2, 2010. <bit.ly>

Today is the day that marks the beginning of the development of a new and much-improved generation of assessments for America’s schoolchildren. …

Earlier this morning, the Department announced the winners of the Race to the Top Assessment competition. I am glad to report that the 44 states and the District of Columbia that applied are all part of at least one winning grant. Two large state consortia have won awards totaling $330 million. …

As you know, the Partnership for Assessment of Readiness for College and Careers, or PARCC, is managed by Achieve. The PARCC consortium has 26 member states. Its proposal underwent a rigorous review by a panel of peer review experts—and came out a winner. The consortium is slated to receive a $170 million award. The SMARTER [Summative Multi-State Assessment Resources for Teachers and Educational Researchers] Balanced Assessment Consortium, with 31 member states, won a $160 million award. …

By the 2014–2015 school year, the assessments developed by these two winning state consortia will be in use in any state that chooses to use them. …

The Common Core standards developed by the states, coupled with the new generation of assessments, will help put an end to the insidious practice of establishing 50 different goalposts for educational success. In the years ahead, a child in Mississippi will be measured against the same standard of success as a child in Massachusetts.

[866] Webpage: “About.” Partnership for Assessment of Readiness in College and Career. Accessed October 30, 2015 at <bit.ly>

The Partnership for Assessment of Readiness for College and Careers (PARCC) is a group of states working together to develop a modern assessment that replaces previous state standardized tests. It not only evaluates a student’s progress but also provides better information for teachers and parents to identify where a student needs help, or is excelling, so they are able to enhance instruction to meet individual student needs. …

The Partnership for Assessment of Readiness for College and Careers (PARCC) believes that assessments should work as tools for enhancing teaching and learning. Assessments that are aligned with the new, more rigorous Common Core State Standards help to ensure that every child is on a path to college and career readiness.

[867] Webpage: “What is SMARTER Balanced?” Smarter Balanced Assessment Consortium. Accessed October 4, 2018 at <bit.ly>

About Us

SMARTER [Summative Multi-State Assessment Resources for Teachers and Educational Researchers] Balanced is a public agency currently supported by its members. Through the work of thousands of educators, we created an online assessment system aligned to the Common Core State Standards (CCSS), as well as tools for educators to improve teaching and learning. SMARTER Balanced is housed at the University of California Santa Cruz Silicon Valley Extension.

[868] Letter from U.S. Education Secretary Arne Duncan to the Chief State School Officers, September 23, 2011. <www2.ed.gov>

[M]any of you are petitioning us for relief from the requirements of current law. …

For these reasons, I am writing to offer you the opportunity to request flexibility on behalf of your State, your LEAs [local education agencies], and your schools, in order to better focus on improving student learning and increasing the quality of instruction. This voluntary opportunity will provide educators and State and local leaders with flexibility regarding specific requirements of NCLB [No Child Left Behind Act of 2001] in exchange for rigorous and comprehensive State-developed plans designed to improve educational outcomes for all students, close achievement gaps, increase equity, and improve the quality of instruction. …

I invite each interested SEA [state education agency] to request this flexibility pursuant to the authority in section 9401 of the Elementary and Secondary Education Act of 1965 (ESEA), which allows me to waive, with certain exceptions, any statutory or regulatory requirement of the ESEA for an SEA that receives funds under a program authorized by the ESEA and requests a waiver. …

In addition to this letter, we have posted two documents on our Web site at <www.ed.gov>. The first document is titled ESEA Flexibility, which is also attached to this letter. This document contains three parts. First, it sets forth the statutory and regulatory requirements that would be waived in order to provide flexibility for SEAs and LEAs. Second, it lays out the principles to which SEAs and LEAs must adhere in order to receive that flexibility.

[869] Policy document: “ESEA Flexibility.” U.S. Department of Education, September 23, 2011. Updated 6/17/12. <www2.ed.gov>

In order to move forward with State and local reforms designed to improve academic achievement and increase the quality of instruction for all students in a manner that was not originally contemplated by the No Child Left Behind Act of 2001 (NCLB), a State educational agency (SEA) may request flexibility, on its own behalf and on behalf of its local educational agencies (LEAs), through waivers of ten provisions of the Elementary and Secondary Education Act of 1965 (ESEA) and their associated regulatory, administrative, and reporting requirements. In order to receive this flexibility, an SEA must meet the principles described in the next section. …

This document was originally issued on September 23, 2011. It has been updated to include two optional waivers that have been added to ESEA flexibility since that time and to reflect the implementation timeline for an SEA that requests this flexibility at the beginning of the 2012–2013 school year.

1. Flexibility Regarding the 2013–2014 Timeline for Determining Adequate Yearly Progress (AYP): An SEA would no longer need to follow the procedures in ESEA section 1111(b)(2)(E) through (H) for setting annual measurable objectives (AMOs) to use in determining AYP. Instead, an SEA would have flexibility to develop new ambitious but achievable AMOs in reading/language arts and mathematics in order to provide meaningful goals that will be used to guide support and improvement efforts for the State, LEAs, schools, and student subgroups.

2. Flexibility in Implementation of School Improvement Requirements: An LEA would no longer be required to comply with the requirements in ESEA section 1116(b) to identify for improvement, corrective action, or restructuring, as appropriate, its Title I schools that fail, for two consecutive years or more, to make AYP, and neither the LEA nor its schools would be required to take currently required improvement actions; however, an SEA may still require or permit an LEA to take such actions. An LEA would also be exempt from all administrative and reporting requirements related to school improvement under current law.

3. Flexibility in Implementation of LEA Improvement Requirements: An SEA would no longer be required to comply with the requirements in ESEA section 1116(c) to identify for improvement or corrective action, as appropriate, an LEA that, for two consecutive years or more, fails to make AYP, and neither the LEA nor the SEA would be required to take currently required improvement actions. An LEA would also be exempt from all associated administrative and reporting requirements related to LEA improvement under current law.

4. Flexibility for Rural LEAs: An LEA that receives Small, Rural School Achievement Program funds or Rural and Low-Income School Program funds would have flexibility under ESEA sections 6213(b) and 6224(e) to use those funds for any authorized purpose regardless of the LEA’s AYP status.

5. Flexibility for Schoolwide Programs: An LEA would have flexibility to operate a schoolwide program in a Title I school that does not meet the 40 percent poverty threshold in ESEA section 1114(a)(1) if the SEA has identified the school as a priority school or a focus school, and the LEA is implementing interventions consistent with the turnaround principles or interventions that are based on the needs of the students in the school and designed to enhance the entire educational program in the school, as appropriate.

6. Flexibility to Support School Improvement: An SEA would have flexibility to allocate ESEA section 1003(a) funds to an LEA in order to serve any priority or focus school, if the SEA determines such schools are most in need of additional support.

7. Flexibility for Reward Schools: An SEA would have flexibility to use funds reserved under ESEA section 1117(c)(2)(A) to provide financial rewards to any reward school, if the SEA determines such schools are most appropriate for financial rewards.

8. Flexibility Regarding Highly Qualified Teacher (HQT) Improvement Plans: An LEA that does not meet its HQT targets would no longer have to develop an improvement plan under ESEA section 2141 and would have flexibility in how it uses its Title I and Title II funds. An SEA would be exempt from the requirements regarding its role in the implementation of these plans, including the requirement that it enter into agreements with LEAs on the uses of funds and the requirement that it provide technical assistance to LEAs on their plan. This flexibility would allow SEAs and LEAs to focus on developing and implementing more meaningful evaluation and support systems. An SEA would not be exempt from the requirement of ESEA section 1111(b)(8)(C) that it ensure that poor and minority children are not taught at higher rates than other children by inexperienced, unqualified, or out-of-field teachers; however, once more meaningful evaluation and support systems are in place in accordance with principle 3 (described below), an SEA may use the results of such systems to meet that requirement.

9. Flexibility to Transfer Certain Funds: An SEA and its LEAs would have flexibility to transfer up to 100 percent of the funds received under the authorized programs designated in ESEA section 6123 among those programs and into Title I, Part A. Moreover, to minimize burden at the State and local levels, the SEA would not be required to notify the Department and its participating LEAs would not be required to notify the SEA prior to transferring funds.

10. Flexibility to Use School Improvement Grant (SIG) Funds to Support Priority Schools: An SEA would have flexibility to award SIG funds available under ESEA section 1003(g) to an LEA to implement one of the four SIG models in any priority school.

Optional Flexibility

In addition to its request for waivers of each of the requirements above, an SEA may wish to request flexibility through waivers related to the following:

11. Flexibility in the Use of Twenty-First Century Community Learning Centers (21st CCLC) Program Funds: An SEA would have flexibility under ESEA sections 4201(b)(1)(A) and 4204(b)(2)(A) to permit community learning centers that receive funds under the 21st CCLC program to use those funds to support expanded learning time during the school day in addition to activities during non-school hours or periods when school is not in session (i.e., before and after school or during summer recess).

12. Flexibility Regarding Making AYP Determinations: An SEA and its LEAs would no longer be required to comply with the requirements in ESEA sections 1116(a)(1)(A)-(B) and 1116(c)(1)(A) to make AYP determinations for LEAs and schools, respectively. Instead, an SEA and its LEAs must report on their report cards performance against the AMOs for all subgroups identified in ESEA section 1111(b)(2)(C)(v), and use performance against the AMOs to support continuous improvement in Title I schools.

13. Flexibility Regarding Within-District Title I Allocations: An LEA would have flexibility under ESEA section 1113(a)(3)-(4) and (c)(1) so that it may serve with Title I funds a Title I-eligible high school with a graduation rate below 60 percent that the SEA has identified as a priority school even if that school does not rank sufficiently high to be served based solely on the school’s poverty rate.

To receive flexibility through the waivers outlined above, an SEA must submit a request that addresses each of the following four principles, consistent with the definitions and timelines described later in this document, to increase the quality of instruction for students and improve student academic achievement in the State and its LEAs. In the SEA’s request, the SEA must describe how it will ensure that LEAs will fully implement these principles, consistent with the SEA’s authority under State law and the SEA’s request.

1. College- and Career-Ready Expectations for All Students …

To receive this flexibility, an SEA must demonstrate that it has college- and career-ready expectations for all students in the State by adopting college- and career-ready standards in at least reading/language arts and mathematics, transitioning to and implementing such standards statewide for all students and schools, and developing and administering annual, statewide, aligned, high-quality assessments, and corresponding academic achievement standards, that measure student growth in at least grades 3–8 and at least once in high school. An SEA must also support English Learners in reaching such standards by committing to adopt English language proficiency (ELP) standards that correspond to its college- and career-ready standards and that reflect the academic language skills necessary to access and meet the new college- and career-ready standards, and committing to develop and administer aligned ELP assessments. To ensure that its college- and career-ready standards are truly aligned with postsecondary expectations, and to provide information to parents and students about the college-readiness rates of local schools, an SEA must annually report to the public on college-going and college credit-accumulation rates for all students and student subgroups in each LEA and each high school in the State. …

2. State-Developed Differentiated Recognition, Accountability, and Support …

3. Supporting Effective Instruction and Leadership …

4. Reducing Duplication and Unnecessary Burden

[870] Webpage: “ESEA [Elementary & Secondary Education Act] Flexibility.” U.S. Department of Education. Last modified May 12, 2016. <www2.ed.gov>

The U.S. Department of Education has invited each State educational agency (SEA) to request flexibility regarding specific requirements of the No Child Left Behind Act of 2001 (NCLB) in exchange for rigorous and comprehensive State-developed plans designed to improve educational outcomes for all students, close achievement gaps, increase equity, and improve the quality of instruction.

• 45 states, the District of Columbia, Puerto Rico and the Bureau of Indian Education submitted requests for ESEA flexibility

• 43 States, the District of Columbia and Puerto Rico are approved for ESEA flexibility (see the list of peer reviewers)

[871] Press release: “Common Core State Standards Initiative Validation Committee Announced.” National Governors Association, September 24, 2009. <bit.ly>

For the college- and career-readiness standards, the Validation Committee will:

• Review the process used to develop the college- and career-readiness standards and recommend improvements in that process. These recommendations will be used to inform the K–12 development process.

• Validate the sufficiency of the evidence supporting each college- and career-readiness standard. Each member is asked to determine whether each standard has sufficient evidence to warrant its inclusion.

[872] Webpage: “Common Core State Standards Adoption Map.” Certica Solutions. Accessed April 18, 2019 at <statestandards.certicasolutions.com>

State

Adopted

Adoption Type

Final Adoption

Alabama

Adopted with Modifications

Full

11/28/2010

Alaska

Not Adopted

N/A

Arizona

Adopted with Modifications

Incremental

6/28/2010

Arkansas

Withdrawn

Incremental

7/12/2010

California

Adopted with Modifications

Incremental

8/2/2010

Colorado

Adopted with Modifications

Full

8/30/2010

Connecticut

Adopted Verbatim

Incremental

7/7/2010

Delaware

Adopted Verbatim

Incremental

8/19/2010

District of Columbia

Adopted Verbatim

Incremental

7/5/2010

Florida

Adopted with Modifications

Incremental

7/27/2010

Georgia

Adopted with Modifications

Full

7/8/2010

Hawaii

Adopted Verbatim

Incremental

6/18/2010

Idaho

Adopted Verbatim

Full

1/24/2011

Illinois

Adopted with Modifications

Incremental

6/24/2010

Indiana

Withdrawn

N/A

8/3/2010

Iowa

Adopted with Modifications

Incremental

7/9/2010

Kansas

Adopted with Modifications

Incremental

10/12/2010

Kentucky

Adopted Verbatim

Full

2/10/2010

Louisiana

Withdrawn

Incremental

7/10/2010

Maine

Adopted Verbatim

Incremental

4/4/2011

Maryland

Adopted Verbatim

Incremental

6/1/2010

Massachusetts

Adopted with Modifications

Incremental

7/21/2010

Michigan

Adopted Verbatim

Full

6/15/2010

Minnesota

Partially Adopted

Full

9/1/2010

Mississippi

Adopted with Modifications

Incremental

7/2/2010

Missouri

Withdrawn

Full

6/15/2010

Montana

Adopted with Modifications

Full

11/4/2011

Nebraska

Not Adopted

N/A

Nevada

Adopted Verbatim

Incremental

6/18/2010

New Hampshire

Adopted Verbatim

Full

7/8/2010

New Jersey

Adopted with Modifications

Incremental

6/16/2010

New Mexico

Adopted with Modifications

Full

10/29/2010

New York

Withdrawn

Incremental

7/19/2010

North Carolina

Withdrawn

Full

6/4/2010

North Dakota

Withdrawn

Full

6/24/2010

Ohio

Adopted with Modifications

Full

6/7/2010

Oklahoma

Withdrawn

N/A

6/24/2010

Oregon

Adopted with Modifications

Incremental

10/28/2010

Pennsylvania

Adopted with Modifications

Incremental

7/1/2010

Rhode Island

Adopted Verbatim

Full

7/1/2010

South Carolina

Withdrawn

Incremental

7/14/2010

South Dakota

Adopted Verbatim

Incremental

11/29/2010

Tennessee

Withdrawn

Incremental

7/30/2010

Texas

Not Adopted

N/A

Utah

Adopted with Modifications

Incremental

8/6/2010

Vermont

Adopted Verbatim

Incremental

8/17/2010

Virginia

Not Adopted

N/A

Washington

Adopted Verbatim

Incremental

6/1/2012

West Virginia

Withdrawn

Incremental

5/12/2010

Wisconsin

Adopted Verbatim

Incremental

6/2/2010

Wyoming

Adopted Verbatim

Full

6/15/2012

[873] Speech: “Beyond the Bubble Tests: The Next Generation of Assessments.” By Education Secretary Arne Duncan. U.S. Department of Education, September 2, 2010. <bit.ly>

The PARCC [Partnership for Assessment of Readiness in College and Career] consortium has 26 member states. Its proposal underwent a rigorous review by a panel of peer review experts—and came out a winner. The consortium is slated to receive a $170 million award. The SMARTER [Summative Multi-State Assessment Resources for Teachers and Educational Researchers] Balanced Assessment Consortium, with 31 member states, won a $160 million award.

[874] Speech: “Beyond the Bubble Tests: The Next Generation of Assessments.” By Education Secretary Arne Duncan. U.S. Department of Education, September 2, 2010. <bit.ly>

“By the 2014–2015 school year, the assessments developed by these two winning state consortia will be in use in any state that chooses to use them.”

[875] Webpage: “States.” Partnership for Assessment of Readiness in College and Career. Accessed October 30, 2015 at <bit.ly>

In the 2014–15 school year, 5 million students in 11 states and the District of Columbia took the PARCC [Partnership for Assessment of Readiness in College and Career] annual assessments in grades 3–11, although not all participating states have students in all grades taking the test. Students in the following states took PARCC assessments in the 2014–15 school year: Arkansas, Colorado, District of Columbia, Illinois, Louisiana, Maryland, Massachusetts, Mississippi, New Jersey, New Mexico, Ohio, and Rhode Island.

[876] Article: “As Test Results Trickle In, States Still Ditching Common Core.” By Lauren Camera. U.S. News & World Report, September 21, 2015. <www.usnews.com>

“However, even that notion may be in danger, as many of the 18 states that administered the SMARTER [Summative Multi-State Assessment Resources for Teachers and Educational Researchers] Balanced test last school year have been reporting or framing results in their own ways.”

[877] Article: “As States Drop Out of PARCC’s Common Core Test, Faithful Carry On.” By Emma Brown. Washington Post, July 22, 2015. <www.washingtonpost.com>

But fewer than half of the states originally part of PARCC [Partnership for Assessment of Readiness in College and Career]—11 states and the District of Columbia—were still on board when the online tests rolled out this spring. Since then, Louisiana, Mississippi, Arkansas and Ohio all have dropped out and just seven states and the District plan to give the test in 2015–2016, raising questions about whether the consortium is in danger of completely falling apart.

[878] Article: “Missouri Drops SMARTER Balanced Common-Core Exam.” By Andrew Ujifusa. Education Week, June 3, 2015. <www.edweek.org>

“A provision of the state education budget signed by Missouri Gov. Jay Nixon cuts off funding for the SMARTER [Summative Multi-State Assessment Resources for Teachers and Educational Researchers] Balanced exam and require the state to develop a new test in English/language arts and math.”

[879] Article: “Other States Surpass California’s Test Scores.” By Maureen Magee. San Diego Union-Tribune, September 18, 2015. <www.sandiegouniontribune.com>

This year, 11 states and the District of Columbia administered PARCC [Partnership for Assessment of Readiness in College and Career] exams. Arkansas, Mississippi, and Ohio have since withdrawn from the exams. Of the 18 states that participated in the SMARTER [Summative Multi-State Assessment Resources for Teachers and Educational Researchers] Balanced test this year, at least three have decided to scrap one or all of the grade level tests.

[880] Press release: “Board of Elementary and Secondary Education Approves Path to Next-Generation MCAS.” Massachusetts Board of Elementary and Secondary Education, November 17, 2015. <www.doe.mass.edu>

The Board of Elementary and Secondary Education today voted 8–3 to transition to a next-generation MCAS [Massachusetts Comprehensive Assessment System] that would be given for the first time in spring 2017 and would use both PARCC [Partnership for Assessment of Readiness in College and Career] and MCAS items, along with items developed specifically for the Massachusetts tests. …

For spring 2016, districts that administered PARCC in spring 2015 will do so again, and the remainder of districts will continue with MCAS unless they affirmatively choose to administer PARCC. The MCAS tests in spring 2016 will be augmented with a limited number of PARCC items in order to help make statewide comparisons easier and to offer students and staff the opportunity to experience PARCC items while the new assessment is being developed.

[881] Article: “What Tests Did Each State Require in 2016–17?” By Catherine Gewertz. Education Week, February 15, 2017. <www.edweek.org>

In 2016–17, states’ testing systems have begun to settle down after several years of transition sparked by the Common Core State Standards. By now, most states have chosen not to use the PARCC [Partnership for Assessment of Readiness in College and Career] and SMARTER [Summative Multi-State Assessment Resources for Teachers and Educational Researchers] Balanced assessments, which were designed to reflect the common core. Under pressure to cut back on testing time, many opted for other, shorter tests, or chose to use the SAT [Scholastic Aptitude Test] or ACT [American College Testing] in high school instead. …

Which States Are Using PARCC or SMARTER Balanced?

California, Colorado, Connecticut, Delaware, District of Columbia, Hawaii, Idaho, Illinois, Maryland, Montana, Nevada, New Hampshire, New Jersey, New Mexico, North Dakota, Oregon, Rhode Island, South Dakota, Vermont, Washington, West Virginia …

Which States Are Using Non-Consortium Tests?

Alabama, Alaska, Arizona, Arkansas, Florida, Georgia, Indiana, Iowa, Kansas, Kentucky, Maine, Minnesota, Mississippi, Missouri, Nebraska, New York, North Carolina, Ohio, Oklahoma, Pennsylvania, South Carolina, Tennessee, Texas, Utah, Virginia, Wisconsin, Wyoming

Are Any States Mixing PARCC or SMARTER Balanced with Their Own Items?

Louisiana, Massachusetts, Michigan

[882] Webpage: “The Power of Partnerships and Collaboration.” SMARTER [Summative Multi-State Assessment Resources for Teachers and Educational Researchers] Balanced Assessment Consortium. Accessed July 11, 2023 at <smarterbalanced.org>

“Smarter Balanced Members … Bureau of Indian Education … California … Connecticut … Delaware … Hawaii … Idaho … Indiana … Michigan … Montana … Nevada … Oregon … South Dakota … U.S. Virgin Islands … Washington”

[883] Webpage: “States.” Partnership for Assessment of Readiness in College and Career. Accessed October 30, 2015 at <bit.ly>

In the 2014–15 school year, 5 million students in 11 states and the District of Columbia took the PARCC [Partnership for Assessment of Readiness in College and Career] annual assessments in grades 3–11, although not all participating states have students in all grades taking the test. Students in the following states took PARCC assessments in the 2014–15 school year: Arkansas, Colorado, District of Columbia, Illinois, Louisiana, Maryland, Massachusetts, Mississippi, New Jersey, New Mexico, Ohio, and Rhode Island.

[884] Webpage: “About.” Partnership for Assessment of Readiness in College and Career. Accessed February 22, 2019 at <bit.ly>

The Partnership for the Assessment of Readiness for College and Career (PARCC) is a collaboration of states that share a commitment to developing new-era assessments that measure students’ readiness for college and career. … The PARCC states make many of their high-quality resources available to the public through this Partner Resource Center.

PARCC and non-PARCC states may license and incorporate the PARCC summative assessment content into their state testing programs through a variety of flexible licensing options. …

In 2016, PARCC switched to a single, end of year administration. In 2017, the PARCC Governing Board selected New Meridian Corporation as the management and content development vendor for the next phase of the PARCC assessment system.

[885] Article: “Maryland Will Be the Next State to Drop the PARCC Common-Core Test.” By Stephen Sawchuck. Education Week, September 11, 2018. <www.edweek.org>

Here’s one tricky thing: The PARCC [Partnership for Assessment of Readiness in College and Career] consortium doesn’t exist in the same way it once did, with a governing board managed by states. In 2015, its leaders decided to go in a new direction, allowing states to license content like specific test questions, rather than having a rigid membership model in which member states gave the whole test. By 2017 a new contractor, New Meridian, won an RFP [request for proposal] to manage this new arrangement. …

Maryland, New Jersey, New Mexico, and the District of Columbia plan to administer the PARCC assessment, either the original or the shortened one. Two additional states, Colorado and Louisiana, blend PARCC questions with their own. … Illinois stepped away from PARCC in February but retains a license with New Meridian for 2018–19….

[886] Book: Knowing What Students Know: The Science and Design of Educational Assessment. Edited by James W. Pellegrino, Naomi Chudowsky, and Robert Glaser. National Academies Press, 2001. <www.nap.edu>

Page 39:

As described in the National Research Council (NRC) report High Stakes (1999a), policy makers see large-scale assessments of student achievement as one of their most powerful levers for influencing what happens in local schools and classrooms. Increasingly, assessments are viewed as a way not only to measure performance, but also to change it, by encouraging teachers and students to modify their practices. Assessment programs are being used to focus public attention on educational concerns; to change curriculum, instruction, and teaching practices; and to motivate educators and students to work harder and achieve at higher levels (Haertel, 1999; Linn, 2000).

A trend that merits particular attention is the growing use of state assessments to make high-stakes decisions about individual students, teachers, and schools. In 1998, 18 states required students to pass an exam before receiving a high school diploma, and 8 of these states also used assessment results to make decisions about student promotion or retention in grade (Council of Chief State School Officers, 1999). When stakes are high, it is particularly important that the inferences drawn from an assessment be valid, reliable, and fair (American Educational Research Association, American Psychological Association, and National Council on Measurement in Education, 1999; NRC, 1999a). Validity refers to the degree to which evidence and theory support the interpretations of assessment scores.

[887] Webpage: “About Us: David Coleman.” College Board. Accessed October 19, 2015 at <www.collegeboard.org>

In 2007, David left McGraw–Hill and cofounded Student Achievement Partners, a nonprofit that assembles educators and researchers to design actions based on evidence to improve student outcomes. Student Achievement Partners played a leading role in developing the Common Core State Standards in math and literacy. David left Student Achievement Partners in the fall of 2012 to become president of the College Board.

[888] “Testimony for the House Study Committee on the Role of Federal Government in Education.” By Sandra Stotsky (University of Arkansas, Department of Education Reform). Georgia General Assembly, House Study Committee on the Role of the Federal Government in Education. September 24, 2014. <bit.ly>

Page 1:

… David Coleman was allowed to mandate a 50/50 division between literary study and “informational” text at every grade level from K–12 in the ELA [English language arts] standards….

… The “lead” writers for the ELA standards, David Coleman and Susan Pimentel, had never taught reading or English in K–12 or at the college level. Neither has a doctorate in English, nor published serious work on curriculum and instruction. They were virtually unknown to English language arts educators and to higher education faculty in rhetoric, speech, composition, or literary study.

[889] Article: “How Bill Gates Pulled Off the Swift Common Core Revolution.” By Lyndsey Layton. Washington Post, June 7, 2014. <www.washingtonpost.com>

On a summer day in 2008, Gene Wilhoit, director of a national group of state school chiefs, and David Coleman, an emerging evangelist for the standards movement, spent hours in Bill Gates’s sleek headquarters near Seattle, trying to persuade him and his wife, Melinda, to turn their idea into reality. … After the meeting, weeks passed with no word. Then Wilhoit got a call: Gates was in.

[890] Transcript: “What Must Be Done in the Next Two Years.” By David Coleman. University of Pittsburgh, Learning Research and Development Center, Institute for Learning, December 9, 2011. <bit.ly>

Pages 3–4:

So if there are aspects of what I’m about to say that you disagree with and if the Common Core is kind of a pain in the ass, built on top of your other duties while funding is being cut, don’t blame me. Blame Lauren, because there is clearly no way this country would be talking about 46 states adopting a set of common standards had not a somewhat younger, still revolutionary mind thought that this nation needed a new set of standards. And for that I am most grateful to you. …

Lauren, though, has challenged me over the years with some more ideas, and some I want to put on the table for you, all in the sake of striking back. One of them is that these [Common Core] standards are worthy of nothing if the assessments built on them are not worthy of teaching to, period. This is quite a demanding charge, I might add to you, because it has within it the kind of statement—you know, “Oh, the standards were just fine, but the real work begins now in defining the assessment,” which if you were involved in the standards is a slightly exhausting statement to make.

But let’s be rather clear: we’re at the start of something here, and its promise—our top priorities in our organization, and I’ll tell you a little bit more about our organization, is to do our darnedest to ensure that the assessment is worthy of your time, is worthy of imitation. It was Lauren who propounded the great rule that I think is a statement of reality, though not a pretty one, which is teachers will teach towards the test. There is no force strong enough on this earth to prevent that. There is no amount of hand-waving, there’s no amount of saying, “They teach to the standards, not the test; we don’t do that here.” Whatever. The truth is—and if I misrepresent you, you are welcome to take the mic back. But the truth is teachers do. Tests exert an enormous effect on instructional practice, direct and indirect, and it’s hence our obligation to make tests that are worthy of that kind of attention. It is in my judgment the single most important work we have to do over the next two years to ensure that that is so, period. So when you ask me, “What do we have to do over the next years?” we gotta do that. If we do anything else over the next two years and don’t do that, we are stupid and shall be betrayed again by shallow tests that demean the quality of classroom practice, period.

[891] Webpage: “About Us: David Coleman.” College Board. Accessed October 19, 2015 at <www.collegeboard.org>

In 2007, David left McGraw–Hill and cofounded Student Achievement Partners, a nonprofit that assembles educators and researchers to design actions based on evidence to improve student outcomes. Student Achievement Partners played a leading role in developing the Common Core State Standards in math and literacy. David left Student Achievement Partners in the fall of 2012 to become president of the College Board.

[892] Webpage: “About Us.” College Board. Accessed March 14, 2022 at <about.collegeboard.org>

Each year, College Board helps more than seven million students prepare for a successful transition to college through programs and services in college readiness and college success—including the SAT [Scholastic Aptitude Test], the Advanced Placement Program, and BigFuture. The organization also serves the education community through research and advocacy on behalf of students, educators and schools.

[893] Webpage: “SAT Information and Resources” Delaware Department of Education. Accessed January 6, 2017 at <www.doe.k12.de.us>

While high school grades are a very useful indicator of how students will perform in college, there is great variation in grading standards and course rigor within and across high schools. More than 80 years ago, the College Board created the first standardized college entrance test to help colleges and universities identify students who could succeed at their institutions and to connect students with educational opportunities beyond high school.

Today, the SAT [Scholastic Aptitude Test] is the benchmark standardized assessment of the critical reading, mathematical reasoning, and writing skills students have developed over time and that they need to be successful in college. Each year, more than two million students take the SAT. Nearly every college in America uses the test as a common and objective scale for evaluating a student’s college readiness.

[894] Webpage: “AP Courses.” College Board. Accessed October 19, 2015 at <apcentral.collegeboard.org>

AP [Advanced Placement] offers more than 30 courses across multiple subject areas. Each course is developed by a committee composed of higher education faculty and expert AP teachers who ensure that the course reflects college- and university-level expectations. These committees define the scope and goals of the AP course, articulating what students should know and be able to do upon completing it. …

AP courses are taught by highly qualified high school teachers who use the AP Course Descriptions to guide them. The course descriptions outline the course content, describe the curricular goals of the subject, and provide sample exam questions. While the course descriptions are a significant source of information about the course content on which the AP Exams will be based, AP teachers have the flexibility to determine how this content is presented.

[895] “An Announcement from College Board President David Coleman Regarding the SAT.” By David Coleman, February 26, 2013. <www.facebook.com>

In the months ahead, the College Board will begin an effort in collaboration with its membership to redesign the SAT [Scholastic Aptitude Test] so that it better meets the needs of students, schools, and colleges at all levels. We will develop an assessment that mirrors the work that students will do in college so that they will practice the work they need to do to complete college. An improved SAT will strongly focus on the core knowledge and skills that evidence shows are most important to prepare students for the rigors of college and career.

[896] “College Board Guide to Implementing the Redesigned SAT; Installment 2: Conversation Guides for Talking About the Assessment Redesign.” College Board, October 2014. <www.collegeboard.org>

Page 23:

The Redesigned SAT’s Relationship to the Common Core State Standards (CCSS)

The redesign of the SAT [Scholastic Aptitude Test] was driven by two key considerations: what the best available evidence indicates are the essential skills and knowledge for college and career readiness and success, and what would improve the test’s alignment to best instructional practices.

In gathering that evidence, we consulted numerous sources of evidence, including our own research, as well as feedback systematically collected from our partners in K–12 and higher education.

Is the SAT Aligned to the Common Core?

The redesigned SAT measures the skills and knowledge that evidence shows are essential for college and career success. It is not aligned to any single set of standards.

What Standards Draw From the Same Evidence Base?

In addition to the redesigned SAT, evidence-based college- and career-readiness skills are fundamental to state academic standards including the Common Core, the Texas Essential Knowledge and Skills, and the Virginia Standards for Learning—as well as in the best college-prep curricula.

[897] Webpage: “Compare SAT Specifications.” College Board. Accessed January 6, 2017 at <satsuite.collegeboard.org>

“Get an overview of how the new SAT [Scholastic Aptitude Test] differs from the one students took before March 2016.”

[898] Book: Encyclopedia of Education Economics & Finance. Edited by Dominic J. Brewer and Lawrence O. Picus. Sage, 2014. Article: “Homeschooling.” By Charisee Gulosino and Yongmei Ni. Pages 386–389.

Page 386: “Homeschooling, the oldest form of schooling, was the norm until the introduction of the common school movement in the 1840s.”

[899] Webpage: “Horace Mann.” PBS. Accessed July 9, 2015 at <www.pbs.org>

Horace Mann, often called the Father of the Common School, began his career as a lawyer and legislator. When he was elected to act as Secretary of the newly-created Massachusetts Board of Education in 1837, he used his position to enact major educational reform. He spearheaded the Common School Movement, ensuring that every child could receive a basic education funded by local taxes. His influence soon spread beyond Massachusetts as more states took up the idea of universal schooling.

[900] Article: “Mann, Horace.” Encyclopædia Britannica Ultimate Reference Suite 2004.

“U.S. educator, the first great American advocate of public education, who believed that, in a democratic society, education should be free and universal, nonsectarian, democratic in method, and reliant on well-trained, professional teachers.”

[901] Handbook of Research on School Choice. Edited by Mark Berends and others. Routledge, 2009.

Page xvi: “Diagonally across from traditional public schools, the upper right quadrant shows several forms of privately-funded and privately-operated schooling. The first is perhaps the most private form of education, self-schooling, exemplified by the famous autodidact President Abraham Lincoln.”

[902] Dataset: “Table 206.10. Number and Percentage of Homeschooled Students Ages 5 Through 17 with a Grade Equivalent of Kindergarten Through 12th Grade, by Selected Child, Parent, and Household Characteristics: Selected Years, 1999 Through 2019.” U.S. Department of Education, National Center for Education Statistics, December 2021. <nces.ed.gov>

“2019 … Number home-schooled (in thousands) [=] 1,457 … Percent home-schooled [=] 2.8 … Note: Data in 2019 excludes students who were enrolled for more than 24 hours a week…. Data for all years also exclude students who were homeschooled only due to a temporary illness.”

[903] Handbook of Research on School Choice. Edited by Mark Berends and others. Routledge, 2009.

Page xvii: “Perhaps surprisingly, an estimated one million youngsters (see the homeschooling chapter in this book) are now schooled at home. (Again, such categorization isn’t precise since some primarily homeschooled students take supplementary classes and play sports in local public schools and colleges.)”

[904] Book: Encyclopedia of Education Economics & Finance. Edited by Dominic J. Brewer and Lawrence O. Picus. Sage, 2014. Article: “Homeschooling.” By Charisee Gulosino and Yongmei Ni. Pages 386–389.

Page 389: “With the advent of publicly funded virtual schools, homeschooled students can now access a blended or hybrid model of homeschooling, combining elements of both homeschooling and traditional schooling. This hybrid model is a means of pursuing education from home while simultaneously being matriculated in a public school.”

[905] Dataset: “Table 206.10. Number and Percentage of Homeschooled Students Ages 5 Through 17 with a Grade Equivalent of Kindergarten Through 12th Grade, by Selected Child, Parent, and Household Characteristics: Selected Years, 1999 Through 2019.” U.S. Department of Education, National Center for Education Statistics, December 2021. <nces.ed.gov>

“2019 … Percent home-schooled … Highest education level of parents/guardians … High school diploma or less [=] 2.2% … Vocational/technical, associate’s degree, or some college [=] 2.9% … Bachelor’s degree/some graduate school [=] 3.3% … Graduate/professional degree [=] 3.1%”

[906] Book: The Principal’s Quick-Reference Guide to School Law (3rd edition). By Robert F. Hachiya, Robert J. Shoop, and Dennis R. Dunklee. Corwin/SAGE Publications, 2014.

Page 246:

While home schooling is permitted in all fifty states, there is considerable variation in how each state regulates this type of instruction. Regardless of the specifics of the various statutes, responsibility for ensuring that the quality of home instruction meets at least the minimum requirements prescribed by state law rests with local school superintendents. The majority of states require that parents notify the State Department of Education or the local school district that they plan to homeschool their child. However, other specific regulations vary widely from state to state. When issues relating to home instruction find their way into the courts, they generally focus on either (a) interpretations of the statute or (b) the rights of homeschooled students to receive some of the benefits, i.e. participation in coursework, available through the public school system.

[907] Book: Encyclopedia of Education Economics & Finance. Edited by Dominic J. Brewer and Lawrence O. Picus. Sage, 2014. Article: “Homeschooling.” By Charisee Gulosino and Yongmei Ni. Pages 386–389.

Page 386:

Homeschooling is legal in all fifty states. In many states, families must sign a release exempting their children from compulsory public school attendance. The release also frees the local public school from any obligation to educate a homeschooled child, but it may allow, at the discretion of the school, partial access to classes, activities, and sports. Although homeschool children are not eligible for local, state, or federal funds, a handful of states have regulations and policies in place to ensure access to education-related resources as well as compliance with academic content standards, standardized tests, attendance, and record keeping.

[908] Book: Educational Administration: Concepts and Practices (6th edition). By Fred Lunenburg and Allan Ornstein. Wadsworth Cengage Learning, 2012.

Page 343:

Parents or guardians who select one of the options to public school instruction must obtain equivalent instruction. For example, the Washington Supreme Court held that home instruction did not satisfy that state’s compulsory attendance law, for the parents who were teaching the children did not hold a valid teaching certificate.69 In its decision, the court described four essential elements of a school: a certified teacher, pupils of school age, an institution established to instruct school-age children, and a required program of studies (curriculum) engaged in for the full school term and approved by the state board of education. Subsequently, statutes establishing requirements for equivalent instruction (such as certified teachers, program of studies, time devoted to instruction, school-age children, and place or institution) generally have been sustained by the courts.70

[909] Paper: “Academic Achievement and Demographic Traits of Homeschool Students: A Nationwide Study.” By Brian D. Ray. Academic Leadership, Winter 2010. <www.nheri.org>

Page 20:

To determine whether there is a difference in achievement for students in households where at least one parent had ever held a state-issued teaching certificate, parent education level was controlled. As shown in Table 17, the achievement levels across groups are remarkably similar. Controlling for parent education level, there is a significant difference in the achievement levels of homeschool students whose parents are certified and those that are not (F=11.155; df=11,10141; p=.001); the students having neither parent ever certified performed slightly better. Although statistically significant, whether either parent has ever been a certified teacher explains less than one-tenth of 1% of the variance in test scores. There is no significant interaction of parent certification status and grade (F =.274; df=10,10141; p=.987, n.s.).

[910] Handbook of Research in Education Finance and Policy. Edited by Helen F. Ladd and Edward B. Fiske. Routledge, 2008.

Chapter 26: “Home Schooling.” By Clive Belfield. Pages 467–474.

Page 468: “Finally, home-schooling is not distinctly American; most countries expressly permit or tolerate homeschooling.”

[911] Paper: “Home Education in Germany: An Overview of the Contemporary Situation.” By Thomas Spiegler. Evaluation and Research in Education, December 22, 2008. Pages 179–190. <www.tandfonline.com>

Page 179:

Compulsory school attendance exists in Germany and home education is not allowed. Contraventions are regarded as an administrative or an indictable offence. Nevertheless, about 500 children are home educated. This takes place in secret, with tacit toleration by the local authorities or with legal consequences, ranging from a fine to partial loss of child custody, or even the possibility of a prison sentence.

Page 180:

Until 1919, Germany had compulsory education, which could be fulfilled by private tuition or home education (Avenarius, 2000: 450). The first obligatory compulsory school attendance arose in the Weimar Republic. But even the primary school law of the German Reich (Reichsgrundschulgesetz) from 1920 included a special regulation which was used a lot to maintain the possibility of private tuition (Nave, 1980: 141). Only the law about compulsory school attendance from 1938 (Reichsschulpflichtgesetz) was the first general regulation in the German Reich without exceptions and with criminal consequences in case of contraventions (Habermalz, 2001: 218). This law had considerable influence on the formation of the contemporary laws about compulsory school attendance in the German Länder [states].

Legal Position Regarding Home Education

As mentioned above, it is only possible to fulfil compulsory school attendance by attending a public or a state-approved private school. Home education is not accepted as a reason for exemption from regular school attendance. Furthermore, it is stressed in several points that religious beliefs are not to serve as a basis for an excuse from compulsory schooling (Achilles, 2003; Avenarius, 2000: 453).

NOTE: The full paper is available at <www.homeschooling-forschung.de>

[912] Article: “Home-School Germans Flee to UK.” By Charlie Francis-Pape and Allan Hall. London Guardian, February 23, 2008. <www.theguardian.com>

Home-schooling has been illegal in Germany since it was outlawed in 1938. Hitler wanted the Nazi state to have complete control of young minds. Today there are rare exemptions, such as for children suffering serious illnesses or psychological problems. Legal attempts through the courts—including the European Court of Human Rights—have so far failed to overturn the ban.

Klaus Landahl, 41, who moved in January from the Black Forest in Germany to the Isle of Wight with his wife, Kathrin, 39, said they had no option but to leave their home, friends and belongings in order to educate their five children, aged between three and 12, legally and without fear. “It feels like persecution,” he said. “We had to get to safety to protect our family. We can never go back. If we do, our children will be removed, as the German government says they are the property of the state now.”

[913] Article: “Home Schooling German Family Allowed to Stay in US.” By Ben Waldron. ABC News, March 5, 2014. <abcnews.go.com>

Michael Donnelly of the Home School Legal Defense Association told ABC News his office received a call from the Department of Homeland Security informing him that Uwe and Hannelore Romeike and their seven children will not be deported. …

The Romeikes were initially granted asylum by a Memphis judge who believed that Germany had unfairly restricted the family’s religious freedom. That decision was challenged and overturned by the Obama administration on appeal, which argued that Germany’s home schooling ban did not constitute religious persecution and could not be used as a basis for asylum in the United States.

The family has had its “deferred action,” status extended indefinitely, which means as long as the Romeikes stay out of trouble and stay in contact with the Department of Homeland Security, they won’t be deported, Donnelly said.

[914] Paper: “Free to Learn: The Rationale for Legalizing Homeschooling in Albania.” By Timothy Hagen. Central European Journal of Public Policy, 2011. Pages 50–84. <www.researchgate.net>

Page 53:

Some countries, such as Bulgaria, Germany, the Netherlands, and Greece, do not allow homeschooling except in special circumstances. Other European nations, including Belgium, Denmark, Estonia, Finland, France, Ireland, Italy, Norway, Portugal, and Sweden, all allow homeschooling but with moderate to high levels of regulation and inspection (Petrie 2001; Blok and Karsten 2011).

[915] Report: “1.5 Million Homeschooled Students in the United States in 2007.” U.S. Department of Education, National Center for Education Statistics, December 2008. <nces.ed.gov>

Page 2: “Figure 2. Percentage and Confidence Interval Estimates of Homeschooled Students, Ages 5 Through 17 with a Grade Equivalent of Kindergarten Through 12th Grade, Whose Parents Reported Various Reasons for Homeschooling: 2003 and 2007.”

[916] Paper: “Academic Achievement and Demographic Traits of Homeschool Students: A Nationwide Study.” By Brian D. Ray. Academic Leadership, Winter 2010. <www.nheri.org>

Page 4: “Students were included in the study if a parent affirmed that his or her student was ‘… taught at home within the past 12 months by his/her parent for at least 51% of the time in the grade level now being tested.’

Page 7: “The target population was all families in the United States who were educating their school-age children at home and having standardized achievement tests administered to their children. … A total of 11,739 students provided usable questionnaires with corresponding achievement tests.”

[917] Email from Brian D. Ray to Just Facts, May 12, 2015.

“The median amount spent per this one year on the student’s education for textbooks, lesson materials, tutoring, enrichment services, testing, counseling, evaluation, and so forth is $400 to $599. Here is the frequency list regarding the answers. …”

[918] Calculated with data from the paper: “Academic Achievement and Demographic Traits of Homeschool Students: A Nationwide Study.” By Brian D. Ray. Academic Leadership, Winter 2010. <www.nheri.org>

Page 7:

It was very challenging to calculate the response rate. One of the main problems was that, well into the study, it was discovered that many of the large-group test administrators were not communicating to their constituent homeschool families that they had been invited to participate in the study. Based on the best evidence available, the response rate was a minimum of 19% for the four main testing services with whom the study was originally planned, who worked fairly hard to get a good response from the homeschooled families, and whose students accounted for 71.5% (n = 8,397) of the participants in the study. That is, of the students who were tested and whose parents were invited to participate in the study, both test scores and survey responses were received for this group. It is possible that the response rate was higher, perhaps as much as 25% to these four testing services. For the other testing services and sources of data, the response rate was notably lower, at an estimated 11.0%. These testing services and other sources of data used a less-concentrated approach to soliciting participation and following-up with reminders to secure participation.

NOTES:

  • An Excel file containing the data and calculations is available upon request.
  • For facts about what constitutes a scientific survey and the factors that impact their accuracy, visit Just Facts’ research on Deconstructing Polls & Surveys.

[919] Textbook: Mind on Statistics (4th edition). By Jessica M. Utts and Robert F. Heckard. Brooks/Cole Cengage Learning, 2012.

Pages 164–165:

Surveys that simply use those who respond voluntarily are sure to be biased in favor of those with strong opinions or with time on their hands. …

According to a poll taken among scientists and reported in the prestigious journal Science … scientists don’t have much faith in either the public or the media. … It isn’t until the end of the article that we learn who responded: “The study reported a 34% response rate among scientists, and the typical respondent was a white, male physical scientist over the age of 50 doing basic research.” … With only about a third of those contacted responding, it is inappropriate to generalize these findings and conclude that most scientists have so little faith in the public and the media.

[920] Textbook: Sampling: Design And Analysis (2nd edition). By Sharon L. Lohr. Brooks/Cole Cengage Learning, 2010.

Pages 5–6:

The following examples indicate some ways in which selection bias can occur. …

… Nonresponse distorts the results of many surveys, even sources that are carefully designed to minimize other sources of selection bias. Often, nonrespondents differ critically from the respondents, but the extent of that difference is unknown unless you can later obtain information about the nonrespondents. Many surveys reported in newspapers or research journals have dismal response rates—in some, the response rate is as low as 10%. It is difficult to see how results can be generalized of the population when 90% of the targeted sample cannot be reached or refuses to participate.

[921] Paper: “Response Rates to Mail Surveys Published in Medical Journals.” By David A. Asch and others. Journal of Clinical Epidemiology, 1997. Pages 1129–1136. <www.jclinepi.com>

Page 1129:

The purpose of this study was to characterize response rates for mail surveys published in medical journals…. The mean response rate among mail surveys published in medical journals is approximately 60%. However, response rates vary according to subject studied and techniques used. Published surveys of physicians have a mean response rate of only 54%, and those of non-physicians have a mean response rate of 68%. … Although several mail survey techniques are associated with higher response rates, response rates to published mail surveys tend to be moderate. However, a survey’s response rate is at best an indirect indication of the extent of non-respondent bias. Investigators, journal editors, and readers should devote more attention to assessments of bias, and less to specific response rate thresholds.

Page 1135:

The level of art and interpretation in calculating response rates reflects the indirect and therefore limited use of the response rate in evaluating survey results. So long as one has sufficient cases for statistical analyses, non-response to surveys is a problem only because of the possibility that respondents differ in a meaningful way from non-respondents, thus biasing the results.22,23 Although there are more opportunities for non-response bias when response rates are low than high, there is no necessary relationship between response rates and bias. Surveys with very low response rates may provide a representative sample of the population of interest, and surveys with high response rates may not.

Nevertheless, because it is so easy to measure response rates, and so difficult to identify bias, response rates are a conventional proxy for assessments of bias. In general, investigators do not seem to help editors and readers in this regard. As we report, most published surveys make no mention of attempts to ascertain non-respondent bias. Similarly, some editors and readers may discredit the results of a survey with a low response rate even if specific tests limit the extent or possibility of this bias.

[922] Paper: “Academic Achievement and Demographic Traits of Homeschool Students: A Nationwide Study.” By Brian D. Ray. Academic Leadership, Winter 2010. <www.nheri.org>

Page 27:

• The median income for home-educating families ($75,000 to $79,999) was similar to all married-couple families nationwide with one or more related children under age 18 (median income $74,049 in 2006 dollars; or roughly 78,490 in 2008 dollars).

• Homeschool parents have more formal education than parents in the general population; 66.3% of the fathers and 62.5% of the mothers had a college degree (i.e., bachelor’s degree) or a higher educational attainment. In 2007, 29.5% of all adult males nationwide ages 25 and over had finished college and 28.0% of females had done so.

• These homeschool families are notably larger—68.1% have three or more children—than families nationwide.

• The percent of homeschool students in this study who are White/not-Hispanic (91.7%) is disproportionately high compared to public school students nationwide.

• Almost all homeschool students (97.9%) are in married couple families. Most homeschool mothers (81%) do not participate in the labor force; almost all homeschool fathers (97.6%) do work for pay.

NOTE: For a rough point of comparison, the next footnote shows how many children in the general population lived with two married parents in 2012.

[923] Report: “America’s Families and Living Arrangements: 2012.” By Jonathan Vespa, Jamie M. Lewis, and Rose M. Kreider. U.S. Census Bureau, August 2013. <www.census.gov>

Page 25:

Table 10. Children’s Economic Situation by Family Structure: CPS 20121 (Numbers in thousands)

Total [=] 73,817 … Living with two parents … Married [=] 47,330

1 All people under age 18, excluding group quarters, householders, subfamily reference people, and their spouses or unmarried partners

CALCULATION: 47,330 / 73,817 = 64%

[924] Webpage: “CPI Inflation Calculator.” United States Department of Labor, Bureau of Labor Statistics. Accessed July 11, 2023. <www.bls.gov>

$400 in August 2007 has the same buying power as $575.56 in January 2023

$599 in August 2007 has the same buying power as $861.90 in January 2023

About the CPI Inflation Calculator

The CPI inflation calculator uses the Consumer Price Index for All Urban Consumers (CPI-U) U.S. city average series for all items, not seasonally adjusted. This data represents changes in the prices of all goods and services purchased for consumption by urban households.

[925] Paper: “What Have We Learned About Homeschooling?” By Eric J. Isenberg. Peabody Journal of Education, December 5, 2007. Pages 387–409. <www.tandfonline.com>

Page 398:

Parents make school choice decisions based on preferences, the quality of local schools, and constraints of income and available leisure time. Separating the causal effect of each variable on school choice requires holding the others constant. For instance, if two families with identical preferences, income, and leisure time choose different schools, the difference can be ascribed to the local education market. Families who live in the same area with the same time and income constraints but who choose different schools must have different preferences.

Page 404:

Using aggregate data or child-level data, there is some evidence that poorer academic quality of public schools and decreased choice of private schools both contribute to an increase in homeschooling. Isenberg (2003) used test score data to measure academic school quality in Wisconsin. The results indicate that in small towns, a decrease in math test scores in a school district increases the likelihood of homeschooling. The magnitude of this effect is significant. A decrease in math scores from the 1 standard deviation above the mean to 1 standard deviation below the mean increases homeschooling by 29%, from 1.9 percentage points to 2.4 percentage points, all else equal. A decrease from 2 standard deviations above to 2 standard deviations below increases homeschooling by 65%, from 1.6 percentage points to 2.7 percentage points.

Page 405:

If parents are dissatisfied with the public schools for academic, religious, or other reasons, they must choose between homeschooling and private schooling. Private school has tuition costs; homeschooling has opportunity costs of time. Isenberg (2006) showed the ways in which mothers are motivated by the amount of disposable time they have, the opportunity cost of time, and income constraints. The results are summarized in Table 3.

If a mother has preschool children as well as a school-age child, she is predisposed to stay home, decrease her work hours, or even stay out of the labor force entirely and therefore more likely to homeschool. Of course, small children require a great deal of time to care for, but this pull on a mother’s time is dominated by the incentive to withdraw from the labor force, freeing daytime hours and eliminating commute time, thereby increasing the likelihood of homeschooling. All else equal, having a preschool child younger than 3 years old increases the probability of homeschooling a school-age sibling by 1.2 percentage points; a toddler age 3 to 6 increases the probability of homeschooling by 0.5 percentage points.

Having school-age siblings also increases the likelihood that a child is homeschooled. Each additional sibling beyond the first sibling increases the probability that a particular child is homeschooled. All else equal, a child with two other school-age siblings is 1.2 percentage points more likely to be homeschooled than a child with one school-age sibling, and a child with three or more siblings in school is an additional 1.7 percentage points more likely to be homeschooled than a child with two siblings. There appear to be economies of scale in homeschooling.

The presence of other adults in the household also has a significant effect on the likelihood of homeschooling. This may be because these extra adults take over household tasks, giving the mother more disposable time. Other adults in the household, including but not limited to a husband, increase the likelihood of homeschooling by 0.5 percentage points per extra adult.

[926] Paper: “Academic Achievement and Demographic Traits of Homeschool Students: A Nationwide Study.” By Brian D. Ray. Academic Leadership, Winter 2010. <www.nheri.org>

Page 5:

The standardized academic achievement tests most used in this study were the Iowa Tests of Basic Skills (ITBS, Form A) and California Achievement Tests (CAT). The ITBS is published by Riverside Publishing Company. …

Several organizations in the United States provide assessment (testing) services to homeschool families and their students on a fee-for-service basis. Several of these cooperated with the researcher in the present study to gather achievement test and demographic data on the students.

Page 7:

The target population was all families in the United States who were educating their school-age children at home and having standardized achievement tests administered to their children. …

The researcher began with four notably large testing services that work with families nationwide, and then included a few more smaller testing services in the study. The expectation was this approach would provide a more robust sampling by utilizing several testing services from across the nation.

Pages 25–26:

The scores of all students tested by three of the four major testing services were sent to the researcher. The scores of these students, a total of 22,584, nearly all of whom were home educated, are presented in Table 22. (That is, the testing services reported that a tiny minority might have been taught in small private schools.) These comprise the scores of both those who participated and those who did not participate in the present study.

Table 22

Mean z-Scores and Corresponding National Percentile by Subtest for All Students From Three Major Testing Services (i.e., Participants and Non-Participants)

Subject

N

Mean z

Std. Deviation

National Percentile

Reading

22362

1.1150

0.83183

87

Language

22515

0.8744

0.88439

81

Math

22343

0.8358

0.90915

80

Science

12830

0.8985

0.80392

82

Social Studies

12814

0.8526

0.86598

80

Core

21445

1.1038

0.85266

84

Composite

12602

0.9537

0.83149

83

Page 28: “Developing a sample from the widest source ever of homeschool student test scores, this study offers plentiful information concerning the students’ demographics and achievement.”

[927] Webpage: “About NHERI.” National Home Education Research Institute. Accessed October 4, 2018 at <www.nheri.org>

NHERI [National Home Education Research Institute] conducts and collects research about homeschooling (home-based education, home schooling), and publishes the research journal called the Home School Researcher. …

Brian D. Ray, Ph.D. and others founded the institute in 1990 as a 501(c)3 non-profit research organization, and is the president of the institute. He holds his Ph.D. in science education from Oregon State University, his M.S. in zoology from Ohio University, and his B.S. in biology from the University of Puget Sound. Dr. Ray has been a middle school and high school classroom teacher in both public and private schools, an undergraduate college professor, and a university professor at the graduate level. He is a leading international expert with regard to homeschool (home school, home education) research. Dr. Ray executes and publishes research, speaks to the public, testifies before legislators, and serves as an expert witness in courts.

[928] Paper: “Academic Achievement and Demographic Traits of Homeschool Students: A Nationwide Study.” By Brian D. Ray. Academic Leadership, Winter 2010. <www.nheri.org>

Page 29:

As previously noted, the results of the present study are consistent with preceding studies of the academic achievement of homeschool students (Ray, 1990, 1994, 1997, 2000; Rudner, 1999; van Pelt, 2003). The above-average nature of these achievement test scores is also consistent with state-provided data in states that have mandated or used testing of the home educated (for example, Alaska Department of Education, 1993; Arkansas Department of Education, 1998; Oregon Department of Education. 1999). Comparisons between home-educated students and institutional school students nationwide should, however, be interpreted with thoughtfulness and care. As stated at the beginning of this report, this is a nationwide cross-sectional, descriptive study (Johnson, 2001). It is not an experiment and readers should be careful about assigning causation to anything.

One could say, as Rudner (1999) wrote: “This study simply shows that those parents choosing to make a commitment to home schooling are able to provide a very successful academic environment.” On the other hand, it may be that something about the typical nature and practice of home-based education causes higher academic achievement, on average, than does institutional state-run schooling (Ray, 1997; 2000, p. 91–100; 2005). Similar to what Holt (1983) suggested nearly three decades ago, academic leaders could entertain this possibility and consider what those ingredients might be, and how the theoreticians and practitioners involved in conventional institutional schools might be informed by their counterparts in the parent-led home-based education community.

[929] Paper: “Econometric Methods for Causal Evaluation of Education Policies and Practices: A Non-Technical Guide.” By Martin Schlotter, Guido Schwerdt, and Ludger Woessmann. Education Economics, January 2011. <www.tandfonline.com>

Page 132:

In medical research, experimental evaluation techniques are a well-accepted standard device to learn what works and what does not. No one would treat large numbers of people with a certain medication unless it has been shown to work. Experimental and quasi-experimental studies are the best way to reach such an assessment. It is hoped that a similar comprehension is reached in education so that future education policies and practices will be able to better serve the students.

NOTE: Click here for detailed documentation about the importance of experimental studies and the pitfalls of observational studies.

[930] Book: Encyclopedia of Education Economics & Finance. Edited by Dominic J. Brewer and Lawrence O. Picus. Sage, 2014. Article: “Homeschooling.” By Charisee Gulosino and Yongmei Ni. Pages 386–389.

Pages 388–389:

More rigorous empirical work is needed regarding the “black box” of homeschooling before definitive conclusions are drawn.

At issue are several limitations for the study of homeschooler outcomes. First, there has been no empirical study thus far based on data obtained from a random sample of all homeschoolers. This means that the findings cannot be generalized from the study samples to the entire homeschooling population. …

In addition, reliance on self-reporting of scores from tests administered by the parent weakens the validity, the reliability, and the potential to generalize about student achievement. The same considerations apply to other measures and outcomes of homeschooling, such as civic socialization.

NOTE: The standardized test results from the 2010 study in Academic Leadership are not vulnerable to the criticism of using parent-administered or self-reported test scores, because testing services administered the tests, and the researcher obtained the scores directly from the testing services:

Several organizations in the United States provide assessment (testing) services to homeschool families and their students on a fee-for-service basis. Several of these cooperated with the researcher in the present study to gather achievement test and demographic data on the students. …

Electronic copy of the test results and survey questionnaire results were sent from the testing services and the online survey administrator to the researcher.

[Paper: “Academic Achievement and Demographic Traits of Homeschool Students: A Nationwide Study.” By Brian D. Ray. Academic Leadership, Winter 2010. <www.nheri.org>. Pages 5–6.]

[931] Report: “Keeping Pace with K–12 Digital Learning: An Annual Review of Policy and Practice.” By John Watson and others. Evergreen Education Group, November 3, 2014. <static1.squarespace.com>

Page 176:

Digital learning is any instructional practice in or out of school that uses digital technology to strengthen a student’s learning experience and improve educational outcomes. Our use of the term is broad and not limited to online, blended, and related learning. It encompasses a wide range of digital tools and practices, including instructional content, interactions, data and assessment systems, learning platforms, online courses, adaptive software, personal learning enabling technologies, and student data management systems to provide timely and rich data to guide personalized learning.

[932] Paper: “The Effectiveness of Online and Blended Learning: A Meta-Analysis of the Empirical Literature.” By Barbara Means and others. Teachers College Record, March 2013. <agronomy.unl.edu>

Page 3:

Frequently, the motivation for online learning programs entails (1) increasing the availability of learning experiences for learners who cannot or choose not to attend traditional face-to-face offerings, (2) assembling and disseminating instructional content more cost-efficiently, and/or (3) providing access to qualified instructors to learners in places where such instructors are not available. Online learning advocates argue further that additional reasons for embracing this medium of instruction include current technology’s support of a degree of interactivity, social networking, collaboration, and reflection that can enhance learning relative to normal classroom conditions.

[933] Report: “Keeping Pace with K–12 Digital Learning: An Annual Review of Policy and Practice.” By John Watson and others. Evergreen Education Group, November 3, 2014. <static1.squarespace.com>

Page 16:

In California, county offices of education operate independent study programs that are often heavily based on online courses or other digital content and tools. Independent study programs are not tied to seat-time restrictions in California, so they usually require limited attendance at a physical school. About 330,000 students are in independent study programs across the state. No data exist about the extent of the use of digital tools and content within these programs, but most such programs are using at least some digital instruction. In some cases independent study programs are mostly online. These alternative education programs are recognized by the state as schools (i.e. they have a school code and receive an Academic Performance Index score.)

Page 58:

Course choice fills a critical need for students who do not have access to a wide range of courses—or access to a specific course they are seeking—within their school. Many schools lack advanced courses in math and science, challenging electives, and world language courses. …

Online courses can fill the gaps for these students who are attending schools without a wide range of available courses. In addition, some students prefer to take a course online in order to create flexibility in their schedules, perhaps to meet the time demands of a job, sport, or other extracurricular activity.

Page 176: “Online learning: Teacher-led education that takes place over the Internet, with the teacher and student separated geographically, using an online instructional delivery system. It may be accessed from multiple settings (in school and/or out of school buildings).”

[934] Report: “Data Snapshot: College and Career Readiness.” U.S. Department of Education Office for Civil Rights, March 21, 2014. <www2.ed.gov>

Page 1:

Nationwide, only 50% of high schools offer calculus, and only 63% offer physics.

… Nationwide, between 10–25% of high schools do not offer more than one of the core courses in the typical sequence of high school math and science education—such as Algebra I and II, geometry, biology, and chemistry.

… A quarter of high schools with the highest percentage of black and Latino students do not offer Algebra II; a third of these schools do not offer chemistry. Fewer than half of American Indian and Native-Alaskan high school students have access to the full range of math and science courses in their high school.

Page 8:

Eighty-one percent (81%) of Asian-American high school students and 71% of white high school students attend high schools where the full range of math and science courses are offered (Algebra I, geometry, Algebra II, calculus, biology, chemistry, physics). However, fewer than half of American Indian and Native-Alaskan high school students have access to the full range of math and science courses in their high schools.

[935] Book: Encyclopedia of Education Economics & Finance. Edited by Dominic J. Brewer and Lawrence O. Picus. Sage, 2014. Article: “Homeschooling.” By Charisee Gulosino and Yongmei Ni. Pages 386–389.

Page 389: “With the advent of publicly funded virtual schools, homeschooled students can now access a blended or hybrid model of homeschooling, combining elements of both homeschooling and traditional schooling. This hybrid model is a means of pursuing education from home while simultaneously being matriculated in a public school.”

[936] Paper: “The Effectiveness of Online and Blended Learning: A Meta-Analysis of the Empirical Literature.” By Barbara Means and others. Teachers College Record, March 2013. <agronomy.unl.edu>

Page 3:

Online learning has become popular because of its potential for providing more flexible access to content and instruction at any time, from any place. Frequently, the motivation for online learning programs entails (1) increasing the availability of learning experiences for learners who cannot or choose not to attend traditional face-to-face offerings, (2) assembling and disseminating instructional content more cost-efficiently, and/or (3) providing access to qualified instructors to learners in places where such instructors are not available.

Page 4:

[T]oday’s online learning applications … can take advantage of a wide range of web resources, including web-based applications (for example, audio/video streaming, learning management systems, 3D simulations and visualizations, multiuser games) and new collaboration and communication technologies (for example, Internet telephony, chat, wikis, blogs, screen sharing, shared graphical whiteboards).

[937] Report: “Keeping Pace with K–12 Digital Learning: An Annual Review of Policy and Practice.” By John Watson and others. Evergreen Education Group, November 3, 2014. <static1.squarespace.com>

Pages 10–11: “In some cases the district offers enough online courses to provide a student’s entire education online for hospitalized, homebound, pregnant, incarcerated, or other students in similar uncommon circumstances.”

Page 176:

State virtual schools are created by legislation or by a state-level agency, and/or administered by a state education agency, and/or receive state appropriation or grant funding for the purpose of providing online learning opportunities across the state. They also may charge course fees to help cover costs. …

Fully online schools, also called cyber schools and virtual schools, work with students who are enrolled primarily (often only) in the online school. Online schools typically are responsible for ensuring their students take state assessments, and for their students’ scores on those assessments.

[938] Book: The SAGE Encyclopedia of Educational Technology. Edited by J. Michael Spector. Sage Publications, 2015. Article: “Adaptive Learning Software and Platforms.” By Dr. Kinshuk. Pages 7–10.

Page 7:

Adaptive learning software and platforms refer to the provision of technological solutions that cater to the individual needs of different types of learners. Even in earlier times when education was largely seen as restricted to formal institution, with students coming from relatively homogeneous backgrounds, individual differences in the ways students learn effectively required different instructional strategies. …

It is very difficult if not impossible to meet the different needs of all students in traditional classroom-based instructional environments, where implementing different teaching strategies becomes prohibitively overburdening for the teachers, especially in classes with a large number of students. Adaptive learning software and platforms aim at providing instruction that fits the individual needs of students.

Page 8:

With appropriate learner profiling, such systems customize (or adapt and personalize) the learning experience by modifying the presentation, navigation, and other aspects of learning space to suit individual students. The terms customization, personalization, and adaptation all express a similar goal—to transform the information or learning material to a presentation that best meet the needs of the learners. Adaptive learning software and platforms, due to their ability to change the content and representations according to a student’s needs, resemble the situation when a personal instructor is available for each individual student.

Adaptive learning software and platforms take into account different student features when an adapting learning experiences for an individual student. Major features covered in the existing adaptive learning software and platforms can be categorized into four main categories: (1) learning styles, (2) cognitive abilities, (3) affective states, and (4) the context of the learning.

[939] Commentary: “Meet the Mind-Reading Robo Tutor in the Sky.” By Eric Westervelt. National Public Radio, October 13, 2015. <www.npr.org>

“We think of it like a robot tutor in the sky that can semi-read your mind and figure out what your strengths and weaknesses are, down to the percentile,” says Jose Ferreira, the founder and CEO of ed-tech company Knewton.

The company believes its new, free online tutoring platform will radically transform how teachers personalize instruction.

Knewton claims to offer automated, digital tutoring that responds to each student’s needs.

“We can take the combined data power of millions of students—all the people who are just like you—[who] had to learn a particular concept before, that you have to learn today—to find the best pieces of content, proven most effective for people just like you, and give that to you every single time,” he says.

[940] Paper: “The Effectiveness of Online and Blended Learning: A Meta-Analysis of the Empirical Literature.” By Barbara Means and others. Teachers College Record, March 2013. <agronomy.unl.edu>

Page 3: “Frequently, the motivation for online learning programs entails … assembling and disseminating instructional content more cost-efficiently….”

Page 5:

The terms blended learning and hybrid learning are used interchangeably and without a broadly accepted precise definition. Bonk and Graham (2005) described blended learning systems as a combination of face-to-face instruction and computer-mediated instruction. The 2003 Sloan Survey of Online Learning (Allen & Seaman, 2003) provided somewhat more detail, defining blended learning as a “course that is a blend of the online and face-to-face course. Substantial proportion of the content is delivered online, typically uses online discussions, typically has some face-to-face meetings” (p. 6). Horn and Staker (2010) defined blended learning as “any time a student learns at least in part in a supervised brick-and-mortar location away from home and at least in part through online delivery with some element of student control over time, place, path and/or pace” (p. 3).

Blended approaches do not eliminate the need for a face-to-face instructor and usually do not yield cost savings as purely online offerings do. To justify the additional time and costs required for developing and implementing blended learning, policy makers want evidence that blended learning is not just as effective as, but actually more effective than, traditional face-to-face instruction.

Page 7: “Purely online instruction serves as a replacement for face-to-face instruction (for example, a virtual course), with attendant implications for school staffing and cost savings.”

[941] Book: The SAGE Encyclopedia of Educational Technology. Edited by J. Michael Spector. Sage Publications, 2015. Article: “Adaptive Learning Software and Platforms.” By Dr. Kinshuk. Pages 7–10.

Page 10: “Adaptive learning software and platforms are used in different settings and modes of learning, including desktop-based and mobile/ubiquitous/pervasive learning; formal, nonformal, and informal learning; individual and collaborative learning; and instruction-based, assessment-based, and game-based learning.”

[942] Report: “Keeping Pace with K–12 Digital Learning: An Annual Review of Policy and Practice.” By John Watson and others. Evergreen Education Group, November 3, 2014. <static1.squarespace.com>

Page 5: “The most easily identifiable schools that combine online instruction with required attendance at a physical school have been created by individual charter schools, charter management organizations, and pioneering districts.”

Page 18: “[T]he fully online charter schools … generally … do not have a physical building that students attend regularly. …”

[943] Handbook of Online Learning. Edited by Kjell, Erik Rudestam, and Judith Schoenholtz-Read. Sage Publications, 2010.

Chapter 6: “Media Psychology Controls the Mouse that Roars.” By Bernard Luskin and James Hirsen. Pages 161–172.

Page 163: “Internet-based social networking, and media and learning psychology are providing more student-friendly forms, while the rising cost of transportation due to fuel inflation is rapidly conditioning society to view learning opportunities that do not require physical presence in a classroom positively.”

[944] “WHO Director-General’s Opening Remarks at the Media Briefing on Covid-19.” World Health Organization, March 11, 2020. <bit.ly>

[Dr. Tedros Adhanom Ghebreyesus:] …

WHO [World Health Organization] has been assessing this outbreak around the clock and we are deeply concerned both by the alarming levels of spread and severity, and by the alarming levels of inaction.

We have therefore made the assessment that COVID-19 can be characterized as a pandemic.

[945] Report: “Snapshot 2020: A Review of K–12 Online, Blended, and Digital Learning.” Digital Learning Collaborative, March 2020. <static1.squarespace.com>

Page 17:

Supplemental Online Courses with Online Teachers

What They Are

• Full courses that provide credit towards grade advancement or graduation.

• They include content (text, graphics, videos, etc.) and assessments.

• The course includes an online teacher, often employed by the course provider, who is in regular contact with students via online communications tools and telephone. …

An online course provides the entire course content, interaction with the teacher, and curriculum progression via online content, sometimes with additional print materials. Student are engaged entirely online for that portion of their education, while typically taking courses at a brick-and-mortar school in their remaining time.

Page 19:

Online courses that include online teachers are among the earliest use cases of digital learning. In the late 1990s and early 2000s, about half of all states created a state virtual school. Fast forward to 2020, state virtual schools remain an important part of the digital learning landscape. Among the reasons that they are significant is that they provide publicly available data about their course usage that private providers are (understandably) less willing to share. State virtual schools are entities created by legislation or by state-level agencies, usually funded partially or entirely by state appropriations, course fees, and/or grants. As of early 2020, state virtual schools operate in 21 states … collectively serving 1,015,760 course enrollments.”

Page 20: “Table 1: State Virtual Schools … TOTAL course enrollments served by all state virtual schools (2017–18) 1,015,760”

[946] Calculated with data from the report: “Snapshot 2022: An Inflection Point for Digital Learning?” Digital Learning Collaborative, January 2022. <s3.us-east-1.amazonaws.com>

Page 13:

State Virtual School Summary Table. Enrollment numbers are for 2019–2020 (pre-pandemic).

Program Name

Semester Course Enrollments

ACCESS Virtual Learning (AL)

71,351

Virtual Arkansas

31,437

Colorado Digital Learning Solutions

4,353

Florida Virtual School

502,232

Georgia Virtual School

57,703

Idaho Digital Learning

35,286

Illinois Virtual School

9,338

Michigan Virtual

32,689

Montana Digital Academy

6,772

Virtual Learning Academy Charter School (NH)

26,609

NCVirtual

102,368

North Dakota Center for Distance Education

5,850

VirtualSC

84,148

Vermont Virtual Learning Cooperative (VTVLC)

2,131

West Virginia Virtual School

22,583

Wisconsin Virtual School

9,291

Total

1,004,141

† Calculated by Just Facts

[947] “WHO Director-General’s Opening Remarks at the Media Briefing on Covid-19.” World Health Organization, March 11, 2020. <bit.ly>

[Dr. Tedros Adhanom Ghebreyesus:] …

WHO [World Health Organization] has been assessing this outbreak around the clock and we are deeply concerned both by the alarming levels of spread and severity, and by the alarming levels of inaction.

We have therefore made the assessment that COVID-19 can be characterized as a pandemic.

[948] Report: “Snapshot 2020: A Review of K–12 Online, Blended, and Digital Learning.” Digital Learning Collaborative, March 2020. <static1.squarespace.com>

Page 9: “Full-time public online schools that enroll students from across regions or states operate in 32 states…. During school year 2018–2019 they collectively enrolled 375,000 FTE [full-time enrollment] students (less than 1% of all K–12 students in the United States).”

Page 10:

States With Statewide Fully Online Schools

Figure 2: Number of Student Enrollments by State and Percentage of State’s K–12 Population

State

% of State K–12 Population*

Number of Enrollments in School Year 2018–19

CA

0.7%

41,109

PA

2.2%

38,591

FL

1.3%

36,429

MI

1.2%

25,823

OH

1.5%

25,692

OK

3.6%

25,525

GA

1.0%

16,875

CO

1.9%

16,815

IN

1.5%

16,653

TX

0.3%

15,950

AZ

1.4%

15,676

OR

2.2%

12,894

SC

1.4%

11,111

UT

1.4%

9,478

KS

1.6%

8,353

WA

0.7%

7,953

ID

2.3%

7,268

WI

0.8%

7,055

MN

0.8%

6,864

NC

0.4%

5,844

AL

0.7%

5,475

NV

1.1%

5,367

LA

0.7%

5,089

NM

0.9%

3,056

MA

0.3%

2,733

TN

0.2%

2,292

AR

0.5%

2,162

VA

0.2%

2,000

IA

0.2%

1,047

WY

1.1%

1,013

ME

0.5%

826

NH

0.2%

400

DC

0.2%

191

[949] Report: “Snapshot 2022: An Inflection Point for Digital Learning?” Digital Learning Collaborative, January 2022. <s3.us-east-1.amazonaws.com>

Page 18:

During the pandemic, of course, most school districts offered remote learning for their own students, and for many students remote learning was the only option for extended periods. Those data are not included in this section because they were mostly temporary.

In school year 2021–22, many districts—likely more than 1,000—are reporting that they are creating or significantly growing their own online schools for their own students. We believe that in fact many of these district-run schools have an onsite component, such that we would label them hybrid schools. As of late 2021, there is little data regarding how many students are enrolled in these district-run online schools, and only limited data on how many districts (and which ones) are offering these online schools.

For these reasons, we continue to focus our reporting on the online schools that are serving students statewide, building on our pre-pandemic data sets. In our last Snapshot, we reported that 32 states allowed such online schools, and collectively they enrolled about 375,000 students, in school year 2018–19. Those numbers were flat for the year ending in 2019–20, and then grew by 75% in school year 2020–21, to 656,000 student enrollments, as hundreds of thousands of students left their prior district of enrollment and moved to online schools. The map on the next page shows the 35 states that will allow statewide online schools as of school year 2022–23, and the statewide enrollment numbers in school year 2020–21.

Page 19:

States With Statewide Fully Online Schools

Number of Student Enrollments by State and Percentage of State’s K–12 Population

State

% of State K–12 Population

Number of Enrollments in School Year 2020–21

FL

2.90%

80,987

PA

3.55%

61,760

MI

3.80%

57,048

OK

5.72%

39,584

OH

2.22%

37,612

CA

0.62%

37,334

AZ

2.97%

32,646

TX

0.57%

29,521

CO

3.32%

28,424

ID

7.53%

22,924

GA

1.03%

17,327

OR

3.03%

16,994

SC

2.27%

16,950

WI

1.93%

16,020

KS

2.85%

14,240

UT

2.32%

13,941

MN

1.47%

12,848

WA

1.05%

12,674

TN

1.11%

10,581

AL

1.36%

9,741

IN

0.72%

7,974

VA

0.56%

7,000

NM

2.20%

6,816

AR

1.42%

6,708

NC

0.40%

5,769

LA

0.81%

5,415

NV

0.82%

3,887

MA

0.39%

3,457

WY

2.83%

2,602

IA

0.53%

2,555

MO

0.24%

2,150

ME

0.51%

888

NH

0.27%

478

DC

0.27%

210

… Michigan, Kansas, and Washington State did not report data for SY [school year] 2020–21. For these states, we applied an adjustment assuming that that [sic] enrollments in these states grew at the same rate as the average of all other states.

We include California as a state allowing online schools to operate statewide, even though in fact online schools are limited to serving the county in which the school is located, and contiguous counties. All students in California, however, have access to at least one online school, so for simplicity we include California in the states with statewide online schools.

Page 20:

In the five or so years leading up to the pandemic, changes in digital learning policy were generally incremental. In most years a state or two would allow full-time online schools for the first time. States with extensive online learning activity would create new policies that would tweak funding levels, change attendance accounting requirements, address quality and accountability, and so forth. … States with state virtual schools generally supported these programs to provide supplemental online courses to students statewide, but very few states were adding new state programs to support digital learning.

Since the pandemic hit in spring 2020, we have had the final months of the 2020 legislative sessions (in some cases extended into special sessions to address pandemic-related issues), the full 2021 legislative sessions, and extensive activity by State Boards of Education. The end result, however, is much of the same, in the sense that the pace of change is about the same as it was pre-pandemic. Overall, most states that supported digital learning before the pandemic still do so, and most states that were restrictive still have those restrictions in place, despite the increase in interest in new options among students and families.

[950] “WHO Director-General’s Opening Remarks at the Media Briefing on Covid-19.” World Health Organization, March 11, 2020. <bit.ly>

[Dr. Tedros Adhanom Ghebreyesus:] …

WHO [World Health Organization] has been assessing this outbreak around the clock and we are deeply concerned both by the alarming levels of spread and severity, and by the alarming levels of inaction.

We have therefore made the assessment that COVID-19 can be characterized as a pandemic.

[951] Article: “Covid-19-Associated School Closures and Related Efforts to Sustain Education and Subsidized Meal Programs, United States, February 18–June 30, 2020.” By Nicole Zviedrite and others. PLoS ONE, September 14, 2021. <doi.org>

The first COVID-19–associated school closure occurred on February 27, 2020 in Washington state. By March 30, 2020, all but one US public school districts were closed, representing the first-ever nearly synchronous nationwide closure of public K–12 schools in the US. Approximately 100,000 public schools were closed for ≥8 weeks because of COVID-19, affecting >50 million K–12 students. Of 600 districts sampled, the vast majority offered distance learning (91.0%) and continued provision of subsidized meal programs (78.8%) during the closures. …

Between mid- and late-March, statewide mandates or recommendations for public SCs [school closures] were issued in every state, with the earliest announced in Kentucky, Maryland, Michigan, and Oregon on March 12, 2020. Among the 50 states and DC, 24 states and DC (49.0%) had public SCs go into effect on March 16. An additional 21 states had public SCs go into effect later that same week (March 17–21, 2020), and the remaining five states had public SCs go into effect during the week of March 22–28. Idaho was the last state to issue a mandated statewide public SC, which went into effect on Tuesday, March 24….

[952] Dataset: “Table 311.22. Number and Percentage of Undergraduate Students Taking Distance Education or Online Classes and Degree Programs, by Selected Characteristics: Selected Years, 2003–04 Through 2015–16.” U.S. Department of Education, National Center for Education Statistics, May 2018. <nces.ed.gov>

“2015–16 … Percent of undergraduate students taking online classes … Total, any online classes [=] 43.1% … Entire degree program is online1 [=] 10.8% … 1 Excludes students not in a degree or certificate program”

NOTE: As of 7/11/23, this is the latest data on online classes for graduate school students.

[953] Dataset: “Table 311.30. Number and Percentage of Graduate Students Taking Night, Weekend, or Online Classes, by Selected Characteristics: 2011–12.” U.S. Department of Education, National Center for Education Statistics, March 2014. <nces.ed.gov>

“Percent of students taking night, weekend, or online classes … Online classes … Any online classes Total [=] 36.0% … Exclusively online classes Total [=] 20.1%”

NOTE: As of 7/11/23, this is the latest data on online classes for graduate school students.

[954] Paper: “The Effectiveness of Online and Blended Learning: A Meta-Analysis of the Empirical Literature.” By Barbara Means and others. Teachers College Record, March 2013. <agronomy.unl.edu>

Page 2:

The meta-analysis was conducted on 50 effects found in 45 studies contrasting a fully or partially online condition with a fully face-to-face instructional condition. …

… Studies using blended learning also tended to involve additional learning time, instructional resources, and course elements that encourage interactions among learners. This confounding leaves open the possibility that one or all of these other practice variables contributed to the particularly positive outcomes for blended learning.

Page 12:

This meta-analysis was conducted to examine the effectiveness of both purely online and blended versions of online learning as compared with traditional face-to-face learning. Our approach differs from prior meta-analyses of distance learning in several important respects:

• Only studies of web-based learning have been included (i.e., eliminating studies of video- and audio-based telecourses or stand-alone, computer-based instruction).

• Only studies with random-assignment or controlled quasi-experimental designs have been included to draw on the best available evidence.

• All effects have been based on objective and direct measures of learning (i.e., discarding effects for student or teacher perceptions of learning, their satisfaction, retention, attendance, etc.).

Page 13: “Relevant studies [used in this meta-analysis] were located through a comprehensive search of publicly available literature published from 1996 through July 2008. We chose 1996 as a starting point for the literature search because web-based learning resources and tools became widely available around that time.”

Page 17: “The precision of each effect estimate was determined by using the estimated standard error of the mean to calculate the 95% confidence interval for each effect.”

Page 18: “The review of the 99 studies to obtain the data for calculating effect size produced 50 independent effect sizes (27 for purely online vs. face-to-face and 23 for blended vs. face-to-face) from 45 studies. Fifty-four studies did not report sufficient data to support calculating an effect size.”

Page 20: “For the total set of 50 contrasts and for each subset of contrasts being investigated, a weighted mean effect size (Hedges’ g+) was computed by weighting the effect size for each study contrast by the inverse of its variance.”

NOTE: See the next footnote for an explanation of Hedges’ g.

Page 21: “Among the 50 individual contrasts between online and face-to-face instruction, 11 were significantly positive, favoring the online or blended learning condition. Three significant negative effects favored traditional face-to-face instruction. That multiple comparisons were conducted should be kept in mind when interpreting this pattern of findings.”

Page 22: “Figure 3. Effect sizes for contrasts in the meta-analysis.”

NOTE: This figure shows the discrete effect sizes for all 50 contrasts.

Page 29:

The overall finding of the meta-analysis is that online learning (the combination of studies of purely online and of blended learning) on average produces stronger student learning outcomes than learning solely through face-to-face instruction. The mean effect size for all 50 contrasts was +0.20, p < .001.

Next, separate mean effect sizes were computed for purely online versus face-to-face and blended versus face-to-face contrasts. The mean effect size for the 27 purely online versus face-to-face contrasts was not significantly different from 0 (g+ = +0.05, p = .46). The mean effect size for the 23 blended versus face-to-face contrasts was significantly different from 0 (g+ = +0.35, p < .0001).

NOTE: See the next footnote for an intuitive explanation about the size of these effects.

Page 33:

To investigate whether online learning is more advantageous for some types of learners than for others, the studies were divided into three sub-sets of learner type: K–12 students, undergraduate students (the largest single group), and other types of learners (graduate students or individuals receiving job-related training). … In summary, for the range of student types for which controlled studies are available, online learning appeared more effective than traditional face-to-face instruction in both older and newer studies, with both younger and older learners, and in both medical and other subject areas.

Page 35:

The corpus of 50 effect sizes extracted from 45 studies meeting meta-analysis inclusion criteria was sufficient to demonstrate that in recent applications, purely online learning has been equivalent to face-to-face instruction in effectiveness, and blended approaches have been more effective than instruction offered entirely in face-to-face mode.

Page 38:

Even with this expected expansion of the research base, however, meta-analyses of online learning effectiveness studies will remain limited in several respects. Inevitably, they do not reflect the latest technology innovations. The cycle time for study design, execution, analysis, and publication cannot keep up with the fast-changing world of Internet technology. In the present case, important technology practices of the last five years, notably the use of social networking technology to create online study groups and recommend learning resources, are not reflected in the corpus of published studies included in this meta-analysis.

[955] Webpage: “Difference Between Two Means.” By David M. Lane. Online Statistics Education (Developed by Rice University, University of Houston Clear Lake, and Tufts University). Accessed November 3, 2015 at <onlinestatbook.com>

When the scale of a dependent variable is not inherently meaningful, it is common to consider the difference between means in standardized units. That is, effect size is measured in terms of the number of standard deviations the means differ by. Two commonly used measures are Hedges’ g and Cohen’s d. Both of these measures consist of the difference between means divided by the standard deviation. They differ only in that Hedges’ g uses the version of the standard deviation formula in which you divide by N–1, whereas Cohen’s d uses the version in which you divide by N. The two formulas are given below. …

Standardized measures such as Cohen’s d and Hedges’ g have the advantage that they are scale free. That is, since the dependent variable is standardized, the original units are replaced by standardized units and are interpretable even if the original scale units do not have clear meaning. …

It is natural to ask what constitutes a large effect. Although there is no objective answer to this question, the guidelines suggested by Cohen (1988) stating that an effect size of 0.2 is a small effect, an effect size of 0.5 is a medium effect, and an effect size of 0.8 is a large effect have been widely adopted. Based on these guidelines, the effect size of 0.87 is a large effect.

It should be noted, however, that these guidelines are somewhat arbitrary and have not been universally accepted. For example, Lenth (2001) argued that other important factors are ignored if Cohen’s definition of effect size is used to choose a sample size to achieve a given level of power.

Just Facts | 3600 FM 1488 Rd. | Suite 120 #248 | Conroe, TX 77384 | Contact Us | Careers

Copyright © Just Facts. All rights reserved.
Just Facts is a nonprofit 501(c)3 organization.
Information provided by Just Facts is not legal, tax, or investment advice.
justfacts.com | justfactsdaily.com