Skip to main content

Undermeasuring: College and Career Readiness Indicators May Not Reflect College and Career Outcomes

Introduction

For more than a decade, state education leaders have worked to better align academic content between high school and higher education and to measure not only whether high schools help students earn a diploma, but also whether those graduates are college and career ready. Even better, they have gone beyond measuring students’ postsecondary preparation by incentivizing high schools to focus on improving students’ preparation for postsecondary opportunities. Since the passage of the Every Student Succeeds Act (ESSA) in 2015, the majority of states have incorporated college and career readiness (CCR) indicators into their accountability systems for high schools—a key tool state policymakers use to signal which student outcomes are critically important and to support school leaders and educators in improving those outcomes. 

However, there is no standardized way by which states measure college and career readiness. Instead, states have developed their own approaches using different metrics and setting different benchmarks for students to meet. Further, the readiness measures states usually use are, in effect, proxy measures meant to predict successful postsecondary outcomes; high schools are less likely to have and use actual postsecondary data, such as college enrollment, remediation, persistence, and completion, to gauge how well their former students were prepared.

This report compares statewide CCR rates (measures meant to predict postsecondary readiness) with college enrollment and college remediation rates (measures providing evidence of postsecondary readiness) for a cohort of high school students for all 50 states and Washington, DC. Because states take such varied approaches to measuring college and career readiness, we often had to select the best available data to use or, in some cases, use data from the ACT or College Board in the absence of state-reported data (learn more in “About Our Data” below). This comparison shed light on racial disparities in readiness and enrollment rates, and gaps between postsecondary readiness and outcomes. While we expected to see some disparities, they also appear to be related to how readiness is measured. Specifically:

Although states have made large strides in collecting data on students’ college and career readiness and linking K–12 and postsecondary data systems, these findings show there are legitimate concerns regarding whether states are relying on the “right” measures to evaluate readiness. As school accountability systems resume after a two-year pause due to the pandemic, states should examine past data on students’ readiness compared to actual postsecondary outcomes and redesign their CCR indicators so they better reflect the skills and knowledge students need to succeed in higher education and the workforce. We hope this report and our recommendations to improve how states report postsecondary readiness and develop more nuanced CCR indicators will help state leaders begin those conversations.

About Our Data

For all 50 states and Washington, DC, we sought to match and compare state-level CCR rates to college enrollment rates and college remediation rates for a cohort of high school students. Given COVID-related disruptions in schooling, we sought data for a particular cohort: high school students who were in 12th grade during the 2018–19 school year (or those who graduated from high school in 2019) and who would have been able to enroll in college during the 2019–20 school year. However, because state agencies have different data reporting schedules, we used data for the previous cohort (students in grade 12 during the 2017–18 school year) in some states. As this report is based on data from several years ago, it is possible states’ data reporting practices and college and career readiness policies may have changed since then.

Additionally, CCR and postsecondary enrollment data is more frequently updated and reported on report cards (see “The Use of ‘Report Card’ in This Report” below) and/or other state government websites compared to remediation data. Thus, we prioritized matching CCR and postsecondary enrollment data for the same cohort of students when postsecondary enrollment and remediation rate data were both available, but for different cohorts (instead of using outdated CCR and postsecondary enrollment rate data as well).

The Use of “Report Card” in This Report

State report card(s). In this report, “state report card” refers to the education data on the annual state report card for the state as a whole that a state educational agency prepares and disseminates to the public under section 1111(h)(1) of ESSA.

School report card(s). “School report card” refers to the education data on the annual report card that a local educational agency (LEA) prepares and disseminates to the public under section 1111(h)(2) of ESSA. It includes information on each school served by the LEA. In practice, SEAs disseminate school report cards on behalf of LEAs.

Report card(s). “Report card” (without any modifiers) refers generally to the education data included on annual report cards at all levels (i.e., report cards for a state as a whole, for a school district as a whole, and for each school).

As a result, as depicted in Figure 1 below, for 49 states (including Washington, DC), we used CCR and postsecondary enrollment data for the same high school cohort.

In the final two states (Vermont and Wyoming, shown in gray in Figure 1), our data set included different high school cohorts for CCR and postsecondary enrollment data. Neither state reported remediation data. Specifically, for Vermont, we used CCR data for the 2018–19 cohort because the state only started reporting CCR data in the 2018–19 school year, while our enrollment data included the 2017–18 cohort, the most recent data available. For Wyoming, we used CCR data for the 2018–19 cohort but could only find postsecondary enrollment data averaged for the 2020–21, 2019–20, 2018–19, 2017–18, and 2016–17 cohorts.

Since statewide data normally does not fluctuate dramatically from one year to the next, we are more confident that the data is a good proxy when there is only a one-year mismatch. However, our analysis may not be as accurate for states where there is a larger time gap between cohorts.

Figure 1. High School Graduating Cohort for College and Career Readiness, Postsecondary Enrollment, and Remediation Data in Our Analysis
Table 1. High School Graduating Cohort for College and Career Readiness, Postsecondary Enrollment, and Remediation Data in Our Analysis

State

High School Cohort for CCR Data

High School Cohort for Postsecondary Enrollment Data

Matching Cohort for CCR and Postsecondary Enrollment Data

High School Cohort for Remediation Data

Matching Cohort for CCR, Postsecondary Enrollment, and Remediation Data

Alabama

2018–19

2018–19

Yes

2018–19

Yes

Alaska

2018–19

2018–19

Yes

NA

NA

Arizona

2018–19

2018–19

Yes

NA

NA

Arkansas

2018–19

2018–19

Yes

2018–19

Yes

California

2017–18

2017–18

Yes

NA

NA

Colorado

2018–19

2018–19

Yes

2018–19

Yes

Connecticut

2018–19

2018–19

Yes

2015–16

No

Delaware

2018–19

2018–19

Yes

2014–15

No

Florida

2018–19

2018–19

Yes

NA

NA

Georgia

2017–18

2017–18

Yes

2017–18

Yes

Hawaii

2018–19

2018–19

Yes

2018–19

Yes

Idaho

2018–19

2018–19

Yes

NA

NA

Illinois

2018–19

2018–19

Yes

2018–19

Yes

Indiana

2018–19

2018–19

Yes

2018–19

Yes

Iowa

2018–19

2018–19

Yes

Average of 2018–19, 2017–18, and 2016–17

No

Kansas

2018–19

2018–19

Yes

2018–19

Yes

Kentucky

2018–19

2018–19

Yes

NA

NA

Louisiana

2018–19

2018–19

Yes

NA

NA

Maine

2018–19

2018–19

Yes

2014–15

No

Maryland

2018–19

2018–19

Yes

2018–19

Yes

Massachusetts

2018–19

2018–19

Yes

NA

NA

Michigan

2018–19

2018–19

Yes

2018–19

Yes

Minnesota

2018–19

2018–19

Yes

2017–18

No

Mississippi

2018–19

2018–19

Yes

2018–19

Yes

Missouri

2018–19

2018–19

Yes

2018–19

Yes

Montana

2018–19

2018–19

Yes

2018–19

Yes

Nebraska

2018–19

2018–19

Yes

NA

NA

Nevada

2018–19

2018–19

Yes

2017–18

No

New Hampshire

2018–19

2018–19

Yes

NA

NA

New Jersey

2018–19

2018–19

Yes

NA

NA

New Mexico

2017–18

2017–18

Yes

2017–18

Yes

New York

2018–19

2018–19

Yes

NA

NA

North Carolina

2017–18

2017–18

Yes

NA

NA

North Dakota

2018–19

2018–19

Yes

2017–18

No

Ohio

2017–18

2017–18

Yes

2017–18

Yes

Oklahoma

2018–19

2018–19

Yes

2018–19

Yes

Oregon

2017–18

2017–18

Yes

NA

NA

Pennsylvania

2018–19

2018–19

Yes

NA

NA

Rhode Island

2018–19

2018–19

Yes

NA

NA

South Carolina

2018–19

2018–19

Yes

NA

NA

South Dakota

2018–19

2018–19

Yes

2018–19

Yes

Tennessee

2018–19

2018–19

Yes

NA

NA

Texas

2018–19

2018–19

Yes

2017–18

No

Utah

2018–19

2018–19

Yes

2018–19

Yes

Vermont

2018–19

2017–18

No

NA

NA

Virginia

2018–19

2018–19

Yes

NA

NA

Washington

2018–19

2018–19

Yes

2018–19

Yes

Washington, DC

2018–19

2018–19

Yes

NA

NA

West Virginia

2018–19

2018–19

Yes

2016–17

No

Wisconsin

2018–19

2018–19

Yes

NA

NA

Wyoming

2018–19

Average of 2020–21, 2019–20, 2018–19, 2017–18, and 2016–17

No

NA

NA

College and Career Readiness Rates

1. Data Sources

We gathered state-reported, statewide CCR data for high school seniors1In collecting CCR data, we preferred data for either students in the 12th grade or high school completers or graduates, as opposed to data for all high school students or multiple grade levels. from 39 states (including Washington, DC):

  • For 34 of these states, we collected CCR data each state educational agency (SEA) published on its ESSA-required state report card (see “The Use of ‘Report Card’ in This Report” below).
  • For three states (Alabama, Maryland, and Ohio), we used data from downloadable data files made available by the SEA.2Alabama’s state report card provided CCR data for students in all grades combined; we used data for 12th graders from downloadable data files. Maryland did not report any state-level data related to its CCR indicators on its state report card; we found state-level data for one of its CCR measures from downloadable data files. Ohio reported the points earned on the CCR indicator on its state report card, which are not suitable for this analysis; we found the actual percentage of students who were prepared for success from downloadable data files.
  • For Hawaii, we gathered CCR data from the Hawaii Data eXchange Partnership, a cross-agency data-sharing partnership of multiple state agencies.3Hawaii did not use a CCR indicator in its ESSA accountability system, so we used one of the CCR data points collected by the Hawaii Data eXchange Partnership that is the closest proxy to an overall rate of postsecondary readiness: the percentage of high school completers earning diplomas with academic, CTE, or STEM honors.
  • For Nevada, we gathered CCR data from the Nevada System of Higher Education.

For the remaining 12 states, state-reported statewide CCR data was not available. Thus, we used a proxy measure: the percentage of grade 12 students meeting college-ready benchmarks on the ACT or SAT, as reported on state reports produced by the ACT and College Board

2. Excluded College and Career Readiness Measures

“On-track” measures, which are usually based on the number of credits students earn and/or successful completion of specific courses by the end of 9th or 10th grade, have been shown to be predictive of students’ likelihood to graduate from high school. However, we did not consider these indicators in our analysis, because we focused on students’ readiness for success beyond high school graduation. Moreover, comparisons of states’ estimates of students’ postsecondary readiness in grade 12 to students’ actual postsecondary outcomes beyond high school (e.g., college enrollment) are more relevant for this analysis than data gathered earlier in students’ high school careers. Ten states (Arkansas, Connecticut, Delaware, Idaho, Illinois, Maryland, Nevada, Oregon, Washington, and West Virginia) used and reported “on-track” indicators and/or measures in their accountability systems. Of these states, Oregon was the only one that did not use any other CCR indicator for accountability (we used ACT-reported data instead).

Additionally, because this analysis compares estimates of students’ postsecondary readiness in high school to actual postsecondary results like enrollment and placement in credit-bearing coursework, we did not treat “postsecondary outcome” measures (such as college enrollment and successful transitions to the workforce) as CCR data in the analysis unless they were just one measure within a broader CCR indicator. For example, Connecticut’s “postsecondary entrance” indicator measured the percentage of high school graduates who enrolled in a postsecondary institution during their first year after high school graduation. For our analysis, we used a different CCR indicator in Connecticut: “preparation for college and career readiness,” which measured the percentage of students in grades 11 and 12 who met CCR benchmarks on the ACT, SAT, an AP exam, or an IB exam.4Similarly, Vermont’s “college/career outcomes” indicator measured whether high school graduates enrolled in college, were employed, or enlist in the military within 16 months of graduation. In this analysis, we used another indicator Vermont developed for its accountability system (“performance on college/career assessments”) as the best CCR data point for the state. On the other hand, one of Georgia’s CCR indicators included five options for high school graduates to demonstrate readiness, including entering the Technical College System of Georgia or the University System of Georgia without needing remediation. In this analysis, we used the percentage of graduates meeting any of the five options as Georgia’s CCR rate, even though one of the options is technically a “postsecondary outcome” to which we compared CCR rates.

3. Selection process for the most suitable college and career readiness data

Excluding “on-track” and “postsecondary outcome” measures, we found that 20 states (shown in dark purple in Figure 2 below) included one CCR indicator in their accountability systems5Indiana had a state accountability system and a federal accountability system. The CCR indicator we used in this analysis is from the state system. and reported the percentage of students statewide deemed ready on that indicator overall. For these states, we simply used the state-reported CCR rate on the accountability indicator in our analysis. For example, Alabama students were deemed ready on its CCR indicator if they (1) met the college-ready benchmark on at least one ACT section; (2) scored a 3 or higher or 4 or higher on an AP or IB exam, respectively; (3) scored at the silver level on ACT WorkKeys; (4) earned transcripted college credits through dual enrollment; (5) earned an industry-recognized credential; or (6) enlisted in the military. The state reported that 83% of students in the 2018–19 high school graduating cohort demonstrated readiness by meeting at least one of the six requirements.

Figure 2. Number of Readiness Indicators and Availability of State-Reported Readiness Data Across States

In addition, 14 states (shown in green in Figure 2) included one or multiple CCR indicators, but reported CCR data in a way that required us to identify the most suitable readiness data point from multiple options. Some of these states used multiple CCR indicators for accountability, forcing us to choose the better CCR measure for this particular analysis. Some states did not report the percentage of students deemed ready overall on the CCR indicator(s). Instead, they reported data on the individual components or measures within the indicator. For these states, we selected the indicator, component, or measure we believe is the best proxy for an overall CCR rate and best-suited for our analysis (see Table 2 below for further explanation of our selection process).

Table 2. Summary of Our Selection Process for the Best-Suited College and Career Readiness Data

State

Selected One of Multiple CCR Indicators

Selected One of Multiple Measures Within a CCR Indicator

Selection Process

Arkansas

No

Yes

Arkansas had one CCR indicator that included multiple measures. The state reported data on each CCR measure but did not report an overall readiness rate. We used the percentage of students achieving a composite score of 19 or higher on the ACT, because this is the measure given the most weight in the overall indicator.

Connecticut

Yes

No

Connecticut had two CCR indicators and reported overall readiness rates for both. We used the percentage of students attaining benchmark scores on SAT, ACT, AP, or IB exams, because performance on these exams is more closely related to postsecondary success than participation in the exams.

Georgia

Yes

No

Georgia had three CCR indicators and reported overall readiness rates for all of them. We used the “college and career readiness” indicator because it includes rigorous college-ready and career-ready measures.

Iowa

No

Yes

Iowa had one CCR indicator that includes multiple measures. The state reported data on each CCR measure but did not report an overall readiness rate. We used the percentage of students participating in college-level, postsecondary, or advanced coursework, because this is the measure given the most weight in the overall indicator.

Louisiana

Yes

Yes

Louisiana had two CCR indicators and reported overall readiness rates for both. We chose the “strength of diploma” index, because it measures readiness for graduating seniors (as opposed to students in grade 11). Further, we used the percentage of grade 12 students achieving measures within the Index that are “stronger” than a regular high school diploma.

Maryland

Yes

Yes

Maryland had two indicators that included both CCR measures alongside non-CCR measures. Within each indicator, the state did not report data for any CCR measure. We used the percentage of graduates meeting University of Maryland course requirements, one of the CCR options included within one of the CCR measures, because it is the only CCR option where data was available.

Mississippi

Yes

Yes

Mississippi had two CCR indicators and reported overall readiness rates for both. We used the average of the percentage of students meeting the ACT college-ready benchmarks in English or reading and math because it was the only readiness measure made available to all students.

Nevada

No

Yes

Nevada had one CCR indicator that included three measures. The state reported data for one of the measures but did not report an overall readiness rate. Thus, we used the percentage of students earning an advanced or a college- and career-ready diploma.

New Mexico

No

Yes

New Mexico had one CCR indicator that included two measures. The state reported data for both measures but did not report an overall readiness rate. We used the percentage of students demonstrating readiness in at least one CCR activity, because success in these activities is more closely related to postsecondary success than participation in them.

New York

No

Yes

New York had one CCR indicator that included multiple measures, with different weights for different measures. The state reported rates of students receiving each weight. We used the percentage of students receiving a weight of 2 in the index, because these measures include demonstrations of readiness most associated with postsecondary success.

Ohio

No

Yes

Ohio had one CCR indicator that included multiple measures, with different points for different measures. The state reported rates of students receiving different index points. We used the percentage of students earning “basic” points by achieving preparedness benchmarks, as opposed to the percentage of students earning “bonus” points, because those CCR measures are most-valued by the state.

Rhode Island

No

Yes

Rhode Island had one CCR indicator that included two measures. The state reported data for both measures but did not report an overall readiness rate. We used the percentage of students demonstrating “postsecondary success” by attaining college credit and industry credentials instead of the measure based on state English and math assessments.

Utah

No

Yes

Utah had one CCR indicator that included two measures. The state reported data for both measures but did not report an overall readiness rate. We used the percentage of students earning a “C” or better in an AP, IB, or concurrent enrollment course, or completing a CTE pathway, because this measure focuses on graduating seniors (as opposed to students in grade 11).

Washington, DC

Yes

Yes

Washington, DC had two CCR indicators but did not report an overall readiness rate for either indicator. We chose the indicator based on the SAT, because it was the only readiness measure made available to all students. Further, we used the percentage of students achieving 480 in evidence-based reading and writing and 530 in math, because these benchmarks have been validated by the College Board (instead of Washington, DC-specific benchmarks).

Two states, Michigan and New Hampshire, used one CCR indicator in their accountability systems but did not report any data related to the indicator (overall or for an individual component) at the state level (shown in yellow in Figure 2). Instead, we used the percentage of students who met both college-ready benchmarks on the SAT. In Michigan, this was reported as part of the state’s academic achievement indicator (the SAT is administered to all students in grade 11), while we used data reported by the College Board for New Hampshire, as we could not find any usable statewide CCR data on any state website.

Pennsylvania was the only state using a CCR indicator for accountability that measured students’ career readiness but not college readiness (shown in gray in Figure 2). Instead of using data from the career-ready accountability indicator, we used another state-reported CCR measure that considered both college and career preparation but was not a part of the accountability system: “rigorous courses of study,” which was based on the percentage of 12th graders who participated in an AP or IB course, a dual enrollment course, or a concentrated CTE program of study.

Three states (Hawaii, Illinois, and Virginia) did not measure high school students’ postsecondary readiness in their accountability systems, but like Pennsylvania, they reported CCR data on state report cards for informational purposes (shown in magenta in Figure 2). Specifically, for Hawaii, we used a data point that considers both college and career readiness: the percentage of high school completers earning a high school diploma with academic, CTE, or STEM honors. For Illinois and Virginia, we only found one usable, state-reported, CCR data point: the percentage of students in the graduating cohort taking early college courses and the percentage of students in the cohort earning an advanced diploma, respectively.

Finally, 11 states did not include CCR in their accountability systems and did not report any postsecondary readiness data for high school students at the state level (shown in blue in Figure 2). For these 11 states (and New Hampshire, as described above), we used either ACT or SAT data retrieved from ACT and College Board state reports. While both data points are available for these states, we considered statewide participation rates in choosing between them. 

  • Six states (Arizona, Kansas, Minnesota, Missouri, Nebraska, and Wisconsin) had higher participation rates on the ACT, so we used the percentage of students in the high school cohort meeting all four college-ready benchmarks on the ACT. 
  • The remaining five states (Alaska, Colorado, Maine, New Jersey, and Oregon) had higher participation rates on the SAT, so we used the percentage of students in the high school cohort who met both SAT college-ready benchmarks.

In sum, as Figure 3 shows, readiness rates in our data set for 35 states included student participation and/or performance on measures besides, or in addition to, college admission tests, such as earning dual enrollment credit or industry-recognized credentials in high school. For the remaining 16 states (including Washington, DC), the readiness data in our analysis only considered the percentage of students in the cohort meeting college-ready benchmarks on the ACT or SAT. In 12 of the 16 states, we used data directly from the ACT or College Board in the absence of state-reported data, because we could not find the statewide rate of students deemed college and career ready, or a similar measure, on any state website. Further, 11 of the 16 states did not use CCR indicators in their accountability systems. The other five used CCR indicators for accountability but either (1) did not report any college- and career-ready data statewide (New Hampshire) or (2) did not report the state’s overall readiness rate but reported ACT or SAT data (Arkansas, Michigan, Mississippi, and Washington, DC).

Figure 3. Type of College and Career Readiness Measures in Our Analysis
4. Disaggregated data for Black, Latinx, and White students

We found readiness data for Black, Latinx, and White students for 43 states (including Washington, DC) from the same data sources where we collected readiness data for all students. The remaining eight states did not report CCR data for different racial/ethnic groups of students: Hawaii, Illinois, Indiana, Maryland, Montana, Nevada, North Dakota, and Pennsylvania.

In seven of the 43 states, the disaggregated readiness data is slightly different than the data for all students. 

  • In six states where our CCR data comes from the ACT state report (Arizona, Kansas, Minnesota, Missouri, Nebraska, and Wisconsin), the ACT reported readiness for Black, Latinx, and White students as the percentage of students in the graduating cohort who met three or four ACT college-ready benchmarks; whereas the data for all students considers the percentage of students in the high school cohort who met all four benchmarks. 
  • The remaining state, Oklahoma, reported disaggregated readiness rates for 11th and 12th graders combined, whereas the rate for all students included 12th graders only.

Postsecondary Enrollment Rates

1. Data Sources

We gathered state-reported, statewide postsecondary enrollment data for recent high school graduates for 49 states, including Washington, DC (summarized in Table 3 below). 

  • For 31 of these states, we used statewide postsecondary enrollment data published on state report cards, as required by ESSA.  
  • For 18 states, we used statewide postsecondary enrollment data published by the SEA on another part of its website or available on other state websites (e.g., the state agency supervising or governing higher education institutions).
    • Two of these states, Arkansas and Iowa, also reported statewide college-going data on their state report cards, but we did not use this data for our analysis. Arkansas published college-going rates for recent high school graduates attending in-state institutions on its report cards, while the college-going rates published by the Arkansas Division of Higher Education include both in-state and out-of-state institutions. Iowa published postsecondary enrollment data combined for the three most recent graduating cohorts on its report cards; however, we used separate data from the Iowa Department of Education for the high school graduating Class of 2019. 
    • Our Alabama data was from a nonprofit research center, the Public Affairs Research Council of Alabama (PARCA). PARCA received their data from the Alabama Commission on Higher Education (ACHE); we used the PARCA data because the postsecondary enrollment data for the high school graduating cohort in our analysis was no longer available on the ACHE website. 
  • Notably, 35 of the 49 states (over 70%) reporting postsecondary enrollment data indicated they produced the data for recent high school graduates by querying the National Student Clearinghouse, a nonprofit education organization that gathers student enrollment information at colleges and universities nationwide.  

For the remaining two states (New Mexico and New York), enrollment data for recent high school graduates in the state was not reported on any state government website. Instead, we calculated estimates for college enrollment of recent high school graduates in these states using data from the Current Population Survey School Enrollment Supplement, a national survey sponsored by the National Center for Education Statistics, the Bureau of Labor Statistics, and the Census Bureau. To calculate estimated rates for both states, we used the estimated number of students who completed high school as the denominator and the number of students who completed high school and enrolled in a college or university in the fall immediately after high school graduation as the numerator.

2. College Enrollment Period

States reported postsecondary enrollment data for recent high school graduates within different periods of time, ranging from the fall immediately following high school graduation to two years after high school graduation. At least 166Our data sources for postsecondary enrollment rates may not be the only sources where such data is available. Additional states may report subsequent college enrollment data elsewhere and thus are not included. states tracked subsequent college enrollment at several points in time for an individual high school class; for those states, we used data for the shortest enrollment period available. 

Specifically, 42 states reported the percentage of high school graduates entering higher education within a year after high school graduation, capturing college-going scenarios where students begin in the fall and/or spring semesters. 

  • About half of these states (22) reported college enrollment for the high school cohort by the fall semester following high school graduation. 
  • Four states, including Washington, DC, reported college enrollment within six months of graduation.
  • For another 16 states, postsecondary enrollment data included students who enrolled within one year of high school graduation. 

In contrast, nine states reported the percentage of students entering postsecondary education within a longer period of time. 

  • In seven states, the enrollment period is within 16 months of graduation.
  • In two states, the enrollment period is within two years of graduation.
3. Higher Education Institutions Included

For 44 states, including Washington, DC, our postsecondary enrollment data included high school graduates who entered public or private, four-year or two-year institutions nationwide, largely thanks to the National Student Clearinghouse. Their repository of postsecondary data covers 97% of all students enrolled in public and private institutions across the country. 

However, seven states reported postsecondary enrollment data for high school graduates attending a more limited swath of institutions. 

  • Three states (Florida, Oklahoma, and Texas) reported postsecondary enrollment data for recent high school graduates attending in-state institutions only. 
  • Another three states (Mississippi, Montana, and Utah) reported enrollment data for students attending in-state public institutions only. 
  • Wyoming’s data is even more limited; it reported postsecondary enrollment rates for recent graduates attending in-state community colleges only. 

For these states, our postsecondary enrollment rates are likely an undercount to varying degrees because the data excludes students attending out-of-state colleges and (in some cases) certain in-state institutions of higher education.

4. Disaggregated Data for Black, Latinx, and White Students

We collected postsecondary enrollment data for Black, Latinx, and White students for 38 states (including Washington, DC) from the same data sources where we collected college-going data for all students. However, we were not able to find disaggregated postsecondary enrollment data for 13 states: Alabama, Idaho, Illinois, Kentucky, Missouri, Nebraska, New Mexico, New York, Oklahoma, South Carolina, Tennessee, Utah, and Wyoming.

In four of the 38 states (Arizona, Arkansas, Delaware, and Kansas), college enrollment data reported for Black, Latinx, and White students was different from the data for all students. The disaggregated data included a limited swath of institutions, whereas the data for all students included all types of institutions nationwide. 

  • Arizona reported disaggregated college enrollment data for students attending four-year institutions only.
  • Arkansas reported disaggregated college enrollment data for students attending in-state institutions only.
  • Delaware and Kansas disaggregated college enrollment data for students attending in-state public institutions only.
Table 3. Overview of Statewide Postsecondary Enrollment Data in Our Analysis

State

Data Source

College Enrollment Period After High School Graduation

Higher Education Institutions Included

Disaggregated Data for Black, Latinx, and White Students

Alabama

Public Affairs Research Council of Alabama (which gets data from Alabama Commission on Higher Education)

Fall after graduation

All institutions

No

Alaska

State report card

Within one year of graduation

All institutions

Yes

Arizona

Arizona Board of Regents

Within one year of graduation

All institutions

Yes

Arkansas

Arkansas Division of Higher Education

Fall after graduation

All institutions

Yes

California

California Department of Education

Within one year of graduation

All institutions

Yes

Colorado

Colorado Department of Higher Education

Fall after graduation

All institutions

Yes

Connecticut

State report card

Within one year of graduation

All institutions

Yes

Delaware

State report card

Fall after graduation

All institutions

Yes

Florida

State report card

Within one year of graduation

In-state institutions

Yes

Georgia

Governor’s Office of Student Achievement

Within one year of graduation

All institutions

Yes

Hawaii

State report card

Fall after graduation

All institutions

Yes

Idaho

Idaho State Board of Education

Fall after graduation

All institutions

No

Illinois

State report card

Within one year of graduation

All institutions

No

Indiana

Indiana Commission for Higher Education

Within one year of graduation

All institutions

Yes

Iowa

Iowa Department of Education

Fall after graduation

All institutions

Yes

Kansas

Kansas Board of Regents

Within one year of graduation

All institutions

Yes

Kentucky

State report card

Within six months of graduation

All institutions

No

Louisiana

State report card

Fall after graduation

All institutions

Yes

Maine

State report card

Within one year of graduation

All institutions

Yes

Maryland

State report card

Within one year of graduation

All institutions

Yes

Massachusetts

State report card

Within one year of graduation

All institutions

Yes

Michigan

State report card

Within six months of graduation

All institutions

Yes

Minnesota

State report card

Within 16 months of graduation

All institutions

Yes

Mississippi

State report card

Within one year of graduation

In-state public institutions

Yes

Missouri

State report card

Within six months of graduation

All institutions

No

Montana

State report card

Fall after graduation

In-state public institutions

Yes

Nebraska

State report card

Fall after graduation

All institutions

No

Nevada

Nevada Department of Education

Fall after graduation

All institutions

Yes

New Hampshire

State report card

Fall after graduation

All institutions

Yes

New Jersey

State report card

Fall after graduation

All institutions

Yes

New Mexico

Current Population Survey

Fall after graduation

All institutions

No

New York

Current Population Survey

Fall after graduation

All institutions

No

North Carolina

State report card

Within 16 months of graduation

All institutions

Yes

North Dakota

State report card

Within one year of graduation

All institutions

Yes

Ohio

Ohio Department of Education

Within two years of graduation

All institutions

Yes

Oklahoma

Oklahoma State Regents for Higher Education

Fall after graduation

In-state institutions

No

Oregon

State report card

Within 16 months of graduation

All institutions

Yes

Pennsylvania

State report card

Within 16 months of graduation

All institutions

Yes

Rhode Island

State report card

Within one year of graduation

All institutions

Yes

South Carolina

State report card

Fall after graduation

All institutions

No

South Dakota

State report card

Within 16 months of graduation

All institutions

Yes

Tennessee

State report card

Fall after graduation

All institutions

No

Texas

Texas Education Agency

Fall after graduation

In-state institutions

Yes

Utah

State report card

Within two years of graduation

In-state institutions

No

Vermont

State report card

Within 16 months of graduation

All institutions

Yes

Virginia

State report card

Within 16 months of graduation

All institutions

Yes

Washington

Washington’s Education Research and Data Center

Within one year of graduation

All institutions

Yes

Washington, DC

State report card

Within six months of graduation

All institutions

Yes

West Virginia

West Virginia Higher Education Policy Commission

Fall after graduation

All institutions

Yes

Wisconsin

Wisconsin Department of Public Instruction

Fall after graduation

All institutions

Yes

Wyoming

Wyoming Community College Commission

Fall after graduation

In-state community colleges

No

Remediation Rates

1. Data Sources

Unlike postsecondary enrollment data, ESSA does not require states to report the percentage of recent high school graduates needing remediation in higher education. We gathered state-level college remediation rate data for recent high school completers for 28 states but could not find similar data in the other 23 states (including Washington, DC).7Twenty-six of the 28 states reported the percentage of students taking remedial courses. The remaining two states, Arkansas and Maryland, reported the percentage of students assessed to need remediation, which may be different from the percentage of students actually taking remedial courses. The remediation rate data we used was published by SEAs, state agencies supervising or governing higher education institutions, or statewide longitudinal data systems (summarized in Table 4 below).

2. Higher Education Institutions Included

For all states where remediation data was available, the data only included students enrolling in in-state institutions, whereas most states included all higher education institutions when reporting college enrollment data. 

  • Four states (Arkansas, Delaware, Michigan, and Minnesota) reported data for students attending all in-state institutions. 
  • The majority (21 states) reported data for in-state, public institutions only. 
  • Two states (Illinois and Maine) reported data for in-state community colleges only.
  • Washington included in-state four-year universities only when reporting remediation rates.
3. Remedial course subjects

Twenty-five of the 28 states reported the percentage of students who took remedial education in either math or English (i.e., writing, reading, or English courses). The remaining three states, Georgia, Hawaii, and Utah, did not report the overall percentage of students needing remediation. Instead, they reported the percentages of students who took remedial math and remedial English, respectively. The percentage of students taking remedial math was higher than the percentage of students taking remedial English in these states (and in all other states where remediation data was also available separately for each subject), meaning that the math remediation rate is closer to the overall remediation rate. Therefore, in this analysis, we used the math remediation rate as a proxy for the overall remediation rate in Georgia, Hawaii, and Utah.

4. College Enrollment and Remedial Course Enrollment Periods

Some states specified that they reported remediation rate data for high school graduates who enroll in higher education within a certain period of time post-high school completion.

  • Eight states (Alabama, Maryland, Missouri, New Mexico, Ohio, Oklahoma, South Dakota, and West Virginia) reported remediation data for students who enroll in colleges in the fall semester following high school graduation. 
  • Seven states (Georgia, Iowa, Kansas, Michigan, Nevada, North Dakota, and Texas) reported data for students who enroll in colleges within one year after high school graduation. 
  • Three states (Connecticut, Mississippi, and Montana) reported remediation data for students who enroll in colleges within 16 months of graduation. 
  • Only Minnesota reported remediation data for students who enroll in colleges within two years of graduation.
  • The remaining nine states did not specify a postsecondary enrollment period. 

Even fewer states specified whether recent high school completers took remedial courses within a certain period of time of enrolling in college.

  • Maine reported remediation rate data for students who enroll in remedial courses during their first semester of college. 
  • Four states (Colorado, Mississippi, South Dakota, and Texas) reported data for students who take remedial courses in the first year of enrollment.
  • Connecticut reported data for students who take remedial courses in the first two years of enrollment. 
  • The remaining 22 states did not specify a remedial course enrollment period.
5. Disaggregated Data for Black, Latinx, and White Students

Among the 28 states that reported remediation data for all high school completers, only 10 states reported data disaggregated for Black, Latinx, and White students: Connecticut, Delaware, Indiana, Iowa, Michigan, Minnesota, Missouri, Montana, Texas, and Washington. With such a small sample size, and given the inconsistencies in the data noted above, we did not run any analyses involving disaggregated remediation data.

Table 4. Overview of Statewide Remediation Data in Our Analysis

State

Data Source

Higher Education Institutions Included

College Enrollment Period After High School Graduation

Remedial Course Enrollment Period

Disaggregated data for Black, Latinx, and While Students

Alabama

Alabama Commission on Higher Education

In-state public institutions

Fall after graduation

Unspecified

No

Arkansas

State report card

In-state institutions

Unspecified

Unspecified

No

Colorado

Colorado Department of Higher Education

In-state public institutions

Unspecified

First year

No

Connecticut

Connecticut Preschool Through 20 and Workforce Information Network

In-state public institutions

Within 16 months of graduation

First two years

Yes

Delaware

Delaware Department of Education

In-state institutions

Unspecified

Unspecified

Yes

Georgia

Governor’s Office of Student Achievement

In-state public institutions

Within one year of graduation

Unspecified

No

Hawaii

Hawaii Data eXchange Partnership

In-state public institutions

Unspecified

Unspecified

No

Illinois

State report card

In-state community colleges

Unspecified

Unspecified

No

Indiana

Indiana Commission for Higher Education

In-state public institutions

Unspecified

Unspecified

Yes

Iowa

Iowa Department of Education

In-state public institutions

Within one year of graduation

Unspecified

Yes

Kansas

Kansas Board of Regents

In-state public institutions

Within one year of graduation

Unspecified

No

Maine

Maine Community College System

In-state community colleges

Unspecified

First semester

No

Maryland

Maryland Higher Education Commission

In-state public institutions

Fall after graduation

Unspecified

No

Michigan

MI School Data

In-state institutions

Within one year of graduation

Unspecified

Yes

Minnesota

Minnesota Office of Higher Education

In-state institutions

Within two years of graduation

Unspecified

Yes

Mississippi

State report card

In-state public institutions

Within 16 months of graduation

First year

No

Missouri

Missouri Department of Higher Education and Workforce Development

In-state public institutions

Fall after graduation

Unspecified

Yes

Montana

Montana Office of Public Instruction

In-state public institutions

Within 16 months of graduation

Unspecified

Yes

Nevada

Nevada System of Higher Education

In-state public institutions

Within one year of graduation

Unspecified

No

New Mexico

New Mexico Higher Education Department

In-state public institutions

Fall after graduation

Unspecified

No

North Dakota

ND Insight

In-state public institutions

Within one year of graduation

Unspecified

No

Ohio

Ohio Department of Higher Education

In-state public institutions

Fall after graduation

Unspecified

No

Oklahoma

Oklahoma State Regents of Higher Education

In-state public institutions

Fall after graduation

Unspecified

No

South Dakota

South Dakota Board of Regents

In-state public institutions

Fall after graduation

First year

No

Texas

Texas Education Agency

In-state public institutions

Within one year of graduation

First year

Yes

Utah

Utah System of Higher Education

In-state public institutions

Unspecified

Unspecified

No

Washington

Washington’s Education Research and Data Center

In-state four-year universities

Unspecified

Unspecified

Yes

West Virginia

West Virginia Higher Education Policy Commission

In-state public institutions

Fall after graduation

Unspecified

No

Findings

Finding 1

Across the country, statewide college and career readiness rates vary more than postsecondary enrollment rates and the rates of students bypassing remedial coursework in college.

State education leaders have been working toward the same goal: equipping students with the knowledge and skills needed to succeed beyond high school. Although it is not surprising to see disparities in educational outcomes across states, we found much greater variation in statewide CCR rates than in immediate postsecondary enrollment rates and the rates of recent high school graduates bypassing remedial coursework in college (Figure 4). In our data set, the percentage of students deemed college and career ready ranged from 18%8Arizona, which had the lowest CCR rate in our data set, did not include CCR indicators in its accountability system and did not report any readiness data for high school at the state level. Therefore, we used the percentage of students in the graduating cohort meeting all four college-ready benchmarks on the ACT, as reported on Arizona’s ACT state report. See “About Our Data” for more information. to 89% across states, a whopping 71-percentage-point difference. Even when we excluded the 16 states where our readiness data was based on ACT or SAT scores alone, the percentage-point gap was still wide: a 57-percentage-point gap, 32% to 89%. In other words, in one state, only one in three students in its graduating class were prepared for college and a career, whereas nine in ten students were considered ready for postsecondary opportunities in another state.

In contrast, differences in postsecondary enrollment rates for recent high school completers were much smaller. Across states, the percentage of high school completers who enroll in college following high school graduation ranged from 35%9Wyoming, which had the lowest postsecondary enrollment rate in our data set, only reported college enrollment for recent graduates attending in-state community colleges, while most states reported enrollment data for students attending colleges and universities nationwide. Had Wyoming reported postsecondary enrollment at additional institutions, the 40-percentage-point range across states in our data set would have been smaller. See “About Our Data” for more information. to 75%, a 40-percentage-point difference. Meanwhile, across the 28 states where remediation data for recent high school completers was available, the percentage of recent completers who can enroll in credit-bearing coursework without needing remediation ranged from 35% to 91%, a 56-percentage-point difference.

Figure 4. Variation in Statewide Readiness, Postsecondary Enrollment, and Remediation Rates for Recent High School Completers

Admittedly, students’ performance on all sorts of educational indicators is associated with demographic factors that naturally vary from state to state. However, state-by-state differences of such magnitude, especially in CCR rates, indicate that state policies may play a critical role in accounting for these gaps. In particular, unlike high school graduation rates, states measure college and career readiness in wildly different ways. Because there is no standardized or uniform way to measure readiness, states have developed their own approaches, using different indicators, including different early postsecondary opportunities, and setting different expectations for student success.

Finding 2

Black and Latinx students are less likely to be considered ready for postsecondary experiences than White students, and readiness gaps are more pronounced for Black students.

Forty-three states, including Washington, DC, reported readiness data for Black, Latinx, White students, and other racial/ethnic student groups. Across these states, average readiness rates were lower for Black and Latinx students than for White students. Across the states in our data set, on average, 33% of Black students and 41% of Latinx students in a state were deemed college and career ready, compared to 61% of White students.

Additionally, as Figure 5 demonstrates, there was more variation in readiness rates for students of color than for White students: a 72-percentage-point range for both Black and Latinx students compared to a 50-percentge-point range for White students. Specifically, readiness rates ranged from 6% to 78% for Black students and from 14% to 86% for Latinx students.10Wisconsin and Nebraska had the lowest readiness rate for Black and Latinx students, respectively, in our data set. Both states did not include college and career readiness indicators in their accountability systems and did not report any readiness data for high school at the state level. Therefore, we used the percentage of Black, Latinx, and White students in the graduating cohort meeting three or four college-ready benchmarks on the ACT, as reported on Wisconsin and Nebraska’s ACT state reports. Note that ACT does not report the percentage of subgroups of students meeting all four benchmarks. See “About Our Data” for more information. In contrast, the lowest statewide readiness rate for White students was 40%, compared to a high of 90%.

Figure 5. Variation in Statewide Readiness Rates for Black, Latinx, and White Students

We also compared the percentage of each state’s Black and Latinx students deemed college and career ready in our data set with the percentage of White students in the state who were ready (Figure 6). In every state, Black students were less likely to demonstrate college and career readiness compared to White students, and in every state except Vermont, Latinx students were less likely to demonstrate readiness compared to White students.

Figure 6. Readiness Gaps Between Black and White Students and Between Latinx and White Students, by State

Although Black and Latinx students were less likely to be considered college and career ready in our data set than White students in nearly every state, the size of the readiness gap was more severe for Black students than Latinx students (Table 5). The Black-White readiness gap was never less than 10 percentage points, while five states had a Latinx-White gap of that size. On the other end of the spectrum, the size of the gap was 40 percentage points or higher in six states (including Washington, DC) for Black students and in only Washington, DC for Latinx students.

Table 5. Comparing the Size of Readiness Gaps Between Black and White Students and Between Latinx and White Students Across States

Size of Readiness Gap

Number of States with a Black-White Readiness Gap of This Size

Number of States with a Latinx-White Readiness Gap of This Size

No gap (readiness rate for Black or Latinx students equal to or higher than for White students)

0

1

Less than 10 percentage points

0

5

Between 10 and 20 percentage points

11

17

Between 20 and 30 percentage points

12

13

Between 30 and 40 percentage points

14

6

More than 40 percentage points

6

1

Moreover, the readiness gaps between students of color and White students tended to be larger in states where our data relied on student performance on college entrance exams (Figure 7). Among the 43 states that reported readiness data for Black, Latinx, and White students, our data set relied on student performance on the ACT or SAT as the primary measure of readiness in 16 states (including Washington, DC). In these states (shown in dark purple in Figure 7), the readiness gap for Black students was more than 20 percentage points in all but one state. For Latinx students, the gap was larger than 20 percentage points in 12 of the 16 states (including Washington, DC).

Figure 7. Readiness Gaps Between Black and White Students and Between Latinx and White Students, by State and Type of Readiness Measures

Finding 3

Recent Black and Latinx high school completers are less likely to enroll in postsecondary education than White students, and college-going gaps are larger for Latinx students.

We gathered postsecondary enrollment data for Black, Latinx, and White students in 38 states (including Washington, DC).11We were not able to find disaggregated postsecondary enrollment data for the remaining 13 states. See “About Our Data” for more information. Across these states, postsecondary enrollment rates for Black and Latinx students were lower, on average, than for White students just as postsecondary readiness rates were also lower among Black and Latinx students (see Finding 2). On average, 53% of Black students and 49% of Latinx students who recently completed high school in a state went on to enroll in college, compared to 62% of White students.

However, the substantial variation in postsecondary enrollment rates among states was fairly consistent across all three student groups, unlike statewide readiness rates (see Figure 5 above). As Figure 8 shows, postsecondary enrollment rates ranged from 26% to 70% for Black students; 20% to 61% for Latinx students; and 37% to 78% for White students. In other words, for all three groups of students, the range for postsecondary enrollment rates was roughly 40 percentage points, whereas the variation in CCR rates was much wider for Black and Latinx students than for White students.

Figure 8. Variation in Statewide Postsecondary Enrollment Rates for Black, Latinx, and White Students

The variation in college enrollment rates among different racial groups depicted in Figure 8 was also smaller than the variation in readiness rates among Black, Latinx, and White students shown in Figure 5. The readiness rate range for White students (50 percentage points)—the narrowest of any racial group—was wider than the postsecondary enrollment rate range for any group of students. This further reinforces our earlier finding (Finding 1) that overall college and career readiness rates vary much more than postsecondary enrollment rates, where states tend to use more similar measures and data definitions.

We also compared the percentage of each state’s recent Black and Latinx high school completers who enroll in college with the percentage of White students who do so. As Figure 9 shows, in 36 out of 38 states, a lower percentage of Black graduates enrolled in higher education than White students, and in 37 of 38 states a lower percentage of Latinx graduates enrolled in higher education than White students.

Figure 9. Postsecondary Enrollment Gaps Between Black and White Students and Between Latinx and White Students, by State

Although both Black and Latinx students were less likely to enroll in college than White students, the share of Latinx students enrolling in college lagged further behind in most states. This is the opposite of the trend we observed in our college and career readiness data, where Black students were least likely to be deemed ready among the three groups (see Figure 6). As Table 6 shows, college-going gaps between Black and White students were smaller than 10 percentage points in 14 states, between 10 and 20 percentage points in 20 states, and larger than 20 percentage points in two states (including Washington, DC). However, gaps in college enrollment for Latinx students were under 10 percentage points in just eight states, between 10 and 20 percentage points in 20 states, and above 20 percentage points in nine states (including Washington, DC).

Table 6. Comparing the Size of Postsecondary Enrollment Gaps Between Black and White students and Between Latinx and White Students Across States

Size of Postsecondary Enrollment Gaps

Number of States with a Black-White Postsecondary Enrollment Gap of This Size

Number of States with a Latinx-White Postsecondary Enrollment Gap of This Size

No gap (postsecondary enrollment rate for Black or Latinx students equal to or higher than for White students)

2

1

Less than 10 percentage points

14

8

Between 10 and 20 percentage points

20

20

More than 20 percentage points

2

9

Finding 4

Many states’ college and career readiness indicators are undermeasuring students’ potential to enroll in college.

Although K–12 leaders hope their graduates leave high school ready for college and a career—and typically select CCR indicators using measures associated with better odds of succeeding in postsecondary education—students are not required to be college and career ready to enroll in college. Perhaps unsurprisingly, the percentage of students deemed college and career ready was lower than the percentage of recent high school completers enrolling in higher education in 34 states (including Washington, DC) in our data set. As Figure 10 shows, the degree of undermeasuring varied from state to state, ranging from 5 percentage points in Pennsylvania to 53 percentage points in Nebraska.

Conversely, in 17 states, the percentage of students deemed college and career ready was higher than the percentage of recent high school completers ultimately enrolling in college. The size of the gap in these states also varied, from 1 percentage point in Delaware to 44 percentage points in Idaho. Note that seven states (Florida, Mississippi, Montana, Oklahoma, Texas, Utah, and Wyoming) reported postsecondary enrollment data for high school graduates attending a limited swath of institutions in the state, as opposed to nearly all universities and colleges nationwide.12Postsecondary enrollment data for these seven states excluded students attending out-of-state colleges and, in some cases, certain in-state institutions of higher education. See “About Our Data” for more information. As a result, the postsecondary enrollment data we used for those states is likely an undercount. If a more comprehensive set of institutions had been included in the data, it is possible that the statewide readiness rates would have been lower than the college-going rates in Florida, Montana, Oklahoma, Texas, Utah, and Wyoming, and the undermeasuring phenomenon would have been even more severe in Mississippi.

Figure 10. Comparing Readiness Rates to Postsecondary Enrollment Rates Among All Students, by State

Finding 5

The degree to which college and career readiness indicators undermeasure college enrollment is more severe in states where our data relied exclusively on college entrance exams.

In 16 states (including Washington, DC) student performance on the ACT or SAT was the best publicly available statewide CCR rate we could find for this analysis.13In 12 of the 16 states, we could not find the rate of students deemed college and career ready on any state website. Instead we used data from the ACT or College Board in the absence of state-reported data. In Arkansas, Washington, DC, Michigan, and Mississippi, the state administered the ACT or SAT to all students and reported student results as one of its readiness measures. See “About Our Data” for more information. These states are shown in dark green in Figure 11. In all of these states except Arkansas, the percentage of students deemed college and career ready in our data set was lower than the percentage of recent high school completers who ultimately enrolled in college; further, nine of the 10 states with the largest gaps between readiness rates and college-going rates had readiness data limited to ACT or SAT performance.

For the remaining 35 states in our data set, the best available readiness rate reported statewide considered other indicators, often a basket of measures that incorporated multiple data points (sometimes including the ACT or SAT) or a single measure unique to the state (such as earning the state’s advanced diploma). A little more than half of those states (19) undermeasured their students’ postsecondary potential, reporting a lower percentage of high school completers deemed prepared than the percentage of students enrolling in college. The reverse was true for the other 16 states.

Keep in mind, the ACT and College Board college readiness benchmarks are the minimum scores those organizations have determined are needed for students to have a good chance of success in first-year, credit-bearing college courses. Because some colleges and universities do not require ACT or SAT scores for admission or use different minimum benchmarks for admissions, it is not entirely surprising that the percentage of students earning benchmark scores on the ACT or SAT may undermeasure students’ ability to enroll in college. That said, high school students’ performance on college entrance exams is a good predictor of their success in college-level coursework—not just whether or not they enroll in postsecondary education. Therefore, performance on these exams is a key postsecondary outcome education leaders should monitor and strive to improve.

Figure 11. Comparing Readiness Rates to Postsecondary Enrollment Rates Among All Students, by State and Type of Readiness Measures

Finding 6

College and career readiness measures are more likely to undermeasure Black and Latinx students’ ability to enroll in college, compared to their White peers.

We also compared statewide college and career readiness rates and postsecondary enrollment rates for Black, Latinx, and White students in 31 states (including Washington, DC) where both data points were disaggregated by racial/ethnic student groups.14In the remaining 20 states, one or both data points were not available for Black, Latinx, and White students. See “About Our Data” for more information. States were more likely to report lower readiness rates than postsecondary enrollment rates for Black and Latinx students than for White students. To be specific, as Figure 12 shows, 26 states (including Washington, DC) and 22 states (including Washington, DC) undermeasured Black and Latinx graduates’ likelihood to enroll in college, respectively. Fewer states (19) reported lower CCR rates than college-going rates for White students.

Figure 12. Comparing Readiness Rates to Postsecondary Enrollment Rates for Black, Latinx, and White Students, by State

That said, the undermeasuring phenomenon was more severe for Black students than for other student groups. For example, as Table 7 demonstrates, in 11 states, the readiness-enrollment gap was more than 30 percentage points for Black students, while only five states had a gap that large for Latinx students and no states had that large of a gap for White students.

Table 7. Comparing the Gap Between Postsecondary Readiness and Postsecondary Enrollment for Black, Latinx, and White Students Across States

Size of the Gap

(Postsecondary Readiness Rate Lower Than Enrollment Rate)

Number of States with a Gap Between Postsecondary Readiness and Enrollment of This Size for Black Students

Number of States with a Gap Between Postsecondary Readiness and Enrollment of This Size for Latinx Students

Number of States with a Gap Between Postsecondary Readiness and Enrollment of This Size for White Students

No gap (readiness rate equal to or higher than postsecondary enrollment rate)

5

9

12

Less than 10 percentage points

2

4

7

Between 10 and 20 percentage points

6

7

6

Between 20 and 30 percentage points

7

6

6

More than 30 percentage points

11

5

0

Moreover, the trend of undermeasuring in Finding 5 for all students held true among racial/ethnic groups of students: readiness rates in states where our data relied solely on the ACT or SAT exams were an especially poor proxy for whether Black and Latinx students enrolled in college. Out of the 31 states (including Washington, DC) where both postsecondary readiness and enrollment data for racial/ethnic groups were reported, ACT or SAT scores were the best available measure in 14 states (including Washington, DC); these states are shown in blue in Figure 13.

Among Black students, all 14 states reported lower readiness rates than postsecondary enrollment rates. Similarly, all but one state (Alaska) had a lower readiness rate than college-going rate for Latinx students. For White students, the number was slightly lower; 10 out of 14 states reported lower statewide readiness rates than postsecondary enrollment rates. Moreover, seven out of the 10 states where undermeasuring was most prominent for Black and Latinx students were those where ACT or SAT data was the only measure used in our data set. For White students, six of the 10 states with the largest readiness-enrollment gaps came from this group.

Figure 13. Comparing Readiness Rates to Postsecondary Enrollment Rates for Black, Latinx, and White students, by State and Type of Readiness Measures

Finding 7

Many states’ college and career readiness indicators are undermeasuring students’ potential to bypass remedial coursework in college.

Many states’ college and career readiness rates are also undermeasuring students’ preparedness for college-level credit-bearing courses. In 25 of the 28 states where statewide remediation data was available, the percentage of high school completers deemed college and career ready was lower than the percentage of those students who could enter credit-bearing courses without remediation. All 28 states reported remediation data for students attending a limited swath of institutions of higher education. In particular, no state reported remediation rates for students attending out-of-state colleges or universities.15All 28 states reported remediation data for recent high school completers attending either all in-state institutions or a subset of in-state institutions. See “About Out Data” for more information. We did not speculate how similar, for any given state, the reported remediation rate was to the true rate (i.e., the percentage of recent high school completers who enrolled in any type of higher education institution in the country and could complete credit-bearing courses without needing remediation).

As Figure 14 shows, among the 25 states, the gaps between readiness rates and the rates of students bypassing remedial coursework ranged from nearly zero (0.4 percentage points) in Delaware to 55 percentage points in Missouri. Keep in mind, the degree of undermeasuring could be influenced by states’ developmental education policies. Many states have started to encourage or require the use of corequisite courses, which allows students who are underprepared to take college-level coursework that integrates additional academic support. These states may be overcounting the number of students bypassing remedial courses, because some students enrolling in credit-bearing courses are not fully ready for college-level classes and still need extra support.

Figure 14. Comparing Readiness Rates and Rates of Students Bypassing Remedial Coursework Among All Students, by State

Moreover, similar to the undermeasuring trend shown in Finding 5, in states where our readiness data relied on student performance on the ACT or SAT (shown in dark blue in Figure 15), the gaps between CCR rates and rates of students bypassing remediation tended to be even bigger, compared to states where students are deemed ready in other ways. Four out of the five states with the largest gaps (Colorado, Minnesota, Kansas, and Missouri) came from this group.

Figure 15. Comparing Readiness Rates and Rates of Students Bypassing Remedial Coursework, by State and Type of Readiness Measures

Discussion and Policy Considerations

Recommendations for Statewide Data Reporting

We encourage states to publish school-, district-, and state-level data related to college and career readiness, postsecondary enrollment, and college remediation for recent high school completers—for all students and disaggregated by each subgroup of students required by ESSA—on report cards.

To better understand how well schools are preparing students for success in life after high school and better align K–12 and higher education, policymakers and the public need to see postsecondary data side by side with high school data. Accordingly, report cards, the most public-facing resource of K–12 education data for states, districts, and schools, should be a one-stop shop for stakeholders to access important pieces of information about both student readiness during high school and success after high school. While the majority of states already report some information on their report cards, states can improve the quality of public data related to college and career readiness and postsecondary outcomes by publishing more comprehensive, easy-to-digest, and accessible data.

Because our analyses are based on statewide data, our recommendations for data reporting mostly focus on state-level data reporting on state report cards. However, we recognize the importance of publishing disaggregated, school-level data, especially as school report cards may be more relevant to parents and families than statewide data and may be more actionable if they reveal resource and opportunity gaps in local communities. Every year, the Data Quality Campaign reviews school report cards from all states and evaluates whether they meet federal requirements and provide information valuable to the public, including information on college and career readiness and postsecondary enrollment.

College and Career Readiness Data

College and career readiness data can provide meaningful information for education leaders and policymakers to evaluate the preparedness of high school students and identify inequities in access and success in various postsecondary pathways across student groups. However, such data can only be helpful to the public if the data is easy to find and understand. To gather statewide readiness data for this analysis, we looked in multiple places on state websites (such as report cards, data centers for downloadable files, and state-published CCR reports), because not all states collected and published statewide readiness data and not all states published all of their data on report cards. To that end, we encourage states to:

1. Report college and career readiness data—for all students and disaggregated by each subgroup of high school students required by ESSA—on report cards.

ESSA does not explicitly require states to publish state-level college and career readiness data about high school students. However, ESSA requires states to include performance information—for all students and disaggregated by each subgroup of students—on indicators of school quality and student success (SQSS) they use in school accountability systems on state report cards. Thus, if a state chooses to incorporate CCR measures as part of the SQSS indicators in its school accountability system, it is required by the federal law to publish that data on its state report card. Yet, in examining state report cards for the 2018–19 school year,16In five states (California, Georgia, New Mexico, North Carolina, and Ohio) out of the 37 that used CCR indicators, we looked at their state report cards for the 2017–18 school year. See “About Our Data” for more information on the year of data used in this analysis. we found that only 32 of the 37 states using CCR indicators for school accountability reported statewide readiness data for all students (see Table 8). The remaining five states did not include any statewide data related to the readiness indicators they use for school accountability on their state report cards. However, for the five outlier states, we found school-level data related to their readiness indicators on school report cards. Thus, it should be fairly easy for these states to publish the aggregated data at the state level. Additionally, of the 32 states that reported statewide readiness data for all students, six states did not disaggregate the data by student groups.

Table 8. Data Reporting on College and Career Readiness Accountability Indicators on State Report Cards

States That Reported State-Level Data On CCR Indicator(s) for All Students and Student Groups

States That Reported State-Level Data on CCR Indicator(s) for All Students But Not Student Groups

States That Did Not Report State-Level Data on CCR Indicator(s)

States That Did Not Use CCR Indicator(s) for Federal Accountability

26 states

Alabama, Arkansas, California, Connecticut, Delaware, Florida, Georgia, Idaho, Iowa, Kentucky, Louisiana, Massachusetts, Mississippi, New Mexico, New York, North Carolina, Oklahoma, South Dakota, Tennessee, Texas, Utah, Vermont, Washington, Washington, DC, West Virginia, Wyoming

6 states

Indiana, Montana, North Dakota, Ohio, Pennsylvania, South Carolina

5 states

Maryland, Michigan*, Nevada, New Hampshire, Rhode Island

14 states

Alaska, Arizona, Colorado, Hawaii, Illinois**, Kansas, Maine, Minnesota, Missouri, Nebraska, New Jersey, Oregon, Virginia***, Wisconsin

*Michigan’s CCR indicator had two measures: (1) percentage of students in 11th and 12th grade successfully completing advanced coursework and (2) percentage of students in the cohort enrolling in postsecondary education. Michigan reported statewide data on the second measure only, which we do not consider as readiness data for the purpose of this analysis.

**Illinois reported readiness data for all students on the state report card despite not using college and career readiness indicators for accountability.

***Virginia reported readiness data overall and disaggregated by student groups on the state report card despite not using college and career readiness indicators for accountability.

Presumably, states that use readiness indicators for accountability purposes are already collecting readiness data. However, they should go one step further by publishing the readiness data for each school, each district, and the state on their report cards. Meanwhile, states that do not use CCR indicators for accountability purposes should consider adopting CCR indicators and collecting and reporting readiness data on their report cards. When reporting readiness data on report cards, states should describe their definition of college and career readiness and clearly communicate which measures are included.

Further, states should disaggregate readiness data for subgroups of students on their report cards. Disaggregated data is essential for policymakers and education leaders to identify and understand inequities in postsecondary preparedness. For example, based on our analysis of disaggregated statewide readiness rates, we found in most states, compared to their White peers, Black and Latinx students are less likely to be considered ready for postsecondary experiences (Finding 2).

State Examples: Florida and Virginia

Florida’s readiness indicator, “college and career acceleration,” looked at the percentage of graduates achieving a passing score on an AP, IB, or Advanced International Certificate of Education (AICE) exam; earning a grade of C- or better in a dual enrollment course; or receiving an industry-recognized credential. Its state report card provided statewide readiness data for all students and groups of students for the most recent five school years. The disaggregated data included all student groups required by ESSA (i.e., economically disadvantaged students, students from major racial and ethnic groups, children with disabilities, and English learners), plus students in foster care, students experiencing homelessness, migrant students, students from military families, female students, and male students.

Although Virginia did not use a CCR indicator for accountability, its state report card has a “College & Career Readiness” section where the state published the percentage of students in the graduating cohort earning advanced diplomas, which included additional requirements aligned to postsecondary readiness compared with Virginia’s standard diploma.17In Virginia, students entering 9th grade in the 2018–19 school year and beyond are required to complete at least one of the additional readiness options to receive advanced diplomas: (1) completing an AP, IB, honors, or dual enrollment course; (2) completing a work-based learning experience; and (3) earning a state-approved CTE credential. The data was available for all students as well as individual groups of students, including students from major racial/ethnic groups, students with disabilities, economically disadvantaged students, English learners, migrant students, students experiencing homelessness, students from military families, students in foster care, female students, and male students.

2. Report the percentage of students in the high school cohort meeting each readiness option, as well as the overall readiness rate.

Usually, states use readiness indicators that include multiple measures, giving students various options to demonstrate their preparedness for college and careers and recognizing that different postsecondary pathways may require different kinds of preparation. For example, a state may choose to include both acquisition of an industry credential (a career-oriented measure) and passing an AP or IB exam in its readiness indicator (a college-oriented measure). Reporting data on students completing each readiness option within a state’s CCR indicator or definition allows policymakers and education leaders to understand differences in student performance across readiness options, as well as identify inequities in access and success across student groups within each option.

Meanwhile, because some students demonstrate readiness through multiple measures, an overall readiness rate that captures all of the students deemed ready in a graduating cohort paints a bigger picture. In practice, many states report one or the other—the rates of students completing individual readiness options or an overall readiness rate—on their report cards, but far fewer report both. Additionally, in practice, a state’s overall readiness rate is usually the percentage of students deemed college or career ready. 

Further, if more states reported more nuanced readiness data, future work could explore more complicated and sophisticated analyses than those in this report. This could include examining racial gaps in postsecondary preparation across various pathways (e.g., whether Black and Latinx students are more likely to demonstrate readiness via “career-oriented” options than White students) and determining whether undermeasuring is more, or less, prominent depending on the type of readiness measure used (e.g., comparing postsecondary enrollment to college and career readiness via dual enrollment credit completion versus earning an advanced diploma versus completing a CTE pathway).In order to better understand which pathways students are prepared for and which experiences in high school best support that preparation, states should consider reporting the percentage of students deemed college and career ready in addition to the overall readiness rate (i.e., the percentage of students deemed college or career ready). States should also report information on each individual option for demonstrating readiness if there are multiple ways to show readiness for college or a career and disaggregate that information for individual student groups.

State Example: Ohio

Ohio’s readiness indicator, “prepared for success,” had three primary measures and three bonus measures. A school or district earned 1 point for every student who completed one of the primary measures and 0.3 points for every student who completed one of the bonus measures in addition to completing one of the primary measures. Ohio reported—for all students and for student groups—the percentage of students earning 1 point, earning 1.3 points, and completing each of the six readiness measures individually. For example, for high school Class of 2018, 27% of all students in the state earned remediation-free scores on the ACT or SAT (one of the three primary measures), but only 6% of Black students and 13% of Latinx students did. Also, 6% of all students earned an industry credential (one of the three primary measures), while 4% of Black students and 7% of Latinx students did. Such data can show racial disparities across different pathways and can help policymakers make informed decisions when they channel resources to promote equity in college and career pathways. These data points were available in a downloadable data file. To make the data easier to find, Ohio should consider adding a link to the resource in the “prepared for success” section on its state report card.

3. Report the percentage of students in the high school cohort demonstrating college and career readiness, in addition to the points, index scores, or ratings earned on the college and career readiness indicators in the accountability system.

Not all readiness measures are equally rigorous or strong predictors of student postsecondary success. When it comes to accountability indicators, as we describe further below, we support states using their indicators to incentivize students to pursue and succeed in certain postsecondary pathways by giving completion of those pathways a higher weight, number of points, or score in their school ratings. As a result, a state’s CCR indicator can be far more complicated than a simple percentage of students in the cohort who are deemed ready for postsecondary opportunities. However, such a college- and career-ready index score, reported on its own, can be misleading and confusing to the public. For example, some states reported an index score as their overall readiness “rate,” as opposed to the actual percentage of students in the state who are deemed ready. For those states, although they technically reported statewide readiness data, we still had to look for alternative data (i.e., percentages of students completing readiness measures) that is easier to interpret.

On the other hand, for accountability purposes, states may also translate the percentage of students deemed ready into more user-friendly labels, like A–F letter grades and 1 to 5 stars. Yet even though report card users may know that a B+ grade for college and career readiness is good, they are left without knowing exactly how many students are prepared. Therefore, states should publish the underlying data—in addition to how that data translates into scores or ratings for accountability purposes—to paint a fuller picture of student preparation.

State Examples: New York and California

New York’s readiness indicator, the “college, career, and civic readiness index,” used a complex system to assign different weights to students completing different postsecondary readiness activities (from 0 to 2 points), based on the rigor of those activities. New York’s report cards provided the index scores schools, districts, and the state earned next to the actual number of students receiving each weight and the total number of students in the cohort, but it can be hard to determine whether a school’s score is exemplary, needs improvement, or falls somewhere in between. The numerator (i.e., the number of students receiving each weight) and the denominator (i.e., the number of students in total) provided report card users some important context, alongside whether a school was meeting long-term goals and measures of interim progress, but New York did not calculate the percentage of students in the high school cohort completing the individual activities associated with each weight on its report cards. Because there were usually multiple options associated with the same weighting, users could not fully distinguish between the percentage of students completing one option versus another, or the percentage of students deemed ready overall. To make the data more useful and comprehensive, we recommend New York consider adding the percentage of students achieving each readiness option to its report cards.


In California, schools and districts received one of five color-coded performance levels on each of its accountability indicators: Blue, Green, Yellow, Orange, and Red (from highest to lowest). The state’s methodology for calculating these performance levels on the CCR indicator was somewhat complicated. First, the indicator included multiple measures, some of which needed to be achieved in combination, in order for a student to be deemed ready. Further, performance levels were determined by not only current year data, but also how the data compared to prior year data from the previous cohort. Despite this complicated method, California’s state report card clearly presented performance levels (i.e., color) on the readiness indicator for all students and for individual groups of students, alongside the number and percentage of students deemed ready. For example, English learners in the state achieved the Orange level in the 2017–18 school year. While that did not provide much specific information about their preparedness, report card users could view the accompanying data and see that 14.5% of English learners were deemed college and career ready. To further improve the state’s data reporting, California could publish the percentage of students—in the aggregate and disaggregated—completing each readiness option. Even better, this should be an easy update, because the data was already available on another webpage maintained by the agency. Although report card users can find this webpage by clicking the “View Additional Reports” button at the top of the report card page and selecting “College/Career Measures Report” from a menu of options, the agency could place a more prominent link to this report in the CCR section on its report cards to alert users about the more detailed information.

Postsecondary Enrollment Data

While readiness measures, like completion of dual enrollment and other advanced coursework, are predictors of student success beyond high school, postsecondary education and workforce data provides evidence of actual successful student outcomes. Although almost all states have made postsecondary enrollment rates for recent high school completers publicly available, during our data collection process, we saw many inconsistencies in terms of where and how states presented this data. Specially, states should:

1. Report postsecondary enrollment data—for all students and disaggregated by each subgroup of students, as required by ESSA—on state report cards.

Under ESSA (see below), states are required to report postsecondary enrollment data for recent high school completers in the aggregate and disaggregated by each subgroup of students, for each high school, on state report cards. However, the statutory requirement gives states an “out” if they do not report the data, where it only expects states to report postsecondary enrollment data “where available.” In addition, the requirement is limited to enrollment in public postsecondary institutions in the state; states can report enrollment data for private colleges and out-of-state institutions “if data are available and to the extent practicable.”

Section 1111(h)(1)(C)(xiii) of the Elementary and Secondary Education Act, as amended by the Every Student Succeeds Act, reads (emphasis added)

(C) MINIMUM REQUIREMENTS.—Each State report card required under this subsection shall include the following information:

(xiii) Where available, for each high school in the State, and beginning with the report card prepared under this paragraph for 2017, the cohort rate (in the aggregate, and disaggregated for each subgroup of students defined in subsection (c)(2)), at which students who graduate from the high school enroll, for the first academic year that begins after the students’ graduation—

(I) in programs of public postsecondary education in the State; and(ii) if data are available and to the extent practicable, in programs of private postsecondary education in the State or programs of postsecondary education outside of the State.

Perhaps unsurprisingly, 18 states did not publish postsecondary enrollment data for the 2018–1918For seven states (California, Georgia, New Mexico, North Carolina, Ohio, Oregon, and Vermont), we looked at postsecondary enrollment data for the 2017–18 high school cohort. See “About Our Data” for more information on the year of data used in this analysis. high school cohort on their state report cards. As Table 9 shows, five of these states reported college-going rates at the school and/or district level only (and are, therefore, capable of also reporting statewide rates), whereas the remaining 13 states did not include any college-going data in their report cards. For most of these states, adding the data to their report cards would be relatively easy; 11 of the 13 states published the data on another part of the agency’s website (and could simply add a link to that page from their report card or publish the data in two places) or on other state websites.19Alabama’s postsecondary enrollment data was available on the Alabama Commission on Higher Education website. However, data for the 2018–19 high school cohort was no longer available. Thus, we used college-going data for the cohort published by the Public Affairs Research Council of Alabama, a nonprofit research center. The remaining two states, New Mexico and New York, did not report postsecondary enrollment data for recent high school completers on any state government website.

Table 9. Data Reporting on Postsecondary Enrollment Rates for Recent High School Graduates on State Report Cards

States That Published Postsecondary Enrollment Data and Fully Disaggregated the Data on State Report Cards

States That Published Postsecondary Enrollment Data on State Report Cards But Did Not Disaggregate the Data

States That Did Not Publish Postsecondary Enrollment Data on State Report Cards but Published the Data on School and/or District Report Cards

States That Did Not Publish State-, District-, or School-Level Postsecondary Enrollment Data on Report Cards

25 states

Alaska, Arkansas, Connecticut, Delaware, Florida, Hawaii, Iowa, Louisiana, Massachusetts, Maryland, Maine, Michigan, Minnesota, Mississippi, Montana, North Carolina, North Dakota, New Hampshire, New Jersey, Pennsylvania, Rhode Island, South Dakota, Virginia, Vermont, Washington, DC

8 states

Illinois, Kentucky, Missouri, Nebraska, Oregon, South Carolina, Tennessee*, Utah

5 states

Arizona, Colorado, Georgia, Kansas, Ohio

13 states

Alabama, California, Idaho, Indiana, Nevada, New Mexico, New York, Oklahoma, Texas, Washington, Wisconsin, West Virginia, Wyoming

*Tennessee partially disaggregated postsecondary enrollment data by combining Black, Latinx, and Native American students into a supergroup.

That said, when it comes to disaggregated postsecondary enrollment rates, eight of the 33 states that published statewide college-going data for recent high school students did not fully disaggregate the data by each group of students required by ESSA.

2. Report enrollment data for high school completers enrolling in postsecondary education within one year of high school graduation.

ESSA requires states to report postsecondary enrollment data for students who graduate from high school and enroll in postsecondary education for “the first academic year” that begins after high school graduation on their report cards.20Section 1111(h)(1)(C)(xiii) of the Every Student Succeeds Act, 20 U.S.C. § 6301 (2015). Yet some states choose to be more “generous” regarding the college enrollment period; these states report enrollment data for high school completers enrolling in college within 16 or 24 months. Understandably, college enrollment rates within a longer enrollment period are higher than immediate enrollment rates, but so much variability in reporting is inconsistent with ESSA and, most importantly, undermines the comparability of the data from state to state. For states that hope to capture college enrollment patterns over a longer period of time, an easy solution would be to post the immediate enrollment rate (for consistency and compliance with the law) and publish updated enrollment rates over later academic terms or years.

State Example: Maryland

Maryland’s state report card provided college enrollment data for recent high school graduates in the aggregate and disaggregated by each subgroup of students as required by ESSA. The state tracked college enrollment rates for students enrolling within 12, 16, and 24 months after high school graduation.

3. Break down postsecondary enrollment data by location (i.e., in-state versus out-of-state) and type of institution (i.e., four-year versus two-year and public versus private).

While many states comply with ESSA in reporting postsecondary enrollment rates on their state report cards, most are doing the bare minimum by publishing just an overall postsecondary enrollment rate in the aggregate and disaggregated by each required group of students. To provide more meaningful information to the public, state policymakers, and district and school leaders, we encourage states to go beyond ESSA’s requirements. For example, in this analysis, we found many states’ CCR indicators are undermeasuring students’ potential to enroll in higher education (Finding 4). If more states disaggregated enrollment data by location and type of institution, we could further compare differences in the degree of undermeasuring for different types of institutions.

State Example: Mississippi

Mississippi’s state report card provided a comprehensive report on high school graduates’ outcomes. For those enrolled in postsecondary education, the report showed the percentage of students enrolled in any institution nationwide, as well as the percentage of students enrolled in an in-state, out-of-state, two-year, four-year, public in-state, and private in-state institution, respectively. Moreover, the state tracked whether students were enrolled full time or part time and included information regarding the areas of study students were pursuing at in-state, public institutions.

Remediation and Other Postsecondary Outcomes Data

Compared to readiness and postsecondary enrollment, the availability of college remediation data for recent high school completers is poor, as ESSA does not require states to publish this data. Nearly half of states (23, including Washington, DC) did not report statewide postsecondary remediation data for recent high school completers on any state government website, let alone their ESSA-required report cards. For states that did publish remediation data, there are great inconsistencies in how the data was collected. For example, some states reported data for students attending all in-state institutions, while some states reported data for in-state, public institutions only. And often, college remediation data for high school completers is maintained and reported on the webpage of the state’s higher education system, as opposed to its K–12 agency. We recommend states:

While most states already gather college enrollment data for high school graduates, it is equally, if not more, important for states to track those students’ success throughout their higher education journey. To better understand how well high schools are preparing students for college, states should look at data points beyond enrollment. Key indicators to consider include the percentage of students completing credit-bearing college-level courses without remediation, persisting in college after the first year, and ultimately earning a credential or degree. Admittedly, some of these data points will not be available until a couple of years after students leave high school, and it can be challenging for states to track students’ performance in out-of-state and/or private institutions due to a lack of data-sharing agreements. However, states could start with gathering and publishing remediation data for students attending in-state, public institutions and expand their data collection from there. Including additional postsecondary outcomes next to college enrollment data on report cards would paint a fuller picture of how well high schools are preparing students for college.

2. Report the percentage of students—overall and disaggregated by each required group of students—entering the workforce or the military immediately following high school graduation on report cards.

Because higher education is just one of the options students have beyond high school, it is important to present information about students who choose otherwise. For example, by examining disaggregated college-going rates, we found that, compared to their White peers, Black and Latinx high school completers are less likely to enroll in college (Finding 3). However, due to the lack of workforce data, we know little about students who choose not to pursue higher education, let alone potential racial disparities in career outcomes. Also, as most states that use CCR indicators provide career-oriented pathways, data on the workforce and military outcomes for recent high school graduates can help policymakers and the public better understand whether the career and military pathways are effective and what skills and credentials are most valued by employers.

Given the state of states’ data systems today, it will likely be challenging for a state education agency to develop a secure and sustainable data-sharing process with the state’s labor or workforce agency and the armed services without additional resources. Still, we encourage investments in data infrastructure to develop high-quality linkages among agencies. With a robust data infrastructure in place, educational leaders can use workforce and military outcomes data to make evidence-based decisions about postsecondary pathways leading to careers.

State Example: Mississippi

In addition to college enrollment data, Mississippi’s state report card presented the percentage of recent high school students enrolling in remedial courses in college, passing all coursework in their first year of college enrollment, remaining enrolled in college after their first year, and earning an associate degree within three years of graduating from high school. In addition, the state also included outcomes for students not enrolling in college. For example, the state reported the percentage of students employed in Mississippi, as well as the percentage of students who were not employed in Mississippi and were not enrolled in any postsecondary institution after high school graduation.

Considerations for Federal Policymakers

At the federal level, policymakers can support states in improving the breadth and quality of data they report related to college and career readiness, including changes in statutory requirements, improved oversight, guidance, and monitoring, and additional resources for data infrastructure.

1. When ESSA is reauthorized, Congress should require states to publish school-, district-, and state-level college and career readiness data for all students in the graduating cohort, disaggregated by each subgroup of students.

As discussed above, limitations in the availability and usability of CCR data on ESSA report cards can be partially attributed to a lack of a statutory requirement to report such data, as well as a lack of clarity in the statutory language related to reporting SQSS indicators. Because states are only required to report CCR data if they choose to include it as an SQSS indicator for accountability, some states include little to no information about students’ postsecondary readiness on their report cards. Others include some information, but in ways that are not especially usable because they report an index score or indicator rating instead of the underlying data.

Congress can foster more robust data reporting on postsecondary readiness by amending ESSA to explicitly require states to publish school-, district-, and state-level readiness data for all students in the graduating cohort and disaggregated by each subgroup of students, regardless of whether this data is also used for school accountability. Accurate postsecondary readiness data has value for policymakers beyond accountability systems. This data can be used to drive conversations about access to early postsecondary coursework, development of college and career pathways, school curriculum and improvement, student supports, and more. Given the lack of consensus and varied approaches states use to measure readiness currently, Congress needs not mandate a uniform definition of readiness; however, the statute could give examples of data sources for states to consider, such as completing advanced coursework (AP, IB, and dual enrollment), attaining benchmark scores on assessments aligned with college readiness, or obtaining postsecondary or industry-recognized credentials. 

Additionally, Congress could consider requiring states to report the percentage of students in the graduating cohort proficient in both English language arts and mathematics on the states’ assessments by the end of high school. Because ESSA requires states to develop academic content standards aligned with credit-bearing coursework at colleges and universities and administer statewide assessments aligned to those content standards,21Section 1111(b)(1)(D)(i) of the Every Student Succeeds Act, 20 U.S.C. § 6301 (2015). proficiency on state tests in high school could be a fairly reliable proxy for students’ postsecondary readiness. As states already report student proficiency on statewide assessments annually (i.e., the percentage of high school test takers scoring proficient in each subject in a given year), it would be relatively easy to also report this data for students in the graduating cohort (i.e., the percentage of students in the cohort scoring proficient on both subjects by the end of high school, regardless of the year they took the assessments). 

Further, Congress should refine the reporting requirements for accountability indicators to improve the availability and usability of accountability data, including data on postsecondary readiness. First, they should clarify that states must report school performance on its accountability indicators (e.g., index score, performance level, or rating on the indicator), as well as the underlying, raw data that inform the indicator (e.g. percentage of students in the graduating cohort who are college and career ready). Congress should also clarify, in cases where multiple measures are included in a single indicator, that states must publish data on the overall indicator as well as each individual component measure, disaggregated by student groups.

2. When ESSA is reauthorized, Congress should strengthen the existing requirement for reporting postsecondary enrollment data and add new requirements for reporting postsecondary outcomes.

Although ESSA requires states to publish postsecondary enrollment data on their report cards, it is less a requirement and more a suggestion, because the requirement is conditioned on the data being “available.” When ESSA is reauthorized, Congress should strengthen the existing requirement by striking “where available” and promoting additional disaggregation of enrollment data. 

As our analysis shows, it is clear that postsecondary enrollment data is readily available. All but two states already publish this data for recent high school completers on a state website, if not on their ESSA-required state report cards. Moreover, most states partner with the National Student Clearinghouse, a nonprofit education organization, to track student enrollment and success at colleges and universities nationwide, not just at in-state, public institutions. Therefore, federal policymakers should update the law to reflect current and best practices by requiring state and LEA report cards to include postsecondary enrollment data in all cases (not just “where available” or “to the extent practicable”) at both in-state, public institutions as well as private and out-of-state colleges and universities. 

In addition, Congress should amend ESSA to suggest states consider disaggregating postsecondary enrollment data in new ways. For example, Congress could amend ESSA to encourage states to report full-time versus part-time enrollment where available, as well as enrollment in two-year versus four-year institutions. Together, these enhancements would expand not only the availability of data, but also its utility in identifying trends and patterns in postsecondary enrollment between institutions and between groups of students.

Finally, college enrollment is only one postsecondary measure and may not reveal as much about students’ preparedness for college and careers as other outcomes, such as enrollment in credit-bearing, non-remedial college coursework or persistence in higher education. Given the availability of this data, it may not be feasible at the moment to require all states to report these kinds of data for recent high school completers on their report cards. However, Congress could consider encouraging states to include additional postsecondary outcomes data, where available, on report cards in the next reauthorization of ESSA. This could build demand for more, and better, linkages between K–12, higher education, and workforce data and expand availability of postsecondary outcomes data beyond current practice.

3. Policymakers should provide more funding to incentivize states to invest in and enhance their data infrastructure.

Statewide longitudinal data systems (SLDS) are critical data infrastructure that helps states gather and monitor student-level data over time, from early childhood and K–12 education to postsecondary education and the workforce. As the Data Quality Campaign explains, “using longitudinal data is like being able to watch a video of student progress over time rather than in a series of snapshots.” SLDS facilitate data sharing across agencies, enable states to effectively manage and use education data, and allow policymakers and the public to make data-driven decisions and identify trends in student outcomes. The SLDS Grant Program, run by the Department of Education’s Institute of Education Sciences, has awarded competitive, cooperative agreement grants to states to help increase their capacity to link data systems. Similarly, the Department of Labor’s Workforce Data Quality Initiative (WDQI) helps states strengthen their SLDS by integrating education and workforce data, which is a critical infrastructure for understanding whether students are career ready as well as college ready.

The Biden administration requested $33.5 million for the SLDS program and $6 million for WDQI in its fiscal year 2023 budget. However, greater investments in states’ data infrastructure are needed, particularly to create and sustain linkages necessary in every state to track student outcomes beyond high school into higher education and the workforce. In future appropriations bills, Congress should consider investing at least $100 million in the SLDS program and at least $40 million in the WDQI fund. This will help states (1) address immediate data needs (such as incorporating pandemic-related data into their SLDS) and (2) fund large-scale data projects to modernize and link outdated data systems spanning preK–12 education, postsecondary education, and the workforce. For example, in addition to increasing funding through existing federal programs, Congress could fund integrated state data systems across early childhood, K–12, postsecondary, workforce, and health sectors, in a more sustainable, effective way, such as by establishing grants for a cross-agency data governance body to link legacy data systems as opposed to grants exclusively for state educational agencies.

4. The Department of Education should promote more comprehensive, digestible college and career readiness data, including by updating its report card guidance, monitoring compliance with ESSA’s reporting provisions, and touting the value of this data to education leaders.

In its report card guidance, published in September 2019, the Department of Education (the Department) provided a checklist showing which report card elements and student subgroup disaggregation are required at the school, district, and state levels. According to the checklist, states must report performance on the other academic indicator and SQSS indicator(s) in their accountability systems at the school, district, and state levels, for all students and subgroups of students, even though these indicators are typically only used for school accountability (there is no district accountability required in ESSA). This checklist is referred to extensively throughout the guidance and is the only place where the Department explicitly clarifies that accountability-related indicators must be aggregated above the school level. Yet such an important clarifying statement appears once in an appendix on page 56 of a 66-page document. Given our findings that some states fail to report state- and/or district-level data related to CCR indicators, we recommend the Department update its guidance to move this checklist to the beginning of the document and to add additional questions making clear that report cards must include school-, district-, and state-level data on SQSS indicators (for example, in question E-3). 

Readiness data is often most useful when the overall percentage of students in the cohort prepared for postsecondary is presented alongside information about each way students demonstrated readiness. Accordingly, question E-4 of the guidance encourages states to “report on the individual measures or components within the [SQSS] indicator as well as the indicator overall.” However, many states failed to report this level of detail in the past. As states resume their accountability systems following COVID-related disruptions and begin reporting their SQSS indicators again, we urge the Department to further emphasize this suggestion in its technical assistance and support to state agencies. For example, the Department could highlight states that go beyond the statutory requirements and produce more detailed college- and career-ready data (including postsecondary enrollment and/or remediation data) on their report cards on its blog, in webinars, or in another public forum.

Finally, the Department should increase its monitoring of state report cards required under Title I of ESSA. Given the number of states failing to report state-, district- and school-level postsecondary enrollment and/or SQSS indicator data for all students and for each subgroup of students, the Department should use its oversight authority to ensure all ESSA reporting requirements, particularly those requiring disaggregated data, are met for the 2021–22 school year. If states do not report this data, leaders will not be able to address resource and preparation gaps and direct more resources to students who need additional support, particularly following COVID-19 educational disruptions.

Recommendations for College and Career Readiness Accountability Indicators

States should incorporate college and career readiness indicators into school accountability systems to incentivize all schools to offer and encourage student participation and success in college and career pathways and signal the effectiveness of those pathways.

An educated workforce is needed more than ever for the U.S. economy to recover from the pandemic. According to the National Center for Education Statistics, in 2021, the employment rate among young adults was 86% for those with a bachelor’s degree, compared to 68% for those who completed high school. In 2020, the median earnings of young adults with a bachelor’s degree were 63% higher than the earnings of those who completed high school.

However, despite increasing high school graduation rates, college enrollment rates have been stagnant for the past decade, and the immediate college enrollment rates for the Class of 2020 have fallen drastically compared to the previous class, especially for high-poverty schools. Meanwhile, college completion rates are persistently low, especially for historically underserved students. There is an obvious disconnect between high school education and high school graduates’ success in postsecondary education. Fortunately, states can use their accountability systems to signal how well schools are doing in various aspects, including preparing students for postsecondary opportunities. Therefore, states should assess how effectively high schools are helping students develop knowledge, skills, and experiences they need to be successful after high school and hold schools accountable for ensuring students graduate prepared. Specifically, we recommend states:

1. Use college and career readiness indicators for school accountability that consider multiple options for students to demonstrate readiness, as opposed to relying on a single measure.

ESSA does not require states to measure or report college and career readiness for high school students, but it does encourage states to consider including student access to and completion of advanced coursework and/or postsecondary readiness as an SQSS indicator in their accountability systems for schools.22Section 1111(c)(4)(B)(v)(II) of the Every Student Succeeds Act, 20 U.S.C. § 6301 (2015). During the 2018–1923In five states (California, Georgia, New Mexico, North Carolina, and Ohio) out of the 37 that used CCR indicators, we looked at the state report cards for the 2017–18 school year. See “About Our Data” for more information on the year of data used in this analysis. school year, 37 states (including Washington, DC) incorporated CCR indicators into their high school accountability systems. The remaining 14 states should strongly consider doing the same, particularly as states look to restart their ESSA accountability systems following COVID-19 data disruptions. Incorporating CCR indicators into school accountability systems can incentivize all schools to offer and encourage student participation and success in college and career pathways and signal the effectiveness of those pathways.

In analyzing statewide readiness and postsecondary enrollment rates, we found that the degree of undermeasuring is more severe in states where our data relied exclusively on college entrance exams (Finding 5). This is especially true for Black and Latinx students (see Figure 13 in Finding 6). In other words, readiness rates are a better proxy for college enrollment in states that measure readiness using a basket of indicators, such as performance on AP and IB exams, earning dual credits, and completing a career pathway with an industry credential. For states that are not using readiness indicators but are considering using them in the future, it is important to include multiple options for students to demonstrate their readiness, recognizing the variety of postsecondary options available to them.

State Example: South Carolina

South Carolina’s readiness indicator had ten measures, including six college-oriented options and four career-oriented options. Students were deemed ready if they demonstrated readiness on any one of the ten measures. The six college-ready measures were: (1) earning a composite score of 20 or higher on the ACT; (2) earning a total score of 1020 or higher on the SAT; (3) earning a score of 3 or higher on an AP exam; (4) earning a score of 4 or higher on an IB exam; (5) earning a C or higher on Advanced Level Cambridge or selected Advanced Subsidiary Level exams; and (6) completing six hours of dual credit coursework with a C or better. The four career-ready measures were: (1) earning a Silver on a Worldwide Interactive Network National Career Readiness certificate or an ACT WorkKeys exam; (2) earning a score of 31 on the Armed Services Vocational Aptitude Battery (ASVAB) test; (3) completing an approved work-based learning experience; and (4) completing a CTE pathway with a state or nationally recognized industry credential from a selected list.

2. Include actual postsecondary outcomes in college and career readiness indicators.

Although proxy measures such as advanced course-taking and dual enrollment are good predictors of students’ postsecondary success, data on actual postsecondary outcomes, such as postsecondary enrollment rates and college remediation rates, provides school leaders, educators, and the public with actual evidence of how well high schools are preparing students for success.

In analyzing statewide data on proxy measures of readiness and postsecondary outcomes, our findings suggest that the proxy measures states use are not doing a great job in “previewing” students’ postsecondary outcomes. We found greater variation in statewide readiness rates than in the rates of recent high school completers enrolling in higher education and the rates of them avoiding college remediation (Finding 1). Further, we find readiness rates based on proxy measures are likely to underestimate the percentage of students enrolling in college and the percentage of those not needing remediation (see Findings 4 and 7), especially for Black and Latinx students (Finding 6). 

Understandably, states may be reluctant to use data on postsecondary outcomes for high school accountability, as there are concerns around holding high schools accountable for students’ experience after high school. To ameliorate this concern, states could consider giving these measures relatively low weight at first.

Leaders may also be concerned about data quality, especially if they cannot account for students attending private or out-of-state colleges or entering the military or workforce after high school; therefore, states should invest in their data infrastructure to improve these data linkages. Also, states may not be able to collect data on certain postsecondary outcomes for recent high school completers, such as first-year persistence and retention rates, until at least two years after their high school graduation. As such, more states would likely need to use lagged data in their accountability systems if they incorporated postsecondary outcomes data (i.e., using data for 2020–21 high school completers in accountability determinations for the 2021–22 school year). However, given the importance of postsecondary outcomes for long-term student success and the current availability of this data, states should at least include measures such as rates of recent high school graduates entering higher education, the workforce, and the military immediately after graduation, while working toward the inclusion of college remediation and/or college persistence data.

State Example: Georgia

One of Georgia’s three college and career readiness indicators was based on the percentage of students entering the Technical College System of Georgia or the University System of Georgia without needing remediation; achieving a readiness score on the ACT or SAT; earning a score of 3 or higher or 4 or higher on two AP or IB exams, respectively; passing an end of pathway assessment that results in a national or state industry credential; or completing a work-based learning program.

3. Identify college and career preparatory experiences that are associated with the strongest postsecondary outcomes and reward schools with bonus points for accountability purposes to incentivize student completion in those pathways.

We recommend states refine their readiness indicators to reward schools that are helping students complete pathways and experiences that result in particularly strong postsecondary outcomes. For example, states could use a readiness index where schools receive bonus points for students reaching “advanced” levels of preparedness (e.g., distinguishing between students completing two dual enrollment courses versus earning an associate degree in high school). Additionally, schools could be rewarded in the accountability system for students who either achieve a higher standard on a given readiness measure (e.g., earning a 4 or 5 on an AP exam instead of a 3) or complete more measures than required (e.g., completing both college- and career-ready experiences). Due to data availability issues, we were not able to conduct deeper, more sophisticated analyses that could reveal which college and career readiness options are associated with better postsecondary outcomes.

However, states should review existing research on readiness measures, analyze their own postsecondary readiness and outcome data, identify readiness measures that lead to better outcomes, and incentivize schools to promote those measures through their high school accountability systems. For example, ample evidence shows that students who successfully complete early college or multiple dual credit courses are more likely to enroll in postsecondary education and attain a credential. We encourage states to use evidence-based measures, such as early college and dual credit courses, in addition to actual postsecondary outcomes in their readiness indicators. Meanwhile, states’ reviews of their own readiness data may reveal additional evidence-based measures to include and/or weigh more heavily if their analyses suggest certain measures are associated with better postsecondary results.

State Examples: Delaware, Ohio, New York, and Louisiana

Delaware’s CCR indicator had four college-ready and five career-ready options (including one military-ready option). Students were deemed ready if they completed at least one of the nine options. However, schools received bonus points for students completing one college-ready option and one career-ready option. Similarly, recall that Ohio’s CCR indicator, “prepared for success,” had three primary measures and three bonus measures. A school or district earned 1 point for every student who completed one of the primary measures (i.e., achieving remediation-free scores on the ACT or SAT; earning an honors diploma; or earning 12 points through an industry-recognized credential in a high-demand career field). It earned 0.3 extra points for every student who also completed one of the bonus measures (i.e., scoring a 3 or higher or 4 or higher on an AP or IB exam, respectively or earning at least three college credits) in addition to completing one of the primary measures.

Both Delaware and Ohio rewarded schools with bonus points for students who completed more measures than they needed to in order to be deemed ready. However, they did not fully differentiate between pathways that lead to the strongest postsecondary outcomes, or gave bonus points for higher achievement in any given pathway.

New York and Louisiana, on the other hand, have done a better job. New York’s CCR indicator, the “college, career, and civic readiness index,” gave different weights to students completing different postsecondary readiness activities (from 0 to 2 points), awarding extra credit for students who demonstrated higher levels of readiness. For instance, students who scored a 3 on an AP exam received a weight of 2, while students who took an AP course but did not pass any AP exams received a weight of 1.5.

One of Louisiana’s CCR indicators, the “strength of diploma index,” assigned students more points for achieving higher levels of postsecondary preparation and completing more measures than needed. Notably, students who earned an associate degree received 160 points, the highest number of points assigned to a single measure, signaling that the state valued this measure more than others. Students who scored a 3 on an AP exam received 150 points, whereas students who passed an AP course received 110 points. Similarly, students who earned an advanced statewide Jump Start credential received 150 points, whereas students who earned a basic statewide Jump Start credential received 110 points. Meanwhile, students received extra points for completing more than one measure. Students were assigned 160 points if they scored a 3 on an AP exam and earned an advanced statewide Jump Start credential, compared to 150 points if they completed either one of the two options.

4. Distinguish whether students are prepared for postsecondary education, the workforce, or the military, in addition to relying on an overall readiness rate for school accountability.

Students take different paths after high school, and different paths require different skill sets and preparation. For example, a student may be ready for English coursework at an open admissions community college, but needs English remediation and support at a private, nonprofit four-year university. Likewise, a student in a cybersecurity or technology pathway may require different preparation than a student interested in pursuing a nursing or medical career. Thus, most states using CCR indicators look at a range of college-, career-, and/or military-ready options, so that students with different postsecondary goals and aspirations can all be captured in the readiness indicator. Typically, students are deemed ready in the readiness indicator if they achieve any of the readiness options. Among the 37 states that used CCR indicators for accountability purposes:

  • 36 states used college-ready measures, such as achieving college-ready benchmark scores on college entrance exams like the ACT and SAT and/or completing advanced coursework, including AP, IB, and dual enrollment.
  • 35 states used career-ready measures, such as completing a CTE pathway and earning an industry-recognized credential. Most of these states (34) considered career measures in addition to college-ready measures, while one state, Pennsylvania, used a career-ready measure on its own.
  • Only 13 states used military-ready measures, such as earning a passing score on ASVAB and being enlisted in the military. All of them used military-ready measures in combination with college and career measures.

However, by aiming to account for so many different postsecondary possibilities, most states’ indicators paint an overly simplified picture of college and career readiness by treating different pathways the same. Students are considered ready whether they earn benchmark scores on the ACT (a measure designed to capture college readiness) or whether they earn an industry-recognized credential (a measure designed to capture career readiness). In other words, most states’ indicators measure whether students are college or career ready, not whether they are college and career ready.

While states can continue to use accountability indicators that consider college-, career-, and military-ready options in determining students’ readiness, states should also consider using sub-indicators for college, career, and military readiness separately. They can also use their accountability systems to incentivize states to support students acquiring both college and career skills so students are not limited in their postsecondary options.

State Example: North Dakota

North Dakota’s CCR indicator, “choice ready,” had three pathways: “postsecondary ready,” “workforce ready,” and “military ready,” plus “essential skills”—the skills critical in helping students succeed after high school. In order to be considered as a “choice ready” graduate, a student needed to acquire “essential skills” and demonstrate readiness in at least two of the three pathways. The primary “postsecondary ready” option included meeting benchmarks scores on the ACT or SAT set by the state and earning a grade point average (GPA) of 2.8. Students could also demonstrate readiness in the “postsecondary ready” pathway by achieving two of the additional options, such as passing an AP or IB exam and earning a GPA of 3.0 in core courses required for admission to the North Dakota University System. The “workforce ready” options included, for example, completing three CTE courses, completing 40 hours of workplace learning experience, and earning an industry credential. The “military ready” options included, for example, passing the ASVAB test; students should also either complete two credits of Junior Reserve Officer’s Training Corps (JROTC) or Civil Air Patrol or complete two of the options included under “postsecondary ready” or “workforce ready.” Although the “choice ready” framework enabled the state to distinguish and report on the three readiness pathways separately, the accountability calculation was still based on students’ overall readiness (i.e., the percentage of students attaining at least two of the readiness areas).

5. Expect students to complete college-ready measures, in addition to career-ready or military-ready measures, to be deemed ready.

Career pathways that require little postsecondary education or training and result in credentials of low value often lead to low-skill, low-wage jobs with few options for career advancement. Rigorous academic curriculum tied to desired career themes is key to students’ success in the workforce. High schools should not settle for graduating students prepared for entry-level jobs that do not require any college credentials, and states should use their accountability systems to promote pathways to the middle class. We found that 34 states measured students’ career readiness in combination with college readiness, and 13 states measured students’ readiness for college, career, and the military. However, most states do not expect students to demonstrate readiness for college coursework if they opt for a career- or military-ready option. Instead, states should encourage students who participate in career or military pathways to also demonstrate readiness for college-level coursework, so that those students are prepared for the postsecondary credentials and training they need to succeed in the modern workforce. In the long run, states should work together to identify and promote high-value industry credentials that count toward postsecondary education and training and thus lead to career advancement.

State Examples: California and Tennessee

In California, students could demonstrate readiness through eight options: (1) completing a CTE pathway with a C- or better in the capstone course, (2) scoring level 3 or higher on the Smarter Balanced Assessments for both English language arts and math, (3) completing two semesters, three quarters, or three trimesters of college credit coursework with a C- or better, (4) earning a score of 3 or higher on two AP exams, (5) earning a score of 4 or higher on two IB exams, (6) completing courses that meet the University of California or California State University a-g criteria required for admissions to the state’s four-year universities with a C or better, (7) earning the State Seal of Biliteracy, and (8) completing two years of Leadership/Military Science. Options 1, 6, 7, and 8 required students to achieve additional measures to be deemed ready. For example, for students who hoped to demonstrate readiness through CTE pathway (option 1), in addition to completing a CTE pathway with a C- or better in the capstone course, they also needed to either score a level 3 or higher on English or math in the Smarter Balanced Assessments and a level 2 in the other subject area or complete one semester (or two quarters/trimesters) of college coursework with a grade C- or better in academic or CTE subjects where college credits are rewarded. Similarly, students who chose to complete at least two years of Leadership/Military Science (option 8) also needed to score a level 3 or higher on English or math and a level 2 in the other subject area.

Tennessee‘s CCR indicator, “ready graduate,” had four pathways: (1) earning a composite score of 21 or higher on the ACT or 1060 or higher on the SAT; (2) completing four early postsecondary opportunities (EPSOs); (3) earning an industry certification and completing two EPSOs; and (4) passing the ASVAB test and completing two EPSOs. The state recognizes eight EPSO options; seven of them were considered academic (e.g., AP, IB, and dual enrollment) and the remaining one was an industry credential. To strengthen this indicator, the state should consider encouraging students who opt to demonstrate readiness through career-ready and military-ready pathways to complete at least one of the seven college-ready EPSOs.

Conclusion

State education leaders have worked hard to report and measure not only whether schools are helping their students earn a high school diploma, but also whether those graduates are ready for postsecondary opportunities. By publishing data related to postsecondary readiness and outcomes, states can provide meaningful information for education leaders and the public to evaluate the preparedness of high school students and identify inequities in access and success in various postsecondary pathways across student groups. By incorporating CCR indicators into their accountability systems for schools, states can signal the value and importance of various college and career pathways and incentivize high schools to improve students’ level of preparedness for success beyond high school.

Although the majority of states have already been doing the work, further improvement is needed, given the racial disparities we see in college and career preparation and outcomes and the undermeasuring of students’ postsecondary potential we observed. States should work on publishing more comprehensive, easy-to-digest data on postsecondary readiness and outcomes for all students, as well as for individual groups of students, at the school, district, and state levels. States can also work together and learn from one another—nationally or regionally—to develop and share best practices and common approaches to measuring college and career readiness. Meanwhile, as school accountability resumes after a two-year pause, states should seize the opportunity to redesign their CCR indicators, so that they are aligned with the skills, knowledge, and experience students need to be successful after high school.

Footnotes

  • 1
    In collecting CCR data, we preferred data for either students in the 12th grade or high school completers or graduates, as opposed to data for all high school students or multiple grade levels.
  • 2
    Alabama’s state report card provided CCR data for students in all grades combined; we used data for 12th graders from downloadable data files. Maryland did not report any state-level data related to its CCR indicators on its state report card; we found state-level data for one of its CCR measures from downloadable data files. Ohio reported the points earned on the CCR indicator on its state report card, which are not suitable for this analysis; we found the actual percentage of students who were prepared for success from downloadable data files.
  • 3
    Hawaii did not use a CCR indicator in its ESSA accountability system, so we used one of the CCR data points collected by the Hawaii Data eXchange Partnership that is the closest proxy to an overall rate of postsecondary readiness: the percentage of high school completers earning diplomas with academic, CTE, or STEM honors.
  • 4
    Similarly, Vermont’s “college/career outcomes” indicator measured whether high school graduates enrolled in college, were employed, or enlist in the military within 16 months of graduation. In this analysis, we used another indicator Vermont developed for its accountability system (“performance on college/career assessments”) as the best CCR data point for the state.
  • 5
    Indiana had a state accountability system and a federal accountability system. The CCR indicator we used in this analysis is from the state system.
  • 6
    Our data sources for postsecondary enrollment rates may not be the only sources where such data is available. Additional states may report subsequent college enrollment data elsewhere and thus are not included.
  • 7
    Twenty-six of the 28 states reported the percentage of students taking remedial courses. The remaining two states, Arkansas and Maryland, reported the percentage of students assessed to need remediation, which may be different from the percentage of students actually taking remedial courses.
  • 8
    Arizona, which had the lowest CCR rate in our data set, did not include CCR indicators in its accountability system and did not report any readiness data for high school at the state level. Therefore, we used the percentage of students in the graduating cohort meeting all four college-ready benchmarks on the ACT, as reported on Arizona’s ACT state report. See “About Our Data” for more information.
  • 9
    Wyoming, which had the lowest postsecondary enrollment rate in our data set, only reported college enrollment for recent graduates attending in-state community colleges, while most states reported enrollment data for students attending colleges and universities nationwide. Had Wyoming reported postsecondary enrollment at additional institutions, the 40-percentage-point range across states in our data set would have been smaller. See “About Our Data” for more information.
  • 10
    Wisconsin and Nebraska had the lowest readiness rate for Black and Latinx students, respectively, in our data set. Both states did not include college and career readiness indicators in their accountability systems and did not report any readiness data for high school at the state level. Therefore, we used the percentage of Black, Latinx, and White students in the graduating cohort meeting three or four college-ready benchmarks on the ACT, as reported on Wisconsin and Nebraska’s ACT state reports. Note that ACT does not report the percentage of subgroups of students meeting all four benchmarks. See “About Our Data” for more information.
  • 11
    We were not able to find disaggregated postsecondary enrollment data for the remaining 13 states. See “About Our Data” for more information.
  • 12
    Postsecondary enrollment data for these seven states excluded students attending out-of-state colleges and, in some cases, certain in-state institutions of higher education. See “About Our Data” for more information.
  • 13
    In 12 of the 16 states, we could not find the rate of students deemed college and career ready on any state website. Instead we used data from the ACT or College Board in the absence of state-reported data. In Arkansas, Washington, DC, Michigan, and Mississippi, the state administered the ACT or SAT to all students and reported student results as one of its readiness measures. See “About Our Data” for more information.
  • 14
    In the remaining 20 states, one or both data points were not available for Black, Latinx, and White students. See “About Our Data” for more information.
  • 15
    All 28 states reported remediation data for recent high school completers attending either all in-state institutions or a subset of in-state institutions. See “About Out Data” for more information.
  • 16
    In five states (California, Georgia, New Mexico, North Carolina, and Ohio) out of the 37 that used CCR indicators, we looked at their state report cards for the 2017–18 school year. See “About Our Data” for more information on the year of data used in this analysis.
  • 17
    In Virginia, students entering 9th grade in the 2018–19 school year and beyond are required to complete at least one of the additional readiness options to receive advanced diplomas: (1) completing an AP, IB, honors, or dual enrollment course; (2) completing a work-based learning experience; and (3) earning a state-approved CTE credential.
  • 18
    For seven states (California, Georgia, New Mexico, North Carolina, Ohio, Oregon, and Vermont), we looked at postsecondary enrollment data for the 2017–18 high school cohort. See “About Our Data” for more information on the year of data used in this analysis.
  • 19
    Alabama’s postsecondary enrollment data was available on the Alabama Commission on Higher Education website. However, data for the 2018–19 high school cohort was no longer available. Thus, we used college-going data for the cohort published by the Public Affairs Research Council of Alabama, a nonprofit research center.
  • 20
    Section 1111(h)(1)(C)(xiii) of the Every Student Succeeds Act, 20 U.S.C. § 6301 (2015).
  • 21
    Section 1111(b)(1)(D)(i) of the Every Student Succeeds Act, 20 U.S.C. § 6301 (2015).
  • 22
    Section 1111(c)(4)(B)(v)(II) of the Every Student Succeeds Act, 20 U.S.C. § 6301 (2015).
  • 23
    In five states (California, Georgia, New Mexico, North Carolina, and Ohio) out of the 37 that used CCR indicators, we looked at the state report cards for the 2017–18 school year. See “About Our Data” for more information on the year of data used in this analysis.

Acknowledgement

This report would not have been possible without the collaboration and contributions of many people, especially Anne Hyslop, who edited and provided valuable feedback on this report. We would like to thank Lynne Graziano, Brennan Parton, and Ryan Reyna for reviewing the report. We also thank the Joyce Foundation for its generous support of this work.


Learn More About College and Career Readiness

Career and Technical Education, College and Career Pathways

College and Career Pathways

All4Ed advocates for policies and practices that ensure high schools provide all students experiences that prepare them for success in college, work, and life.
Read More

College and Career Pathways, Higher Education

High School to Higher Education

All4Ed identifies and promotes policies to close race- and income-based gaps in college preparation and create more seamless postsecondary transitions.
Read More

February 23, 2021

Publication | College and Career Pathways, High Schools

Ready for What? How Multiple Graduation Pathways Do—and Do Not—Signal Readiness for College and Careers

A 50-state review of high school graduation requirements and recommendations for graduation pathways that promote college and career readiness.
Read More

Ziyu Zhou

Research and Data Specialist

Meet Ziyu