Information Interventions and Postsecondary Enrollment: Evidence from Appalachian Ohio

This paper examines a series of high school-level interventions designed to encourage college attendance in a historically underperforming region, Appalachian Ohio. High schools received competitive grants to combat information frictions regarding postsecondary enrollment---through campus visits, college fairs, financial aid seminars, etc. I estimate the effect of these competitive grants on postsecondary enrollment. Only Appalachian high schools were eligible for the program, and I exploit this policy-induced variation in treatment allocation to compare college attendance rates for high schools that received funding and similar, non-Appalachian high schools that were ineligible for the program using a difference-in-differences framework. Leveraging multiple datasets and treatment specifications, I document two findings: i) while college attendance generally rose during treatment, no evidence indicates that the grants increased attendance relative to similar yet untreated schools and ii) there is no evidence that attendance patterns shifted to higher-quality institutions.


INTRODUCTION
Recent economic research suggests that the opportunity for intergenerational economic mobility differs spatially (Chetty et al., 2014). The Appalachian region of Ohio ranks among the worst in this measure. 1 Parental incomes are better predictors of child income and college attendance in this region than in most of the country-promoting generational gaps in wealth and opportunity locally between high-and low-income families. Furthermore, one is * I thank Shawn Kantor, Carl Kitchens, Luke Rodgers, and numerous conference participants for valuable comments. In addition, I thank Jake Bapst, Brenda Haas, and Laura Risler for providing institutional background and data on the OACHE program. All shortcomings and conclusions of the paper are my own. Wallace is an Assistant Professor of Economics at Georgia College & State University, Milledgeville, GA 31061.Corresponding Author: C. T. Wallace, E-mail: cullen.wallace@gcsu.edu 1 For purposes of this study, Appalachia consists of counties as defined by the Appalachian Regional Development Act of 1965.
(c) Southern Regional Science Association 2021 ISSN 1553-0892, 0048-49X (online) www.srsa.org/rrs on average 20 percent less likely to be a top income earner if raised in Appalachian Ohio than if raised elsewhere in the U.S. 2 What factors contribute to the relative lack of economic mobility in this region? Lower postsecondary educational attainment could be one cause. Chetty et al. (2014) find no association between parental income and proximity to higher education institutions, but additional inhibitors to college enrollment, other than distance, could contribute to the income and college attendance disparity in the region. While education rates in Appalachia have been persistently lower than national averages (Isserman, 1997), increases in human capital during the 1980s and 1990s stemmed the widening wage gap between Appalachia and the rest of the U.S. (Bollinger et al., 2011). This paper examines school-level interventions that occurred within Appalachian Ohio during the 1990s and 2000s. These were intended to increase college enrollment throughout the region and combat a documented information deficiency regarding higher education. Concerned with Appalachian students' lack of college attendance, the Ohio Board of Regents commissioned a study in 1991 to discover potential inhibitors to postsecondary achievement. The report cited three key areas-all of which have been explored in the economics literature-in which information frictions significantly prevented students from attending college: cost and financial aid (Horn et al., 2003;Kelchen and Goldrick-Rab, 2015;Evans and Boatman, 2019), the application process (Avery et al., 2014), and a student's own ability awareness (Hoxby and Avery, 2012).
In response to the Board of Regents' report, the Ohio General Assembly created and funded the Ohio Appalachian Center for Higher Education (OACHE) in 1993 to promote higher educational attainment in the region. 3 The OACHE sought to achieve these goals through two primary means: i) funding information centers at regional colleges and universities and ii) providing competitive grants to individual high schools to promote college enrollment-campus visits, financial aid seminars, career planning, and the like. While the OACHE did not lower the monetary cost of attending college, it addressed gaps in information that persisted among high school students. I estimate how OACHE-funded high school information interventions affected college attendance trends in the region by exploiting policy-and timing-induced variation in high school grant eligibility. The OACHE grants were competitively awarded exclusively to Appalachian public high schools within Ohio. Schools in rural non-Appalachian counties, while similar on many dimensions, were ineligible for the grants, creating a reasonable control group.
To estimate the relationship between the OACHE grants and college attendance decisions, I incorporate two complementary datasets from the Ohio Department of Higher Education. One reports annual counts of students from each Ohio county enrolled in each public higher education institution in Ohio from 1990 to 2012. This dataset is useful because of its long timespan and institution-level gradient. The second dataset records the number of seniors in each school district who immediately enroll in any Ohio public institution in the summer or fall semesters following high school graduation. 4 Because I can directly measure the share of students enrolling in college from each school district, this dataset provides a cleaner metric to gauge the success of the OACHE grants. Unfortunately, these data are only available from 2001-2012. I employ both datasets separately to measure the impact of the grants in a differencein-differences specification. Effectively, I estimate the relative change in college attendance patterns for treated high schools to untreated, yet similar, control high schools.
While college enrollment increased within the region, I find no evidence that the program differentially increased college enrollment compared to similar yet untreated schools. This finding is robust to multiple treatment specifications, and the result holds when using county-and district-level attendance data. Additionally, estimates suggest no compositional change in enrollment patterns to higher quality schools. First, there is no shift in enrollment between two-and four-year institutions after treatment. Second, even among four-year institutions, there is no change in enrollment between main campuses and branch campuses after treatment.
This research contributes to the information literature in part because of the unique setting: a historically underperforming region of the country in terms of earnings and postsecondary educational attainment. Most studies target students based on one of two criteria: i) their income or propensity to attend college, regardless of geographic location, or ii) the researchers' ability to conduct research in the region where the student lives. This paper is unique in that policymakers specifically selected this region for treatment based on prior college enrollment trends.
Information interventions may be more or less effective in such a setting-Appalachia specifically but rural areas generally. On one hand, the average student may have a greater need for information (Bell et al., 2009;Tierney and Venegas, 2009;Kelchen and Goldrick-Rab, 2015;Bowman et al., 2018), so a campaign may have greater success; on the other hand, additional factors may affect college enrollment, necessitating further action than solely information provision (Domina, 2009;Wells et al., 2019;Holland, 2020).
Furthermore, this paper comments on the linking of education and place regarding one's economic mobility. One's hometown closely correlates with future earnings (Chetty et al., 2014), and underinvestment in human capital can underlie wage disparity and regional poverty traps (Bollinger et al., 2011;Durlauf, 2012). A substantial factor in the link between place and economic mobility could be the perception, accessibility, necessity, and takeup of higher education. Understanding place-specific barriers to postsecondary education can have considerable impact in understanding impediments to intergenerational mobility.
A last contribution centers around the nature of the OACHE grants. These allowed school representatives to best tailor the funding to their students' needs, rather than imposing a broad, uniform approach across all high schools. Most interventions impose a standard curriculum or regimen that may or may not be particularly effective across schools. This freedom in design provides opportunity to observe outcomes when local administrators and guidance counselors, who are best positioned to understand specific problems at a given high school, have the ability to direct resources.

BACKGROUND INFORMATION
2.1. Appalachian Access and Success Report Figure 1 depicts the percent of adults with a four-year degree as a fraction of the national average (20.3 percent) in 1990. 5 All but one Ohio Appalachian county fell below the U.S. average, and the rate was below half of the U.S. average for 21 counties. In 1991, the Ohio General Assembly formed a consortium of ten two-and four-year Appalachian institutions in response to lagging college enrollment rates in the Appalachian region of the state. The consortium commissioned a study, Appalachian Access and Success, to thoroughly document the state of higher education in the region and present potential means to address such issues (Crowther et al., 1992). A brief summary of the report follows.

The Education Gap
The report outlines the stark gap in educational attainment for Appalachian Ohio. For each of the 88 counties in the state, the Ohio Department of Education estimated the fraction of high school seniors that continued into some form of postsecondary education during the school years 1985-86 to 1990-91. The average participation rate for Ohio over the six years was 53.8 percent; for the 29 Appalachian counties, the estimated college attendance rate was 43.4 percent. 6 The national average college attendance rate in 1991 was 62.4 percent. Only four Appalachian counties placed in the top half of the Ohio county ranking, and nine of the bottom ten counties were Appalachian. The report also considered economic factors. It notes a gap between the level of human capital of the region and skills necessary to compete in the current labor force. Much of the regional economy depended on coal and manufacturing, two waning industries in the early 1990s. The report succinctly describes the link between education and the changing economy: "Continuing low levels of participation in higher education will only add to the size of the unskilled and semiskilled labor pool for which demand is declining."

Information Frictions
The report concludes that a lack of information among many dimensions represents the largest factor affecting college attendance. Students and families cited a lack of knowledge regarding available financial aid. Furthermore, students overestimated the cost of attending college, creating a perceived hurdle to postsecondary education rather than a true obstacle. Nearly half of surveyed seniors estimated one year's tuition cost at a four-year institution to be greater than $10,000, yet the total cost for colleges within the region were listed between $6,000-$7,000. The same overestimation occurred for two-year colleges.
A second broad area of misinformation pertains to the educational process. Students wished for more information linking their desired careers to education programs and technical training. Furthermore, while nearly all students cited the support of their parents to attend college, most parents held no degree themselves. This lack of institutional knowledge may present challenges in understanding the college application process and applying for financial aid.
Another facet of the information problem centers around students' ability and readiness for college. The report noted that teachers and school administrators largely encouraged those students who appeared prepared for college but did not promote college among less capable seniors, who still may have been good candidates for a two-year degree (if not four-year). This dichotomy is problematic if teachers hold antiquated notions of a collegeready student; many options existed in the early 1990s for a wide variety of students. In addition, seniors underestimated their ability to perform well in college. Less than 30 percent considered themselves to have above-average intelligence, compared to the national average of 58 percent.
In addition, the report details an information gap between local higher education insti-tutions and area high schools. High school personnel reported inadequate information on tuition costs and available financial aid from local institutions. The recruitment efforts by local colleges were not deemed effective by students, parents, or high school staff.

The Ohio Appalachian Center for Higher Education
As a consequence of Appalachian Access and Success, the Ohio General Assembly created the Ohio Appalachian Center for Higher Education (OACHE) in 1993 (Ohio 120th General Assembly, 1994). Notable features of the program included competitive grants of approximately $5,000-$10,000 that high schools could use for college promotion. These grants were awarded every two years, and the first round of grants were given in the 1993-94 school year (state fiscal year 1994). High schools submitted applications detailing how they would spend the money to encourage college enrollment, such as organizing campus visits, establishing career and mentoring programs, and assisting students with identifying and applying to colleges (Schwartz, 2004). The program's mission was to "raise the college-going rate of Appalachia Ohio." (Ohio Appalachian Center for Higher Education, 2009). A "model program" was developed by one of the first high school awardees. The high school's college attendance rate increased from 28 to 72 percent in three years after implementation of the program (Schwartz, 2004). The program was replicated with similar success at other schools. It consisted of a complete pathway from ninth grade to senior year. After four years, students would have been exposed to a battery of postsecondary preparation, including any or all of the following: field trips to multiple businesses, career surveys, career and college fairs, multiple college campus visits, a research project pertaining to their postsecondary options, ACT/SAT prep classes, meetings with parents and school personnel, scholarship information, and a mentor program with a business in the community (Ohio Appalachian Center for Higher Education, 2019). High schools were free to completely adopt the model program in their application or submit their own plans. Figure 2 shows college attendance rates for Tuscarawas County (home of the initial successful high school) compared to those of non-Appalachian rural Ohio counties. After successive rounds of treatment, enrollment from Tuscarawas County relatively increased when compared to similar counties not receiving treatment.
Importantly, only high schools in the Appalachian region could request funding. This creates policy-induced variation that I use to measure the program's impact. I compare college attendance rates for Appalachian counties to those of rural non-Appalachian Ohio. These regions are outlined in Figure 3.
As a result of apparent success of the OACHE, several states within the Appalachian area formed the Appalachian Higher Education Network (AHEN). The OACHE was named a top public-service initiative by the Public Employees Roundtable in May 2001, and two years later, the Harvard University John F. Kennedy School of Government awarded the center the "Innovations in American Government" award.
The program lasted for several years but lost funding in 2009 due to the Great Recession. Smaller, regional initiatives have persisted at some colleges, but the coordinated effort to raise Appalachian enrollment numbers has ceased.
In the next section, I discuss data sources used to measure the impact of the center.
©Southern Regional Science Association 2021.   Metropolitan status is as dileniated in 1990.
©Southern Regional Science Association 2021. Inventory Data report the number of enrolled students (undergraduates, graduates, dualenrollers, professionals) for each public college and university by the student's home county at the time of application. 7 Figure 4 shows the 60 public universities in the sample. Figure  5 illustrates Appalachian college attendance at public institutions over time.
The Student Inventory dataset is the primary source of college attendance data for this paper because of the long timespan and reporting at the institution level, allowing me to measure changes in enrollment for university main campuses, university branch campuses, Note: Shown is the total number of Appalachian students attending Ohio two-and four-year public institutions.
©Southern Regional Science Association 2021. community colleges, and technical colleges. 8 The High School to College Transition Reports detail the number of high school graduates who enroll in an Ohio public college in the summer or fall of their graduation year. This dataset captures a more direct measure of college enrollment, useful for identifying the impact of an OACHE grant on college attendance. Unfortunately, data are reported in 2001 and following years, so I only use it to supplement the county-level data.
Transition Reports are reported by high school and school district. While measurement at the high school level is optimal, data are missing for 2006 and 2007. Additionally, school openings, closings, and mergers present challenges for estimation of the treatment effect. Because of these reasons, I use school district data. Over 75 percent of districts in my sample contain only one high school, so this level of aggregation is still relatively precise at measuring transitions from high school to college.
Note that I am only able to observe attendance for public institutions. Enrollment data for private institutions are not available by a student's home county or school district for the time period in question. This analysis is unable to capture the extent to which the OACHE program encouraged enrollment at private institutions. Although enrollment at private colleges is beyond the scope of this paper, it is reassuring to know that nearly 4 in 5 Ohio students enrolled in higher education were at public institutions in 2000, the first year for which such aggregate data is available. 22 percent of Ohio students at that time were enrolled in private nonprofit institutions, and only 0.6 percent of total students were enrolled in a for-profit private college. Thus, observing enrollment changes at public institutions captures the dominant form of college attendance at the time of the program. 9

OACHE Grants
I collected grant information from multiple sources. For state fiscal years 1994-2001, data come from a study conducted by the OACHE (Inman, 2000). Archived versions of the OACHE's website detail treatment for years [2002][2003][2004][2005]. Information regarding grants in years 2008 and 2009 comes from the 2007-08 OACHE Annual Report (Ohio Appalachian Center for Higher Education, 2008). 10 Figure 6 depicts the number of treated high schools in a given year and shows the average percent of treated seniors in a treated county by year. 11 8 I fix the categories of institutions at 1992 definitions. However, in 1994 and 1995, five technical schoolsout of thirteen-converted to community colleges. I therefore present results for university main campuses, university branch campuses, and two-year institutions. I label the Ohio State University Agricultural Technical Institute as a branch campus. 9 It is also likely that state institutions would be the most heavily targeted by the OACHE for i) the perceived lower cost of tuition; ii) the variety of programs; and iii) because the Board of Regents (overseeing public institutions) instituted the program. 10 No data is available on funding for years 2006 or 2007. 11 The OACHE primarily awarded grants to high schools, but recipients also included other entities: county consortia, elementary schools, multiple-county vocational high schools, and research projects. I include only awards to traditional high schools.

Other Variables
Additional control variables include county-level population and per capita personal income from the Bureau of Economic Analysis (BEA), plus the labor force and unemployment rate from the Bureau of Labor Statistics (BLS).
For the county-level analysis, I include county-level educational variables from the National Center for Education Statistics (NCES) Elementary and Secondary Information System (ElSi): the number of high school students, public high schools, share of black students (all grades), and share of white students (all grades).
When using district-level data, I control for the number of seniors, FTE teachers (districtlevel), schools, share of black high school students, and share of white high school students. All of these variables are from ElSi.

Comparison Group
What counties constitute the best comparison group for the treated counties? Given a standard difference-in-differences framework, the control group is the best approximation for how enrollment trends would have evolved in the treated counties had they not received treatment.
Non-Appalachian rural counties comprise an intuitive control group: similar to the treated counties on observables but ineligible for the funding due to the scope of the OACHE's mission. Table 1 shows descriptive statistics for this control group as well as the treatment group. While some differences in means between the two groups are statistically significant, untreated rural counties matched most closely to the treatment group on nearly all variables. Furthermore, the means of some variables (e.g. share of white students or labor force per capita), though statistically different, do not differ in meaningful ways. 12 With a difference-in-differences specification, selection of high schools into the program can prove concerning. If the OACHE awarded grants to high schools with intent of securing the largest increase in enrollment or, conversely, on the basis of greatest need, the estimates would be biased upward or downward, respectively. While the grants were awarded competitively, many schools applied for the program each year, and most Appalachian counties eventually were awarded a grant. Applications did not have an official scoring process, and one cannot rule out differences in subjective judgment stochastically affecting the selection process. In the next section, I demonstrate that, conditional on control variables, treated and control counties have parallel trends in enrollment prior to treatment, allaying concerns that better-or worse-performing schools were favored.

Treatment Effect on Net College Attendance
To estimate how the OACHE information intervention impacted overall college attendance, I employ a difference-in-differences strategy, shown in Equation 1.
Enrollment is the logged number of students from county c in year t that attended a public institution in Ohio. GrantY ear takes the value of one for years in which a high school in county c has an active OACHE grant. X are county-level control variables: demographic, economic, and educational indicators. I include county and year fixed effects. The time period for estimation is school year 1989-90 (denoted 1990) to 2011-12 (2012), three years after the program ended in 2009. Standard errors are clustered at the county level.
The estimate of δ is unbiased provided that attendance trends for untreated counties serve as an adequate comparison group, conditional on X, for attendance trends in treated counties absent the OACHE intervention.
Non-Appalachian rural Ohio counties comprise the primary control group; these were ineligible to receive an OACHE grant. This group best matches the treatment group on pre-treatment characteristics and satisfies the parallel trends assumption, with raw numbers in Figure A1 and with regression output in Figure 7. Effectively, similar counties in the southeastern part of the state were eligible for funding, and counties in the northwestern section were not. Additionally, ever-treated counties that are not currently treated also comprise the control group. Conditional on control variables, this policy-induced variation over space and time identifies the effect of the OACHE grants. Population, the number enrolled in higher education, the number of high school students, and the number of high schools are group averages. Shares of black and white students are group averages weighted by the number of students in all grades. Real per capita income and labor force per capita are group averages weighted by the county's population. Unemployment rate is the group average weighted by the county's labor force. Shares of enrollment are group means weighted by the total enrollment of the county. All means are averaged across 1990-1992. Standard errors in parentheses. *p<0.05, **p<0.01, ***p<0.001 ©Southern Regional Science Association 2021.  Table 2 displays treatment effects from estimation of Equation 1. Figure 7 represents the same equation except time-to-treatment binary variables replace GrantY ear. The relative time period is the year before a county's first OACHE grant. In Table 2, the estimated treatment effects for years of OACHE grant treatment are negative yet statistically indistinguishable from zero. Results fail to show an increase in college enrollment when OACHE grants were awarded. This is the same intuition gained from studying Figure 7. These estimates suggest that the OACHE grants did not increase attendance on the extensive margin. This result holds when using alternative control groups, changes in levels, and district-level data. 13 The next section explores if the program altered enrollment patterns on the intensive margin, between different categories of institutions. Error bars represent the 95 percent confidence interval. For the same specification without a logged dependent variable, see Figure A4.

Treatment Effects by Institution Type
Beyond just an increase or decrease in total postsecondary enrollment, one may want to understand if the program affected the composition of college attendance over time or if students substituted between four-year and two-year institutions. I alter the dependent variable to estimate how these attendance patterns changed. Equation 2 estimates enrollment changes at three mutually exclusive categories of institutions: public university main campuses, public university branch campuses, and public two-year institutions (see Footnote 8).
EnrollmentT ype ct = δGrantY ear ct + X ct β + γ c + τ t + ct (2) Table 3 shows results for the three dependent variables. Figure 8 depicts coefficients from similar regressions with GrantY ear replaced by treatment-relative year indicator variables. Estimates suggest no substitution occurred among the three different types of college attendance. No evidence shows that OACHE grants affected students' decisions regarding college type. 14 Estimates imply the program did not succeed in altering college attendance patterns on the extensive and intensive margins.
On average, students did not attend college at a greater rate following receipt of the grants, and attendance trends by institution type remained unaffected. In the next two sections, I perform various robustness checks. First, I alter the treatment definition. Following that, I use school district-level college attendance data (available beginning in 2001) from a different dataset to verify results.

Alternative Treatment Definitions
While the initial specification captures the change in county-level enrollment after a high school in that county is treated, this treatment definition has shortcomings. It makes no distinction between counties with one high school and those with five. For example, one would expect treatment intensity, the number of seniors exposed to an OACHE grant, to impact county-level college attendance.
I explore two additional treatment definitions: the ratio of seniors enrolled in a treated high school to the overall number of seniors in a county and a count of treated high schools in a county. Additionally, I test if attendance decreased at the end of the program, after a county stopped receiving grants. I estimate Equation 3, where FractionTreatedSeniors is equal to the fraction of treated seniors to total seniors in county c and time t and zero otherwise. During the program, as many as 100 percent of a county's seniors received treatment; the weighted average hovered between 20 and 25 percent (see Figure 7(b)). Enrollment is logged postsecondary attendance for county c in year t. Enrollment ct = θF ractionT reatedSeniors ct + X ct β + γ c + τ t + ct (3) Note: The dependent variable in the first three columns is the logged number of students from each Ohio county attending Ohio public postsecondary institutions by type. The variable is not logged in Column 4, and the regression is weighted by the county's number of high school students. Data are from ODHE. The control group consists of the 28 non-Appalachian rural Ohio counties, in addition to currently untreated ever-treated counties (24 counties are treated: (28+24 counties)×23 years=1,196 observations). Regressions include county and year fixed effects and a control for logged county population (BEA). Economic controls consist of logged real per capita personal income (BEA), logged labor force (BLS), and the logged unemployment rate (BLS). Education controls are at the county level and consist of the logged number of high school students (Columns 1-3), the number of public high schools, the share of black students (all grades), and the share of white students (all grades), all of which are from ElSi. Population and labor force variables are not logged in Column 4. Data span 1990-2012. Some early educational variables are missing for certain counties. Standard errors are clustered at the county level and are in parentheses. *p<0.05, **p<0.01, ***p<0.001   Error bars represent the 95 percent confidence interval.
©Southern Regional Science Association 2021. The estimate of θ captures the change in college enrollment as the proportion of treated seniors in a county increases. Table 4 shows the estimation results from Equation 3. Similar to the primary results, there are no positive treatment effects; the coefficient for Fraction-TreatedSeniors is negative and indistinguishable from zero. As the fraction of treated seniors increased, postsecondary enrollment did not correspondingly increase.
To determine the marginal effect of an additional treated high school on college enrollment, I estimate Equation 4, where T reatedHS is the number of treated high schools in county c and time t and zero otherwise. Results are presented in Table 5 and point to no increase in enrollment because of the program.

End of Program Effects
An alternative method for identifying treatment effects is analyzing attendance patterns at the end of the program. If the intervention temporarily boosted college attendance, one would expect enrollment to decrease after the OACHE grants expire-or at least after any lingering effects of the program (i.e. new information for faculty, culture change, etc.) diminished. Figure 9 shows coefficients for year indicators relative to the last year a county received a grant, time t. The coefficient for time t + 1 estimates the change in enrollment for the first year without a grant. Results suggest that, on average, college attendance did  196 1,196 1,193 1,193 Note: The dependent variable in the first three columns is the logged number of students from each Ohio county attending Ohio public postsecondary institutions. The variable is not logged in Column 4, and the regression is weighted by the county's number of high school students. Data are from ODHE. The control group consists of the 28 non-Appalachian rural Ohio counties, in addition to currently untreated ever-treated counties ( not change after a county stopped receiving OACHE funding, up to seven years after the funding stopped.

Alternative Units of Observation-High Schools and School Districts
The county-level college attendance data first used is helpful because of its many years and institution-level granularity. One can examine college attendance trends before the OACHE program began and understand how attendance at various types of institutions changed over time. A downside to using these data is the imprecision when identifying treatment at the high school level. I use a coarse definition of treatment in Equation 1; GrantY ear is equal to one for a county when any high school in that county is treated. Only changes in county-level attendance can be detected, yet treatment is at the high school-level. Analyzing district-level attendance data provides a better setting for estimating the true change in enrollment after treatment. The Ohio Department of Higher Education provides this information in College Readiness Reports beginning in 2001, seven years after the OACHE's creation. The data measure the number of spring high school graduates from Ohio public high schools that immediately enroll in an Ohio public institution for the summer or fall semester of the same year.
The ODHE releases these fall enrollment data for high schools and school districts annually. However, data are missing at the high school-level in years 2006 and 2007. Because these two years fall during the treatment period and the dataset starts after the program in 2001, this lack of data is unfortunate. Additionally, schools opened, closed, and merged ©Southern Regional Science Association 2021. throughout the time period. Therefore, to identify school-level changes in college enrollment, I use public school districts with only one high school throughout the sample period. This subset constitutes more than 75 percent of my school district sample. In addition, over 70 percent of treated districts were one-high school districts. To identify district-level changes in college enrollment, I use the full sample of school districts, comprised of districts that appear in all twelve years of the dataset (2001)(2002)(2003)(2004)(2005)(2006)(2007)(2008)(2009)(2010)(2011)(2012). The district data are available in 2006 and 2007, plus analysis at the district level mitigates concerns regarding school mergers or closings. 15 In this section, I show results using untreated Appalachian districts as the control, which are the closest in comparison to treated districts. Results with other control groups are presented in the Appendix, with similar conclusions.
Using these data, I estimate Equation 5. In this specification, ψ captures the immediate change in college freshmen enrollment from district d in time t following receipt of an OACHE grant. Table 6 displays the estimated coefficients.
Estimates are consistent with those of Table 2. The coefficients on the treatment variable are very close to zero, suggesting that districts receiving an OACHE grant did not send more students to college on average relative to control districts. 16 Figure 10 plots coefficients from a variation of Equation 5, with relative-to-treatment year indicators substituted for the treatment variable. Estimates are set relative to the last untreated year. 17 Taken together, the granularity of the high school data with the completeness of the school district data, estimates reinforce the primary conclusions. Relative to control groups, average attendance for treated high schools did not increase following the receipt of an OACHE grant.

DISCUSSION AND CONCLUSION
I estimate the causal effect of a regional, school-level information intervention on college enrollment in an area with historical economic and educational challenges. The program originated to address a lack of college attendance by students in Appalachian Ohio. It specifically targeted the dearth of information pertaining to the college attendance process. It did so in part by funding high school-level grants designed to promote college attendance; this often took the form of campus visits, college fairs, application assistance, financial aid and scholarship information, and mentorships.
While college attendance generally increased within the region, I find no evidence that these high school grants on average effectively increased college attendance relative to similar but untreated schools. I also find no evidence that the grants on average affected the composition of college attendance relative to similar yet untreated schools. These findings,  ©Southern Regional Science Association 2021.
although in line with recent studies, contradict anecdotal evidence of substantial success. I give three reasons for the discrepancy. First, the empirical approach of this paper measures causal changes in college enrollment, that is, fluctuations in college attendance relative to a control group. The counterfactual counties represent the change in college enrollment that likely would have occurred absent OACHE treatment. While enrollment from treated counties statistically increased over the time frame, it did not increase relative to the control group of counties, and this distinction may dilute the reported success of the program. "Summer melt" also serves as an additional potential explanation for the discrepancy between these results and the Center's reports. In the summer after high school and before college, large proportions (up to one-third) of low-income students reconsider enrolling in colleges that they previously had committed to attend (Arnold et al., 2009;. Interventions over the summer after high school demonstrate marked increases in college enrollment (Castleman et al., 2012Castleman and Page, 2017;Page and Gehlbach, 2017). Summer melt can account for differences between success stories and these findings to the extent that students reported plans for attending college prior to graduating high school and subsequently failed to enroll in college.
Lastly, variations in treatment could explain differences in results. A representative of the program, when asked why some treated high schools increase college enrollment while others do not, noted that in many cases, an increase in postsecondary attendance directly correlates with the dedication and enthusiasm of the high school's grant coordinator (Ash Center for Democratic Governance and Innovation, 2011). To that end, a potential explanation of improvements in college attendance could be attributed to extraordinary efforts on behalf of dedicated grant coordinators, while on average, results are mixed.
The findings of this paper contribute to our understanding of information interventions in three specific ways. First, the program spanned 16 years and treated more than 23,000 seniors. It did so at the high school level with relatively intensive treatment, compared to nudge-centric interventions. This finding is in agreement with that of Bird et al. (2019)often scaled-up, intensive interventions, that succeed when administered on relatively smaller scales, are ineffective when coordinated at higher levels.
This study also illuminates paths forward regarding college attendance in regions with historically low enrollment. The results suggest that a more holistic approach may be necessary to affect enrollment trends. One potential avenue for success may include ensuring students follow through with plans made during their senior year. One survey of Appalachian area schools reported that 80 percent of seniors intended to enroll in college upon graduation, but additional surveys indicate only 88 percent of those who planned to enroll actually did so in the following summer or fall (Ohio University, 2009). This is likely an upper bound, given response bias for the follow-up surveys.
Third, that the program failed to relatively increase college attendance patterns given this customizable structure is concerning. Local administrators and guidance counselors, who may be best positioned to understand specific problems at a given high school, were unable to influence college enrollment numbers with the resources allotted. This begets new questions for study: Are the grant amounts too small? Are the initiatives ill-suited given the setting? Are complementary interventions needed along with information to increase college enrollment?
This research joins a growing body of literature suggesting that filling information gaps alone may not be adequate for increasing college attendance, particularly when coordinated at the state-or national-level. In addition, the difficulty of this task can vary based on the setting. Understanding what is and is not effective at reducing educational disparity-and where-remains a critical task, presenting ample opportunity for further study.
©Southern Regional Science Association 2021.  196 1,196 1,193 1,193 Note: The dependent variable in the first three columns is the logged number of students from each Ohio county attending any Ohio public postsecondary institution. The variable is not logged in Column 4, and the regression is weighted by the county's number of high school students. Data are from ODHE. The control group consists of the 28 non-Appalachian rural Ohio counties, in addition to currently untreated ever-treated counties (24 counties are treated: (28+24 counties)×23 years=1,196 observations). An indicator for the 1993 unemployment rate quartile among ever-treated counties is interacted with GrantY ear in Equation 1. Regressions include county and year fixed effects and a control for logged county population (BEA). Economic controls consist of logged real per capita personal income (BEA), logged labor force (BLS), and the logged unemployment rate (BLS). Education controls are at the county level and consist of the logged number of high school students (Columns 1-3), the number of public high schools, the share of black students (all grades), and the share of white students (all grades), all of which are from ElSi. Population and labor force variables are not logged in Column 4. Data span 1990-2012. Some early educational variables are missing for certain counties. Standard errors are clustered at the county level and are in parentheses. *p<0.05, **p<0.01, ***p<0.001

APPENDIX
©Southern Regional Science Association 2021.  Note: Average attendance is the group mean of the total number of attenders from a county, weighted by the high school student population. Attendance data are from ODHE. Population data are from BEA. Treatment begins in 1994 and continues through 2009. The base year is 1993. For pre-trends conditional on control variables relative to treatment year, see Figure 8. "Treated Counties" are those that are ever treated.
©Southern Regional Science Association 2021. Time to Treatment Note: The dependent variable is the number of students from each Ohio county attending any Ohio public postsecondary institution. Data are from ODHE. The control group consists of the 28 non-Appalachian rural Ohio counties, in addition to currently untreated ever-treated counties. The regression includes county and year fixed effects and a control for county population (BEA), logged real per capita personal income (BEA), labor force (BLS), logged unemployment rate (BLS), number of public high schools (ElSi), share of black students (all grades) (ElSi), and share of white students (all grades) (ElSi). The regression is weighted by the number of high school students in the county. Data span 1990-2012. The regression includes the maximal number of observable leads and lags from the first year of treatment. Some early educational variables are missing for certain counties. Standard errors are clustered at the county level. N=1,193. Error bars represent the 95 percent confidence interval.

(c) Two-Year Colleges
Note: The dependent variable is the number of students from each Ohio county attending Ohio public postsecondary institutions; data are from ODHE. The control group consists of the 28 non-Appalachian rural Ohio counties, in addition to currently untreated ever-treated counties. The regression includes county and year fixed effects and a control for county population (BEA), logged real per capita personal income (BEA), labor force (BLS), logged unemployment rate (BLS), number of high school students (ElSi), number of public high schools (ElSi), share of black students (all grades) (ElSi), and share of white students (all grades) (ElSi). Data span 1990-2012. Some early educational variables are missing for certain counties. Standard errors are clustered at the county level. Error bars represent the 95 percent confidence interval.