Ohio Policy

Traditional districts that serve as charter school sponsors are often glossed over in the debate over Ohio’s charter sector. But...
A short article published this week in the Columbus Dispatch makes serious reporting mistakes that leave readers with a distorted...
Last year’s biennial budget ( HB 64 ) required Ohio to define what it means to be a “consistently high-performing teacher” by...
Thomas J. Lasley II
NOTE: Tom Lasley, executive director of Learn to Earn Dayton and former dean of the School of Education and Health Sciences at...
Ohio’s largest online school, the Electronic Classroom of Tomorrow (ECOT), has recently caught flack for its low graduation rate...
This week Ohio Auditor Dave Yost visited United Preparatory Academy (UPrep), a high-performing elementary charter school in the...
Too much of what we hear about urban public schools in America is disheartening. A student’s zip code—whether she...
Implementation of the Every Student Succeeds Act (ESSA) is looming on the horizon, and education leaders and policy makers are in...
NOTE: This is the introduction to Fordham Ohio's latest report— Pathway to Success: DECA prepares students for rigors of college...
Earlier this week, the Ohio Department of Education announced a new award for schools that exceeded expectations for student...
Ohio’s student growth measure—value added—is under the microscope, which provides a good reason to take another look at its...
Last month, Attorney General Mike DeWine toured Citizens Academy, one of the eleven charter schools in the Breakthrough Schools...
Auditor of State Dave Yost
I am a conflicted man. Professionally, I lead Ohio’s auditing staff, a team of financial experts whose job it is to verify that...
The passage of comprehensive charter school reform in the form of House Bill 2 was supposed to move charters past the...
In K–12 education, states have historically granted monopolies to school districts. This tradition has left most parents and...
Since their inception in 1999, Buckeye charter schools have grown rapidly. According to the National Alliance for Public Charter...
Regular Gadfly readers know that we usually rely on two metrics when analyzing school performance—Ohio’s performance index and...

Last year’s biennial budget (HB 64) required Ohio to define what it means to be a “consistently high-performing teacher” by July 1, a date that is fast approaching. This particular provision aimed to make life easier for such teachers by excusing them from the requirement to complete additional coursework (and shell out extra money) each time they renew their licenses. It also exempts them from additional professional development requirements prescribed by their districts or schools. Who could oppose freeing some of our best teachers from a couple of burdensome mandates?

More people than you might think, starting with the group tasked with defining “consistently high-performing”: the Ohio Educator Standards Board. The board recently “voted unanimously to oppose the law” according to Gongwer News—never mind the fact that the law passed last year and contesting it now is futile. Chairwoman Sandra Orth said that defining a high-performing teacher was disrespectful and unproductive. Ohio Federation of Teachers (OFT) President Melissa Cropper also weighed in, calling it “another slap in the face to our profession.” Meanwhile, state board of education member A.J. Wagner characterized this provision as a “law that was made to be broken” and urged fellow members to follow in the footsteps of the standards board and refuse to approve a definition. That’s not an option in the face of a statutory deadline, but it shows how far some are willing to go in order to grandstand.

What aspect of the high-performing teacher definition is so disconcerting, exactly? It doesn’t eliminate professional development, as the provision’s opponents misleadingly suggest; it eliminates specific requirements for Ohio’s top-performing teachers, which is meant to free them up so they can seek out the learning opportunities that they deem most worthwhile. The backlash from teachers’ unions and their allies is both disappointing and predictable. They’ve been outspoken critics for years as Ohio legislators have attempted to alter the teacher evaluation system in an effort to better differentiate teacher performance (even if it hasn’t done so very well). Even though the provision would result in unhampering certain teachers from onerous requirements, opponents are critical because it would necessitate a differentiation of performance within the teaching profession. Still, it’s an awfully low-stakes provision to protest (in contrast to, for example, teacher performance designations tied to layoffs). And it doesn’t advance the cause of Ohio’s teachers, especially its top performers, at all.

There’s a divide between those who believe teacher performance can and should be measured and those who argue the opposite—perhaps because they don’t want it to be measured. The best strategy to contest a system that attempts to reward high-performers is to call into question the idea that it’s possible to measure or define high-performers in the first place. The standards board seems to be employing this strategy—Chairwoman Orth said that a high-performing teacher can’t be defined with “any kind of validity or consistency,” as did the OFT in its ponderously titled blog post, “Does a 'consistently high performing teacher' exist?” Cropper said that the OFT will be lobbying for the removal of the definition in law. That’s right: The mere designation of a high-performing teacher—and any incentives or rewards attached to it—is so insidious that the OFT is willing to spend taxpayer resources to lobby against it. All of this serves as further evidence that vested interests feel a knee-jerk need to defend the status quo, even in areas that are relatively low-stakes and where top-performing teachers could benefit.

The recent spat also underscores a rift between those who believe public education needs freedom from regulation (as Fordham articulated a year ago in our Getting out of the Way report) and those who want to see regulation applied uniformly. Teachers’ unions and those in the latter camp champion one-size-fits-all mandates selectively. In one breath, they call for expensive class-size requirements, the application of traditional school mandates to charter schools, and treating teachers equally. In the next, they complain about state and federal accountability mandates that they perceive to threaten local autonomy and exalt the principles of freedom and creativity (look at the Ohio Education Association’s vision for accountability under ESSA, premised on “upholding creativity over standardization”). It is ironic indeed that many of the same people calling for freedom and trust on high-stakes accountability matters actively oppose a low-stakes provision that would allow Teacher-of-the-Year winners to determine what additional training to pursue (instead of the state prescribing it for them).

Despite the standards board’s abdication of responsibility, Ohio statute—and the impending deadline—remains. The Ohio Department of Education went ahead and developed a definition that was approved at the June state board of education meeting.[1] At least one group decided to meet its statutory responsibilities. The latest mini-scuffle about teacher quality in Ohio is a reminder that teachers’ unions sometimes behave in ways that are surprisingly anti-teacher and—in that they selectively bemoan certain one-size-fits-all policies while simultaneously defending ones that treat their members in one-size-fits-all fashion—hypocritical.


[1] Teachers must receive the highest summative rating on the Ohio Teacher Evaluation System for four out of the last five years, and meet an additional leadership criteriona for three out of five years. The leadership criteria are: possession of a senior or lead professional education license; holding a locally recognized teacher leadership role; serving in a leadership role for a national or state professional academic education organization; and/or receiving a state or national educational recognition or award.

 

Traditional districts that serve as charter school sponsors are often glossed over in the debate over Ohio’s charter sector. But given their role in two recent reports, it’s an opportune time to take a closer look at their track record.  

First, a Know Your Charter report covered the failings of a number of Buckeye charters receiving federal startup funds (either they closed or never opened). Though the report itself didn’t draw attention to it, we pointed out that school districts sponsored more than 40 percent of these closed schools. Meanwhile, the auditor of state released a review of charter school attendance; among the three schools referred for further action because of extraordinarily low attendance, two had district sponsors (the third was sponsored by an educational service center).

With all of the talk about charters being created to privatize education, it might surprise you to learn that Ohio school districts have long had the authority to sponsor (a.k.a. authorize) charters. In fact, the Buckeye State allows districts to sponsor either conversion or startup charters within certain geographic limitations (e.g., a school must be located within a district’s jurisdiction or in a district nearby).[1] Throughout our eighteen-year charter history, there have been 105 district-sponsored charters—almost one-fifth of Ohio charters ever opened—authorized by sixty-eight districts. Presently, there are forty-two active district sponsors—roughly 7 percent of districts in the state—that together authorize sixty-two schools, the majority of which are dropout recovery schools.

This article takes a closer look at district-sponsored charters along two dimensions: school performance and school closure. Each charter school is linked to the sponsor of record as reported in ODE’s 2014–15 Community Schools Annual Report (Table 2). For closed or suspended schools, the sponsor of record is identified via ODE’s Closed School Directory.[2]

School performance

It’s vitally important to examine the academic quality of the schools in sponsors’ portfolios. Interestingly, CREDO’s 2014 study on Ohio charters contains an analysis that linked a school’s impact to its sponsor. In short, the analysis did not find appreciable differences in charter school impact based on sponsor type (e.g., non-district versus district). But the data for that analysis ends with the 2012–13 year, so it might be useful to start by taking a look at the most recently available report card data. Of course, we acknowledge that the following comparisons are not nearly as rigorous as student-level analyses like CREDO.

General education schools (i.e., non-dropout-recovery schools)

Districts only rarely sponsor general education charter schools, meaning that a small number of their schools can be compared to non-district-sponsored ones. Along the value-added measure (i.e., student growth), only seventeen district-sponsored charter schools received a rating in 2013–14 and 2014–15. The majority of these schools were sponsored by either the Cleveland or Reynoldsburg school districts. We elect to use the value-added measure, as opposed to a proficiency-based one, since it is more likely to reflect actual school performance.[3]

With those qualifications in mind, Table 1 displays the distribution of schools’ value-added ratings by sponsor type for 2013–4 and 2014–15. As you can see, district-sponsored schools have a slight edge with respect to the percentage of schools earning A ratings on value-added: Thirty-five percent of their schools received such a rating in both years, versus 25 and 20 percent for non-district sponsored schools. The strong performance for districts can be partly attributed to Cleveland, which sponsors several of the high-performing Breakthrough charters (evidence that districts can and do sponsor excellent charter schools). The results across the other rating categories are generally inconclusive— either very similar (B and D ratings) or inconsistent across the two years (C and F).

Table 1: Charter school performance on Ohio’s value-added measure by sponsor type, 2013–14 and 2014–15

Dropout-recovery schools

Because districts tend to sponsor dropout-recovery schools, we should take stock of those results as well. In the 2014–15 school year, thirty-five district-sponsored dropout-recovery schools received a progress rating—a measure that is akin to value-added but uses a norm-referenced exam (for more on this measure, see here). As you can see from Table 2, district-sponsored schools somewhat outperform their counterparts: Fifty-one percent received the Meets Standards rating, versus 26 percent of non-district-sponsored schools. However, the fact that only one dropout-recovery school overall received the top rating on this dimension (Exceeds) raises questions about whether any of the dropout-recovery schools—district-sponsored or otherwise—are truly measuring up, or whether there are challenges with the measure that demand closer attention. (The 2014–15 school year was the first year of implementation.) In sum, it is fair to say that the jury is out on the overall quality of district-sponsored dropout-recovery schools, both in absolute terms and in relation to non-district sponsors.

Table 2: Dropout-recovery charter school performance on Ohio’s value-added measure, by sponsor type, 2014–15

School closure

Closure is not the only—or even best—way to define sponsorship quality, but it is a useful data point. Sponsors are responsible for vetting and overseeing schools; as such, good sponsors shouldn’t be linked to an overabundance of closed schools. Of course, chronically low-performing charter schools should close—this is an essential though difficult part of responsible authorizing—so some closures are to be expected. Figure 1 shows that over the life of Ohio’s charter sector, slightly more district-sponsored schools have closed than non-district-sponsored ones (45 percent to 32 percent). In other words, almost half of the schools that have been sponsored by districts do not remain in existence today.

Figure 1: Percentage of charters closed: district sponsors versus non-district sponsors, 1998–99 to 2014–15

But perhaps some of these closed schools opened early in the history of Ohio’s charter program and operated for, say, eight or nine years. Another way of slicing the data is to zero in on schools that closed shortly after opening—the infamous “fly-by-night” schools. A school that closes before reaching its fifth anniversary is much more likely to have had flaws that could have been identified during a rigorous application process, and their closure is more likely to indicate an error in the sponsor’s judgment. When using a five-year threshold, district sponsors again appear to lag slightly behind. Figure 2 shows that a greater percentage of district-sponsored schools closed before reaching this mark (30 percent) than those with a non-district sponsor (21 percent). Taken together, Figures 1 and 2 show that district sponsors are not necessarily more successful at authorizing schools that remain open. Many will find this surprising given that school districts have overseen schools for a century.

Figure 2: Percentage of charters closed before reaching five years of operation: district sponsors versus non-district sponsors, 1998–99 to 2014–15

***

A closer look at district sponsorship reveals some of the same warts as those of the charter sector as a whole. The academic performance of district-sponsored schools varies in much the same way as charters sponsored by other entities. Traditional districts, like their non-district counterparts, have sponsored schools that closed shortly after launch. The struggles of district-sponsored charters shouldn’t be ignored. (Conversely, sponsors of successful schools deserve our praise.) Fortunately, Ohio’s recent charter reforms will force low-capacity sponsors—district and non-district alike—out of the authorizing business. Their exit, along with others, may not be an altogether bad thing.

For better or worse, school districts have played an important role in the development of Ohio’s charter sector. When talking about charter sponsorship, let’s not let districts fly under the radar.


[1] Generally speaking, a conversion charter school refers to the conversion of an existing district school (fully or partially) into a charter school; a start-up charter is a new school.

[2] Three schools without sponsors of record in both files were identified through OEDS-R. A school can switch from a district to a non-district sponsor or vice-versa. In cases such as these, the following analysis assumes that a school’s last sponsor of record is the one that should be held accountable.

[3] The same reasoning applies in the section on dropout-recovery schools.

 

One of the most controversial aspects of school accountability is how to identify and improve persistently low-performing schools. Under NCLB, states were required to identify districts and schools that failed to make the federal standard known as adequate yearly progress. Failure led to a cascading set of consequences that were viewed by many as inflexible and ineffective.

The passage of a new national education law— the Every Student Succeeds Act (ESSA), signed by President Obama in December— has shifted more of the responsibility for identifying and intervening in persistently low-performing schools to states (though the Department of Education’s regulations attempt to pull some of that responsibility back to Washington—more on that later).

School identification under ESSA is determined by a state’s “system of meaningful differentiation.” This is based on the state’s accountability system, including indicators of student proficiency, student growth, graduation rates, and English language proficiency. The use of these indicators isn’t optional, though the weight of each (and the methodology crafted from them that is then used to identify schools) is left up to states. Using their chosen methodology, states are required to identify a minimum of two statewide categories of schools: comprehensive support and improvement schools and targeted support and intervention schools.[1]

Comprehensive support schools must be identified at least once every three years beginning in the 2017–18 school year. This category must include the lowest-performing 5 percent of Title I schools (it’s possible for states to include a higher percentage, but likely politically unfeasible) and all public high schools that fail to graduate 67 percent or more of their students. States are responsible for notifying districts of any schools that are identified as part of the comprehensive category. Once districts have been notified, they must work with local stakeholders to develop and implement a comprehensive support and improvement plan. By law, these plans must be informed by all the indicators in the state’s accountability system, include evidence-based interventions, be based on a school-level needs assessment, and identify resource inequities that must be addressed.[2] The plan must be approved by the school, the district, and the state. Once approved, the state is responsible for monitoring and reviewing implementation of the plan.

Targeted support schools are identified as such when any subgroup[3] of their students is labeled “consistently underperforming” by the state.[4] Similar to comprehensive schools, the state must notify districts of any schools that have been identified as targeted. However, while comprehensive support schools are subject to plans crafted and implemented by the district, targeted support and improvement plans are developed and implemented by the school rather than the district. The parameters for the plan are largely the same: The school must seek stakeholder input, and the plan must be informed by all indicators and include evidence-based interventions. Schools that have a subgroup performing as poorly as the bottom 5 percent of schools in the state must also identify resource inequities. The plan must be approved by the district, and the district is responsible for monitoring implementation. If the plan ends up being unsuccessful after a certain number of years (as determined by the district), “additional action” is authorized by ESSA.

States are tasked with a few additional responsibilities regarding identified schools. First, they must establish statewide exit criteria for schools that are identified as comprehensive and targeted. If these criteria aren’t satisfied within a certain number of years (determined by the state, but not to exceed four years), the school is subject to more rigorous state-determined action and intervention. Second, states must periodically review resource allocation intended for school improvement for districts that serve a significant number of comprehensive and targeted schools. Third, states must provide technical assistance to each district that serves a significant number of schools implementing comprehensive or targeted support plans. ESSA also permits (but does not require) states to “initiate additional improvement” in any district with a significant number of schools that are consistently identified as comprehensive and fail to meet exit criteria, or in any district with a significant number of targeted schools. 

Interestingly, if a targeted school with a subgroup performing as poorly as the bottom 5 percent of schools in the state fails to satisfy exit criteria within the state-determined number of years, the school as a whole must be identified as a comprehensive support and intervention school. The implications of this provision are enormous: A school that the public perceives as high-performing could land in the statewide comprehensive improvement category by failing to significantly improve outcomes for a particular subgroup of students. While NCLB already required states to disaggregate achievement data based on certain subgroups, ESSA added three new subgroups to the mix: homeless students, foster care students, and children of active duty military personnel. Given that a single persistently low-performing subgroup can force an entire school into the comprehensive category, the subgroups—many of which have either not been compiled or have flown under the radar—could draw serious attention from districts and schools. Sample sizes will matter immensely, and in some cases they may be the only thing that stands between dozens of schools and their identification as comprehensive.[5]

The Department of Education’s (USDOE) recently released proposed regulations will, if enacted, add some additional requirements and responsibilities. Here are a few of the most significant proposed regulations related to school identification and support:

  • Performance on a “school quality or a student success” indicator (like teacher or student engagement) cannot be used to justify the removal of a school from the comprehensive or targeted categories unless the school or subgroup is also making significant progress on at least one of the academic indicators that ESSA requires to be more heavily weighted—achievement, growth, graduation rate, or English language proficiency.
  • While states can create their own definitions for what constitutes a “consistently underperforming” subgroup, the regulations provide suggested definitions.
  • The identification of consistently underperforming subgroups that can lead to a school’s designation as targeted for support and intervention must begin in 2018–19.
  • Any district that is notified by the state of a comprehensive school identification must notify parents of each enrolled student about the identification, the reason(s) for identification, and an explanation of how they can be involved.
  • The proposed regulations list components that each comprehensive support and improvement plan must have, including a description of how stakeholder input was solicited and taken into account.

Overall, while ESSA may have eliminated some of NCLB’s more controversial school intervention provisions, there are still plenty of mandates that states and districts will need to be aware of in order to comply with the new law. Furthermore, although there is significant potential for innovation, personalized support, and stakeholder input in the more localized intervention process, there is also an incredible amount of risk. Without strict oversight and discipline, some states (and districts and schools) could opt to simply go through the motions. That’s a future we just can’t afford.


[1] States are permitted to add additional statewide categories for schools.

[2] Identifying resource inequities can include a review of district- and school-level budgeting.

[3] ESSA requires the following subgroups to be reported on: race and ethnicity, gender, English language proficiency, migrant status, disability status, low-income status, homeless students, foster care students, and children of active duty military personnel.

[4] The state determines consistently underperforming subgroups based on the indicators used in the state’s accountability system.

[5] ESSA leaves selecting a sample size up to states, but the USDOE’s proposed regulations call for states to “submit a justification” for any size larger than thirty students.

 

School choice advocates have long agreed on the importance of understanding what parents value when selecting a school for their children. A new study from Mathematica seeks to add to that conversation and generally finds the same results as prior research. What makes this study relatively unique, however, is that its analysis is based on parents’ rank-ordered preferences on a centralized school application rather than self-reported surveys.

To analyze preferences, researchers utilized data from Washington, D.C.’s common enrollment system, which includes traditional district schools and nearly all charters. D.C. families that want to send their children to a school other than the one they currently attend (or are zoned to attend) must submit a common application on which they rank their twelve most preferred schools. Students are then matched to available spaces using a random assignment algorithm.

The study tests for five domains of school choice factors: convenience (measured by commute distance from home to school),[1] school demographics (the percentage of students in a school who are the same race or ethnicity as the chooser), academic indicators (including a school’s proficiency rate from the previous year), school neighborhood characteristics (crime rates and measures of residents’ socioeconomic status), and other school offerings (including average class size, uniform policies, and the availability of before- and after-school care). Findings suggest that, of the five factors, convenience, school academic performance, and student body composition are the most predictive of how parents rank school alternatives. (The analysis focuses on only entry grade levels—pre-K and kindergarten for elementary schools, grades five and six for middle school, and grade nine for high school—since these are the most common levels for which families submit applications.)

In terms of subgroup breakdowns, the economic status of choosers impacted preferences for elementary and middle school applicants, but not high school applicants. For example, in elementary school, higher-income applicants preferred schools with high percentages of students of the same race and lower percentages of low-income students; low-income applicants didn’t share the same preferences. In middle school, both low- and higher-income applicants were influenced by school academic performance; however, low-income choosers focused on school proficiency rates (which were observable on the application website) while higher-income choosers were more influenced by accountability ratings (which were not immediately available on the application site). Breakdowns for the three largest race/ethnicity groups (white, Hispanic, and African American) in elementary school showed that while white choosers preferred schools with larger percentages of students from the same racial group, African American choosers “essentially showed indifference for own-group racial composition.” In middle school, however, all but the Hispanic group of applicants had a “pronounced own-group preference and a slight preference for diversity.”   

To round out their analysis, the researchers use their model to predict how parents would rank schools under alternative scenarios. For example, if capacity constraints were eased so that more applicants were able to attend their most preferred schools, enrollment in high-performing schools would increase and segregation by race and income would decrease. Closing the lowest-performing schools would also increase enrollment in high-performing schools and decrease segregation.

Overall, while there are limitations to this particular study and others like it, it’s a valuable analysis of what parents look for in schools—and the importance of expanding their options.

SOURCE: Steven Glazerman and Dallas Dotter, “Market Signals: Evidence on the Determinants and Consequences of School Choice from a Citywide Lottery,” Mathematica Policy Research, (June 2016). 


[1] To ensure accurate distance measurements between schools and residences, the study restricts the student sample to only lottery applicants with a valid D.C. address (approximately 98 percent of applicants).

 

On the heels of national research studies that have uncovered troubling findings on the performance of virtual charter schools, a new report provides solid, commonsense policy suggestions aimed at improving online schools and holding them more accountable for results. Three national charter advocacy organizations—NAPCS, NACSA, and 50CAN—united to produce these joint recommendations.  

The paper’s recommendations focus on three key issues: authorizing, student enrollment, and funding. When it comes to authorizers, the authors suggest restricting oversight duties for statewide e-schools to state or regional entities, capping authorizing fees, and creating “virtual-specific goals” to which schools are held accountable. Such goals, which would be part of the authorizer-school contract, could include matters of enrollment, attendance, achievement, truancy, and finances. On enrollment, the authors cite evidence that online education may not be a good fit for every child, leading them to suggest that states study whether to create admissions standards for online schools (in contrast to open enrollment). They also recommend limits to enrollment growth based on performance; a high-performing school would have few, if any, caps on growth, while a low-performer would face strict limits. Finally, the report touches on funding policies, including recommendations to fund online schools based on their costs and to implement performance-based funding, an approach that four states have already piloted for online schools (Florida, Minnesota, New Hampshire, and Utah). Interestingly, the report notes how the design of the performance-based funding model varies from state to state. New Hampshire, for example, takes a mastery-based approach (with the teacher verifying mastery), while Florida requires the passage of an end-of-course exam—as determined by the state—to trigger payment.

Perhaps the paper’s most intriguing idea is that states consider decoupling virtual schools from their charter laws. They write, “States may need to consider governing full-time virtual schools outside of the state’s charter school law, simply as full-time virtual public schools.” Indeed, laws and regulations crafted with brick-and-mortar charter schools in mind may be poorly suited to the unique environment of online schools. Enrollment and funding policies are just two examples where it would be helpful to have a separate set of rules (e-schools, however, should not be held to different academic standards).

State policy makers, including those in our home state of Ohio (a state with a large e-school sector), should pay close attention to this report. As my colleague Chad Aldis noted, “Virtual schools have become and will remain an important part of our education system.” If indeed policy misalignment is at least partly behind the poor results we’ve observed, employing the recommendations of this report would be a major step forward for online education.

Source: National Alliance for Public Charter Schools (NAPCS), the National Association of Charter School Authorizers (NACSA), and 50CAN, “A Call to Action to Improve the Quality of Full-Time Virtual Charter Public Schools,” (June 2016).

A short article published this week in the Columbus Dispatch makes serious reporting mistakes that leave readers with a distorted view of school finance. According to the article, a Columbus citizen millage panel recently discussed a state policy known as the funding “cap.” Briefly speaking, this policy limits the year-to-year growth in state revenue for any particular school district. As we’ve stated in the past, funding caps are poor public policy because they shortchange districts of revenue they ought to receive under their funding formulae. State lawmakers should kill the cap; it circumvents the state’s own formula, it’s unfair to districts on the cap, and it ultimately shortchanges kids.

The article would’ve been right to stop there. Yet somehow charter schools got pulled into the discussion, and that is where the coverage went way off track. The Dispatch writes:

But the formula for one class of school [i.e., Columbus district schools] is now capped, while the other [Columbus charters] isn’t.…But today Columbus charters get $142.4 million from the state to teach 18,000 students, while the district is left with $154.4 million to teach the remaining 52,000 kids, many of whom rank among the poorest in the state. 

The Dispatch seems to have forgotten that Ohio’s school districts are financed in fundamentally different ways than charters. Districts are part of a hybrid funding system—raising revenue via state and local tax sources—while charters depend on state revenue alone. Any discussion that juxtaposes district and charter funding must keep that basic difference (something that charter critics like to ignore as well) in mind. Let’s take a closer look at the problems with these statements and provide a fuller picture.

It is true that Ohio charters are not subject to state funding caps. But they are also not funded according to the district funding formula. To allocate state revenue to Ohio districts, legislators have created a very intricate funding formula, part of which includes adjustments that account for districts’ wealth (their local tax bases). These wealth-related factors, in addition to changes in enrollment, can determine whether a district is capped. A district with a declining tax base would receive more state aid, all else equal, but that increase in aid might be capped.

Charters, however, don’t have the authority to levy taxes; as such, they receive state funding on a per-student basis with no adjustments for local wealth. Inasmuch as caps are unfair to districts, establishing a cap on charters would be even more inappropriate. A cap could directly deny charters funding when they enroll additional students. Moreover, since they don’t have local funding to fall back on, it would put them at an even greater financial disadvantage.

Misunderstanding the state funding formula, for districts and charters alike, could be considered excusable. (It’s complicated!) But the second statement, comparing the revenue received by Columbus charters ($142 million) to that received by the Columbus City Schools ($154 million), is unpardonable. It misleads readers into believing that charters receive much more revenue per pupil—living high on the hog—than the district ($7,900 to $2,900 when you do the math). But these figures leave out the massive amount of local tax revenue that the district raises—and charters do not. Hence, the assertion wildly underreports the amount of public funding received by Columbus City Schools.

Let’s set the record straight: According to the district’s financial statements, Columbus City Schools raised over $370 million in local property tax revenue in the 2015 fiscal year—hundreds of millions of dollars that go to educate district pupils. In fact, according to the state’s Cupp Reports (one of the best sources for financial data) the district received more than $15,000 per student in state and local funding combined in 2014–15. This amount far exceeds what a typical Columbus charter school receives in public aid; it is simply inaccurate to suggest that charters receive more generous taxpayer funding than their nearest district.

In sum, here are two simple ideas to improve our discourse around school funding. First, let’s acknowledge the fundamental differences in how charters and districts are financed. Second, let’s focus on the overall taxpayer dollars that are being used to support the learning of Ohio’s school children. Getting the fundamentals right is a necessary first step before tackling the more complicated funding issues.

SIGN UP for updates from the Thomas B. Fordham Institute