Ohio Education Gadfly
Volume 5, Number 15
September 21, 2011
Event recap: Assuring Highly Effective Teachers for All Ohio Students
Report card recap
A Better Measure of Skills Gaps
SpongeBob, SB 5, and ACT
Terry Ryan / September 21, 2011
Ohio has been a national leader in using value-added measures of student academic growth. The current value-added system was piloted in 2007 and fully integrated into the state accountability system in 2008. Yet, since then Ohio’s value-added system has come under increasing scrutiny and criticism from some district superintendents and others in the field.
Critics of Ohio’s value-added system have raised concerns about its methodology and its usability, while others have criticized the Ohio Department of Education for giving too much weight to value-added when it rates local schools and districts. We’ve been tracking these issues and reporting on them since August 2008, when we published Ohio Value-Added Primer: A User’s Guide. As Ohio moves toward launching teacher evaluation systems in the fall of 2013 that, by law, must incorporate value-added analysis where available, Ohio’s current value-added metrics warrant additional attention.
Ohio’s measurement of value-added is based on the SAS Institute’s Education Value-Added Assessment System (EVAAS). EVAAS develops a customized prediction of each student’s progress based, if possible, on the student’s own academic record as well as that of other students over multiple years, with statewide test performance serving as an anchor. In short, EVAAS measures student academic growth in reading and math for grades four through eight using a complex calculation. The state uses these measurements to assign schools and districts one of three ratings:
1. Above expected growth – indicates that the students in a school or a district made greater progress than expected. These schools and districts are “adding value.”
2. Met expected growth – indicates that students made the amount
Jamie Davies O'Leary / September 21, 2011
Last week Fordham, along with the Nord Family Foundation and Ohio Grantmakers Forum, convened two public discussions in Lorain and Cleveland on how districts across the state can improve teacher effectiveness. Panelists for the two events included Eric Gordon, CEO of the Cleveland Metropolitan School District; Mike Miles, superintendent of Harrison School District 2 in Colorado; Robert Sommers, director of the Governor’s Office of 21st Century Learning; and Kate Walsh, president of the National Council on Teacher Quality (NCTQ).
With Ohio’s new requirements surrounding evaluations – districts and charter schools have until the 2013-14 school year to adopt new evaluations based on a state model, slated to come out by the end of this year – the discussions delved into details of teacher evaluations and personnel policies tied to them.
Both events fostered productive conversations around this key issue. Be sure to check out footage of the September 13th event at the Cleveland City Club here, or a brief video of Superintendent Mike Miles describing his district’s reforms to teacher evaluation and compensation. To view Twitter coverage of the events, search for the hashtag, #EffectiveTeachers, or check out the newsfeed from our Twitter account, @OhioGadfly.
What can Ohio districts learn from Colorado’s Harrison School District 2?
Ohio school district leaders as well as state policymakers and education leaders should pay attention to what’s happening in the Harrison school district just outside of Colorado
September 21, 2011
This year with the help of researchers from Public Impact in North Carolina we continued our tradition of conducting an annual analysis of student achievement in Ohio’s Big 8 districts and charters. State report card data were released in late August, and we released a quick turn-around analysis on Big 8 charters and traditional schools. However, this year we decided to dive deeper into the data and stagger various analyses day-by-day on the Fordham blog. This approach allowed us to develop a deeper and more nuanced perspective that a one- or two- day analysis simply couldn’t deliver. Many interesting findings emerged from the data; a few are highlighted below.
- Fewer students in Ohio’s urban district and charter schools attended a school rated D or F in 2010-11 (40 percent this year, down from 47 percent last year).
- Dayton student performance saw an uptick, with far fewer students attending an F-rated school, and more students meeting or exceeding value-added gains.
- Charter schools were some of the highest rated schools (according to both absolute achievement and growth) in the Ohio Urban 8, as well as some of the lowest rated.
- Cincinnati once again was the highest rated urban district, holding onto its “B” rating for a second year in a row. Only four percent of CPS students attended a school rated “F.”
- While 60 percent
of both charter and district schools made expected growth on the state’s
value-added measure, charter schools generally outperformed their district
counterparts on this measure. A smaller
percentage of charters (16 percent) than district schools (20 percent) failed
to meet growth targets.
Jamie Davies O'Leary / September 21, 2011
AYP, or “adequate yearly progress,” has become one of the most derided parts of the federal No Child Left Behind Act, and the accountability requirements it set in motion for states. Simply put, a school makes AYP if it is progressing adequately enough toward meeting NCLB’s goal of having 100 percent of children proficient in key tested subjects by 2014, and fails to meet AYP if it isn’t. States set annual targets and have different methods for calculating whether schools are meeting these targets. Ohio, for example, is one of nine states under the federal “Growth Model Pilot Project” allowed to incorporate its growth model into AYP calculations.
But meeting AYP is like trying to ride an escalator that speeds up with each step you take. Most states set fairly low proficiency targets in earlier years, and steeper ones in the years leading up to 2014 – making it increasingly difficult to meet targets and more likely to be labeled as failing. Take a look at Ohio’s changing target proficiency rates for math in a handful of grades over the past five years (and looking ahead to 2014).
Target proficiency rates in math among Ohio’s fourth, sixth, and tenth graders over time
Source: Ohio Department of Education website
Clearly, each year it gets harder for schools to meet these targets. Even those schools serving kids well will have an increasingly difficult time of getting the last 15, 10, or five percent of students to proficiency. On a national level, Secretary Duncan predicted earlier this year
September 21, 2011
As the nation attempts to pull itself out of economic recession, leaders and policy makers alike are struggling to find ways to integrate the millions of unemployed back into the world of work. The longer these individuals go unemployed, the more likely it is that their current skill sets will continue to deteriorate. Along with this stark reality, educators and policymakers are also realizing that a substantial portion of today’s labor force does not possess the necessary skills for gainful employment.
The phrase most often used to describe this problem is “skills gap.” A skills gap is defined as a measurement of the difference between the skills needed for a job versus the skills actually possessed by a prospective worker. This recent report by ACT attempts to create a simpler definition of the term “skills gap,” and conducts a skills gap analysis looking at four major industries.
In order to quantify gaps at the national level researchers looked at four industries: manufacturing, healthcare, energy, and construction, which represent 25 percent of all U.S. total employment. Combinations of analyses looking at national staffing patterns, as well as long-term occupation outlooks were used to determine the skills needed for each of the four industries. Occupations in which at least 10 percent of the overall employment for that industry was represented were chosen as target occupations to analyze.
The results were somewhat surprising and indicate that level of education does not necessarily lead to gaps in on-the-job skills. Individuals with a higher level of education often saw significant skill gaps. For example, in the
September 21, 2011
Education, Demand and Unemployment in Metropolitan America explores the relationship between high unemployment in U.S. cities and “education gaps” – instances in which employer demand for educated workers exceeds the supply of such workers.
For this report, researchers gathered data on education levels and unemployment data from the U.S. Census Bureau and U.S. Bureau of Labor Statistics for all 366 U.S. metropolitan areas with a population of at least 500,000. The report focused mainly on findings from the largest 100 metropolitan areas, which included several cities in Ohio (Akron, Cincinnati, Cleveland, Columbus, Toledo, and Youngstown).
The study found that from 2005 to the peak of the recession in 2009, employers – on average – sought workers with higher levels of education. It was a buyer’s market and they wanted better educated workers. Furthermore, cities with larger education gaps had higher unemployment rates. This result reflects the declining demand for less-educated workers that are often employed in industries like construction and manufacturing. At the same time, more elastic industries like education and health care actually saw job gains. This results in relatively low levels of unemployment in cities like Washington D.C. and Columbus, both of which have smaller education gaps. In fact, Columbus ranks first in Ohio in predicted growth largely due to its better educated citizens and diversified industry. Meanwhile cities like Youngstown and Toledo that suffer from large education gaps and declining manufacturing base have seen unemployment figures sky-rocket.
Boosting educational attainment across the board is imperative, especially in cities like Toledo, Youngstown, and Dayton. From 2005 to 2009, Toledo was among
September 21, 2011
- State-by-state ACT results for the 2011 graduating class shed light on the potential of our future workforce. Ohio’s state report looked at 92,313 students who took the ACT last year. Of those students only 28 percent met all four ACT college readiness benchmark scores. Furthermore, less than 50 percent of tested students met the benchmark score for mathematics, and only 35 percent met the benchmark score for science. While the results in Ohio are slightly above the national average (25 percent of students reached all four college readiness benchmarks), the results are still troubling as more and more students are entering college or the workplace without necessary skill sets to compete.
- An increase in lawsuits related to bullying is causing a serious drain on some schools’ discretionary spending budgets. This shift of funds from educational efforts into the court system is causing administrators and teachers to take a more serious look at bullying problems.
- Innovative use of the iPad is generating impressive progress in schools, especially within the field of special education. With approximately 40,000 educational applications available on the iPad, this new form of technology is serving large numbers of students, increasing both efficiency and engagement.
- SpongeBob Squarepants is too fast, according to a recent study. The frenzied nature of this children’s favorite has proven to have a negative effect on the ability of kids to follow rules and complete tasks when compared to its more mild counterparts.