Just the Facts: A Longitudinal Analysis of ASD Schools Before and After Takeover

Written by Ezra Howard

Intro

The Achievement School District (ASD) took over its first schools two years ago. Its goal is to move the bottom 5% of schools in Tennessee to the top 25% in five years. All but one of its takeover schools have been in Memphis. ASD has released data on its performance for the past two years, which has been subpar at best even by its own acknowledgement.

Many news outlets have reported on the district’s achievement results since the takeover, yet no one has attempted an analysis on whether or not the schools taken over by the ASD are doing better, worse or the same as they were prior to takeover. This is a vital question given the tremendous disruption caused by the ASD and the millions being spent on this turn-around effort.

However, this level of analysis is not a simple task. Attempting to do a longitudinal examination of the ASD is like trying to take the census in an ant colony. For one, the district’s composition isn’t the same from year to year; it’s constantly adding new schools. It’s fairly straightforward to analyze any one school, but trying to gauge the efficacy of the entire district as a whole is a very difficult task because the data is constantly changing. Fortunately, the Tennessee Department of Education makes all its school data public in Excel spreadsheets, making such an analysis possible as long as you’re willing to invest the time and energy.

My analysis suggests that ASD schools aren’t doing significantly better in terms of student growth than they were before state takeover. In fact, in many cases the schools’ pre-takeover growth outperformed the ASD. These findings have significant implications for the future of the ASD, how we should move forward with continued takeovers, and for future turn-around efforts in general.

 

Abridged Methods

The methods that the state and the ASD use for accountability measures are, in my opinion, flawed for the 2014 TCAP results. In 2013, they simply compared the achievement rates for proficient or advanced (P/A) for the district against those in 2012, accounting for grades that were not taught by the ASD due to the phase-in model. However, in 2014 the ASD not only had additional schools in the district, but also additional grades in the phase-in schools with the district in 2013. For example, Corry Middle was a 6-8 school taken over by the ASD. But after the takeover, they started at grade 5. Therefore we cannot truly call the before-after comparison a valid one since no student who attended Corry before the takeover is actually included in the new ASD school. By relying on methods appropriate in 2013, there is a skewed view of achievement in 2014. As such, I found a need for reconsideration of the data. The state allows data to be broken down by individual grade and subject level, which allows us to make a more detailed comparison. My results for 2013 growth will differ slightly from the state’s because of this modification.

Essentially, I replicated the same methods used in 2013 and added in the new schools and grades for the ASD phase-in schools. I tabulated the achievement rates for each school incorporated into the ASD in 2014. I did this for the TCAP results in 2010, 2011, and 2012. For 2013, I simply used achievement data for each school in the ASD during that academic year while fully accepting the limitations of the analysis. This provided an interesting perspective on trends in school achievement before and after a school was absorbed into the ASD. There were, however, notable quirks in the data collection process, the effects of which rippled through the analysis. These are noted at the end, in the Unabridged Methods section.

You can find the entire data set from 2010 to 2014 here. I specifically examine data for Math and English Language Arts (ELA), giving close attention to gains and the trends thereof. Social Studies was not included in the study as the state did not consider it a factor for accountability until 2012-13. Schools were not held accountable for Science results until 2011-12; it’s also removed from this study. However, as science results were reported by the state since 2010, they are included in the linked spreadsheets for the sake of the curious. The data for 2010-2012 are for schools under the Local Education Agency (LEA) and for 2013-2014 they are for the ASD.

 

Results: Math - you can see the raw data below, along with graphs showing achievement and growth over time:

Math Table

 

 

Math Prof Time

 

Math Gains

 

The LEA surpassed the ASD in many respects. Chart 1 illustrates the gains made under the LEA and the ASD. The last year of LEA growth, 4.48% in 2012, exceeds any rate of growth by the ASD. When comparing two years of growth under each district, the gains made by the LEA are actually greater than the ASD by almost 2%, 7.75% compared to 5.84%. Chart 2 illustrates the rate of growth for these schools since 2010. In summary, achievement gains have not hastened under the ASD; indeed, they continue to follow a trend that was already established in the two years before the ASD took over.

Results: ELA - you can see the raw data below, along with graphs showing achievement and growth over time:

ELA Table

 

ELA Prof Time

 

ELA Gains

 

Once again, the LEA exceeded the ASD. Much discussion has been given to the regression of ELA scores in the first year of the ASD. But in examining the total growth of the same schools under the two different districts, it’s readily apparent that the LEA outperformed the ASD by over 4%, 4.64% total gains in P/A compared to 1.44%. Even the level of growth in the last year under the ASD, 3.40% in 2014, is less than that with the last year of the LEA before ASD takeover, 3.71% in 2012. Chart 2 exhibits the trend of growth for ELA, illustrating that the ASD failed to capitalize on the LEA’s momentum of increasing P/A rates in the same way that they were able to with math scores.

The results illustrate that the ASD has failed to outperform the LEA in both primary accountability subjects, math and ELA. Total gains under the LEA for both subjects were greater than the ASD (charts 1 and 3) and the rate of growth of the LEA exceeded the ASD (charts 2 and 4).

What’s the ASD’s take on all this? Chris Barbic, the ASD superintendent, recently tweeted to Gary Rubinstein, fellow education blogger, on the subject of 2014 TCAP results that “Growth is growth. Better than the alternative.” However, as we can see from this data, the alternative might have actually been an improvement over the ASD had it been allowed to continue at the same rate.

 

Policy Implications

These results raise three questions in my mind. First, can the ASD reach 55% P/A in order to be in the top quartile? Maybe. In order to reach that magic number of 55% P/A in all three of these subjects, the ASD would have to average 11.07% gains in Math and 12.67% gains in ELA over a 5 year period. However, in the last two years, the ASD has averaged 2.92% gains in Math and 0.72% gains in ELA. Given the past two years, I’m extremely skeptical as to whether this still is or ever was an attainable goal for the ASD.

Second, this data also begs the question of whether or not state takeovers are worth it. The ASD has spent 18 of the $22 million awarded to it by Race to the Top and has very little to show for it other than continuing already established growth trends in the schools it took over. That’s on top of per pupil funding and resources provided by donations or grants. Sticking with RTTT funds, if we compare the average gains by the ASD to the LEA, the Fed should have saved their (and by proxy, our) money. The LEA was averaging 3.88% gains in Math and 2.32% in ELA. If I take $18 million and divide it by four (Math, ELA, Science and Social Studies), that’s $4.5 million per subject area. If I then divide that $4.5 million by the total gains in each subject area, I’m paying over $770,000 for each percentage point in Math and $3.1 million on a 1% increase in ELA. Of course, district finances are not as simple as slicing a pie, but you can see my point: an exorbitant amount is spent on results that are, at best, no different than what the data suggests we could have expected had these schools not been taken over by the ASD.

Third, we must consider whether or not there will even be a year five for the ASD. If the current trends continue, there will be consequences. Chris Barbic, ASD superintendent, was recently quoted by WPLN, stating, “There’ll be reckoning… You’ll be talking to another person on the end of this phone if that doesn’t happen in year five.” If the ASD doesn’t quickly turn itself around this next year, this experiment may end before then. I’m also concerned about the continued takeover of schools. I personally think it would be wiser for the district to halt absorption of additional schools, doubling efforts on the schools it currently operates instead of the continuous cycle of disruption and turn-over currently employed by the ASD, especially since that strategy hasn’t shown significantly different results at this point.

In the end, it’s fair to say that the Achievement School District has been a disappointment in the last two years. In terms of achievement, the results have been moderate at best (Math) and regressive at worse (ELA).  Demonstrated growth increases beyond what we could have expected, essential for the defense of any form of school takeover, has been nonexistent. The ASD’s performance adds to the on-going debate on state takeover and increasing implementation of the portfolio model. One of the ASD’s precursors, the Recovery School District, is seeing modest results, even with a drastically different student population following Hurricane Katrina. The Education Achievement Authority in Michigan is exhibiting equally unimpressive results as the ASD. Some states, like Texas, have considered their own state-run districts. This should all lead us to ask the question, do the country’s struggling schools need state take-over to improve or do they simply need additional time and support? More immediately, does Tennessee really need an Achievement School District or should we simply entrust our local districts with the same resources to aide their ailing schools? Only further research will tell us the answer to that question.

By Ezra Howard

Follow Bluff City Education on Twitter @bluffcityed and look for the hashtag #iteachiam and #TNedu to find more of our stories.  Please also like our page on facebook. The views expressed in this piece are solely those of the author and do not represent those of any affiliated organizations or Bluff City Ed writers. Inflammatory or defamatory comments will not be posted on this story.

———————————————————————————————————-

Unabridged Methods

First, I compiled a list of schools within the ASD. I was specifically interested in the schools with tested grades in the district’s 2013-2014 school year. Some of these schools are phase-in schools, taking only a grade or two at a time. As such, in looking at TCAP results from previous years, I felt compelled to isolate certain grades for an accurate year-to-year comparison. The state, for accountability measures, does the same (more on that later).

The following is a list of the phase-in schools and the grades available for testing. Brick Church taught grades 5 and 6 in Nashville, and is managed by the LEAD charter Network. Grades 3 through 5 with Cornerstone Prep, formerly Lester School, in Memphis, TN managed by the Capstone Education Group, a local charter. Humes was a phase-in school and started with grade 6 in 2012-13, but taught grades 6 through 8 in 2013-14. Humes Middle is operated by Gestalt Community Schools. There were also several schools where the entire population was taken over by the ASD. Corning Elementary, Frayser Elementary, Westside Middle, Georgian Hills Elementary, and Whitney Elementary are all run directly by the ASD. The Aspire network manages Hanley Elementary.

A few schools were not included in the study. Klondike Elementary is run by Gestalt and includes only Kindergarten and 1st Grade, non-tested grades. Shannon Elementary, under KIPP schools, has Pre-K and Kindergarten, also non-tested grades. As a side note, I have no idea how they’re tracking accountability for schools that slowly phase in lower, untested grades. Additionally, and most perplexingly, there is Corry Middle, which is also a KIPP school. Despite its history as a 6-8 school, KIPP taught only 5th grade in its first year with the ASD. As a result, I’m not sure from which school(s) the students at Corry came; therefore, it was left out of the analysis. In the same vein, Humes taught 5th and 6th grade in their first year with the ASD. However, Humes has also never had a 5th grade and therefore there is nothing to which to compare it. If I find out what schools feed into ASD schools, I’ll add in the data. Lastly, Westside Middle didn’t have 6th grade in 2010 or, at least, the information was left off of the reported data, and so only 7th and 8th grade data was used.

I emulated the same methods the state uses to measure accountability on the state report card. As directed by the state’s data department, I used the Accountability Base spreadsheet readily available on its website. I pulled data from 2010, 2011, 2012, and 2013. These files provide the exact number of students in each grade who took the TCAP exam as well as the number of students within each level of proficiency. As some of the ASD schools were phase-in, isolating certain grades for proper comparison was paramount. The state operates the same way, where the TCAP results of Cornerstone Prep – Lester Campus, which only tested 3rd grade in 2013, were only compared to Lester’s 3rd grade results in 2012. I verified this consulting the 2013 report card’s section on accountability.

It’s quite easy to replicate the methods that the state uses for accountability measures. As previously mentioned, the state compares the TCAP results of the tested grades of the current year to the results of the same tested grades in the previous year. To expand upon these methods, I simply compiled the TCAP results from 2010 to 2013 using the grades taught at the 2013-14 ASD schools. The numbers of students tested in each grade are exact, and after that, it’s basic arithmetic. I added up the number of valid tests for the different academic year as well as the number of students whose results fell within each proficiency level of below basic, basic, proficient and advanced. After that, I divided the number of students in each proficiency level by the number of valid tests, which yielded a percentage for each level of proficiency. To tabulate the percentage of proficient and advanced students, I first added together the students within each level and then divided them by the total number of valid tests, just to ensure accuracy of the percentage.  I did this for both Math and ELA.

There was an interesting hiccup in the data when recreating the accountability measures for the state’s report card in order to verify my own methods. Everything was perfect until I calculated the results for Westside Middle. For the life of me, I cannot find out the root of this inconsistency and am at a loss on whether this is my error or the state’s. I’ve contacted the state’s data department for clarification but haven’t heard back. I can’t yet hold it against them, I’m sure they have better things to do getting school-level data ready for release.

The results provide a fairly accurate comparison, with one caveat: the 2013 TCAP results are for the six original ASD schools while the other academic years include the additional schools taught in 2014. With the school-level data released, it will be quite simple to isolate and remove these results for a deeper level of analysis, examining the results for each cohort of schools with the ASD. However, this method provides an accurate yet generalized view of the performance of schools prior and during tenure with the ASD.

 

The Need for Further Research

First and foremost, I would like to update my findings when: 1) I hear back about my concerns about Westside, Corry and Humes; and 2) the school-level data is released. Then I would like to add different levels to the analysis. I would like to isolate the schools in the first and second year with the ASD, I would like to compare charter to those directly run by the ASD, and I would like to look at phase-in and non-phase-in schools. I think we could learn a lot by looking at trends within these sub-groups.

Next I would like to compare and contrast ASD to similar initiatives, particularly the Izone schools of Shelby County Schools (SCS). A lot of discussion has been dedicated to comparing ASD to SCS or the state. I don’t think either of these comparisons is fair. The populations are too different; both contain higher performing schools for which gains are naturally hard to attain and the rates are slower. Sometimes, in the case of Hume-Fogg Academic and MLK Magnet, the two top public high schools in Tennessee, the need to analyze gains is non-existent because achievement levels are so high. Izone, to my knowledge, is the only comparable program in the state. With the same time, money, and mission, both Izone and the ASD are naturally comparable. Yet the implementation is drastically different; Izone has not yet relied on charters nor does it need the phase-in model. With two years of data on hand, we shall see which approach proves more effective and beneficial to students and communities.

Leave a Reply

Your email address will not be published. Required fields are marked *


6 − = four