Schools often boast of their students’ achievements, but does this reflect well on the school itself, or only the students? After all, schools with more money or prestige can recruit high achieving students. So when the high test scores come in, does that reflect recruitment success or success on the parts of teachers and school administration?

With twenty years of data and over a million students’ demographics grades and test scores, student achievement can be decomposed into three components:

- the student’s ability
- the school’s ability to educate students
- an idiosyncratic “match effect”

Everyone has a story of a particular teacher who for better or for worse, significantly changed the course of their academic career. If you had happened to be assigned geographically to a different high school, that new school wouldn’t have offered you the chance (or misfortune) of interacting with that particular teacher. This is what is meant by idiosyncratic match effects.

Suppose there are J_{t} schools in each time period t and I_{t} students for t= 1,…,T. We model potential outcomes of students as:

Y_{ij} = α_{j} + β_{j}^{T} X_{i} + ε

where α

A_{i} = J^{-1} ∑ Y_{ij} = α + β^{T} X_{i} + ε_{i}

With a notion of ability, we can express potential outcomes as

Y_{ij} = A_{i} - A_{i} + α_{j} + β_{j}^{T} X_{i} + ε_{i}

Y_{ij} = A_{i} + (α_{j}-α) + (β_{j}-β)^{T} X_{i} + ε_{ij} - ε_{i}

Y_{ij} = Ability + ATE_{j} + M_{ij} (Match Effect)

which depends on student ability, the average treatment effect of school j ATE

E[Y_{ij} | j(i) = j ] = E[A_i | j(i) = j] + ATE_{j} + E[M_{ij} | j(i) = j]

E[Y_{ij} | j(i) = j ] = Q_{j} + ATE_{j} + M_{j}

where Q_{j} represents the average ability of students attending school j, ATE_{j} is defined above, and M_{j} captures the average match effect of students attending school j. Returning to our motivating example, a school that attracts high achieving students with be a school with high Q_{j} but not necessarily high ATE_{j}. The ATE_{j}, in contrast, captures the average effect of attending school j relative to the average school in the district. Thus, a school with ATE_{j}>0 is above average in terms of improving the academic achievement of its students, while a school with ATE_{j}<0 is below average. Regardless of the value of ATE_{j}, it is a more adequate notion of school quality since it represents the average impact of school j on its students’ achievement, regardless of the students’ underlying ability.

It is important to emphasize that publicly available data on school achievement will report E[Y_{ij} | j(i) = j ] and implicitly combine the three components we defined. Therefore, some schools may appear to be high performing because they attract students with high average ability Q_{j} but have low ATE_{j}. A school with these characteristics would not improve the academic trajectory of its students, but appear to be high-performing due to the fact that the average student would do well regardless of the school they attended. Thus, when discussing within-district inequality we’ll refer to differences in the ATE_{j} over time as opposed to differences in E[Y_{ij} | j(i) = j ].

In 2012 Governor Jerry Brown proposed the Local Control Funding Formula (LCFF) with a threefold aim: requiring school districts to focus on eight key areas to help students succeed, providing extra funding for students with greater challenges, and giving school districts more flexibility in spending money to improve schools. The measure pledged 18 billion dollars to California schools over 6 years. Furthermore, the LCFF included the Local Control and Accountability Plan (LCAP), wherein school districts were meant to engage the local community in specific, measurable, and timely implementation of the LCFF.

Despite the LCFF’s aims, a report by Education Trust West found that although poorer districts received on average more funding, in some schools, inequality gaps have widened. Critics have also argued that basing the funding on average daily attendance (ADA) instead of total enrollment, disproportionately affects low-income communities, which were the orig- inal proposed targets of the funding. Others have raised concerns that the LCFF is a “dump truck” solution, i.e. money goes to the districts or schools without ensuring it benefits the specific students it intended to, as opposed to a “backpack” model, where the students whose attendance merits the funding also receive the proportional benefit.

Through our novel decomposition of student test scores, we find not only disparities in student outcomes between racial groups, but value added within-school differences as well. Furthermore, we find different results than the Education Trust West study in that the LCFF decreased inequality, and suggests an upward trend moving into 2019-2020. Despite finding substantial mismanagement of allocation and disbursal of the funds, we’re hopeful that the full effects of the funding for future reductions in inequality will be seen in years to come.