Skip Repetitive Navigation Links
California State Auditor Logo COMMITMENT • INTEGRITY • LEADERSHIP

K–12 Local Control Funding
The State’s Approach Has Not Ensured That Significant Funding Is Benefiting Students
as Intended to Close Achievement Gaps

Report Number: 2019-101

Audit Results

California’s Approach to Overseeing Supplemental and Concentration Funds Has Not Always Ensured That Those Funds Benefit Intended Student Groups

The State’s implementation of LCFF has not yet proven effective at increasing transparency and accountability for the supplemental and concentration funds that CDE allocates to districts. Specifically, state law does not explicitly require districts to use unspent supplemental and concentration funds in the following year to benefit intended student groups, nor does it require that they track their spending of these funds. Furthermore, existing state law has allowed the districts we reviewed to identify hundreds of millions of dollars in LCFF funding as base rather than supplemental and concentration funds during the phase‑in period. If the Legislature intends for districts to use all of the supplemental and concentration funds it allocates to them to specifically increase or improve services for intended student groups, it should amend state law to establish this requirement. Additionally, districts do not always include clear information in their LCAPs regarding their use of supplemental and concentration funds, even though LCAPs are a key accountability tool for ensuring that they budget and spend these funds to increase and improve services for the intended student groups. The LCAPs’ lack of clarity has reduced transparency and resulted in some stakeholders submitting formal complaints and filing lawsuits in court. Until the State ensures that districts spend all supplemental and concentration funds to benefit the intended student groups, and that they provide clear, accessible information regarding that spending in their LCAPs, the intended student groups may not receive the services necessary to close the State’s persistent achievement gaps.

The State Has Not Ensured That Districts Spend Supplemental and Concentration Funds on Services for Intended Student Groups

As we discuss in the Introduction, the State instituted LCFF to provide districts with supplemental and concentration funds to improve the educational outcomes of the intended student groups and to increase transparency and accountability related to education funding. However, LCFF has not yet successfully accomplished these goals. As we discuss later, available data show improvements in some student outcome measures since the State implemented LCFF, although achievement gaps persist. In acknowledgment of the fact that educating the intended student groups is more costly, the State apportions the additional funds to districts based on their intended student group populations; as a result, we would have expected districts to track their spending of these funds. Instead, a series of impediments hinders stakeholders’ ability to determine with assurance the amount of supplemental and concentration funds that districts actually spend for the benefit of the intended student groups, even though these are the students for whom they receive the additional funding.

Most significantly, the current requirement districts must meet when spending supplemental and concentration funds is essentially meaningless. State law does not explicitly require that districts spend all their supplemental and concentration funds for intended student groups; instead, it states that they must use the funds to increase or improve services for those students in proportion to the amount of supplemental and concentration funds they receive. For example, in its fiscal year 2018–19 LCAP, Clovis Unified calculated that its supplemental funds represented an increase of 8.65 percent over its LCFF base funds. Consequently, state law requires Clovis Unified to increase or improve services for intended student groups by 8.65 percent as compared to all students. However, it is unclear how a district would demonstrate that it increased or improved services by a specific percentage. In fact, two of the three districts we visited stated that measuring objectively whether they have increased or improved services by a specific percentage for any one student group in comparison to all students is difficult. Furthermore, neither the county offices nor CDE is responsible for verifying that districts have achieved the required proportional increases.

In other words, state law created a mechanism to give additional funds to districts that have higher proportions of intended student groups, but it did not explicitly require or provide a means of ensuring that those districts actually spent their additional funds on the specific student groups for whom they were allocated. When we discussed this lack of an explicit requirement with the State Board’s deputy policy director and assistant counsel, he stated that he believes state law reflects a recognition that investing to improve the overall education program at a school site or within a district can be an effective way to meet the needs of intended student groups. He added that tying a legal obligation to dollar‑to‑dollar increases in expenditures would discourage districts from implementing approaches that would improve core programs in ways that better meet the needs of intended students. Although we do not disagree with this premise, LCFF’s intent and CDE’s own regulations make clear that supplemental and concentration funds are fundamentally different from base funds—districts must use them to increase or improve services for intended student groups.

The second impediment to ensuring that districts use supplemental and concentration funds for the intended student groups is that the State does not require districts to track how they spend these funds. Of the three districts we reviewed, only Clovis Unified generally tracked its supplemental funds in its accounting system. The other two districts’ accounting systems tracked supplemental and concentration funds inconsistently. At these districts, we struggled to locate financial information to determine how much supplemental and concentration funds they had received and if they had spent that funding to benefit the intended student groups.

Furthermore, budget and expenditure information for supplemental and concentration funds that districts include in their LCAPs is not always transparent. Specifically, although the LCAP template requires districts to include budgeted expenditures and estimates of actual expenditures for each service they provide, it does not require them to present summary‑level expenditure information in a manner that would allow stakeholders to compare districts’ total budgeted expenditures to their total estimated actual expenditures without significant effort. Consequently, the LCAPs for the three districts we visited do not enable stakeholders to easily identify whether the districts spent all of their supplemental and concentration funds as planned. In fact, within their LCAPs, districts reported numerous individual expenditures. Although the LCAP template asks districts to explain “material differences” between the individual expenditure amounts they budgeted and the estimated actual amounts they spent, the template does not include a place for districts to report the overall total differences between their budgeted and estimated actual expenditures of supplemental and concentration funds.

State Law Deferred Full Implementation of Supplemental and Concentration Funds, Resulting in Significantly Lower Amounts Than the Funding Formulas Would Have Provided

Given the lack of clear information in the accounting systems and LCAPs of the three districts we reviewed, we used the numerous expenditures they reported in the LCAPs to manually sum the amounts of supplemental and concentration funds they had budgeted and spent. We had two key observations from our calculations of these expenditures. First, we found that although the formulas in state law for calculating supplemental and concentration funds are based on a district’s proportions of intended student groups relative to its total enrollment, the regulations that the State Board adopted for local educational agencies to follow during the phase‑in period do not consider these proportions. As we explain in the Introduction, before the State fully funded LCFF in fiscal year 2018–19, the State adopted regulations that required districts to annually estimate how much base, supplemental, and concentration funds they expected to receive. Districts then used these estimates to describe in their LCAPs the services they planned to provide to intended student groups. Under these regulations, districts were to base their estimates on prior‑year spending. Specifically, they were to base their estimates for fiscal year 2014–15 on the amounts they spent in fiscal year 2013–14 on services for intended student groups, which must be greater than or equal to the amount of Economic Impact Aid they spent in fiscal year 2012–13. The Economic Impact Aid program was a state categorical program for kindergarten through grade 12 that provided additional English language acquisition programs, support and services for students with limited English proficiency, and State Compensatory Education services for educationally disadvantaged youth. In subsequent years, the estimate was based on the prior‑year spending.

Because the regulations did not base the estimated amounts on districts’ populations of intended student groups, the resulting amounts of funding districts identified as supplemental and concentration were significantly less than the amounts we calculated using the proportions of intended student groups. Figure 6 presents a hypothetical example to illustrate the different approaches for determining how much of the LCFF funding is supplemental and concentration funds. When we applied the same proportions of base, supplemental, and concentration funds that exist in state law to the total LCFF funds the three districts received, we identified significant amounts of supplemental and concentration funds that districts otherwise would have included in their LCAPs, as Figure 7 demonstrates. In fact, since the State implemented LCFF in fiscal year 2013–14, the regulations have led the three districts to identify approximately $320.6 million of LCFF funding as base rather than supplemental and concentration funds. If all districts statewide estimated supplemental and concentration funds at rates similar to those of the three districts we reviewed, the difference between using the approach required by regulations and basing their estimates on the proportions of intended student groups would have amounted to billions of dollars since LCFF’s implementation. According to State Board documents, because districts had been using various funding sources to provide services to intended student groups before the adoption of LCFF, the use of prior‑year spending allows a local educational agency to estimate the actual services provided. In other words, by directing districts to base their estimates on prior‑year spending, districts identified amounts of supplemental and concentration funds to increase or improve services for intended student groups that were similar to amounts they had already been providing before LCFF. Therefore, by deferring LCFF’s full implementation, the State likely also deferred improvements in performance outcomes for intended student groups.

Figure 6
Regulations Led to Different Proportions of Base, Supplemental, and Concentration Funds Than Full Implementation of State Law Would Have Provided

Figure 6, a chart depicting the different proportions of base, supplemental, and concentration funds that result from the target LCFF amount compared to applying the target proportions of base, supplemental, concentration to the transition LCFF amount; and compared to following the regulations adopted by the State Board, which are based on prior year’s spending.

Source: Analysis of state law, CDE documents, and district documents.


Figure 7
State Regulations Resulted in Less Supplemental and Concentration Funds for the Three Selected Districts

Figure 7, a chart depicting a $320.6 million dollar difference in funds identified as supplemental and concentration from fiscal year 2013-14 through 2018-19 that resulted from following state regulations compared to basing supplemental and concentration funds on the funding formulas in state law.

Source: Analysis of state law, CDE’s principal apportionment data, and district LCAPs.

* When a district did not report the amount of supplemental and concentration funds it budgeted in its LCAP, we used the amount of expenditures it reported, as we show in Appendix B.


Furthermore, the State has not established adequate accountability controls over these funds. To ensure that districts estimated accurately all the supplemental and concentration funds they would receive, we expected that the State would have established a process to validate the amounts districts identified when following the regulations. However, the State has not established such a validation process. As a result, the districts’ estimations are the only method for identifying the amount of supplemental and concentration funds, and the amounts they included in their LCAPs are the only source of information about how much of the LCFF funding the State provided was treated as supplemental and concentration. In Appendix B, we provide information about each of the three districts’ LCFF funding for fiscal years 2013–14 through 2018–19.

The second key observation we identified in our analyses of the LCAPs of the three districts we reviewed is that even when two of these districts included supplemental and concentration funds in their LCAP budgets, they often did not fully spend those funds during the year in question. For example, in fiscal year 2017–18, San Diego Unified underspent by 3 percent, or $3.5 million, and Oakland Unified underspent by 6 percent, or $4 million. This is problematic because we could find no requirement under current law for districts to continue using unspent supplemental and concentration funds in the following year to increase or improve services for intended student groups—the unspent funds essentially can be used for any purpose in subsequent years. Although the amounts in question represent a small percentage of the two districts’ total LCFF funding, they could have used the funding to provide additional resources for intended student groups, such as English language support staff or college counselors. Without direction from the State to do so, San Diego identified unspent supplemental and concentration funds from fiscal year 2017–18 and included that amount with the funding identified in its 2018–19 LCAP to provide services for intended student groups. However, if districts statewide underspend their fiscal year 2018–19 supplemental and concentration funds by just 1 percent each year, they will not provide about $87 million in services for the intended student groups.

Determining whether or how districts used their unspent supplemental and concentrations funds in the following year is difficult because of their inconsistent tracking; however, in the absence of a requirement to carry over unspent supplemental and concentration funds for the same purposes, districts can spend them for general purposes, not specifically for the direct support of intended student groups. Two of the three county offices we visited acknowledged that current law allows districts to potentially use unspent supplemental and concentration funds for more general purposes.

Although state law requires county offices to review whether districts’ budget expenditures are sufficient to implement their planned LCAP services, it does not require the county offices to examine whether districts budget and spend all of their supplemental and concentration funds and provides little guidance for their review. To approve the financial portions of districts’ LCAPs, the three county offices we reviewed use the CCSESA LCAP approval manual as a guide. The fiscal year 2018–19 manual includes guidelines for a compliance‑based review of expenditures that would confirm, for example, that a district identified expenditure amounts, sources, and budget references for each service. The manual does not include steps that county offices should take to ensure that districts budget and spend all of their supplemental and concentration funds. Consequently, the Fresno and San Diego county offices did not compare districts’ budgeted expenditures with their estimated actual expenditures to identify potential underspending. Ultimately, the financial reviews county offices are required to perform appear to be a compliance exercise rather than a critical analysis of the expenditures.

Requirements for Spending Supplemental and Concentration Funds Districtwide

Enrollment of intended student groups that is 55 percent or more of total enrollment

A district may spend supplemental and concentration funds districtwide if the following are true:

Enrollment of intended student groups that is less than 55 percent of its total enrollment

A district may spend supplemental and concentration funds districtwide if:

Source: State regulations.

Districts Do Not Always Clearly Describe in Their LCAPs How the Supplemental and Concentration Funds They Spend Districtwide Principally Benefit Intended Students

Districts may spend supplemental and concentration funds for districtwide purposes by upgrading the entire educational program, thereby benefiting more than just intended student groups. However, districts can spend districtwide only if they follow the requirements we list in the text box. Given these requirements, we would expect districts to sufficiently describe in their LCAPs how they had principally directed their districtwide spending of supplemental and concentration funds toward intended student groups. The LCAPs for the three districts we visited indicate that the districts intended to spend supplemental and concentration funds to pay for varying proportions of districtwide services, such as reducing class sizes and providing parent and community resource centers. The proportions of the districts’ services that were districtwide ranged from 39 to 88 percent. However, the districts did not always clearly describe in their LCAPs how they principally directed those funds toward intended student groups. Specifically, for 37 of the 53 expenditures we reviewed from fiscal years 2017–18 and 2018–19, the information that the three districts provided in their LCAPs was not sufficient for us to determine whether the districtwide services on which they planned to spend supplemental and concentration funds would principally benefit intended students.

For example, for six of the 11 services we tested from its 2018–19 LCAP, Clovis Unified stated that services will be principally directed without explaining how the district will principally direct them. In one instance, Clovis Unified wrote in its LCAP that it would “provide professional development…, train highly qualified teachers, and develop new curriculum units and assessments… to ensure all students, principally directed toward [the intended student groups], achieve at a high level.” When we asked Clovis Unified for clarification, an assistant superintendent stated that the district focused these services—including professional development for teachers—on helping intended student groups. It attributed the lack of clarity in its LCAP to the vagueness in regulations about how districts can principally direct services toward intended student groups. However, when districts fail to clearly explain in their LCAPs how they plan to use supplemental and concentration funds on districtwide services to benefit intended student groups, they reduce transparency and accountability.

In addition, a lack of clarity puts the districts at risk of stakeholders’ submitting complaints or filing lawsuits alleging that they have inappropriately spent the funds. In fact, CDE’s records indicate that since August 2016, it has issued reports for 10 complaints against districts—five from January through April 2019 alone—in which stakeholders raised concerns about districts’ intended use of supplemental and concentration funds. One of those complaints resulted in a lawsuit, which the parties involved ultimately settled with the district agreeing to change how it uses supplemental and concentration funds in the future.

Districts have not always clearly demonstrated how they planned to spend supplemental and concentration funds districtwide, likely in part because the requirements for doing so are vague. The LCFF regulations, the CCSESA LCAP approval manual, and the LCAP template all fail to explain or provide examples of ways a district can successfully demonstrate how its districtwide spending is “principally directed” toward intended student groups. Although the LCFF regulations regarding districtwide spending of supplemental and concentration funds have been in place since 2014, stakeholder complaints demonstrate that some districts struggle to successfully describe how they principally direct those funds toward intended student groups.

Despite the difficulties that some districts have faced in implementing the spending requirements, CDE has not fully incorporated into its key guidance documents the position that it has taken in its complaint reports regarding satisfying the spending regulations. On at least eight occasions since May 2017, CDE has presented a consistent position in its reports about how districts can comply with the regulations for districtwide spending. We summarize CDE’s position in the text box. Because CDE’s comments in its complaint reports provide more specific advice to the districts named in the reports regarding how they can demonstrate in their LCAPs their compliance with the spending regulations, we would have expected it to include this information in the LCAP template instructions to ensure consistent understanding among all districts.

Furthermore, based on the complaints and appeals it receives, CDE could identify common pitfalls for districts to avoid and best practices for them to follow and could include this information in key guidance documents, such as the LCAP template and its instructions. According to the administrator of CDE’s Local Agency Systems Support Office, CDE has provided information regarding principally directed from relevant complaints in recent presentations. Nevertheless, 21 of the 28 descriptions of the districtwide services we reviewed from our selected districts’ fiscal year 2018–19 LCAPs were not in accordance with the guidance in CDE’s complaint reports.

In addition, districts sometimes did not clearly demonstrate how districtwide expenditures of supplemental and concentration funds principally benefited intended student groups because they used these funds for base services that they provide to all students. Specifically, all three districts and all three county offices we reviewed indicated that LCFF base funding amounts do not cover all necessary base costs, which can put pressure on districts to use supplemental and concentration funds to provide such services. Consequently, we observed that districts used supplemental and concentration funds to pay for what appear to be base services. For instance, San Diego Unified budgeted $5.2 million in supplemental and concentration funds for library services at all schools within the district. It justified the expenditure by mentioning that such services create equitable access to learning tools, resources, materials, and technology. According to the district’s LCAP, providing library services on campus allows intended student groups an equitable opportunity to succeed educationally through access to computers, laptops, books, reference materials, and educational software. Although we recognize the benefits of base services and the dilemma districts face when they lack the funding necessary to pay for them, this description fails to sufficiently explain how San Diego Unified principally directed these services toward intended student groups.

LCAPs Have Not Consistently Provided Transparency or Facilitated Accountability

The information districts include in their LCAPs is often overly complex and unclear, resulting in LCAPs that are not consistently transparent and that do not facilitate accountability. For example, the LCAP template and instructions prompt districts to connect their identified needs with goals based on those needs and then to identify services to meet those goals. However, we rarely found this logical connection in the LCAPs we reviewed. Likewise, districts often did not effectively analyze in their LCAPs whether services that they had already implemented had been successful. The lack of clear information within the LCAPs raises concerns about the ability of stakeholders to hold the districts accountable for the services they provide, even though enabling such accountability is one of the fundamental purposes that the LCAPs should serve. Weaknesses in the template and limited reviews required of the county offices have also contributed to the LCAPs’ lack of transparency.

The LCAPs We Reviewed Did Not Clearly Communicate Whether the Districts Had Effectively Met Students’ Needs

Guidance for developing quality LCAPs states that an LCAP should establish a clear understanding of the services that each district will provide to its students and should offer a simple and complete story of that district’s needs, goals, services, and investments in positive outcomes for its students. We believe that to be clear and effective, an LCAP should logically connect a district’s needs and goals, include sufficiently detailed descriptions of the related services, and present understandable content. However, the LCAPs we reviewed were unclear in a number of ways.

First, the three districts did not always base the goals and services in their LCAPs on clearly articulated needs. This approach limits transparency because stakeholders cannot decipher which problems the districts intend the goals to address or how planned services will help the districts achieve those goals. The primary causes of this misalignment are broad goals and a lack of articulation about how certain services connect to the overarching need and goal. In particular, each of the 15 goals we reviewed from the three districts’ 2018–19 LCAPs was broad. For example, Clovis Unified’s first goal is “Maximize achievement for ALL students,” which does not convey any information about which types of services would lead to achieving that goal. Clovis Unified based this goal on a specific need—its students do not currently all perform at or above grade level in mathematics and English language arts, and achievement gaps exist for intended student groups. However, it is not clear how certain services Clovis Unified includes under this need, such as reducing the charges for students to attend performing arts and athletic events, would directly contribute to achievement in mathematics and English language arts. Although Clovis Unified writes in its LCAP that reducing these attendance charges will encourage greater student participation, we believe a more specific need and goal would better align with services like this—for instance, a need to improve students’ participation and a goal to achieve a certain increase in participation.

The two other districts we reviewed also included generally broad goals, although Oakland Unified created more specific “subgoals” underlying its main goals. Additionally, only Oakland Unified had a goal that was specific to an intended student group: English learners. We believe that districts should articulate a clear connection between needs, goals, and underlying services, and that county offices’ reviews should ensure that such connections are in place.

Furthermore, the districts often did not effectively analyze in their LCAPs whether the services they provided had been successful, which makes it difficult for stakeholders to hold them accountable for continuing to fund effective services and eliminating ineffective services. State law requires the LCAP template to include an assessment of the effectiveness of the specific services described in the LCAP toward achieving the goals. Although the Analysis subsection of the LCAP template requires districts to explain the overall effectiveness of the services in achieving the related goal, the template does not require districts to provide analysis specific to each service but rather to each goal. Because a single goal can include more than 30 services, determining which particular services were effective in improving overall outcomes can be difficult.

In fact, the amount of detail in the Analysis subsections we reviewed varied widely and did not always provide information about any specific services, further limiting the usefulness of the information that stakeholders can obtain from the LCAP. In some instances, the districts addressed specific services. For example, in the Analysis subsection for one of its goals, Oakland Unified explained that students who participated in its Pathways Program, which included opportunities such as a skilled trades service, had graduation rates more than 25 percentage points higher than those who did not participate. In contrast, Clovis Unified’s Analysis subsection for its first goal lacks specific details about the results of its services; it includes only a brief summary statement regarding the overall results of the implementation of all its services. Similarly, Clovis Unified described the overall effectiveness of its first goal’s 38 services by reporting four metrics from the dashboard and simply stating that those results are “due to the effective implementation of Goal 1 actions and services.” Perhaps of even greater concern, 60 percent of the goal‑level outcomes we reviewed in districts’ LCAPs either were pending or presented outdated data; two of the districts stated that this is because they must formulate their LCAPs in the spring, before end‑of‑year and statewide data are available.

In addition, the reviews that county offices are required to perform of the LCAPs are insufficient to ensure that districts include the information necessary to ensure their accountability to stakeholders. State law requires that county offices consider only three criteria when approving LCAPs: whether a district’s LCAP adheres to the LCAP template and its instructions, whether that district’s budgeted expenditures in its LCAP are feasible given the funds available in its budget, and whether the LCAP adheres to expenditure regulations related to supplemental and concentration funds. Each of the county offices we visited met the legal requirements for approving LCAPs. However, state law does not require county offices to ensure that districts write LCAPs that articulate a logical connection between the districts’ needs and goals, provide sufficiently detailed descriptions of services within the LCAP’s Analysis subsection, and are easily understandable.

The Alameda County Office of Education took steps beyond those required by state law. According to an executive director at the Alameda county office, it includes an exemplary category in its review to encourage its districts to prepare higher‑quality LCAPs. This exemplary category includes checks for readability and understandability; a review of whether a district has thoughtfully described how its services address the needs of its students, student subgroups, and specific school sites; and an expectation that the district will provide insightful and easily understood descriptions of how its services address the needs of its intended student groups. We consider a county office’s including such steps in its review of an LCAP to be a best practice.

The Lengthy and Complex LCAPs We Reviewed Reduced Transparency

We believe that, to be effective at providing transparency, an LCAP needs to—among other things—provide a simple, brief, and coherent story of the district’s goals and be understandable to an audience of parents and community members. However, all three districts we reviewed produced 2018–19 LCAPs that are hundreds of pages long: Clovis Unified’s LCAP is nearly 260 pages, San Diego Unified’s is 320 pages, and Oakland Unified’s is nearly 600 pages. LCAPs of these lengths cannot tell a simple, brief, and coherent story of each district’s goals; rather, their length and complexity reduces readability and transparency. In fact, without any requirement to do so, Clovis Unified and San Diego Unified have both created shortened versions of their LCAPs—such as infographics—that should be easier for their stakeholders to understand.

Because the LCAP template requires districts to present similar information in multiple places, it contributes to LCAPs’ excessive lengths. As Figure 8 illustrates, several subsections within the LCAP template appear multiple times. We determined that the LCAPs we reviewed could have been as much as 40 percent shorter had they not contained duplicative information. For example, the Annual Update section and the Goals, Actions, and Services section contain similar information and together accounted for 466 of the 592 pages in Oakland Unified’s LCAP. Combining those sections in the LCAP template could have reduced Oakland Unified’s total page count by around 40 percent. We believe that a revision to the LCAP template that the State Board and CDE are considering to merge the Annual Update section with the Goals, Actions, and Services section could resolve some of the duplication we noted, if the State Board adopts it.

Figure 8
LCAPs Consist of Five Main Sections, Two of Which Contain Duplicative Information

Figure 8, a graphic presenting the five main sections of the LCAP and identifying the two that contain duplicative information.

Source: Analysis of the LCAP template that CDE prepared in October 2016.

* The budget summary no longer exists in the version of the LCAP template that CDE prepared in January 2019.


As we previously discuss, districts have also added complexity to their LCAPs by including numerous services for each of their identified goals. Having numerous services related to a single goal obscures whether any particular service was effective in helping the district meet that goal. Each of the three districts we reviewed had at least one goal for which it identified 11 or more services. For example, in its fiscal year 2018–19 LCAP, Clovis Unified included 38 specific services for its goal of maximizing achievement for all students. These services include providing intervention summer school and reducing the charges for students to attend some performing arts and athletic events. Its description of these 38 services is 76 pages long, or about 30 percent of the length of its entire LCAP. With so many services for just one broad goal, determining which ones are the most critical to achieving the goal or even how some relate to the goal is difficult.

We also found that the three districts sometimes included mistakes and discrepancies in their LCAPs that made the documents less transparent and useful. For instance, Oakland Unified indicated in its fiscal year 2017–18 LCAP that it would implement some services districtwide but at the same time stated it would provide those services at only certain school sites, making it unclear which was correct. Similarly, for certain expenditures in its fiscal year 2018–19 LCAP, Clovis Unified included line items for services that, when totaled, did not match the sum it reported; thus, we were unclear about which amounts were correct. According to an assistant superintendent at Clovis Unified, these discrepancies occurred because the electronic tool the State provided to assist districts in filling out the LCAP did not automatically sum the expenditures that Clovis Unified entered. She stated that often data previously saved in the e‑template would disappear upon reopening the file and that inefficiencies of the e‑template made it difficult to validate the data.

Additionally, the districts sometimes used jargon that made it difficult to understand how they planned to spend their supplemental and concentration funds. San Diego Unified provided one particularly difficult description: “Integrated Multi‑Tiered Systems of Support (I‑MTSS) will be implemented in Grades TK–12 through the Academics and Agency (A²) model by ensuring the essential elements and solution seeking processes are in place at all schools.” We could not determine from that description whether and to what extent San Diego’s expenditure of supplemental and concentration funds would affect the intended student groups.

The State Currently Lacks Information That Would Better Enable It to Measure the Effectiveness of LCFF

The State has recently made a number of significant changes to its statewide assessment system and accountability system, including the implementation of the dashboard and new academic assessments. As a result of these changes, identifying clear trends in achievement gaps statewide will require additional time and data. Further, policymakers and other stakeholders still lack adequate information to assess the impact of supplemental and concentration funds on the educational outcomes of the intended student groups. However, by collecting and reporting additional information about districts’ uses of supplemental and concentration funds, the State could begin to determine how districts’ spending of those funds affects students and whether it should take further action to close persistent achievement gaps.

Because the Dashboard Data Are New and Not Tied to Local Spending, the State Has Limited Ability to Measure LCFF’s Effectiveness in Closing Achievement Gaps

The State’s current accountability system does not yet allow stakeholders to adequately assess LCFF’s effectiveness in improving student educational outcomes and closing achievement gaps for intended student groups. The State implemented LCFF in part to improve the outcomes of the intended student groups and to close the achievement gaps that exist between certain student groups and students overall. As we discuss in the Introduction, the State measures student outcomes—including those of intended student groups—through the dashboard, which is a key accountability tool for LCFF. However, the State did not release the dashboard until 2017, four years after it implemented LCFF. The State also transitioned to new academic assessments, reported new dashboard indicators, and changed methodologies for calculating certain existing indicators, making identifying and assessing trends related to student outcomes even more difficult. In addition, CDE does not incorporate year‑to‑year growth for individual students into its calculations for certain dashboard indicators, and therefore may obscure LCFF’s impact on students over time. However, CDE has been exploring a student growth model for the dashboard. Given these developments, we believe additional time and more dashboard data are necessary to identify clear trends in closing achievement gaps statewide.

Further, the State is in the early stages of planning and developing a data system that could provide additional information regarding LCFF’s effectiveness. Unlike some other states, California does not yet have a statewide system that connects K–12 data—such as the data that contribute to the dashboard—to postsecondary and workforce data. However, in 2019 the State authorized funding to plan for such a statewide data system, which could allow it to report additional outcomes related to students’ participation in college and the workforce after leaving the K–12 system. For instance, such data could build upon the dashboard indicator for college/career preparedness, which reports the percentage of students who are prepared for college or the workforce but does not report whether students have actually participated or succeeded in those domains.

The State’s current data make clear, however, that achievement gaps still persist under LCFF. The available data show improvements in some student outcome measures since the State implemented LCFF, including modest reductions in certain statewide achievement gaps. Additionally, two recent case studies report that San Diego Unified has improved its outcomes; one report cited increases in graduation rates between 2014 and 2016 while the other cited greater rates of college and career readiness over the last six years. However, the dashboard indicates that significant achievement gaps still exist statewide for the intended student groups. For example, the 2018 dashboard shows that the statewide graduation rate for all students was nearly 84 percent but that the statewide graduation rate for youth in foster care was only 59 percent. Similarly, according to the dashboard’s college/career indicator, less than 15 percent of English learners in the graduating class of 2018 were prepared for college or the workforce, versus about 42 percent of all high school students in the class of 2018.

Given that the data show these persistent achievement gaps, we would expect the State to have a method to determine whether supplemental and concentration funds, and possibly other funding, is helping to improve the performance of the intended student groups. However, the State has not required districts to track and report their expenditures of supplemental and concentration funds in a way that aligns with dashboard indicators. It therefore lacks a means of determining directly whether or how well districts are spending those funds to reduce achievement gaps. For instance, the dashboard does not indicate whether the 7 percentage point increase from 2017 to 2018 in the graduation rate for students from households with low incomes at Oakland Unified was associated with any specific district effort, nor does it reveal whether declines in English and math assessment scores for English learners at Oakland Unified were the result of the amounts of supplemental and concentration funds the district directed toward those students. When we asked CDE for its perspective, the director of its Analysis, Measurement, and Accountability Reporting Division indicated that state law provided for the establishment of the dashboard to allow county offices and districts to evaluate strengths and weaknesses and identify areas that require improvement; it does not require CDE to determine whether LCFF is working. Nonetheless, we believe that as part of its responsibility to improve public education programs, it would be reasonable for CDE to have a method for doing so.

For each goal in their LCAPs, districts are to report both estimated actual expenditures and actual outcomes. However, districts often do not effectively analyze in their LCAPs whether specific services have been successful—as we previously discussed. At times, districts articulated in their LCAPs how their expenditures of supplemental and concentration funds affected student outcomes. For example, according to its 2018–19 LCAP, Oakland Unified reported that it spent an estimated $250,000 in supplemental and concentration funds to provide a five‑week summer literacy program and that participating students averaged three months of reading growth. However, districts do not consistently provide this type of information. Moreover, even if they and other local educational agencies consistently measured the effectiveness of their spending of supplemental and concentration funds and reported those results in their LCAPs, it would be onerous for CDE to aggregate, summarize, and report that information on a statewide basis; the source information would exist in the more than two thousand LCAPs local educational agencies prepare each year, each of which could contain dozens of individual expenditures. As we describe in the next section, collecting and aggregating these data is critical for understanding how funding affects students and for determining whether the State should take additional actions to close achievement gaps

By Implementing Certain Tracking Mechanisms, the State Could Better Understand How LCFF Funding Affect Student Outcomes

Since implementing LCFF in fiscal year 2013–14, the State has allocated billions of dollars in supplemental and concentration funds each year, yet policymakers still lack adequate information to assess the impact of those funds on the educational outcomes of the intended student groups. We acknowledge that a key principle of LCFF is local control, and we do not advocate undermining that principle. However, because districts do not always clearly describe how the supplemental and concentration funds they spend principally benefit intended student groups and because achievement gaps still exist for those student groups, we believe the State should do more to obtain data that would help policymakers and other stakeholders better assess the impact of the funds the State distributes. By collecting and reporting additional information about districts’ uses of supplemental and concentration funds, the State could ensure that it and other stakeholders better understand how the districts’ spending of these funds affects intended student groups and whether further action is necessary to close persistent achievement gaps.

As an initial step, the State could collect and report data on the total amount of supplemental and concentration funds each district spends to assess whether they spend all of it. As we discuss in the first section of this report, because regulations directed them to use prior‑year spending amounts, the districts we visited did not include in their LCAPs all of the supplemental and concentration funds that they would have if they had based their estimations on the percentages in state law, nor did they spend all of the supplemental and concentration funds they did include in their LCAPs. As a result, it is unclear the extent to which hundreds of millions of dollars benefited those student groups. To provide assurance that districts spend all of their supplemental and concentration funds, the Legislature could require CDE to identify a common methodology—for instance, using resource codes in CDE’s already existing account code structure—for districts to track and report the total amount of supplemental and concentration funds that they receive and spend each year.

A standardized methodology for tracking supplemental and concentration funds could also help districts more easily compile the information that they report in their LCAPs. Because the districts we reviewed did not consistently track all of their supplemental and concentration funds in their accounting systems, they sometimes had to use time‑consuming processes to calculate the amounts of these funds they reported in their LCAPs. For example, Oakland Unified’s former LCAP coordinator stated that she developed a process—which she said took about three months to complete—that involved using a spreadsheet to manually compile LCAP expenditure information. She said that when the State first transitioned to LCFF, it provided limited information about how districts should generate expenditure information for their LCAPs and that such guidance would have been helpful. According to Oakland Unified’s chief academic officer, the district has a new accounting system that now tracks supplemental and concentration funds more accurately and can provide useful information for the district.

Beyond simply accounting for the total amount of districts’ budgeted and spent supplemental and concentration funds, the State could begin to determine the impact of those funds by gathering information about the types of services the districts provided with the funds and then comparing that information to student outcomes. To know where to expect supplemental and concentration funds to contribute to improvements in the dashboard’s indicators, the State and other stakeholders need to know the types of services districts have provided—such as math support or English learner tutors—using those funds. However, as we note previously, the State has not required districts to track and report their expenditures of supplemental and concentration funds in ways that correspond with dashboard indicators. To address this gap between funding and outcomes, the State needs to collect additional spending information from districts, as Figure 9 indicates. For example, if a district provided English learner tutors for its intended student groups, it could report expenditures for these tutors as supplemental and concentration funds and as targeted toward English learners. The State and other stakeholders could then compare this spending information with the appropriate dashboard indicators—in this case, the English Learner Progress indicator.

Figure 9
By Collecting and Reporting Districts’ Spending Information, the State Could Strengthen the Links Between Spending and Outcomes

Figure 9, a graphic presenting the missing links between the information in districts’ LCAPs, their accounting systems, and the dashboard.

Source: Analysis of the 2018 California School Dashboard Technical Guide, the 2019 California School Accounting Manual, and LCAPs and financial information from the selected districts.


To standardize these tracking procedures, the Legislature could require CDE to identify categories for the types of services that districts provide with their supplemental and concentration funds. CDE could align certain categories with dashboard indicators and provide guidance to districts to ensure that they categorize expenditures consistently. When we asked about the best way to collect this spending information, the State Board’s deputy policy director and assistant counsel and the administrator of CDE’s Financial Accountability and Information Services Office stated that it may be more feasible to create a new computer‑based reporting tool through which districts and other local educational agencies could enter information about expenditures that they may already report on paper in their LCAPs.

County offices and districts could also use the categorized spending information when they participate in the differentiated assistance process. As we indicate in the Introduction, a district is eligible for differentiated assistance—the State’s primary process for ensuring that districts receive individualized support—if the dashboard shows that the district has any student groups that do not meet performance standards for two or more dashboard performance indicators. During the differentiated assistance process, the county office works with the district to identify possible causes of these achievement gaps. If a district tracked and reported its expenditures of supplemental and concentration funds as we have proposed, the district and county office could use those data to inform their analyses of achievement gaps. For instance, the data might suggest that a district’s lack of spending for services to meet certain goals has contributed to poor outcomes or that the services on which the district has spent funds are ineffective.

Categorized spending data could also be useful for broader policy discussions about LCFF. In the course of our review, we observed that districts reported expenditures related to academic needs, as well as to other, more fundamental needs—such as physical safety and mental well‑being. For example, according to their LCAPs, the three districts we visited used millions of dollars of supplemental and concentration funds to address students’ basic needs such as food, health, and safety; the districts’ LCAPs associated these funds with services such as a child nutrition program, nurses, mental health staff, and school security officers. The State could measure the amount of supplemental and concentration funds that districts direct toward these basic needs by including in its tracking mechanism appropriate categories that focus on issues like health and wellness. These data could be useful for policymakers if they wanted to consider increasing LCFF base funding or redirecting other funding sources to address these fundamental needs.

The State and local entities would not sacrifice local control by collecting and reporting spending data related to districts’ uses of supplemental and concentration funds. Districts would report spending information after they have decided how to spend their funds; because the tracking mechanism would be informational, not prescriptive, it would not represent a return to categorical funding. Moreover, a precedent exists for tracking funds that are generally free of state control: education funding provided under the California State Lottery Act of 1984 is not subject to state control, yet state law still requires each district to track the lottery funds it receives and spends. In fact, CDE has created a standardized accounting code for districts’ unrestricted lottery funds. These requirements exist even though lottery funds are similar to supplemental and concentration funds in that both are unrestricted and have few spending requirements. Further, CDE’s LCAP template already contains sections for districts to record the intended student groups and the state or local priorities that the districts intend to address through their expenditures of supplemental and concentration funds. A standardized tracking mechanism would merely be a way for the State to collect similar information electronically, thus allowing it to aggregate those data on a broader scale and then align them with dashboard outcomes at the school, district, and statewide level.

We recognize that drawing links between certain types of expenditures of supplemental and concentration funds and districts’ dashboard outcomes may be challenging. For example, a single expenditure may support services related to social‑emotional learning as well as to academic and career mentorship. Further, that same expenditure may affect more than one dashboard indicator. However, the complexities of education funding and of local control should not prevent the State from gathering, summarizing, and sharing information about how districts actually use supplemental and concentration funds meant to benefit intended student groups.

The State has an opportunity to take an important step toward learning more about the effectiveness of billions of dollars that it allocates for K–12 education. Tracking and summarizing the districts’ use of supplemental and concentration funds would provide useful data that would be a critical step toward establishing direct connections between the State’s appropriations of these funds and LCFF’s effectiveness in closing persistent achievement gaps related to the intended student groups. Without this information, we believe that the State will continue to struggle to determine whether it needs to do more to close those gaps. We provide several recommendations to help the State better ensure that intended student groups receive maximum benefit from the supplemental and concentration funds it allocates, which we summarize in Figure 10.

Figure 10
By Implementing Our Key Legislative Recommendations, the State Could Better Ensure That Supplemental and Concentration Funds Benefit Intended Student Groups

Figure 10, a chart describing the current concerns we identified regarding supplemental and concentration funds, LCAPs, and intended student group outcomes. The chart also describes our recommendations related to each concern and the intended results of implementing our recommendations.


Recommendations

Legislature

To increase the transparency of LCAPs and ensure that stakeholders can provide an adequate level of oversight, the Legislature should amend state law to require districts and other local educational agencies to specify in their LCAPs the specific amounts of budgeted and estimated actual supplemental and concentration expenditures for each service that involves those funds.

To ensure that intended student groups receive the maximum benefit from supplemental and concentration funds, the Legislature should take the following actions:

To provide additional data for the State and other stakeholders and to align spending information with the dashboard indicators or other student outcomes, the Legislature should take the following actions:

State Board

To increase the transparency of LCAPs, by February 2020, the State Board should make the following changes to the LCAP template:

To ensure that districts and other local educational agencies produce clear and effective LCAPs and to reduce the likelihood of stakeholder complaints, by April 2020 the State Board should revise the instructions for the LCAP template as follows:

We conducted this audit under the authority vested in the California State Auditor by Government Code 8543 et seq. and according to generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives specified in the Scope and Methodology section of the report. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Respectfully submitted,

ELAINE M. HOWLE, CPA
California State Auditor

November 5, 2019






Back to top