Skip navigation

Ca State Auditor Data Reliability Comp Generated Data Varied in Accuracy 10-2012

Download original document:
Brief thumbnail
This text is machine-read, and may contain errors. Check the original document to verify accuracy.
CALIFORNIA STATE AUDITOR
Bureau of State Audits
Data Reliability
State Agencies’ Computer-Generated Data Varied in
Their Completeness and Accuracy

October 2012 Report 2012-401

The first five copies of each California State Auditor report are free. Additional copies are $3 each, payable by
check or money order. You can obtain reports by contacting the Bureau of State Audits at the following address:
California State Auditor
Bureau of State Audits
555 Capitol Mall, Suite 300
Sacramento, California 95814
916.445.0255 or TTY 916.445.0033
OR
This report is also available on the World Wide Web http://www.auditor.ca.gov
The California State Auditor is pleased to announce the availability of an online subscription service. For
information on how to subscribe, please contact the Information Technology Unit at 916.445.0255, ext. 456,
or visit our Web site at www.auditor.ca.gov.
Alternate format reports available upon request.
Permission is granted to reproduce reports.
For questions regarding the contents of this report,
please contact Margarita Fernández, Chief of Public Affairs, at 916.445.0255.

Elaine M. Howle
State Auditor

CALIFORNIA STATE AUDITOR

Doug Cordiner
Chief Deputy

Bureau of State Audits

555 Capitol Mall, Suite 300

S a c r a m e n t o, C A 9 5 8 1 4

October 25, 2012	

916.445.0255

916.327.0019 fax

w w w. a u d i t o r. c a . g o v

2012‑401

The Governor of California
President pro Tempore of the Senate
Speaker of the Assembly
State Capitol
Sacramento, California 95814
Dear Governor and Legislative Leaders:
This letter report presents a summary of the results of the California State Auditor’s (state auditor)
assessments of the reliability of data in a wide variety of databases used by the state auditor for
the purposes of its audits. The U.S. Government Accountability Office (GAO), whose standards
we follow, requires us to assess and report on the reliability of computer‑processed information
that we use to support our audit findings, conclusions, and recommendations. Data reliability
refers to the accuracy and completeness of the data, given our intended purposes for the data’s
use. The State uses these data in many ways, which include reporting on its programs, processing
payroll and personnel transactions, and managing state licensing. Although we disclosed these
data reliability assessments in 20 audit reports that we issued during 2010 and 2011, this report is
intended to call attention both to areas of concern, where important data are not always reliable,
and to instances in which information has been reliable.
Many systems had reliable data for our purposes, but some important systems did not. During
the 20 audits covered by this report, we assessed the reliability of specific data in 53 instances.
For 13 assessments, we concluded that the data were reliable and that the likelihood of significant
errors or incompleteness is minimal and using the data would not lead to an incorrect or
unintentioned conclusion. We found, for example, that the California Department of Corrections
and Rehabilitation’s payroll data maintained by the State Controller’s Office was reliable, allowing
us to present information related to employee overtime.
However, for 14 assessments, we reported the data were not sufficiently reliable, meaning that
using the data would probably lead to an incorrect or unintentioned conclusion and that key data
fields have significant errors or are incomplete, given the audit topics and intended uses of the
data. For instance, we found significant errors in the accuracy of the data in the Department of
Mental Health’s (Mental Health) Sex Offender Commitment Program Support System (Mental
Health’s database), which Mental Health uses to track information related to each inmate the Sex
Offender Commitment Program evaluates to determine if the inmate meets the sexually violent
predator criteria. Specifically, we found that in our random sample of 29 records Mental Health’s
database contained an unacceptably high number of errors in six key fields.
For 26 assessments, circumstances prevented us from concluding one way or the other as to
the reliability of the data. In many cases, our conclusion that data were of undetermined
reliability arose from our decision to limit testing due to the prohibitively high cost
to fully test a database when source documents were housed at numerous locations
throughout the State. For instance, we did not conduct accuracy or completeness testing for

2

California State Auditor Report 2012-401

October 2012

the Department of Health Care Services’ California Medicaid
Management Information System because the source documents
required for this testing are stored at medical providers’ offices
located throughout the State, making such testing cost‑prohibitive.
The table on pages 8 through 11 summarizes selected information
from the pages referenced in the Appendix. The data reliability
assessment relates to the purpose for which we tested the system’s
data during the audit, as described in the Appendix. The agency’s
use of the system’s data usually, but not always, is similar to our use
of the system’s data.

California State Auditor Report 2012-401

October 2012

Introduction
Information technology (IT) systems are increasingly important for
efficient and effective business practices. The State has an ongoing
need for its IT systems to keep pace with technological changes and
to develop and use systems and databases where they have not
existed in the past. Equally important, however, is state agencies’
day‑to‑day use of existing IT systems for purposes that can have
significant impacts on the State’s operations, such as reporting on
programs, processing payroll and personnel transactions, tracking
and monitoring licensees, disbursing funds, and reaching program
decisions. In October 2008 and in August 2010 we issued reports
that addressed the reliability of the data from systems we tested as
part of audits issued in 2006 through 2009. The reliability of the
data from systems tested during audits issued in 2010 and 2011 is
the subject of this report.
The U.S. Government Accountability Office (GAO),
whose standards we follow, requires us to assess
and report on the reliability of computer‑processed
information that we use to support our audit
findings, conclusions, and recommendations. Data
reliability refers to the accuracy and completeness
of the data, given the intended purposes for
their use. The GAO defines the three possible
assessments we can make—sufficiently reliable
data, not sufficiently reliable data, and data of
undetermined reliability. (See the text box for
definitions.) In assessing data reliability, we take
several factors into consideration, including the
degree of risk involved in the use of the data and
the strength of corroborating evidence. A single
database may have different assessments because
information that we use for one purpose is accurate
and complete, whereas data fields needed for a
separate purpose are not.

Definitions Used in Data Reliability Assessments
Sufficiently Reliable Data—Based on audit work, an
auditor can conclude that the likelihood of significant errors
or incompleteness is minimal and that using the data would
not lead to an incorrect or unintentional message given the
research question and intended use of the data.
Not Sufficiently Reliable Data—Based on audit work, an
auditor can conclude that results indicate significant errors
or incompleteness in some or all the key data elements, and
that using the data would probably lead to an incorrect or
unintentional message, given the research question and the
intended use of the data.
Data of Undetermined Reliability—Based on audit work,
an auditor can conclude that use of the data may or may
not lead to an incorrect or unintentional message, given the
research question and intended use of the data.

We may employ various procedures for determining the reliability
of computer‑processed data we report and use to reach audit
conclusions. For example, if we want to use data to identify the
costs of contracted specialty health care for inmates, we might
test the California Prison Health Care Services’1 Contract Medical
Database in the following ways:

1	

In January 2011 California Prison Health Care Services changed its name to California Correctional
Health Care Services.

3

4

California State Auditor Report 2012-401

October 2012

•	 Performing electronic testing of key fields to identify any issues
with the fields which would materially affect our analysis. For
example, we might verify that all records contain valid types
of services.
•	 Verifying the accuracy of the data by selecting a random sample
of inmate health care costs and agreeing the data to source
documents. For example, we might verify the amounts paid
in the data agree with the invoice or contract amounts for a
specified service.
•	 Ensuring the completeness of data by selecting a sample of
source documents and verifying that corresponding entries exist
in the data. For example, we might haphazardly select hard copy
invoices and verify they are included in the data.
In the case of the California Prison Health Care Services, we tested
its Contract Medical Database for all these elements and found it to
be reliable for the purposes of our audit.
To provide the appropriate perspective about information derived
from computer‑based systems, GAO standards require us to
disclose the results of our data reliability testing and the limitations
of the data we use.

California State Auditor Report 2012-401

October 2012

Audit Results
Many Automated Systems Had Reliable Data for the Purposes of
the Audits
In assessing data reliability for audit purposes in 53 instances,
we determined that the data were reliable in 13 assessments.
Therefore, for these assessments, we were able to use the data to
draw conclusions and to quote the data without qualifications about
the accuracy or completeness of the information. For example,
we were able to use the Employment Development Department’s
leave accounting data maintained by the State Controller’s Office’s
(SCO) California Leave Accounting System to identify the amount
of leave used and accrued by unemployment insurance program
representative staff. We also concluded that the California
Department of Corrections and Rehabilitation’s (Corrections)
payroll data maintained by the SCO’s Uniform State Payroll System
were sufficiently reliable for us to report data on overtime. At the
California Housing Finance Agency, we were able to identify bond
issuance amounts and dates, rate types, and hedge status for our
audit period because we found the data from its debt management
system sufficiently reliable.
Many Automated Systems Were Not Sufficiently Reliable for the
Purposes of the Audits
For 14 data reliability assessments, we concluded that the data were not
sufficiently reliable. One reason for this conclusion was that the errors
caused by inaccurate data exceeded the acceptable number of errors we
established for the audit data to be deemed reliable for our purposes.
For instance, we found significant errors in the accuracy of the data
in the Department of Mental Health’s (Mental Health) Sex Offender
Commitment Program Support System (Mental Health’s database),
which Mental Health uses to track information related to each inmate
the Sex Offender Commitment Program evaluates to determine if the
inmate meets the sexually violent predator criteria. Specifically, we
tested a random sample of 29 records and found that Mental Health’s
database contained three errors in six of 21 key fields. At Corrections,
our electronic testing revealed that its CalParole Tracking System
(CalParole) contained a significant amount of erroneous data that
prevented us from drawing conclusions using the data. For example,
of the nearly 645,000 parolee employment records from CalParole
that we reviewed, at least 33,000 (5 percent of all records) contained
errors in the data field that is designed to identify a parolee’s current
employer. Additionally, the employer address fields were blank for
more than 20,000 of the more than 290,000 records that appeared
to contain a valid employer. Our testing of hard‑copy parolee files
further confirmed the limitations of CalParole employment data. We

5

6

California State Auditor Report 2012-401

October 2012

reviewed the files of 36 parolees from 12 parole units located across
the State whom CalParole listed as employed. For 31 of 36 parolees, the
hard‑copy files did not contain pay stubs, letters from employers, or
any other reliable documents that would offer proof of employment to
support the employment information recorded in CalParole.
In some circumstances we recommended that the audited agency
take corrective action. For example, to improve the reliability of
employment data contained in CalParole, we recommended that
Corrections ensure that parole agents correctly follow procedures
related to populating the data fields and maintaining CalParole. In
addition, we recommended that supervisors of parole agents conduct
periodic reviews of parolee files to verify whether employment fields
are completed appropriately and whether employment is documented
adequately. Finally, as Corrections prepares to move CalParole data
into the planned Strategic Offender Management System (SOMS),
which will consolidate existing databases and records to provide
a fully automated system and replace manual paper processes, we
recommended it modify existing employment related fields and add to
SOMS new fields that are currently not available in CalParole so that
Corrections can minimize the opportunity for erroneous data entries
and make employment data more reliable.
We Were Unable to Determine the Reliability of Data for Some Audits
For 26 data reliability assessments, we concluded that the data had
undetermined reliability. For 17 of the 26 assessments for which the
data were of undetermined reliability, the causes were not reasons
for concern. In many cases, the determination that data were of
undetermined reliability arose from our decision to limit testing
due to the prohibitively high cost to fully test a database when
source documents were housed at numerous locations throughout
the State. In other cases, testing was not possible because the
department, in accordance with its record retention policy, did not
retain the source documentation.
For example, data from two Department of Health Care Services’
(Health Care Services) systems were of undetermined reliability. We
did not conduct accuracy or completeness testing for Health Care
Services’ California Medicaid Management Information System
because the source documents required for this testing are stored
at medical providers’ offices located throughout the State, making
such testing cost‑prohibitive. For Health Care Services’ Service
Utilization Review, Guidance, and Evaluation (SURGE) computer
application, we could not assess data reliability by tracing to and
from supporting documents because the system is partly paperless.
Alternatively, following GAO guidance, we could have reviewed
selected system controls—the underlying structures and processes

California State Auditor Report 2012-401

October 2012

of the computer where data are maintained—to determine whether
data were entered reliably. However, the SURGE data was provided
from a data warehouse2, rather than from production data taken
directly from the SURGE database. System controls that apply to the
SURGE database can be overridden in a data warehouse, so in this
instance a test of system controls would not have been meaningful;
therefore, we did not test these controls.
For nine of the 26 data reliability assessments where the data were
of undetermined reliability, we identified concerns associated with
the audited agencies’ data or practices. For example, data from the
Department of General Services’ Division of the State Architect’s
(division) Tracker database was of undetermined reliability in part
because the dates it records in its database for the start and end of
construction can come from different and sometimes undocumented
sources. For example, the division may draw on several sources for
the construction end date, which may not accurately or consistently
represent the actual end of construction. Because the division did
not have a consistent method for identifying the date construction
ended, we were unable to test the accuracy of this field.
The Appendix Provides Specific Information About Each of the Data
Assessments That We Reported
The Appendix to this report contains tables that summarize the results
of the data reliability assessments for state‑administered programs
we discuss in audit reports issued in 2010 and 2011. The tables in the
Appendix are preceded by brief summaries of their related reports
and are organized by oversight agency, if applicable, and date order of
reports issued. They indicate the agency audited and the name of the
database we examined. The tables also include the following:
•	 Our purpose (or intended use) of testing the data, our assessment
based on our intended use, the audited agency’s purpose for the
data, and recommendations for corrective actions, if any. Although
our purpose for the data is sometimes the same as that of the
agency, our purpose occasionally differs. We report the results of
these assessments to inform others who may try to use the data.
•	 The agency’s response to our recommendations, if applicable. The
response date listed generally corresponds to the date noted in
the annual report to legislative subcommittees about the corrective
actions that the agency took to address our data reliability
recommendations. We issued our most recent report to the
subcommittees in March 2012. Therefore, some agencies may have
taken more recent corrective actions that we do not report here.
2	

A data warehouse is a system that stores, retrieves, and manages large volumes of data.

7

8

California State Auditor Report 2012-401

October 2012

Finally, the tables disclose information that provides context about
the significance of the data we obtained during our audits. For
example, the data from the California Department of Education’s
Consolidated Application Data System, which we used to determine
the number of traditional and charter schools and their students
eligible for free and reduced‑price meals, indicated that nearly 3.3
million of the more than 6.1 million students enrolled were eligible
to receive free or reduced‑price meals.
The table below summarizes the 36 data reliability assessments
included in the Appendix. We excluded the 17 data reliability
assessments associated with systems where the data were of
undetermined reliability because we chose to limit our testing.
We made this choice either because it was too costly to conduct
the tests or the source documents needed for testing were no
longer available due to the department’s retention policy. We do
not consider these instances to be problematic. The table lists the
agency and department associated with each database, our data
reliability assessment, the agency or department’s purpose for the
database, and the page number for each database’s data reliability
assessment table in the Appendix. In some cases we performed
more than one data reliability assessment of a database. If a database
with multiple assessments received the same rating more than once,
we list that rating only once in the summary table. For example, we
found Corrections’ Offender Based Information System was not
sufficiently reliable in one assessment and of undetermined reliability
in three other assessments; thus, in the table we list the assessment
simply as “No, Undetermined” to summarize the results of the
four assessments.
Table
Summary of Reliability Assessments for Audits Issued in 2010 and 2011
AGENCY

SYSTEM

RELIABLE FOR
AUDIT PURPOSES?

AGENCY PURPOSE OF DATA

PAGE

BUSINESS, TRANSPORTATION AND HOUSING
California Housing Finance
Agency (CalHFA)

Transportation, California
Department of (Caltrans)

Debt management system

Yes

To serve as a repository for CalHFA’s bond and
swap information, and as a reporting and
analysis tool.

16

Mortgage reconciliation system

Yes

To reconcile the loan payments received and to
serve as a source for reporting.

16

California Transportation
Improvement and Programming
System

Undetermined

To provide a shared database for Caltrans
District Offices, Headquarters, Regional
Transportation Planning Agencies, Federal
Highway Administration, and Metropolitan
Planning Organizations to manage the
programming and allocation of funds for the
State Transportation Improvement Program,
State Highway Operation and Protection
Program, and local projects.

18

9

California State Auditor Report 2012-401

October 2012

AGENCY

SYSTEM

RELIABLE FOR
AUDIT PURPOSES?

AGENCY PURPOSE OF DATA

PAGE

CORRECTIONS AND REHABILITATION, CALIFORNIA DEPARTMENT OF
Corrections and Rehabilitation,
California Department of
(Corrections)

Offender Based Information System

No,
Undetermined

To capture and maintain all adult offender
information from the time that the offenders
are committed to Corrections through the time
of their discharge.

20
25
28
30

Offender Based Information
Tracking System

No

To record and track youth offender information
such as demographics, movement within
and between facilities, and the occurrence and
processing of youth behavioral violations.

20

Corrections’ payroll data maintained
by the State Controller’s Office’s (SCO)
Uniform State Payroll System

Yes

To process the State’s payroll and personnel
transaction documents.

20

Corrections’ leave accounting data
maintained by the SCO’s California
Leave Accounting System

Yes

To perform a variety of functions necessary to
accurately track leave system eligibility, state
service credits, and leave benefit activity.

21

Corrections’ authorized positions
maintained by the SCO’s Position
Roster File for fiscal years 2004–05
through 2008–09

Yes

The Position Roster File is a file of authorized
positions used by departments to
track positions.

21

CalParole Tracking System (CalParole)

No

To track parolee progress through the parole
process and to update, add, or retrieve
parolee information. In addition, CalParole
shares parolee data through interfaces with
the Department of Justice and local law
enforcement agencies. 

24

Correctional Offender Management
Profiling for Alternative
Sanctions database

No

To assist criminal justice practitioners in the
placement, supervision, and case management
of offenders in community and secure settings. 

30

Consolidated Application Data System

No

To provide a process for local educational
agencies to complete Education’s Consolidated
Application for Funding Categorical Aid Programs,
which Education uses to distribute categorical
funds from various state and federal programs.

 32

Child Nutrition Information and
Payment System (CNIPS) database

No

To help local program sponsors administer state
and federal nutrition programs. CNIPS enables
sponsors to submit online reimbursements,
view the status of applications and meal
reimbursement claims, and access site and
sponsor information across programs.

33

Public Health, Department of
(Public Health)

Electronic Licensing
Management System

Undetermined

To provide Public Health’s Licensing and
Certification Division with an intranet
application tool to manage the State’s licensing
of over 30 types of health care facilities.

36

Developmental Services,
Department of

Uniform Fiscal System

Yes

To establish and track regional center
authorization and billing data including
vendor number, purchase authorization
number, consumer identification and eligibility
information, service code, service rate, claim
amount, and claim date.

 38

Mental Health, Department of

Sex Offender Commitment Program
Support System

No

To track data related to each inmate that meets
the sexually violent predator criteria and all of
their referrals for civil commitment.

 28

EDUCATION, CALIFORNIA DEPARTMENT OF
Education, California
Department of (Education)

HEALTH AND HUMAN SERVICES

continued on next page . . .

10

California State Auditor Report 2012-401

October 2012

AGENCY

SYSTEM

RELIABLE FOR
AUDIT PURPOSES?

AGENCY PURPOSE OF DATA

PAGE

LABOR AND WORKFORCE DEVELOPMENT
Employment Development
Department (EDD)

Streamline Tracking System
(streamline database)

No

To track the length of time required for EDD’s
training benefits program determinations.
The streamline database also captures
information related to the training and
sponsoring program provided on a training
benefits program application, the reason
that an application cannot be processed, and
the final processing status of a n application
(completed, unprocessed, or denied).

 40

EDD’s position information
maintained by the SCO’s Position
Roster File

Yes

The Position Roster File is a file of
authorized positions used by departments
to track positions.

41

EDD’s leave accounting data
maintained by the SCO’s California
Leave Accounting System

Yes

To perform a variety of functions necessary to
accurately track leave system eligibility, state
service credits, and leave benefit activity.

 41

EDD’s payroll data maintained by the
SCO’s Uniform State Payroll System

Yes

To process the State’s payroll and personnel
transaction documents.  

 41

iVOS

Yes,
No

To document claims information for all
of its insurance programs, including the
Foster Family Home and Small Family Home
Insurance Fund.

 44

Tracker database

Undetermined

To manage the projects submitted by school
districts. The Tracker database tracks project
applications, key dates, the inspectors assigned
to projects, and the types of project closure. The
database also generates invoices and calculates
the various fees owed to the Division of the
State Architect for certain aspects of its work. 

 46

STATE AND CONSUMER SERVICES
General Services, Department of

OTHER DEPARTMENTS, JUDICIARY & COMMISSIONS
California Prison Health
Care Services

Contract Medical Database

Yes

To combine data from 33 institutions into
a centralized database to provide real‑time
access to information, and to streamline the
maintenance of contracts and vendor data.

21

Marin Superior Court

Beacon case management database

No,
Yes

To manage civil, family law, juvenile, probate,
and small claims cases and to maintain
filing and disposition data about these cases
in accordance with direction provided by the
Administrative Office of the Courts (AOC). 

 48

Sacramento Superior Court

Sustain case management database

No,
Undetermined

To generate calendars, minute orders, out
cards, and statistics.

48

Office of Family Court
Services’ database

No

To assign cases to mediators, to send notices
regarding upcoming mediation appointments,
to track activities performed by mediators
related to cases, and to generate bills for
evaluation services. 

49

Oracle financial system

Undetermined

To record, process, and store AOC’s
financial transactions

 52

AOC’s payroll data maintained by the
SCO’s Uniform State Payroll System

Yes

To process the State’s payroll and personnel
transaction documents.

 52

Credentialing Automation System
Enterprise data

No

To track teacher data, applications,
documents, exams, case information, and
organization information.

 54

AOC

Commission on
Teacher Credentialing

11

California State Auditor Report 2012-401

October 2012

AGENCY

SYSTEM

RELIABLE FOR
AUDIT PURPOSES?

AGENCY PURPOSE OF DATA

PAGE

State Lands Commission
(commission)

Application Lease Information
Database (ALID)

No

To store the commission’s lease information,
including the lessee name, lease term and type,
lease location, rental amount, lease history,
and bond and insurance information. The
commission also uses tickler dates within ALID
to remind staff when leases are eligible for a
five‑year rent review.  

 56

Justice, Department of

Sex and Arson Registry

Undetermined

To serve as the statewide repository for all sex
and arson offender registration information.

58

Respectfully submitted,

ELAINE M. HOWLE, CPA
State Auditor

12

California State Auditor Report 2012-401

October 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2012-401

October 2012

Appendix
The tables on the following pages detail the results of the California
State Auditor’s assessments of the reliability of data discussed in
audits issued during 2010 and 2011. In addition, the tables briefly
summarize the main conclusions of each assessment. We excluded
the 17 data reliability assessments for which we did not identify
concerns but, because we chose to limit our testing, we categorized
the data as of undetermined reliability.
Index
AGENCY

AUDIT NUMBER

PAGE NUMBER

California Housing Finance Agency

2010‑123

 15

Transportation, California Department of

2010‑122

17

2009‑107.2

 19

Business, Transportation and Housing

Corrections and Rehabilitation, California Department of
Corrections and Rehabilitation, California Department of
Corrections and Rehabilitation, California Department of

2010‑118

23

Corrections and Rehabilitation, California Department of

2010‑116

27

Corrections and Rehabilitation, California Department of

2010‑124

29 

2010‑104

 31

Education, California Department of
Education, California Department of
Health and Human Services
Public Health, Department of

2010‑108

 35

Developmental Services, Department of

2009‑118

37

Mental Health, Department of

2010‑116

Labor and Workforce Development

...

Employment Development Department
2010‑112
----------~[
~
State and Consumer Services

27

 39

General Services, Department of

2010‑121

General Services, Department of

2011‑116.1

45 

43

2009‑107.2

19

Other Departments, Boards, and Commissions
California Prison Health Care Services
Marin Superior Court

2009‑109

47

Sacramento Superior Court

2009‑109

47

Administrative Office of the Courts

2010‑102

51

Commission on Teacher Credentialing

2010‑119

53

State Lands Commission
Justice, Department of

2010‑125

55

2011‑101.1

57

13

14

California State Auditor Report 2012-401

October 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2012-401

October 2012

CALIFORNIA HOUSING FINANCE AGENCY
Most Indicators Point to Continued Solvency Despite Its Financial Difficulties Created,
in Part, by Its Past Decisions
Date: February 24, 2011	

Report: 2010‑123

BACKGROUND

Overseen by a 14‑member board, the California Housing Finance Agency (CalHFA) uses the proceeds from the sale of
bonds to fund low‑interest‑rate home loans for single‑family and multifamily housing for low‑ and moderate‑income
persons and families. CalHFA repays the bonds that it issues with revenues generated through borrowers’ repayment
of mortgage loans and uses the remaining funds for its operating costs and other programs that promote affordable
housing for low‑income Californians. Unless a loan is otherwise insured by the federal government, CalHFA carries
the risk if a borrower stops paying. With plummeting home values and high levels of unemployment, CalHFA
has experienced increased delinquencies in mortgage payments from its borrowers. Although profitable for many
years, CalHFA suffered losses of $146 million and $189 million in fiscal years 2008–09 and 2009–10, respectively.
Because CalHFA is self‑sustaining, its financial problems will not affect the State’s General Fund.
KEY FINDINGS

Our review of CalHFA’s financial position, including decisions and actions that contributed to its current financial
condition and its future solvency, revealed the following:
•	 Although it will continue to face significant risks, such as its high level of variable‑rate debt, CalHFA’s major
housing programs and its operating fund should remain solvent under most foreseeable circumstances. However,
the fund that provides insurance on its mortgages will become insolvent by summer 2011.
•	 Past decisions by CalHFA, such as its use of variable‑rate bonds and launching new mortgage products in 2005 and 2006,
contributed to its current financial difficulties.
»» One of the biggest threats to CalHFA’s solvency is the amount of variable‑rate bond debt it holds—as of
June 30, 2010 it accounted for $4.5 billion or 61 percent of its total bond debt.
»» Although CalHFA’s new mortgage products—a 35‑year and a 40‑year loan launched in 2005 and 2006—were
easier for borrowers to qualify for, the delinquency rates for borrowers with these loans are presently twice as
high as CalHFA’s traditional mortgages.
»» The decision to implement what turned out to be risky loan products was never brought before the board for
a vote because CalHFA’s board delegates these decisions to staff.
•	 The composition of the CalHFA board specified in statute does not appear to require certain financial expertise
necessary to provide adequate guidance to CalHFA on complex financial matters.
KEY RECOMMENDATIONS

We recommended that the Legislature consider amending statutes regarding the composition of the CalHFA board so
that appointees include individuals that have knowledge of housing finance agencies, single‑family mortgage lending,
bonds and related instruments, and risk management. We also recommended that CalHFA’s board provide better
oversight of CalHFA including writing policies for approving new debt‑issuance strategies or mortgage products
prior to implementation, and that it restrict staff ’s actions regarding debt strategies and mortgage products to those
specified in its annual delegations, approved business plans, and resolutions.

15

16

California State Auditor Report 2012-401

October 2012

California Housing Finance Agency (2010-123, February 24, 2011)
Description of Data

Agency Purpose of Data

An extract from the California Housing Finance Agency’s
(CalHFA) debt management system for the period
January 1998 through June 2010.

To serve as a repository for CalHFA’s bond and swap information, and as a reporting
and analysis tool.
The debt management system contained information for nearly 2,200 bonds.

Purpose of Testing

Data Reliability Determination

To identify bond issuance amounts, dates, rate types, and
hedge statuses.

Sufficiently reliable.

Description of Data

Agency Purpose of Data

An extract from CalHFA’s mortgage reconciliation system
for the period January 2000 through August 2010.

To reconcile the loan payments received and to serve as a source for reporting.
The mortgage reconciliation system contained information for more than
59,000 single‑family mortgage loans purchased between January 1, 2000, and
September 1, 2010, in the amount of nearly $11.1 billion.

Purpose of Testing

Data Reliability Determination

To compare lending volumes and delinquency rates among
CalHFA’s different loan products.

Sufficiently reliable.

California State Auditor Report 2012-401

October 2012

CALIFORNIA DEPARTMENT OF TRANSPORTATION
Its Capital Outlay Support Program Should Strengthen Budgeting Practices, Refine its Performance Measures,
and Improve Internal Controls
Date: April 28, 2011	

Report: 2010‑122

BACKGROUND

Providing support activities—engineering, design, environmental studies, right‑of‑way, and construction management
of state highway projects—for nearly 2,500 capital outlay projects, the California Department of Transportation
(Caltrans) and its 12 districts provide the funding and resources through its Capital Outlay Support Program (support
program) to develop and deliver the projects to construction as well as to administer and oversee the projects once
they are under construction. Support program functions for a project begin after the California Transportation
Commission (commission) programs funding for a project and continue until the project is complete. Caltrans
has two primary programs that provide funding for capital outlay projects: the State Transportation Improvement
Program (STIP), which is a five‑year plan of projects, and the State Highway Operation and Protection Program
(SHOPP), which is a four‑year plan. Each project receives funding through multiple budget acts, and the support
program budget reflects the total for the support activities in a given year.
KEY FINDINGS

During our review of the performance, management, efficiency, and budget of Caltrans’ support program, we noted
the following:
•	 The capital outlay support costs for 62 percent of the projects that completed construction in fiscal years 2007–08
through 2009–10 exceeded their respective budgets—such overruns totaled more than $305 million of the
$1.4 billion in total support costs for these projects.
»» The average support cost overrun for STIP projects we reviewed was about $1.5 million per project and
$329,000 for the SHOPP projects, which is more than 25 percent of their respective budgets.
»» Caltrans has done little analysis to determine the frequency or magnitude of support cost budget overruns
and has not notified stakeholders of overruns.
»» An increase in the hourly rate for support costs was the primary cause for the cost overruns in the projects
we reviewed—one project was about 14,600 hours under budget yet exceeded its budgeted cost by nearly
$6.8 million, or 83 percent.
»» Some of the districts’ project managers monitor their budgets based primarily on the hours charged and not
the dollars spent, while others monitor budgets using a combination of hours and the planned versus actual
project schedule.
•	 Although Caltrans has established a goal of reducing total support costs to 32 percent of total capital costs, it has
historically failed to use a consistent method to calculate the ratio over time, and has generally not met its goal for
the last three fiscal years.
•	 Caltrans’ time‑reporting system does not prevent its employees from charging time to projects to which they are
not assigned, and lacks strong internal controls to ensure that its employees charge their time appropriately.
KEY RECOMMENDATIONS

We make numerous recommendations to Caltrans including that it improve accountability internally and with the
public such as by creating and incorporating an analysis of support cost budget overruns and reporting it quarterly
and annually to the Legislature and governor. Further, we recommend various actions to improve performance
metrics related to the support program including developing and publicizing the goals. Moreover, we provide
recommendations geared at better developing, managing, and monitoring support budgets for projects.

17

18

California State Auditor Report 2012-401

October 2012

Transportation, California Department of (2010-122, April 28, 2011)
Description of Data

Agency Purpose of Data

California Department of Transportation’s (Caltrans)
California Transportation Improvement and Programming
System (CTIPS)

To provide a shared database for Caltrans District Offices, Headquarters, Regional
Transportation Planning Agencies, Federal Highway Administration, and Metropolitan
Planning Organizations to manage the programming and allocation of funds for the
State Transportation Improvement Program (STIP), State Highway Operation and
Protection Program (SHOPP), and local projects.
For fiscal years 2004–05 through 2009–10, CTIPS showed more than $5.6 billion in
STIP and SHOPP budgeted support costs for projects that reached either the ready to
list milestone or the construction contract complete milestone.

Purpose of Testing

Data Reliability Determination

To obtain budgeted project support costs and hours.

Undetermined reliability—To ensure the data Caltrans provided us was accurate,
we compared the data provided to the actual information in Caltrans’ systems for a
random sample of 36 projects. We were able to verify project programmed budget
data in CTIPS for 26 of the 36 projects by comparing the information to actual STIP
and SHOPP documents. However, we were unable to verify the project budget data for
the remaining 10 projects because, according to the information provided by Caltrans,
these projects were generally the result of projects that subsequently divided into
multiple projects or combined with existing projects. Thus, information for these types
of projects was no longer available.

Agency Response Date

October 2011

Corrective Action Recommended

Status of Corrective Action

To improve accountability internally and with the public,
Caltrans should develop a system to report on the
total budgets of support program projects—including
initial project support budgets—of projects that have
been divided into multiple projects or combined into a
larger project.

Fully implemented—Caltrans stated that it has developed improved business practices
to allow for easier tracking of project budgets. Specifically, Caltrans provided a
project management directive outlining a process for managing project funding and
costs when projects are split or combined into one or more construction contracts.
The process allows for tracking the origin of projects split into multiple projects or
combined into one project. That directive took effect in August 2011.

California State Auditor Report 2012-401

October 2012

CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION
Inmates Sentenced Under the Three Strikes Law and a Small Number of Inmates Receiving Specialty
Health Care Represent Significant Costs
Date: May 18, 2010	

Report: 2009‑107.2

BACKGROUND

With annual expenditures of $10 billion or 10 percent of the State’s General Fund in fiscal year 2007–08—a 32 percent
increase over the previous three years—the California Department of Corrections and Rehabilitation (Corrections)
operates California’s prisons, oversees community correctional facilities, supervises parolees, and operates the juvenile
justice system. Health care to inmates is provided at each adult facility and through external contractors. The inmate
health care function transitioned to a federal court‑appointed receiver and is now overseen by the California Prison
Health Care Services (Health Care Services). In reviewing the effect of California’s prison population on the State’s
budget and its operations, the state auditor issued the first of this two‑part report in September 2009 and concluded
that Corrections fails to track and use data that would allow it to more effectively monitor and manage its operations.
KEY FINDINGS

Further review of the effect of Corrections’ operations on the budget revealed the following:
•	 43,500 inmates currently sentenced under the three strikes law (striker inmates) make up 25 percent of the total
inmate population. Further, with regards to striker inmates:
»» On average, they receive sentences that are nine years longer—resulting in approximately $19.2 billion in
additional costs.
»» More than half are currently imprisoned for convictions that are not classified as strikes.
»» Many were convicted of committing multiple serious or violent offenses on the same day, while some
committed one or more of these offenses as a juvenile.
•	 Health Care Services has not fully estimated potential savings from its proposed cost containment strategies.
Further, a significant portion of the cost of housing inmates is for providing health care, which includes contracted
specialty health care.
»» Roughly 41,000 of the 58,700 inmates that incurred specialty health care costs averaged just more than
$1,000 per inmate and cost $42 million in total. The remaining 17,700 inmates incurred costs of more than
$427 million in the same year.
»» Specialty health care costs averaged $42,000 per inmate for those inmates that incurred more than $5,000 for
such costs and were age 60 and older.
»» The specialty health care costs associated with inmates that died during the last quarter of the fiscal year were
significantly greater than any specific age group—ranging from $150 for one inmate to more than $1 million
for another.
•	 Nearly 32 percent of overtime costs in fiscal year 2007–08, or $136 million, were related to medical guarding and
transportation for health care.
•	 Custody staff ’s growing leave balances—due in part to vacancies, errors in Corrections’ staffing formula, and
exacerbated by the State’s furlough program—represent a future liability to the State of at least $546 million and
could be more than $1 billion.
KEY RECOMMENDATIONS

We made several recommendations to Health Care Services including that it continue to explore methods of
reducing the cost of inmate medical care and to monitor overtime to ensure that it does not pose a safety issue. We
also recommended that Corrections provide staff the opportunity to use leave by updating its staffing formulas.
Further, we recommended Corrections better communicate to policy makers information concerning its annual leave
balance liability.

19

20

California State Auditor Report 2012-401

October 2012

Corrections and Rehabilitation, California Department of (2009-107.2, May 18, 2010)
Description of Data

Agency Purpose of Data

California Department of Corrections and Rehabilitation’s
(Corrections) Offender Based Information System (OBIS)

To capture and maintain all adult offender information from the time that the
offenders are committed to Corrections through the time of their discharge.
OBIS contained information related to nearly 28 million inmate movement and
status change records.

Purpose of Testing

Data Reliability Determination

To identify the striker inmates incarcerated for a current
offense that is a nonserious and nonviolent felony and
striker inmates who committed multiple strikes during
one criminal offense. Further, to determine the additional
cost of inmates incarcerated under the three strikes law.

Not sufficiently reliable—To test the accuracy of the data, we selected a random
sample of inmates and traced key data elements to source documents. During the
testing, we identified errors in the inmate identification information that we used
for associating striker inmates with crimes they committed in the past. Further, we
did not conduct completeness testing because the source documents required for
this testing are stored at the 33 institutions located throughout the State.

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action.

N/A

Description of Data

Agency Purpose of Data

Corrections’ Division of Juvenile Justice’s Offender Based
Information Tracking System (OBITS)

To record and track youth offender information such as demographics, movement
within and between facilities, and the occurrence and processing of youth
behavioral violations.
OBITS contained more than 76,000 records of youth offenders and nearly
79,000 records of youth offenders’ initial and subsequent referrals.

Purpose of Testing

Data Reliability Determination

To identify striker inmates who received a strike as
a juvenile.

Not sufficiently reliable—To test the data’s accuracy, we selected a random sample
of inmates and traced key data elements to source documents. During the testing,
we identified errors in the inmate identification information that we used for
associating striker inmates with crimes they committed in the past. In addition, we
did not conduct completeness testing because the source documents required for
this testing are stored at multiple juvenile facilities throughout the State.

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action.

N/A

Description of Data

Agency Purpose of Data

Corrections’ payroll data maintained by the State
Controller’s Office’s (SCO) Uniform State Payroll System
(payroll system)

To process the State’s payroll and personnel transaction documents.

The payroll system contained nearly 47.4 million state payroll transactions for the
period July 2003 through September 2009.
Purpose of Testing

Data Reliability Determination

To present data on overtime.

Sufficiently reliable.

California State Auditor Report 2012-401

October 2012

Description of Data

Agency Purpose of Data

Corrections’ leave accounting data maintained by
the SCO’s California Leave Accounting System (leave
accounting system)

To perform a variety of functions necessary to accurately track leave system
eligibility, state service credits, and leave benefit activity.
For the period July 2004 through January 2010, the leave accounting system
contained nearly 6.7 million leave benefit transactions for Corrections’ employees.

Purpose of Testing

Data Reliability Determination

To identify the amount of leave used and accrued by
custody staff.

Sufficiently reliable.

Description of Data

Agency Purpose of Data

Corrections’ authorized positions maintained by the
SCO’s Position Roster File for fiscal years 2004–05 through
2008–09

The Position Roster File is a file of authorized positions used by departments to
track positions.
The Position Roster File indicated an average of 2,700 vacant Corrections’ custody
staff positions for fiscal years 2004–05 through 2008–09.

Purpose of Testing

Data Reliability Determination

To identify the number of custodial positions authorized
by Corrections.

Sufficiently reliable.

California Prison Health Care Services (2009-107.2, May 18, 2010)
Description of Data

Agency Purpose of Data

California Prison Health Care Services’ Contract Medical
Database (CMD) for fiscal year 2007–08

To combine data from 33 institutions into a centralized database to provide
real‑time access to information, and to streamline the maintenance of contracts and
vendor data.
CMD contained nearly 403,000 expenditure transactions.

Purpose of Testing

Data Reliability Determination

To identify the costs associated with adult health care.

Sufficiently reliable.

21

22

California State Auditor Report 2012-401

October 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2012-401

October 2012

CALIFORNIA PRISON INDUSTRY AUTHORITY
It Can More Effectively Meet Its Goals of Maximizing Inmate Employment, Reducing Recidivism, and
Remaining Self‑Sufficient
Date: May 24, 2011	

Report: 2010‑118

BACKGROUND

Created to offer inmates the opportunity to develop effective work habits and occupational skills and to reduce the
operating costs of the California Department of Corrections and Rehabilitation (Corrections), the California Prison
Industry Authority (CALPIA) employs over 5,000 inmates statewide. The inmates produce various products including
license plates, furniture, and agricultural commodities; and provide services such as laundry and printing through
CALPIA’s 25 manufacturing, service, and agricultural enterprises located at 20 of the 33 correctional institutions in the
State. CALPIA sells primarily to state agencies with the majority of its revenue—66 percent—coming from Corrections
and 10 percent from purchases by the Department of Motor Vehicles. Although state agencies may purchase goods
from the private sector that may be available from CALPIA, if cost‑beneficial, they must first consider whether
CALPIA can meet their needs.
KEY FINDINGS

Our review of the operations of CALPIA revealed the following:
•	 It lacks the ability to measure the quantity and types of post‑release employment that its participants obtain and
thus, cannot determine the effectiveness of its programs.
»» Although its new tracking system contains information about inmate workers’ assignments and training,
neither it nor a consultant it hired could match employment data from the Employment Development
Department with data in the tracking system to determine post‑release employment.
»» Corrections’ parole tracking system—CalParole—could provide employment information for parolees;
however, we found that it often contained erroneous, inappropriate data. Of nearly 645,000 parolee
employment records we reviewed, at least 33,000 contained erroneous data.
•	 It developed comprehensive performance indicators for fiscal year 2010–11, however, it did not finalize a matrix
to track these indicators until March 2011—limiting their usefulness—and, some indicators are not measurable or
are vague.
•	 The recidivism rates for inmates who worked at a CALPIA enterprise are lower than those for Corrections’
general‑population parolees, but other factors may have contributed to the lower recidivism rates.
•	 Even though the prices for almost half of CALPIA’s products and services that we evaluated were above the average
prices for comparable items, its five largest state agency customers saved an estimated $3.1 million during fiscal
year 2009–10 by purchasing those products and services from CALPIA.
•	 Participation in CALPIA’s enterprises has declined by 9.3 percent from June 2004 to June 2010: since 2004 CALPIA
has closed, deactivated, or reduced the capacity of six existing enterprises at 10 locations.
KEY RECOMMENDATIONS

We made several recommendations to CALPIA to allow it to measure progress in meeting its goals, including that
it ensure all its performance indicators are clear, measurable, and consistently tracked, and that it create a process
for management to review performance results. We further recommend that it maintain source documentation
when performing analyses for establishing product prices and calculating savings it brings to the State. We also made
recommendations to Corrections that were aimed at improving the reliability of the employment data contained in
CalParole, conducting periodic reviews of the parolee files to verify accuracy, and ensuring certain modifications
are made to its new Strategic Offender Management System database before CalParole data is transferred to the
new system.

23

24

California State Auditor Report 2012-401

October 2012

Corrections and Rehabilitation, California Department of (2010-118, May 24, 2011)
Description of Data

Agency Purpose of Data

California Department of Corrections and Rehabilitation’s
(Corrections) CalParole Tracking System (CalParole)

To track parolee progress through the parole process and to update, add, or retrieve
parolee information. In addition, CalParole shares parolee data through interfaces with
the Department of Justice and local law enforcement agencies.
CalParole data contained more than 300 records pertaining to parole units, more
than 508,000 records pertaining to parolees, and nearly 645,000 records pertaining to
parolees’ job information.

Purpose of Testing

Data Reliability Determination

To assess an inmates’ post‑release employment success.

Not sufficiently reliable—Our electronic testing revealed that CalParole contains a
significant amount of erroneous data that precluded us from performing any viable
analysis. In addition, our review of hard‑copy parolee files further confirmed the
limitations of CalParole employment data.
Specifically, our review found that the employment fields within CalParole often
contained erroneous, inappropriate data. For example, of the nearly 645,000 parolee
employment records from CalParole that we reviewed, at least 33,000 (5 percent of
all records) contain erroneous data in the records’ field that is designed to identify a
parolee’s current employer.
Moreover, employer address fields for employed parolees often show no information
at all. Specifically, for more than 290,000 records that appeared to contain a valid
employer, the employer address fields for more than 20,000 of these records are blank.
Our testing of hard‑copy parolee files further confirmed the limitations of CalParole
employment data. We reviewed the files of 36 parolees from 12 parole units located
across the State whom CalParole listed as employed. For 31 of the 36 parolees, the
hard‑copy files did not contain pay stubs, letters from employers, or any other reliable
documents that would offer proof of employment to support the employment
information recorded in CalParole.

Agency Response Date

January 2012

Corrective Action Recommended

Status of Corrective Action

To improve the reliability of employment data contained
in CalParole, Corrections should ensure that parole agents
correctly follow procedures related to populating the data
fields and maintaining CalParole.

Pending—According to Corrections, it intends to release a policy memorandum in
April 2012 to provide direction to field staff about entering offender data into CalParole,
which will include detail on the integrity of employment information. Further, Corrections
indicates that it will release another policy memorandum in April 2012 outlining the
use of the parole performance index (PPI), a new tool used to monitor data input within
CalParole. The policy memorandum is to include instructions for managers to audit the
frequency and quality of CalParole updates. As of January 12, 2012, Corrections indicates
that executive management is using PPI while it is being finalized for release to parole staff
for general use.

In addition, supervisors of parole agents should conduct
periodic reviews of parolee files to verify whether
employment fields are completed appropriately and
whether employment is documented adequately.

Pending—In addition to existing department procedures that require parole agent
supervisors to review all cases subject to active supervised parole, Corrections indicated
that the new PPI is a secondary monitoring tool for parole agent supervisors to ensure
data put into CalParole is correct. As previously stated, currently the PPI is being used by
executive management while being finalized for release to parole staff for general use.

As Corrections prepares to move CalParole data into the
Strategic Offender Management System (SOMS), it should
modify existing employment‑related fields and add to SOMS
new fields that are currently not available in CalParole so that
Corrections can minimize the opportunity for erroneous data
entries and make employment data more reliable.

Pending—According to Corrections, it is in the process of modifying existing
employment‑related fields in SOMS in a thorough, more detailed manner than that
currently captured within CalParole.

California State Auditor Report 2012-401

October 2012

Description of Data

Agency Purpose of Data

Corrections’ Offender Based Information System (OBIS)

To capture and maintain all adult offender information from the time that the
offenders are committed to Corrections through the time of their discharge.
OBIS contained more than 29.3 million inmate movement and status change records.

Purpose of Testing

Data Reliability Determination

To track inmate movements into and out of Corrections’
institutions for the purpose of calculating the recidivism
rates for both Corrections’ general‑population parolees and
the California Prison Industry Authority’s (CALPIA) parolees.

Undetermined reliability—To test the accuracy of the data, we randomly selected
a sample of 29 records from the OBIS data files obtained from Corrections and
conducted tests to ensure the data contained in those records could be matched to
source documents. We found that Corrections was unable to provide documentation
that supported the entries it keyed into data fields used to specify the type of
inmate movement, the date those movements occurred, and the institution
reporting the movements for five of our 29 sample items. However, we did not test
the completeness of the OBIS data due to the lack of a centralized storage location
and because the source documents required for this testing are stored at the
33 institutions located throughout the State.

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action.

N/A

25

26

California State Auditor Report 2012-401

October 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2012-401

October 2012

SEX OFFENDER COMMITMENT PROGRAM
Streamlining the Process for Identifying Potential Sexually Violent Predators Would Reduce
Unnecessary or Duplicative Work
Date: July 12, 2011	

Report: 2010‑116

BACKGROUND

Sex offenders who are identified and designated, through the Sex Offender Commitment Program (program), as
sexually violent predators (SVPs)—those that represent the highest risk to public safety due to mental disorders—may
be committed by the courts to a state hospital for treatment rather than released from prison. The program consists of
various key players. The Department of Corrections and Rehabilitation (Corrections) and its Board of Parole Hearings
(Parole Board) review certain sex offenders scheduled for release or parole to determine whether the offenders meet
the criteria for SVPs as defined by law. If Corrections and its Parole Board determine an offender meets the criteria, the
law requires that they refer the offender to the Department of Mental Health (Mental Health). Mental Health assesses
potential SVPs using administrative reviews, clinical screenings, and evaluations to determine whether to recommend
an offender to the designated county counsel, who files a petition to commit the offender if the counsel agrees with
the recommendation.
KEY FINDINGS

During our review of the program, we noted the following:
•	 Current inefficiencies in the program’s process of evaluating potential SVPs are partly due to Corrections’
interpretation of state law and were compounded by Jessica’s Law—a proposition approved by voters in 2006.
»» Corrections refers all offenders convicted of sexually violent offenses to Mental Health rather than assessing
whether offenders’ crimes were predatory and if the offenders meet other criteria before referring them as
potential SVPs. Of the nearly 31,500 referrals Corrections made over a five‑year period, less than 2 percent
were ultimately recommended to designated counsel for commitment.
»» Jessica’s Law added more crimes to the list of sexually violent offenses and reduced the number of victims
required for SVP designation, resulting in many more offenders becoming potentially eligible for commitment
under the program—the number of Corrections’ referrals to Mental Health ballooned from 1,850 in 2006 to
8,871 in 2007.
•	 Corrections does not consider whether Mental Health determined that an offender did not meet the criteria to be
an SVP based on a prior referral and thus, re‑refers the offender to Mental Health. Of the offenders Corrections
referred between 2005 and 2010, 45 percent were referred at least twice and 8 percent of those were referred
between five and 12 times.
•	 During a three‑year period, Corrections failed to refer many offenders to Mental Health at least six months before
their scheduled release as required by law—in one case, the referral came one day before the scheduled release.
•	 Because it has made limited progress in hiring and training more staff, Mental Health has used between 46 and
77 contractors each year to perform its evaluations and clinical screenings and has not reported to the Legislature
about its efforts to hire state employees as evaluators or the effect of Jessica’s Law.
KEY RECOMMENDATIONS

We make several recommendations aimed at eliminating duplicative effort and increasing program efficiency including
that Corrections not make unnecessary referrals to Mental Health and that jointly, Corrections and Mental Health
revise the screening instrument used to refer offenders to Mental Health. Further, we recommend that Corrections
promptly make referrals to allow Mental Health sufficient time to complete screenings and evaluations. We also
recommend that Mental Health continue its efforts to obtain enough qualified staff to perform evaluations and report
its progress to the Legislature.

27

28

California State Auditor Report 2012-401

October 2012

Corrections and Rehabilitation, California Department of (2010-116, July 12, 2011)
Description of Data

Agency Purpose of Data

California Department of Corrections and Rehabilitation’s
(Corrections) Offender Based Information System (OBIS)

To capture and maintain all adult offender information from the time that the
offenders are committed to Corrections through the time of their discharge.
OBIS contained more than 1.3 million records related to offenders and more than
3 million records related to offenses.

Purpose of Testing

Data Reliability Determination

To identify the number of referrals that ultimately resulted
in an offender being committed as a sexually violent
predator (SVP), and to calculate the recidivism rate of those
not committed as SVPs.

Undetermined reliability—For accuracy testing, we selected a random sample of
29 offenders and tested the accuracy of 12 key fields related to these offenders and
found eight errors. However, we did not perform completeness testing because the
documents needed are located at the 33 correctional institutions located throughout
the State, so conducting such testing is impractical.

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action.

N/A

Mental Health, Department of (2010-116, July 12, 2011)
Description of Data

Agency Purpose of Data

Department of Mental Health’s (Mental Health) Sex
Offender Commitment Program Support System (database)

To track data related to each inmate that meets the SVP criteria and all of their referrals
for civil commitment.
Mental Health’s database contained more than 37,000 records gathered in support
of identifying, screening, and evaluating a case, in addition to approximately 26,000
records related to the scheduling, disposition, and outcome of evaluations.

Purpose of Testing

Data Reliability Determination

To identify the number of referrals made by Corrections
to Mental Health, the number of referrals at each step in
the SVP commitment process, and the extent to which
contractors perform evaluations.

Not sufficiently reliable—For accuracy testing, we selected a random sample of
29 referrals and tested the accuracy of 21 key fields for these referrals. Of the 21 key
fields tested, we found three errors in six key fields.

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action.

N/A

California State Auditor Report 2012-401

October 2012

DEPARTMENT OF CORRECTIONS AND REHABILITATION
The Benefits of Its Correctional Offender Management Profiling for Alternative
Sanctions Program Are Uncertain
Date: September 6, 2011	

Report: 2010‑124

BACKGROUND

Charged with overseeing an estimated 163,000 inmates and 107,000 parolees, the Department of Corrections and
Rehabilitation (Corrections) intends to use the Correctional Offender Management Profiling for Alternative Sanctions
(COMPAS) software to identify factors that cause inmates to commit crimes so they can participate in rehabilitative
programs such as substance abuse treatment or vocational education programs. The goal is to reduce the likelihood
of reoffending, thereby reducing overcrowding in the State’s prisons and lowering its recidivism rates. There are
two assessments—a core assessment that identifies the needs of inmates entering prison and the reentry assessment
evaluates inmates who are about to reenter society on parole.
KEY FINDINGS

During our review of Corrections’ use of COMPAS we determined the following:
•	 It is uncertain whether COMPAS will help Corrections ultimately reduce prison overcrowding and lower its
recidivism rates.
»» Although Corrections began using COMPAS assessments at all of its 12 reception centers in 2008,
eight centers indicated that these assessments do not play a significant role when deciding where inmates
should be housed, and by extension, the rehabilitative programs inmates might access at those facilities.
»» Corrections lacks rehabilitative programs that address all five COMPAS‑identified needs—it currently
has rehabilitative programs that only address two—academic/vocational education and substance
abuse treatment.
»» Its academic/vocational programs have more than five times the program capacity compared to substance
abuse even though substance abuse is the need most often cited by COMPAS.
»» Inmates are not consistently compelled to follow their COMPAS case plan as a condition of parole and parole
agents do not routinely use the information to develop case plans or supervise parolees.
»» Corrections has not issued the required regulations on COMPAS nor provided training to some staff on how
to use the tool.
•	 COMPAS assessments do not seem to be a key factor in determining whether an inmate gets into the in‑prison
substance abuse program—only 800 of the 2,600 inmates that had a moderate to high substance abuse treatment
need identified by COMPAS and are housed in the institutions that offer this program were assigned to the
program. A larger number of inmates in the program either did not receive an assessment or were assessed as a low
substance abuse need.
•	 A significant number of inmates—73.5 percent—incarcerated between July 1, 2010 and February 20, 2011, had not
received a COMPAS core assessment.
•	 Corrections does not have records that show how much it cost to deploy and administer COMPAS to its parole
units and reception centers and did not establish an accounting system to track such costs. Further, it has reported a
total of $14.6 million in actual COMPAS costs that it could not explain.
KEY RECOMMENDATIONS

We make many recommendations including that Corrections suspend its use of COMPAS assessments until it has
issued regulations and updated its operations manual for using COMPAS. We also recommend that Corrections
develop a plan to measure and report COMPAS’s effect on reducing recidivism and that once it resumes its use of
COMPAS, provide ongoing training to staff that administer assessments. Additionally, Corrections should disclose its
lack of tracking the costs for COMPAS and develop policies to ensure appropriate tracking and reporting of the costs
of future information technology projects.

29

30

California State Auditor Report 2012-401

October 2012

Corrections and Rehabilitation, California Department of (2010-124, September 6, 2011)
Description of Data

Agency Purpose of Data

California Department of Corrections and Rehabilitation’s
(Corrections) Offender Based Information System (OBIS)

To capture and maintain all adult offender information from the time that the
offenders are committed to Corrections through the time of their discharge.
OBIS contained more than 29.8 million inmate movement and status change records.

Purpose of Testing

Data Reliability Determination

To identify the number of unique inmates who were
released to parole during the period of July 1, 2007,
through February 20, 2011.

Undetermined reliability—For accuracy testing, we selected a random sample of
29 transactions and tested the accuracy of nine key fields. Of the nine key fields tested,
we found errors in two key fields. However, we did not perform completeness testing
of OBIS because the source documents for this system are stored at the 33 adult
inmate institutions located throughout the State, making such testing impractical.

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action.

N/A

Description of Data

Agency Purpose of Data

Corrections’ Correctional Offender Management Profiling
for Alternative Sanctions (COMPAS) database

To assist criminal justice practitioners in the placement, supervision, and case
management of offenders in community and secure settings.
COMPAS contained information related to more than 221,000 core assessments and
more than 43,000 reentry assessments.

Purpose of Testing
To identify the number of inmates who received at least
one COMPAS assessment and the number of inmates
housed in an institution or camp on February 20, 2011,
who were identified as having a moderate to high criminal
risk factor in any of the five areas assessed by the COMPAS
core assessment.

Data Reliability Determination*
Not sufficiently reliable†—We did not perform accuracy and completeness testing of
COMPAS because it is a paperless system, and thus, hard‑copy source documentation
was not available for us to review. Alternatively, following U.S. Government
Accountability Office (GAO) guidelines, we reviewed selected system controls, which
included general and business process application controls. General controls are the
policies and procedures that apply to all or a large segment of Corrections’ information
systems and help ensure their proper operation. Business process application
controls are directly related to a specific computerized application, COMPAS in this
case, and help to ensure that transactions are complete, accurate, and available. In
conducting our review, we identified significant weaknesses in the general controls
Corrections implemented over its information systems, which we reported to them in
a confidential management letter due to the sensitivity of the information provided.
Further, the strength of general controls is a significant factor in determining the
effectiveness of business process application controls. Therefore, because we
identified pervasive weaknesses in the general controls Corrections implemented
over its information systems, we did not perform any testing of the COMPAS business
process application controls.

Corrective Action Recommended

Status of Corrective Action

Corrections should ensure that all policy requirements
included in the State Administrative Manual, Chapter 5300,
are fully implemented and updated on a regular
basis to strengthen the general controls over its
information systems.

Corrections’ response regarding its efforts to implement our recommendation was not
due until October 3, 2012; therefore, we did not include its response in this report.

*	 We reported the specifics of our review of the COMPAS database in a separate management letter, rather than a publicly available report.
†	 Our data reliability assessment, which relied on a review of selected system controls, based the determination of not sufficiently reliable
on Section 7.70b(1) of the GAO’s July 2007 version of Government Auditing Standards, which states that evidence is not sufficient or
not appropriate when using the evidence carries an unacceptably high risk that it could lead to an incorrect or improper conclusion.

California State Auditor Report 2012-401

October 2012

CALIFORNIA’S CHARTER SCHOOLS
Some Are Providing Meals to Students, but a Lack of Reliable Data Prevents the
California Department of Education From Determining the Number of Students Eligible
for or Participating in Certain Federal Meal Programs
Date: October 21, 2010	

Report: 2010‑104

BACKGROUND

Although part of the public school system and serving students in kindergarten through grade 12, California’s 815 active
charter schools operate independently from the existing school district structure. For example, charter schools are not
subject to the law that provides for needy students to receive one nutritionally adequate free or reduced‑price meal
during each school day. Similar to school districts, participation by charter schools in the federal School Breakfast
Program (breakfast program) and the National School Lunch Program (lunch program) is voluntary. The California
Department of Education (Education) maintains several databases that provide various levels of information regarding
traditional and charter schools.
KEY FINDINGS

Our review of the California charter schools’ child nutrition programs, revealed the following:
•	 Although Education maintains numerous databases with varying information relating to schools, students,
applications, and child nutrition, we could not rely on the databases to determine the exact number of charter
schools and their students participating in the breakfast and lunch programs.
»» Its paperless application database system lacks an internal control process to ensure the accuracy of
certain data.
»» It does not verify certain information on the schools’ site applications—such as the site type—and we found
errors related to certain codes and site types.
»» It allows school food authorities to combine information for their sites before entering it into the database
and thus, it cannot differentiate between charter school students and students from traditional schools who
participate in the programs.
•	 Despite the data limitations, we identified 815 active charter schools—over half (451) that appear to participate in the
breakfast and lunch programs and 151 that appear to provide instruction outside the classroom and thus would not
participate in the programs.
•	 Of the remaining 213 charter schools, 133 responded to our survey. Of those, 39 did not provide meals because they
lack resources such as funding, staff, and facilities to prepare and deliver meals, while 46 do offer an alternative meal
program. The remaining schools state that they do participate in the programs, do not provide meals due to the
structure of the school, or their students’ ages made them ineligible to participate in the programs.
KEY RECOMMENDATIONS

We made several recommendations to Education regarding the charter schools’ child nutrition programs, including
the following:
•	 Establish internal control processes within its electronic application system to ensure the reliability of
certain information such as the number of students enrolled and students’ eligibility for receiving free and
reduced‑price meals.
•	 Ensure the accuracy of the child nutrition information and payment system by discontinuing allowing school food
authorities to combine information from more than one school site, modifying its review tools to verify information
on schools’ applications, and requiring school food authorities to establish review procedures for data they enter into
one of its systems.

31

32

California State Auditor Report 2012-401

October 2012

Education, California Department of (2010-104, October 21, 2010)
Description of Data

Agency Purpose of Data

California Department of Education’s (Education)
Consolidated Application Data System (ConApp database)

To provide a process for local educational agencies to complete Education’s
Consolidated Application for Funding Categorical Aid Programs (application),
which Education uses to distribute categorical funds from various state and
federal programs.
The ConApp database showed enrollment of more than 6.1 million students between
the ages of 5 and 17, of which nearly 3.3 million students were eligible to receive free
or reduced‑price meals.

Purpose of Testing

Data Reliability Determination

To determine the number of traditional and charter schools Not sufficiently reliable—The ConApp database is a paperless system, meaning
and their students eligible for free and reduced‑price meals. the local educational agencies and direct‑funded charter schools enter the data
directly into the database. Typically, we assess the reliability of paperless databases
by reviewing the adequacy of system controls in place. However, Education has
not established an internal control process over the ConApp database to ensure
the accuracy of the three data fields designed to capture the number of students
enrolled at the school level, the number of enrolled students who are eligible
to receive free meals, and the number of enrolled students who are eligible to
receive reduced‑price meals. Although we expected Education to have an internal
control process, such as a systematic audit or review of supporting documentation,
Education had not established a process to ensure the accuracy of key data fields.
Further, Education’s ConApp database instructions require the local educational
agencies and direct‑funded charter schools to electronically certify that they have
fulfilled the requirements; the instructions do not state that they should retain
the documentation.
Education requires local educational agencies applying for categorical aid program
funds to submit their information into the ConApp database. However, according to
an administrator in its Data Management Division, there is no state or federal law that
gives Education the authority to require charter schools to submit the application.
Therefore, complete data on the number of charter schools and their students eligible
for free and reduced‑prices meals may not be available.
Agency Response Date

November 2011

Corrective Action Recommended

Status of Corrective Action

To ensure the reliability of the ConApp database fields
related to the number of students enrolled at the school
level, the number of those enrolled students who are
eligible to receive free meals, and the number of those
students who are eligible to receive reduced‑price meals,
Education should do the following:
•	

Modify its ConApp database instructions to require
local educational agencies and direct‑funded charter
schools to retain their documentation supporting the
three data fields for a specified period of time.

Fully implemented—Education modified its ConApp instructions to require local
educational agencies and direct‑funded charter schools to retain documentation
supporting reported data in accordance with state and federal records retention
requirements. The clause requires each recipient of federal funds to maintain records
that will facilitate an effective financial or programmatic audit for three years after the
completion of the activity for which the funds are used.

•	

Establish an internal control process such as a
systematic review of a sample of the local educational
agencies’ and direct‑funded charter schools’
supporting documentation.

No action taken—Education stated that to strengthen existing internal control
processes, it reviews a sample of the local educational agencies’ and direct‑funded
charter schools’ supporting documents as a part of its Coordinated Review Effort
(CRE) process. However, Education’s procedures for its CRE process specifically state it
does not review information in its ConApp database. Therefore, Education has yet to
adequately address our recommendation.

California State Auditor Report 2012-401

October 2012

Description of Data

Agency Purpose of Data

Education’s Nutrition Services Division’s (nutrition
services) Child Nutrition Information and Payment
System (CNIPS) database

To help local program sponsors administer state and federal nutrition programs. CNIPS
enables sponsors to submit online reimbursements, view the status of applications and
meal reimbursement claims, and access site and sponsor information across programs.
CNIPS contained information pertaining to 407 charter schools participating in the
Federal Reduced Price Meal Program.

Purpose of Testing

Data Reliability Determination

To identify the number of charter schools and their
students currently participating in the breakfast or
lunch program.

Not sufficiently reliable—We identified omissions in a key data field during our electronic
logic testing. Specifically, we found that the county‑district‑school (CDS) code data
field was blank in 12.5 percent of the instances. Further, we could not conduct accuracy
testing because nutrition services no longer updates their hard‑copy documents.
Therefore, we could not verify data in the system against source documents. Nutrition
services performs administrative reviews to meet federal regulations related to the
lunch program. However, its review does not include the data elements we consider
key to this analysis. To test the completeness of the data, we haphazardly selected
a sample of 29 charter schools identified as participating in the breakfast and lunch
programs by obtaining their applications on file at nutrition services to ensure that they
were included in the data we received. In all but one instance we were able to find the
unique identifier associated with a charter school. In that one instance the school did
not appear in the data because its application was pending the school food authority’s
verification for fiscal year 2009–10, which had not been completed by the date of the
data we received. However, we were not able to verify the charter school name in three
of 29 instances due to the lack of updated source documents. We also attempted to
identify charter school students participating in the breakfast and lunch programs
by obtaining information from Education’s CNIPS database. However, Education does
not require the school food authorities to report monthly claims for each of their sites
separately. Therefore, although Education can report the total number of students, it
cannot differentiate between charter school students and traditional students who are
participating in these programs.
Finally, during our review we noted that nutrition services requires the school food
authorities to enter the CDS codes for their public school district sites but not for
other site types, such as charter schools. Consequently, three charter schools had
CDS codes in the CNIPS database that did not match the CDS codes in the Charter
Schools Database, and eight charter schools had no CDS codes in the CNIPS database.
Also, two charter schools participating in the breakfast and lunch programs were
misidentified on the school food authorities’ applications, one as a private school
and one as a county office of education. Nutrition services performs reviews of a
sample of the schools under the jurisdiction of the school food authorities each year,
in accordance with federal regulations, to ensure that the requirements of the lunch
program are being met. However, nutrition services’ review tool does not include a
procedure for verifying the accuracy of the CDS code or the site type reflected on
the schools’ site applications. Nutrition services stated that it is the charter schools’
responsibility to enter the CDS code into the CNIPS database but that there is no
requirement for them to do so.

continued on next page . . .

33

34

California State Auditor Report 2012-401

October 2012

Agency Response Date

December 2011

Corrective Action Recommended

Status of Corrective Action

To ensure the accuracy of the CNIPS database, Education
should do the following:
•	

Direct the school food authorities to establish
internal control procedures to ensure the accuracy
of the application information they enter into the
CNIPS database.

Fully implemented—Education’s CNIPS application includes a “certification” check
box that school food authorities must check in order to submit the application. In
addition, Education posted a notice on the first screen of the CNIPS advising sponsors
of their responsibility to ensure that they report accurate information. Education also
stated that beginning with the 2011–12 school year it will further ensure the accuracy
of the application information by including a clause in the annual instructions to
remind school food authorities of their responsibility to ensure that they report
accurate CNIPS information, to clarify that charter schools be identified as such and
not as public schools, and to suggest that a second person review the information for
accuracy before the school food authorities submit the information to Education.

•	

Direct nutrition services to modify the tool used to
review a sample of the school food authorities schools
to include a procedure for verifying the accuracy
of the CDS code and site type reflected on the
schools applications.

Fully implemented—Education’s nutrition services’ Data Management Unit has a
procedure in place to run a query every month that identifies charter schools and
public schools that are not displaying CDS codes in the CNIPS database. In addition,
the query ensures the name and address data in the CNIPS database matches the
information on the Charter School Web site and in the online Public School Directory.
Education’s staff are to resolve any discrepancies.

To ensure that it maximizes the benefits from the State’s
investment in the CNIPS database, Education should do
the following:
•	

Require the school food authorities to submit
a monthly Claim for Reimbursement for each
site under their jurisdiction in addition to their
consolidated claims.

Partially implemented—Education’s nutrition services has updated its New Sponsor
Applications desk manual to instruct analysts to set new agencies, schools, and
Residential Child Care Institutions to site‑level reporting. Education also requires these
entities to submit their monthly claims for reimbursement at the site level. However,
Education does not plan to require existing school food authorities to submit their
monthly claims for reimbursement until July 1, 2012.

•	

Establish a timeline for the school food authorities to
comply with the requirement.

Partially implemented—Education stated that site‑level reporting will be mandatory
for all school food authorities on July 1, 2012. Education stated it has communicated
the transition to site‑level reporting via personal discussions and mass e‑mails when
deemed necessary. In addition, Education stated it has announced the July 1, 2012,
site‑level reporting start during training presentations at various conferences. Further,
Education stated it expects to send a Management Bulletin in December 2011 to
inform school food authorities of the mandatory site‑level reporting requirement.

California State Auditor Report 2012-401

October 2012

DEPARTMENT OF PUBLIC HEALTH
It Reported Inaccurate Financial Information and Can Likely Increase Revenues for the
State and Federal Health Facilities Citation Penalties Accounts
Date: June 17, 2010	

Report: 2010‑108

BACKGROUND

More than 2,500 long-term health care facilities (facilities) are licensed and monitored by the Department of
Public Health (Public Health)—previously known as the California Department of Health Services. In addition to
ensuring that these facilities comply with state requirements, Public Health also ensures that facilities accepting
Medicare and Medicaid payments meet federal requirements through a cooperative agreement with the Centers
for Medicare and Medicaid Services (CMS)—an agency within the U.S. Department of Health and Human Services.
Public Health may impose monetary penalties, known as a Civil Money Penalty, on facilities to address noncompliance
it finds during an inspection or complaint investigation of a facility. Although Public Health determines whether
facilities comply with applicable state and federal requirements, it generally issues and collects monetary penalties
resulting from noncompliance with state requirements only—CMS is responsible for assessing and collecting
monetary penalties resulting from noncompliance with federal requirements.
KEY FINDINGS

During our review of the State and Federal Health Facilities Citation Penalties accounts (state and federal accounts)
and Public Health’s management of those accounts, we noted that Public Health:
•	 Excluded certain financial information when preparing the federal account’s fund condition statements for at least
five years causing an overstatement of $9.9 million as of June 30, 2009.
•	 Made significant prior-year adjustments attempting to resolve inaccuracies, yet the federal account fund balance
will be near insolvency by June 30, 2011—the projected balance is expected to decline to roughly $249,000.
•	 Inappropriately reduced the monetary penalties for 135 citations, resulting in a loss of approximately $70,000 in
revenue for the state account due to a calculation its automated system uses.
•	 May have created an incentive for facilities to appeal citations due to delays in processing those appeals and its
history of significantly reducing original penalties imposed.
»» As of February 2010 more than 600 citations were backlogged awaiting citation review conferences—some were
contested roughly eight years ago; other types of hearings take precedence over citation review conferences.
»» As of March 15, 2010, nearly $9 million in monetary penalties were still under appeal, and for 243 of
313 appealed citations, Public Health granted reductions amounting to $2.7 million. In one case, Public Health
reduced the original monetary penalty from $100,000 to just $1,000.
•	 Could have generated nearly $95,000 in interest from nearly $6.7 million in monetary penalties under appeal if it
could establish an interest-bearing account in which it would deposit appealed monetary penalties at the time the
citations are contested.
•	 Could increase revenue—some monetary penalties have not been adjusted for inflation for as many as 25 years—
nearly $3.3 million more than it actually collected if state law had adjusted monetary penalties to reflect the Consumer
Price Index. Also, it has not surveyed the majority of facilities to ensure their compliance with state requirements.
KEY RECOMMENDATIONS

We made several recommendations to the Legislature and Public Health to increase revenue for both the state
and federal accounts through changes to state law and by ensuring Public Health adheres to the current law. Such
changes to state law include periodically adjusting monetary penalties to reflect the rate of inflation and amending
the citation review conference process to reflect the federal process more closely; and to require facilities that contest
their monetary penalties to pay the penalties upon their appeal and allow Public Health to deposit penalties in an
interest‑bearing account. Further, other recommendations include making changes to its policies and procedures to
ensure funds are not overstated, and conducting all state surveys of facilities every two years, as required by law.

35

36

California State Auditor Report 2012-401

October 2012

Public Health, Department of (2010-108, June 17, 2010)
Description of Data

Agency Purpose of Data

Department of Public Health’s (Public Health) Electronic
Licensing Management System (ELMS) for the period
covering fiscal year 2003–04 through March 15, 2010

To provide Public Health’s Licensing and Certification Division with an intranet
application tool to manage the State’s licensing of over 30 types of health
care facilities.
ELMS contained information related to approximately 5,400 penalties from
July 1, 2003, through March 15, 2010, and more than 18,000 health care facilities.

Purpose of Testing

Data Reliability Determination

To determine the number of citations for which penalties
were imposed and collected, the amounts of the
penalties imposed and collected, the number of appeals
and the monetary amounts associated with them, and the
timeliness of payments.

Undetermined reliability—In our electronic testing of key data elements, we identified
seven instances where ELMS indicated that a citation had received a decision by Public
Health or an external party, but was not coded as having first appealed the citation.
Further, other ELMS coding issues prevented us from determining which entity (i.e.,
Public Health or an external party) rendered a decision on 48 appealed citations.
Finally, we did not conduct completeness testing because the source documents
required for this testing are stored at Public Health’s Licensing and Certification
Division’s 18 district offices located throughout the State.

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action.

N/A

California State Auditor Report 2012-401

October 2012

DEPARTMENT OF DEVELOPMENTAL SERVICES
A More Uniform and Transparent Procurement and Rate‑Setting Process Would Improve the
Cost‑Effectiveness of Regional Centers
Date: August 24, 2010	

Report: 2009‑118

BACKGROUND

Approximately 240,000 Californians with developmental disabilities (consumers) receive community‑based services
from California’s network of 21 regional centers, which are private, nonprofit entities created by law in order to
allow the State to meet its responsibility for providing services and support to consumers. The Department of
Developmental Services (Developmental Services) oversees the regional centers through five‑year contracts. Regional
centers authorize vendors to provide services to consumers, and establish many reimbursement rates for the vendor
services. Regional centers can also enter into contracts for certain services.
KEY FINDINGS

During our review of Developmental Services and six regional centers, we noted the following:
•	 Developmental Services conducts various reviews of regional centers, but provides little oversight of vendor
selection and how rates are negotiated or established even though regional centers set the rates for 96 of the
155 types of services.
•	 Although the regional centers’ expenditures that we reviewed were generally allowable, they did not always
maintain documentation of their processes.
•	 Regional centers set rates using different methodologies, often do not keep documentation demonstrating how
rates were set, and in certain instances gave the appearance of favoritism or fiscal irresponsibility. Of the 61 rates we
examined, we found the following:
»» We could not determine how rates were set for 26, and only 18 were established using a detailed cost
statement from the vendor—a method we considered a best practice.
»» Five rates set at four of the six regional centers we visited appeared to violate a rate freeze required by law—in
two instances the regional center approved rates almost twice as high as the statewide median rate for the
same service.
•	 Regional centers do not have written policies indicating when they are to use rate agreements and when they are to
use contracts, nor do they document their rationale for selecting certain vendors. One regional center paid a vendor
almost $1 million through a rate agreement without adequately specifying the deliverables to be received.
•	 Of the 33 contracts we evaluated, only nine were advertised with four showing evidence of a competitive process—
the type of process that ensures that the State is getting the best value.
•	 Almost half of the roughly 400 regional center employees who responded to our survey do not feel safe to report
suspected improprieties. Also, many indicated that the regional centers do not create an atmosphere of mutual trust
or establish open communication.
•	 At the time of our fieldwork, we were unable to test Developmental Services’ process for responding to complaints
from regional center employees because it did not log, track, or have a written process for such complaints.
KEY RECOMMENDATIONS

We made numerous recommendations to Developmental Services including that it should provide more oversight
and issue more guidance to regional centers for preparing and adhering to written procedures regarding rate‑setting,
vendor selection, and procurement processes to ensure consumers receive high‑quality, cost‑effective services that
meet the goals of the consumers and the program. Other recommendations included that Developmental Services
monitor the regional centers’ adherence to laws, regulations, and new processes by enhancing the level of reviews to
include examining rate‑setting, vendor selection, and procurement practices at the regional centers and to adhere to its
newly documented process for receiving, tracking, and investigating complaints from regional center employees.

37

38

California State Auditor Report 2012-401

October 2012

Developmental Services, Department of (2009-118, August 24, 2010)
Description of Data

Agency Purpose of Data

Department of Developmental Services’ Uniform Fiscal
System (UFS)

To establish and track regional center authorization and billing data including vendor
number, purchase authorization number, consumer identification and eligibility
information, service code, service rate, claim amount, and claim date.
The UFS contained nearly 340,000 regional center vendors.

Purpose of Testing

Data Reliability Determination

To identify the total expenditures for claims against
purchase‑of‑service expenditures.

Sufficiently reliable.

California State Auditor Report 2012-401

39

October 2012

EMPLOYMENT DEVELOPMENT DEPARTMENT
Its Unemployment Program Has Struggled to Effectively Serve California’s Unemployed in the Face of
Significant Workload and Fiscal Challenges
Date: March 24, 2011	

Report: 2010‑112

BACKGROUND

Unemployed workers (claimants) who meet certain requirements find temporary financial assistance through the unemployment
insurance program (unemployment program). Administered by the Employment Development Department (department), the
unemployment program is financed by employers who pay state unemployment taxes. A federal unemployment tax goes directly
to the federal government to pay for administering the unemployment program. The combination of the rising unemployment
rate—132 percent from June 2007 to June 2010—and the 148 percent increase in the number of claims the department processed
over the same time period, were two reasons why, in January 2009, the State’s Unemployment Fund became insolvent. To
continue making benefit payments, the department obtained loans from the federal government. However, this outstanding loan
balance could ultimately cost California employers an estimated $6 billion annually.
KEY FINDINGS

Our review of the department’s administration of the unemployment program, revealed the following:
•	 Due to its prolonged poor performance related to core measures, the United States Department of Labor (federal labor
department) designated California as an “At Risk” state. From 2002 to 2011, the department did not meet acceptable
performance levels as set by the federal labor department.
»» It failed to make timely first payments to claimants—in 2007 the department was making fewer than 80 percent of its
first payments within 14 days of the first allowable week and only reached this goal 62 percent of the time in 2010.
»» It did not determine whether a claimant could receive benefits within 21 days from when it detected issues—by 2010
only 43 percent of these determinations were made timely.
•	 Recently, the department improved its performance when it increased the number of staff and allowed overtime.
However, most of its long‑term corrective actions have not improved its ability to issue timely first payments or
promptly determine eligibility.
»» Only one of its nine major information technology projects has been fully implemented and is available to claimants.
»» Two other projects may ultimately improve the department’s performance, but have not been implemented.
»» Despite developing a new phone system to increase the public’s access to unemployment services, callers may
continue to experience difficulties in reaching agents.
•	 The State may risk forfeiting $839 million in federal stimulus funds if the department does not implement, by
September 2012, certain changes to its base period that could allow additional claimants to qualify for benefits.
•	 The department has taken an average of four or more weeks to determine if a claimant is eligible for the California Training
Benefits program (training benefits program), during which time the claimant did not receive unemployment benefits.
KEY RECOMMENDATIONS

We made numerous recommendations to the department, including the following:
•	 Enhance its corrective action planning process to improve the unemployment program by identifying those
actions that address timeliness measures, developing specific milestones for corrective action, and establishing key
performance benchmarks.
•	 Assess its more robust management information now available to develop strategies and measurable goals related to
limiting calls that require agent intervention and to ensure callers are able to access the voice response portion of the
new phone system.
•	 Maximize federal funding by closely monitoring its resources and project schedule to avoid delays in implementing the
alternate base period by the federal deadline.
•	 Better track and improve timeliness and assist claimants in understanding requirements to enroll in the training
benefits program.

40

California State Auditor Report 2012-401

October 2012

Employment Development Department (2010-112, March 24, 2011)
Description of Data

Agency Purpose of Data

Employment Development Department’s (EDD) California
Training Benefits program’s (training benefits program)
Streamline Tracking System (streamline database)

To track the length of time required for EDD’s training benefits program
determinations. The streamline database also captures information related to the
training and sponsoring program provided on a training benefits program application,
the reason that an application cannot be processed, and the final processing status of
an application (completed, unprocessed, or denied).
The streamline database contained nearly 2,900 records related to training benefits
program applications for the period March 2010 through June 2010.

Purpose of Testing

Data Reliability Determination

To determine the average duration for EDD to process an
application from receipt until a determination was made.

Not sufficiently reliable—We identified omissions in three key data fields during
our electronic logic testing. In 5 percent of the records we analyzed, we found
that although the determination status indicated it was complete, the fields for
the training benefits program determination decision and the date EDD made the
eligibility determination were blank. Similarly, in 6 percent of the records we analyzed,
we found claim records identified as complete in which the field specifying the
program the training was conducted under was blank. Finally, to test the accuracy of
the streamline database data, we randomly selected a sample of 29 records from the
streamline database and traced key data elements to source documents. We identified
several errors during this accuracy test. We found two instances in which the data
fields identifying the date EDD completed a determination and the training benefits
program determination decision did not match hard‑copy source documents. In
addition, we found one instance in which the field containing the date EDD received a
Training Enrollment Verification (TEV) form did not match the source documentation.
After finding this error, we increased our accuracy sample from 29 to 46 records for
this particular data field. Our testing subsequently identified another error, for a total
of two of 46 records containing errors related to the date EDD received a TEV form.

Agency Response Date

November 2011

Corrective Action Recommended

Status of Corrective Action

To better track and improve the timeliness of
determinations for the training benefits program and to
assist claimants in understanding self‑arranged training
requirements, EDD should take measures to ensure that
its staff correctly enter all data into the training benefits
program’s streamline database.

Pending—EDD indicated in its 60‑day response that it has taken actions involving
both procedures and updates to automated processes to ensure staff correctly enter
all data into the training benefits program’s streamline database to better track
determination timeliness for training program participants. After we asked EDD to
support this assertion, it was unable to demonstrate that the actions it has taken thus
far have fully addressed our recommendation. Specifically, despite its claims related to
taking actions involving procedures, EDD was only able to provide us with the same
procedures that were in place at the time of our audit, and thus, are not indicative
of a corrective action. In addition, EDD provided a “guide card,” which it asserted is a
comprehensive guide to processing incoming streamline mail. However, our review
concluded that it provides a high‑level overview of processing steps, and it does not
clearly identify the data fields that are required for processing.
Moreover, EDD provided us with a compact disc that we found to be a source code
dump that did not include programmer’s notes or other documentation explaining
the code. Thus, without investing a considerable amount of time by our Information
Technology Audit Support unit, we cannot confirm that the streamline database is
working as intended.

California State Auditor Report 2012-401

October 2012

Description of Data

Agency Purpose of Data

EDD’s position information maintained by the State
Controller’s Office’s (SCO) Position Roster File

The Position Roster File is a file of authorized positions used by departments to
track positions.
The Position Roster File contained nearly 34,000 records related to EDD’s positions for
the period July 2007 through April 2010.

Purpose of Testing

Data Reliability Determination

To identify the number of paid employment program
representative (program representative) positions by
month for the period July 2007 through June 2010.

Sufficiently reliable.

Description of Data

Agency Purpose of Data

EDD’s leave accounting data maintained by the SCO’s
California Leave Accounting System (leave accounting system)

To perform a variety of functions necessary to accurately track leave system eligibility,
state service credits, and leave benefit activity.
For the period July 2007 through June 2010, the leave accounting system contained
nearly 1.5 million leave benefit transactions for EDD’s employees.

Purpose of Testing

Data Reliability Determination

To identify the amount of leave used and accrued by EDD’s
program representative staff.

Sufficiently reliable.

Description of Data

Agency Purpose of Data

EDD’s payroll data maintained by the SCO’s Uniform State
Payroll System (payroll system)

To process the State’s payroll and personnel transaction documents.
The payroll system contained nearly 23 million records related to state payroll
transactions for the period July 2007 through May 2010.

Purpose of Testing

Data Reliability Determination

To present data on overtime.

Sufficiently reliable.

41

42

California State Auditor Report 2012-401

October 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2012-401

October 2012

FOSTER FAMILY HOME AND SMALL FAMILY HOME INSURANCE FUND
Expanding Its Coverage Will Increase Costs and the Department of Social Services Needs to Improve Its
Management of the Insurance Fund
Date: September 29, 2011	

Report: 2010‑121

BACKGROUND

The Department of Social Services (Social Services) manages California’s county‑administered foster care program.
It issues licenses to the foster family homes and small family homes (licensed homes), in which the county welfare
departments place foster children, and to foster family agencies (FFAs)—organizations that recruit, certify, and train
parents who provide foster family homes the State has not licensed (certified homes). Nearly 24,000 foster children
were placed in over 11,400 homes early this year—75 percent of these children were placed in certified homes.
California offers liability protection to licensed homes through the Foster Family Home and Small Family Home
Insurance Fund (insurance fund), but certified homes are not eligible for coverage under the insurance fund. Social
Services contracts with the Department of General Services (General Services) to process claims and perform the
accounting for the insurance fund.
KEY FINDINGS

In our review of Social Services’ administration of the insurance fund, we reported the following:
•	 The majority of the licensed foster parents are unaware of the insurance fund and the protections that it provides.
Based on our survey, we estimate that almost 90 percent of those parents are not aware that the insurance fund
exists and, for about a third of the licensed homes, the possibility of liability claims against them make them less
likely to continue as foster parents.
•	 Most FFAs use private insurance to protect themselves and the homes they certify against liability, and those
that do not frequently cite the high cost of such coverage. Expanding the insurance fund’s coverage to the FFA’s
certified homes will significantly increase the number of homes eligible for the coverage and could cost the State
approximately $1 million each year.
•	 Social Services neither ensures that General Services processes claims efficiently nor that it provides certain data.
»» Over a five and a half‑year period, General Services took an average of 51 days to process the majority of the
claims we reviewed, but exceeded the state‑mandated 180‑day deadline to process 14 percent of the claims—it
took between 182 and 415 days to process these claims.
»» General Services has not provided Social Services with claims data and thus, Social Services has not been able
to accurately budget for the insurance fund’s needs.
•	 Social Services overestimated the insurance fund’s needs and, as of December 21, 2010, its fund balance had grown
to $5.4 million—significantly more than the $1 million we estimate it needs as a reserve under current conditions.
KEY RECOMMENDATIONS

We make several recommendations to Social Services including that it develop more effective methods to inform and
remind licensed homes about the availability of the insurance fund, that it ensure General Services approves or rejects
all claims within the mandated deadline, and that it provide Social Services with the claim data per their agreement.
We also recommend that Social Services determine the annual amount needed for the insurance fund to meet its
anticipated liabilities, establish a written policy and procedures for determining the fund’s financial needs, and set
an adequate reserve amount for the insurance fund. Further, we recommend that the Legislature consider amending
state law to expressly provide claimants the option of litigating against the insurance fund when claims are not
processed timely.

43

44

California State Auditor Report 2012-401

October 2012

General Services, Department of (2010-121, September 29, 2011)
Description of Data

Agency Purpose of Data

Summary report generated from the Department of
General Services’ (General Services) claims database, iVOS,
for all general liability claims

To document claims information for all of its insurance programs, including the Foster
Family Home and Small Family Home Insurance Fund (insurance fund).
The summary report contained 486 general liability claims that were entered into iVOS
between July 1, 2005, and December 31, 2010.

Purpose of Testing

Data Reliability Determination

To identify the number of insurance fund claims filed
and paid, the types of claims, and the amounts paid for
damages between July 1, 2005, and December 31, 2010.

Sufficiently reliable.

To identify the amounts paid for legal and investigation
services for the insurance fund claims.

Not sufficiently reliable—In performing our accuracy testing, we reviewed the source
documentation for the 126 claims the summarized report identified as insurance fund
claims. Our review found that the amounts the summarized report identified as paid
for legal and investigation services were inaccurate for nine of the claims.

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action.

N/A

California State Auditor Report 2012-401

October 2012

DEPARTMENT OF GENERAL SERVICES
The Division of the State Architect Lacks Enforcement Authority and Has Weak Oversight Procedures,
Increasing the Risk That School Construction Projects May Be Unsafe
Date: December 8, 2011	

Report: 2011‑116.1

BACKGROUND

The Division of the State Architect (division), within the Department of General Services (General Services), is
responsible for supervising the design and construction of projects at K‑12 schools and community colleges to certify
that they comply with the Field Act (act) and certain building standards. During fiscal years 2008–09 through 2010–11,
there were nearly 18,000 school construction projects, costing an estimated $44.5 billion active throughout the State.
To oversee the construction phase, the division’s field engineers (licensed structural engineers) make periodic visits to
construction sites and communicate with division‑approved project inspectors who ensure that school districts
comply with division‑approved plans and specifications. When construction is completed according to approved plans
and required documents are filed, the division certifies the projects.
KEY FINDINGS

During our review of the division’s implementation of the act, we noted the following:
•	 It has limited authority to penalize school districts for noncompliance with the act—school districts can occupy
projects regardless of whether projects are certified. Nearly 25 percent of school construction projects closed during
the last three fiscal years were uncertified.
•	 Although it can take some steps to mitigate the risks that uncertified projects may pose—such as ordering districts
to stop work on projects when the division identifies a potential threat to public safety—the division rarely does. In
fact, the division issued only 23 orders to comply and six stop work orders during the last three fiscal years.
•	 Even though over 16,000 projects remain uncertified, the division neither documents the reasons for classifying
some uncertified projects as having safety issues nor prioritizes actions related to projects with safety concerns.
•	 Its school construction oversight is neither effective nor comprehensive. Of the 24 closed projects we reviewed, we did
not see any evidence of a site visit on file for three projects—which lasted between five and 32 months and have estimated
costs as high as $2.2 million—and found evidence of only one site visit each for another eight closed projects.
•	 The division does not provide the same level of construction oversight in fire and life safety and accessibility as it
does for structural safety, even though it reviews plans for school construction projects for all three disciplines.
•	 Although it relies on project inspectors to ensure proper construction, we noted concerns with the division’s
oversight of inspectors.
»» School districts sometimes proceed with projects before the division approves their inspectors—on 22 of
34 projects we reviewed, the inspector was not approved until well after construction began.
»» It sometimes excuses inspectors from required trainings, does not always ensure inspectors have passed
all parts of the latest certification examination, and has not always clearly documented verification of an
inspector candidate’s prior experience.
»» The division does not have a formal evaluation process for inspectors and thus, may not be consistently
and adequately addressing performance issues, and may also be unable to defend its disciplinary actions
against inspectors.
KEY RECOMMENDATIONS

We made several recommendations to General Services including that the division better use the enforcement
tools at its disposal such as orders to comply and stop work orders to enforce compliance with the act. We also
recommend that it modify current policies regarding classifying uncertified projects with safety concerns and to use
the information to prioritize its efforts to follow up on projects based on risk. Further, to ensure it provides adequate
oversight of school construction projects, it should develop an overall strategy that establishes specific expectations for
conducting field engineers’ site visits. Additionally, it should streamline its inspector approval process to ensure they
are approved prior to starting construction and should re‑establish a formal process for evaluating inspectors.

45

46

California State Auditor Report 2012-401

October 2012

General Services, Department of (2011-116.1, December 8, 2011)
Description of Data

Agency Purpose of Data

Department of General Services’ Division of the State
Architect’s (division) Tracker database (database)

To manage the projects submitted by school districts. The database tracks project
applications, key dates, the inspectors assigned to projects, and the types of
project closure. The database also generates invoices and calculates the various
fees owed to the division for certain aspects of its work.
The database contained information related to more than 50,000 applications for
construction projects for the division since 1997.

Purpose of Testing

Data Reliability Determination

To identify the number and estimated cost of projects that
were in the construction oversight or project close‑out
phases in fiscal years 2008–09 through 2010–11. Further,
to identify which projects received close‑out letters and to
determine the amount of time between construction
completion and June 30, 2011, for projects that had not
begun the close‑out process as of that date.

Undetermined reliability—Our review of existing information identified two data
limitations. The division’s database does not track information on any projects
submitted to the division before November 1997. Further, the database does not
track if projects reopen regardless of whether the project was initially recorded in
the database. Because some projects are required to pay a fee when they reopen, we
were able to identify a portion of the reopened projects using the fee information. In
addition, we found minor errors in our electronic testing, some of which we were able
to correct. We also tested the accuracy of the database by testing key data elements
for a random sample of 29 projects and tracing the selected elements to the project
files. In this sample, we found one error, so we continued testing until we had tested
a total of 47 randomly selected projects and found no additional errors. However,
because the division did not have a consistent method for identifying the date
construction ended, we were unable to test the accuracy of this field.

Agency Response Date

June 2012

Corrective Action Recommended

Status of Corrective Action

To ensure it is providing adequate oversight of school
district construction projects, the division should establish
consistent criteria for entering data into its database on key
aspects of projects, such as the dates for the start and end
of construction.

Pending—According to the division, it developed proposals for a standard for the start
and end dates of construction. These proposals are currently undergoing the review of
the division’s senior management.

California State Auditor Report 2012-401

October 2012

SACRAMENTO AND MARIN SUPERIOR COURTS
Both Courts Need to Ensure That Family Court Appointees Have Necessary Qualifications, Improve
Administrative Policies and Procedures, and Comply With Laws and Rules
Date: January 20, 2011	

Report: 2009‑109

BACKGROUND

Every superior court in each of California’s 58 counties has jurisdiction over family law matters typically within their
family courts. Judges assigned to the family courts decide various family law matters, such as the dissolution of
marriages, and where child custody or a determination of the legal relationship between natural or adoptive parents
and a child is at issue, the family court may issue an order for child custody and visitation. At the Sacramento family
court, where more than 92,500 family law cases were filed during the four‑year period we reviewed, its staff conducted
mediations and certain evaluations that the family court ordered and the court appointed private mediators,
evaluators, and minor’s counsel. In contrast, the Marin family court, which opened 2,352 cases that involved child
custody and visitation during the same four‑year period, had staff who performed only child custody and visitation
mediations and it appoints private evaluators and minor’s counsel to contested child custody and visitation cases. The
Family Code requires family courts to design all child custody and visitation orders to reflect what is in the best interest
of the child.
KEY FINDINGS

Our audit of the Sacramento and Marin County Superior Courts’ processes for identifying, assessing, and evaluating
court appointees in child custody disputes during the four‑year period—from April 1, 2006 through March 31, 2010—
revealed the following:
•	 The Sacramento County Superior Court could not demonstrate that its staff performing mediations and evaluations
and the private mediators, evaluators, and minor’s counsel it appoints are qualified or trained.
•	 The Marin County Superior Court could not demonstrate that the mediators always met the minimum
qualifications or training requirements and that its private evaluators were qualified and met certain
training requirements. Further, the family court did not ensure that minor’s counsel were qualified before
making appointments.
•	 Although both family courts have a process for reviewing and resolving complaints about their mediators or
evaluators, neither court kept logs of complaints received. In addition, both family courts did not consistently follow
processes for dealing with complaints about their mediators.
•	 Even though courts may pay for minor’s counsel when it determines that the parties cannot pay, both courts need to
improve their processes. The Sacramento family court did not always make the legally required determination about
the parties’ ability to pay and the Marin Superior Court did not have a policy outlining the costs it reimburses.
KEY RECOMMENDATIONS

We make numerous recommendations to the Sacramento and Marin County Superior and Family Courts to ensure
that the individuals who provide mediation and evaluation services and who act as minor’s counsel in cases before
these family courts are qualified and trained. Further, we recommend that both the Sacramento and Marin family
courts track all complaints properly and review them promptly and keep a log of complaints they receive. Moreover,
both family courts need to improve their policies and rules for receiving, reviewing, and resolving complaints. We
also recommend that the Sacramento Superior Court improve billing procedures and for determining and reviewing
parties’ ability to pay appointing minor’s counsel costs.

47

48

California State Auditor Report 2012-401

October 2012

Marin Superior Court (2009-109, January 20, 2011)
Description of Data

Agency Purpose of Data

Marin Superior Court’s Beacon case management database
(Beacon database)

To manage civil, family law, juvenile, probate, and small claims cases and to maintain
filing and disposition data about these cases in accordance with direction provided by
the Administrative Office of the Courts.
The Beacon database contained nearly 2,400 child custody and visitation related cases
that were opened between April 1, 2006, and March 31, 2010.

Purpose of Testing

Data Reliability Determination

To identify custody and visitation cases, contested custody
and visitation cases.

Not sufficiently reliable—For our accuracy testing, we randomly selected 29 records
and traced key data elements to the source documentation in the court’s case files.
We identified errors in one of the key fields needed for our analysis. Specifically, we
identified two errors in the data element for the case subtype. Because we relied
on this field to determine if a case was a custody and visitation case or a contested
custody and visitation case, we cannot be sure that we included all relevant cases in
our analysis.

To identify cases opened after April 1, 2006, and cases that
remained open as of March 31, 2010.

Sufficiently reliable.

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action.

N/A

Sacramento Superior Court (2009-109, January 20, 2011)
Description of Data

Agency Purpose of Data

Sacramento Superior Court’s Sustain case management
database (Sustain database)

To generate calendars, minute orders, out cards, and statistics.
The Sustain database contained records of more than 430,000 Family Law cases,
nearly 555,000 Civil Law cases, and more than 31,000 Probate cases.

Purpose of Testing

Data Reliability Determination

To identify custody and visitation cases, contested custody
and visitation cases, and cases opened after April 1, 2006.

Not sufficiently reliable—For accuracy testing, we randomly selected 29 records and
traced key data elements to the source documentation in the court’s case files. For
two of 29 case files tested, we identified inaccurate entries in the data field that
indicates when a case was filed. We performed completeness testing by haphazardly
selecting 29 case files and verifying that the Sustain database contained these cases.
We identified three court cases that were not recorded in the Sustain database.

To identify those cases that had court‑appointed
minors counsel.

Undetermined reliability—For our accuracy testing, we selected 29 case files and
traced key data elements to the source documentation in the court’s case files. The
results identified no errors. We performed completeness testing by haphazardly
selecting 29 case files and verifying that the Sustain database contained these cases.
We identified three court cases that were not recorded in the Sustain database.

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action.

N/A

California State Auditor Report 2012-401

October 2012

Description of Data

Agency Purpose of Data

Sacramento Superior Court’s Family and Children
department’s Office of Family Court Services’ database
(FCS database)

To assign cases to mediators, to send notices regarding upcoming mediation
appointments, to track activities performed by mediators related to cases, and to
generate bills for evaluation services.
The FCS database contained records of nearly 106,000 scheduled appointments for
family court services.

Purpose of Testing

Data Reliability Determination

To identify contested custody and visitation cases, to
identify cases that remained open as of March 31, 2010,
and to determine if the court ordered mediations or
evaluations.

Not sufficiently reliable—For our accuracy testing, we randomly selected 29 records
and traced key data elements to the source documentation in the court’s case files.
We identified inaccurate entries in numerous data elements. Specifically, the accuracy
testing identified five errors in the data element containing the case number, three
errors in the field that identifies the type of court action, three errors in the child’s
birth‑date data element, six errors in the field that identifies whether a mediation
session was held, and three errors in the field that identifies the type of mediation
session. We also performed completeness testing by haphazardly selecting 29 case
files and verifying that the FCS database contained these cases. We found that the FCS
database did not contain two of the 29 cases tested.

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action.

N/A

49

50

California State Auditor Report 2012-401

October 2012

Blank page inserted for reproduction purposes only.

California State Auditor Report 2012-401

October 2012

ADMINISTRATIVE OFFICE OF THE COURTS
The Statewide Case Management Project Faces Significant Challenges Due to Poor Project Management
Date: February 8, 2011	

Report: 2010‑102

BACKGROUND

With as many as 10 million case filings in a year, the California court system has 400 locations statewide—each
county’s superior court has between one and 55 courthouse branches. In 2003, after finding over 200 varieties of case
management systems in use by superior courts, and wishing to improve access, quality, and timeliness of the judicial
system, the Administrative Office of the Courts (AOC)—staff agency to the Judicial Council of California, the policy
and rule‑making body over California’s judicial branch—was directed to continue developing a statewide case
management system for the superior courts. That same year, the AOC entered into contracts to develop two interim
systems—currently used by seven courts—and later decided to develop one comprehensive system—the California
Court Case Management System (CCMS).
KEY FINDINGS

Our review of the AOC’s management and oversight of the statewide case management project revealed the following:
•	 The AOC inadequately planned the project since 2003. Specifically, the AOC:
»» Did not conduct a business needs assessment at the onset of the project nor has it performed a cost‑benefit
analysis to ensure that the CCMS is the most cost‑effective technology solution for the courts’ needs.
»» Did not structure its contract with the development vendor to adequately control the project costs and
scope—over the course of seven years, the AOC entered into 102 contract amendments and increased the cost
of the contract from $33 million to $310 million.
•	 The AOC has consistently failed to develop accurate cost estimates or timelines for the project.
»» Cost estimates have gone from $260 million in 2004 to nearly $1.9 billion in 2010. Moreover, this estimate
excludes other significant costs such as those that the superior courts and justice partners are likely to incur in
deploying CCMS.
»» Annual reports to the Legislature did not provide complete cost information.
»» The estimated date for complete deployment has been pushed back by seven years.
•	 The majority of the courts believe their current case management systems will serve them for the foreseeable
future and users of interim systems expressed reservations about using CCMS. Some of these users say they
will not adopt CCMS until the AOC makes significant improvements in the areas of performance, stability, and
product management.
•	 The AOC’s attempt at independent oversight came late in the life of the project and the scope of services it
contracted for fell short of best practices for a project of this size and scope. Nevertheless, the AOC did not
adequately address significant concerns raised by the consultant providing the oversight and thus, the project may
have future quality issues.
KEY RECOMMENDATIONS

We made numerous recommendations to the AOC, including the following:
•	 Conduct a thorough analysis of the costs and benefits of the CCMS to determine the course of action to take.
•	 Update cost information and estimates on a regular basis and report true costs to the Legislature and others.
•	 Develop a realistic overall funding strategy for the CCMS in light of the current fiscal crisis facing the State.
•	 Take steps to fully understand and address the courts’ concerns as implementation moves forward.
•	 Retain an independent consultant to review CCMS before deployment to determine if there are quality issues
and problems.

51

52

California State Auditor Report 2012-401

October 2012

Administrative Office of the Courts (2010-102, February 8, 2011)
Description of Data

Agency Purpose of Data

An excerpt of financial data from the Administrative Office
of the Courts’ (AOC) Oracle financial system (financial
system) for fiscal years 2000–01 through 2009–10

To record, process, and store AOC’s financial transactions.

The AOC’s financial system contained information related to more than 6.6 million
accounting records.
Purpose of Testing

Data Reliability Determination

To determine total expenditures associated with the
development of the statewide case management project.

Undetermined reliability—We could not assess the reliability of the data in the AOC
financial system for fiscal years 2000–01 through 2005–06 because the AOC had
previously destroyed the hard‑copy source documents in accordance with its record
retention policy. Further, in assessing the accuracy of the AOC financial system data for
fiscal years 2006–07 through 2009–10, we found that certain key data fields included
in our sample were generated by the AOC’s financial system. Due to the nature of
system‑generated fields, there is no corroborating evidence available for our review.
Therefore, we were unable to determine the accuracy of those key data fields for the
purposes of this audit.
Finally, the AOC’s financial system does not fully account for payroll costs associated
with staff that performed a role in the most recent version of the statewide case
management project—the California Court Case Management System (CCMS).
Specifically, in our testing of 13 employees associated with the CCMS project, we
noted that the AOC did not properly charge payroll costs for five information system
division employees and one regional administrative director who spent a portion
of their time working on the CCMS project during the period July 2002 through
June 2010. From our analysis of the State Controller’s Office’s (SCO) Uniform State
Payroll System (payroll system) data for this time period, we estimated the total gross
salary for these six employees—excluding certain payroll expenses, such as employer
contributions for retirement and Medicare—exceeded $5.5 million. According to the
AOC, except for a select group of employees working on grants, AOC employees do
not complete timesheets that detail the projects they are working on. Consequently,
because these six employees spent their time working on various functions, the AOC
was not able to determine what portion of their time was spent on the CCMS project.

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action.

N/A

Description of Data

Agency Purpose of Data

AOC’s payroll data maintained by the SCO’s Uniform State
Payroll System

To process the State’s payroll and personnel transaction documents.
The payroll system data contained information related to nearly 60 million state
payroll transactions for the period July 2002 through June 2010.

Purpose of Testing

Data Reliability Determination

To determine total gross salary for a sample of employees
associated with the statewide case management project.

Sufficiently reliable.

California State Auditor Report 2012-401

53

October 2012

COMMISSION ON TEACHER CREDENTIALING
Despite Delays in Discipline of Teacher Misconduct, the Division of Professional Practices Has Not Developed an
Adequate Strategy or Implemented Processes That Will Safeguard Against Future Backlogs
Date: April 7, 2011	

Report: 2010‑119

BACKGROUND

Receiving over 250,000 applications for teaching credentials each year, the 19‑member Commission on Teacher Credentialing
(commission) establishes high standards for the preparation and licensing of public school educators. The Division
of Professional Practices (division) conducts investigations of misconduct on behalf of the Committee of Credentials
(committee)—a commission appointed seven‑member body. The committee meets monthly to review allegations of misconduct
and, when appropriate, recommends that the commission discipline credential holders or applicants, including revoking or
denying credentials when the committee determines holders or applicants are unfit for the duties authorized by the credential.
KEY FINDINGS

During our audit of the commission’s educator discipline process, we noted the following:
•	 As of the summer of 2009, the division had accumulated a backlog of about 12,600 unprocessed reports of arrest and
prosecution—nearly three times the number of educator misconduct reports the division typically processes each year.
The backlog was a result of an insufficient number of trained staff, ineffective and inefficient processes, and a lack of an
automated system for tracking the division’s workload.
•	 These conditions appear to have significantly delayed processing of alleged misconduct and potentially allowed
educators of questionable character to retain a credential.
»» In 11 of the 29 cases we reviewed, the division took more than 80 days to open a case after it received a report of
misconduct, with one taking nearly two years to open and another taking nearly three years.
»» The division does not always effectively track cases that potentially result in mandatory revocation of a credential—for
two of the 23 such cases we reviewed the division took one and a half months and six months, respectively, to revoke
the credentials after being notified by the court the holder was convicted of the crime charged.
»» Because it relies on the prosecution of criminal charges rather than contemporaneously pursuing all available
sources of information regarding its cases, when an individual is not convicted the division may not be able
to get the information necessary to effectively investigate because some witnesses—students, teachers, and
administrators—may no longer be accessible.
•	 The division has not effectively processed all the reports of arrest and prosecution that it receives—we could not locate
in the commission’s database more than half of the 30 reports we randomly selected. Further, it processes reports it no
longer needs because it does not always notify the appropriate entity that the reports are unneeded.
•	 To streamline the committee’s review of reports of misconduct, the commission allows division staff to use their
discretion to decide which reports to forward to the committee for its review and which require no disciplinary
action—a practice we believe constitutes an unlawful delegation of discretionary authority.
•	 The division lacks comprehensive written procedures for reviewing reported misconduct and the database it uses for
tracking cases of reported misconduct does not always contain complete and accurate information.
•	 Familial relationships among commission employees may have a negative impact on employees’ perceptions and
without a complete set of approved and consistently applied hiring procedures, the commission is vulnerable to
allegations of unfair hiring and employment practices.
KEY RECOMMENDATIONS

We make numerous recommendations to the commission including that it develop and formalize comprehensive procedures
for reviews of reported misconduct and for hiring and employment practices to ensure consistency. We also recommend that it
provide training and oversight to ensure that case information in its database is complete, accurate, and consistent. Moreover,
we provide specific recommendations for the commission to revisit its processes for overseeing investigations to adequately
address the weaknesses in its processing of reports of misconduct and reduce the time elapsed to perform critical steps in the
review process.

54

California State Auditor Report 2012-401

October 2012

Commission on Teacher Credentialing (2010-119, April 7, 2011)
Description of Data

Agency Purpose of Data

Commission on Teacher Credentialing’s (commission)
Credentialing Automation System Enterprise data (database)

To track teacher data, applications, documents, exams, case information, and
organization information.
The database contained information pertaining to nearly 1.2 million people,
76,639 applications, and 17,206 cases.

Purpose of Testing

Data Reliability Determination

To identify the number of affidavits, school reports,
testing agency misconduct reports, actions taken by the
Committee of Credentials (committee), recommendations
for adverse action, and the number of days between
the date that the commission’s Division of Professional
Practices (division) staff opened and closed a case for cases
the committee did not review that were opened during the
period of January 2007 through June 2010.

Not sufficiently reliable—For accuracy testing, we randomly selected a sample of 28
records of case activities and found several errors in key fields. Specifically, we found
three errors in the data field that tracks the date an activity begins and the field that
describes which activity is being performed, such as a request for court documents or
Department of Motor Vehicles’ records, and two errors in the field that describes the
action that needs to be taken, such as opening a case.

Agency Response Date

August 2012

Corrective Action Recommended

Status of Corrective Action

The division should provide the training and oversight,
and should take any other steps needed, to ensure that the
case information in its database is complete, accurate, and
consistently entered to allow for the retrieval of reliable
case management information.

Fully implemented—As indicated in its six‑month response, the commission provided
training to its staff to ensure that they consistently and accurately enter information
into the database. Additionally, in its one‑year response, the commission stated that
most of the management and supervisory team in the division were replaced and it is
in the process of recruiting a new management team. According to the commission,
management duties will include routine or scheduled reviews of data.
In an August 2012 update, the commission provided its newly developed policy and
procedures for reviewing data to ensure its accuracy. The commission also stated that
it selected a random sample of 60 case files and reviewed 23 key data points for each
file, creating a possibility of 1,380 errors. According to the commission, it developed,
completed, and saved documentation of this review, during which it found a very low
rate of error—only seven errors in total. Finally, in keeping with the procedures that
the division developed, the commission plans to complete this data audit annually.

California State Auditor Report 2012-401

October 2012

STATE LANDS COMMISSION
Because It Has Not Managed Public Lands Effectively, the State Has Lost Millions in Revenue
for the General Fund
Date: August 23, 2011	

Report: 2010‑125

BACKGROUND

Responsible for managing millions of acres of tidelands and submerged lands, and other lands that must be used
to benefit public education, the three‑member State Lands Commission (commission) meets periodically to make
decisions regarding leases and other matters. An executive officer—appointed by the commissioners—manages the
commission’s daily operations and its employees. Its divisions oversee and manage more than 4,000 leases, including
approximately 900 agricultural, commercial, industrial, right‑of‑way, and recreational leases; 85 revenue‑generating oil
and gas, geothermal, and mineral leases; and 3,200 rent‑free leases.
KEY FINDINGS

Our audit of the commission’s management of leases revealed that it did not effectively manage or monitor leases:
•	 Some lessees remained on state land for years, sometimes decades, without paying rent. Of the 10 delinquent leases
we reviewed nearly half were more than 17 years past due. The commission generally does not evict or pursue other
remedies against lessees who do not pay rent—in total, the State lost $1.6 million from those 10 delinquent leases
we reviewed.
•	 About 140 of its nearly 1,000 revenue‑generating leases are in holdover, meaning the leases had expired and the
lessee continued to pay the rental amount stipulated in the expired lease. The commission could have collected
an additional $269,000 in rent for 10 expired leases we reviewed had it merely adjusted the leases to reflect the
Consumer Price Index.
•	 The commission failed to perform timely rent reviews even though many of its lease agreements allow it to review
and modify the rental amount every five years. For 18 of the 35 leases we reviewed, it could have collected an
additional $6.3 million had it conducted timely rent reviews.
•	 Properties are not appraised regularly—of the 35 leases we reviewed, four had not been appraised in 20 years and
another nine had not been appraised for at least 10 years.
•	 The commission may be undervaluing certain types of leases because it is using a rate to establish rent for pipelines
on state property that has not been adjusted for more than 30 years.
•	 It lost track of one of its leases and failed to bill a lessee for 12 years while the lessee remained on state property.
This was likely due to an incorrect entry in the commission’s database.
•	 Even though audits of oil and gas leases can result in millions of dollars in revenue to the State, the commission
does not consistently conduct these audits nor does it conduct audits of 85 properties granted to local governments
to ensure that they spend the funds generated from those lands as permitted.
KEY RECOMMENDATIONS

We make numerous recommendations to the commission including that it manage delinquent leases in a timely
manner and that it ensure that as few leases as possible are in holdover by implementing its newly established
procedures and periodically evaluating their effectiveness. Further, we recommend it conduct rent reviews promptly,
appraise its properties as frequently as permissible to obtain a fair rental value for its leases, and amend outdated
regulations for establishing pipeline rents. Additionally, to improve its monitoring of leases, we recommend the
commission ensure its database is complete and accurate for retrieval of reliable lease information, and to require all
divisions to use it. Moreover, it should develop an audit schedule that focuses on leases that historically generate the
most revenue and recoveries to the State.

55

56

California State Auditor Report 2012-401

October 2012

State Lands Commission (2010-125, August 23, 2011)
Description of Data

Agency Purpose of Data

State Lands Commission’s (commission) Application Lease
Information Database (ALID)

To store the commission’s lease information, including the lessee name, lease term
and type, lease location, rental amount, lease history, and bond and insurance
information. The commission also uses tickler dates within ALID to remind staff when
leases are eligible for a five‑year rent review.
ALID contained records for more than 4,000 leases, including approximately 85 oil
and gas, geothermal, and mineral leases; 900 agricultural, commercial, industrial,
right‑of‑way, and recreational leases; and 3,200 rent‑free leases.

Purpose of Testing

Data Reliability Determination

It was our intent to use the ALID data to determine how
frequently the commission appraises the value of all of its
lease properties, how much time it spends in each step
of the rent review process, the total number of leases in
holdover, and the total number of leases based on the price
per diameter inch per linear foot of pipeline.

Not sufficiently reliable—Based on our initial review of the data included in ALID—
which found significant errors—we determined that we would not be able to make
conclusions based on these data.

Agency Response Date

October 2011

Corrective Action Recommended

Status of Corrective Action

To improve its monitoring of leases, the commission should
do the following:
•	

Create and implement a policy, including provisions
for supervisory review, to ensure that the information
in ALID is complete, accurate, and consistently entered
to allow for the retrieval of reliable lease information.
To do so, the commission should consult another
public lands leasing entity, such as the Department
of General Services, to obtain best practices for a
lease‑tracking database.

•	

Partially implemented—The commission asserts that all income‑producing leases
have been verified for data elements related to rent review dates, lease terms, and
expiration dates. Further, commission staff is developing management reports
that, according to the commission, will allow access to data in a format that will
be useful for decision making. Finally, the commission is pursuing an off‑the‑shelf
software program that could potentially replace ALID. However, the commission
has not implemented a policy that includes provisions for supervisory review
of the data entered into ALID. Further, the commission has not yet consulted
with other public lands leasing agencies to obtain best practices for a lease
tracking‑database.

•	

Require all of its divisions to use ALID as its one
centralized lease‑tracking database.

•	

Partially implemented—The commission stated that the steps it has taken should
reduce the need for staff to use multiple data sources.

California State Auditor Report 2012-401

October 2012

CHILD WELFARE SERVICES
California Can and Must Provide Better Protection and Support for Abused and Neglected Children
Date: October 27, 2011	

Report: 2011‑101.1

BACKGROUND

Child welfare services (CWS) agencies—which include programs for child protective services—across California’s
58 counties received 480,000 allegations of child abuse or neglect in 2010 and, along with local law enforcement,
make immediate decisions about whether to remove a child from his or her home. While each county establishes
and maintains its own program for CWS, the Department of Social Services (Social Services) monitors and provides
support through oversight, administrative services, and development of program policies and regulations. Among
other duties, Social Services provides oversight from early intervention activities to permanent placement services,
and provides oversight and regulatory enforcement for more than 85,000 licensed community care facilities statewide,
including licensing foster and group homes that house children removed from unsafe homes.
KEY FINDINGS

During our review of the child protective services programs in Sacramento, Alameda, and Fresno counties and Social
Services, we noted the following:
•	 Despite a recommendation we made in 2008, Social Services does not use the Department of Justice’s (Justice) sex
offender registry to identify sex offenders who may be inappropriately living or working in its licensed facilities and
foster homes. In July 2011 we found over 1,000 address matches and alerted Social Services. Social Services and
county CWS agencies investigated the matches and are taking action as needed.
•	 Social Services is struggling to visit community care facilities at least once every five years as required. The number
of overdue inspections has been increasing since the beginning of 2010.
•	 Although the three counties generally performed required background checks before placing children in foster
homes and appeared to remove children quickly when needed, they did not always timely notify Social Services
of allegations involving its licensees and forward required information regarding instances of abuse or neglect
to Justice.
•	 The percentage of children placed with private foster family agencies—agencies that recruit and certify foster homes
and are compensated at a higher rate than state‑ or county‑licensed foster homes—has dramatically increased over
the last 10 years and resulted in an additional $327 million in foster care payments during that time. The counties we
visited admit to placing children with these agencies out of convenience rather than for elevated treatment needs as
originally intended.
•	 Although the county CWS agencies we visited generally complied with state regulations and county policies when
investigating and managing cases, they need to improve the timeliness of investigations and consistency of ongoing
case management visits.
KEY RECOMMENDATIONS

We make numerous recommendations to Social Services including that it conduct regular address comparisons using
Justice’s sex offender registry and its licensing database, and that it complete follow‑up on any remaining address
matches we provided. We further recommend that Social Services complete comprehensive reviews of agencies’
licensing activities more timely as well as on‑site reviews of state‑licensed foster homes, foster family agencies, and
group homes. Moreover, Social Services should ensure that rates paid to private foster family agencies are appropriate
and should monitor placements with these agencies. To the county CWS agencies, we recommend that all agencies
perform child death reviews for children with CWS histories to improve their practices.

57

58

California State Auditor Report 2012-401

October 2012

Justice, Department of (2011-101.1, October 27, 2011)
Description of Data

Agency Purpose of Data

Department of Justice’s (Justice) Sex and Arson Registry
(sex offender registry)

To serve as the statewide repository for all sex and arson offender
registration information.
The sex offender registry contained more than 1.4 million address records for nearly
125,000 registered sex and arson offenders.

Purpose of Testing

Data Reliability Determination

To identify possible matches between addresses
of registered sex offenders and the addresses of
state‑ and county‑licensed facilities, such as foster family
homes, family day care homes, and adult residential
facilities in the Department of Social Services’ Licensing
Information System.

Undetermined reliability—During electronic logic testing of key data elements, we
noted that some address data fields were blank nearly 42 percent of the time. Justice
informed us that these blanks are likely due to the fact that the registry is populated
by data entered by over 500 agencies. Nevertheless, we decided to conduct an
analysis using the available address data since it is the best available source of this
information. Further, we determined that conducting accuracy and completeness
testing for the sex offender registry was not feasible because the documentation
supporting this data is located at over 500 agencies throughout the State.

Corrective Action Recommended

Status of Corrective Action

We did not recommend corrective action.

N/A

California State Auditor Report 2012-401

October 2012

cc:	

Members of the Legislature
Office of the Lieutenant Governor
Little Hoover Commission
Department of Finance
Attorney General
State Controller
State Treasurer
Legislative Analyst
Senate Office of Research
California Research Bureau
Capitol Press

59