As the CELDT is replaced as the standardized test of English language proficiency in California, an expert gives the inside scoop on creating and maintaining a statewide assessment.
By Caroline Fahmy
Since 2001, the California English Language Development Test (CELDT) has been used to assess 1.5 million K–12 students every year. The CELDT has identified students with limited English proficiency, determined the level of proficiency of those students, and assessed progress of students in acquiring listening, speaking, reading, and writing skills in English.
In November 2012, the California State Board of Education adopted a new set of English Language Development (ELD) standards that closely align with the California Common Core State Standards: English Language Arts and Literacy in History/Social Studies, Science, and Technical Subjects. The ELD standards clarify the knowledge and skills that English learners (ELs) need to master the new academic standards.
The new standards necessitated a new assessment, so this spring, students whose primary language is not English are being tested with the English Language Proficiency Assessment for California (ELPAC). The ELPAC includes the same four domains as the CELDT—listening, speaking, reading, and writing—but has a separate initial assessment for determining the language proficiency of new students whose home language is not English, and a summative assessment to indicate the progress of students previously identified as ELs. Educational Data Systems has been the CELDT contractor since 2009, and the end of the CELDT era provides an opportunity to delve into what it takes to develop, test, and implement a large-scale assessment.
When content standards—or the intended use of the results—change, tests must be evaluated with respect to whether they meet the new goals. For large-scale assessments, the test development cycle can take years to complete. The basic process for developing and maintaining valid assessment programs can be illustrated as a life cycle with the following stages:
Test Purpose: Specify who is to be tested, to what content standards the assessment will be aligned, and how the results will be used to make inferences.
Test Blueprint: Convert the test purpose into a plan for test items that cover the specified content.
Item Writing and Review: Develop test questions as indicators of how well students have met the content standards; review items for content coverage and potential sources of bias.
Field Test: Administer the items to a sample of students; analyze the results to determine whether the items meet their intended objectives.
Standard Setting: Identify the score cut point(s), the point(s) at which students meet “passing” or “proficiency” criteria. Write descriptors of performance levels that assist in interpreting the students’ abilities at each of the levels.
Operational Test: Administer the test to all students in the target group(s).
Report: Provide information about what students know and can do relative to the standards.
Producing and maintaining a large-scale assessment program involves many other steps, including developing state regulations that guide activities, establishing standardized test-administration procedures, training educators on the administration procedures, developing test materials, maintaining an item bank, and establishing a score scale. For an ongoing assessment program, item-writing, reviewing, and field-testing happen in a continuous cycle to feed new content into operational tests.
I equate running an assessment program to building a skyscraper: there’s the purpose and goals, designs, blueprints, contractor and subcontractors, foundation, framing, plumbing, etc. Once it’s built, there is the maintenance, property management, and all of the activities it takes to keep the operation running smoothly. Building an assessment program is a huge endeavor that takes expertise, careful planning, flawless execution, management, and then a lot of hard work. Building a skyscraper might even be easier!
One of our biggest takeaways from managing the CELDT for nearly a decade is that the tens of thousands of educators involved with this program have worked diligently and faithfully to implement it. They all—from the CDE staff to district testing coordinators, English language development specialists, information technology staff, test examiners, and trainers—have put forth a monumental effort on behalf of English learners in our state. And they have done so to help our students succeed. A new assessment will not change this commitment to English learners’ success in California.
Caroline Fahmy is the President and CEO of Educational Data Systems.