Florida: Where Essays are Both Written AND Graded By AI
One More Good Reason to Eliminate Florida’s State-Mandated Writing Assessments
Last year I wrote “Why Does Florida Continue To Administer State-Mandated Writing Assessments?” and encouraged parents to ask their child’s teachers and lawmakers why Florida continues to administer state-mandated writing assessments that have no bearing on the state’s accountability system. Every spring, Florida’s 4th-10th grade students take a 120 minute plus, computer based writing assessment…
The question everyone should be asking is “Why?”
- Writing assessments are not required for federal government accountability measures.
- Since 2022, Writing scores are no longer part of Florida’s School Grade calculation.
- B.E.S.T. Writing assessments are no longer combined with FAST ELA Reading assessments to create an overall ELA score.
- B.E.S.T. Writing scores do not factor into student assessment graduation requirements.
- B.E.S.T. Writing scores are not used to determine statutory promotion or retention decisions.
On the other hand, there are plenty of reason to argue for eliminating Florida’s state writing assessment:
- The B.E.S.T. Writing assessments are computer based, meaning students as young as 9 years old are typing essays on computers… forget cursive instruction, even printing is, apparently, no longer necessary.
- B.E.S.T. Writing assessments are graded by artificial intelligence/”hybrid scoring” – using “a large representative sample of Florida student responses scored by highly trained and qualified human scorers to train an automated scoring (AS) engine, which then scores the majority of student responses.” Why do parents need an extra AI-graded writing assessment when their child’s teacher provides regular assessments of their writing ability?
- Rather than encouraging creative writing, the state writing assessment encourages the training of students to write formulaically.
- These assessments are time consuming – demanding at least 14 hours of standardized writing assessments, annually, to determine scores that serve no purpose in the state Accountability system and provide no real feedback to the student..
Yesterday I learned one more reason to eliminate this time-waster… cheating with Artificial intelligence.
According to a December 2024 presentation at the Florida Organization Of Instructional Leaders (FOIL) conference, students are using ChatGPT to write their essays.
For kicks, I asked ChatGPT to provide a summary of the FOIL presentation (you can view the slide deck here):
The Florida Department of Education (FLDOE) has observed an increase in AI-assisted responses in statewide writing assessments. In spring 2024, both human scorers and automated systems identified unusual patterns in certain student submissions, suggesting the use of AI tools like ChatGPT. Indicators of such assistance include inconsistent writing quality within a response, language atypical for the student’s grade level, and content resembling AI-generated material. When automated scoring systems flag these anomalies, the responses undergo further human review. Confirmed cases are reported to the Test Administration team, which collaborates with districts to investigate, adhering to Rule 6A-10.042 of the Florida Administrative Code.
Students whose essays were flagged, in some instances, openly admitted to using ChatGPT during the assessment. If a teacher had assisted a student during an state assessment, they could be found guilty of a a misdemeanor of the first degree and subject to up to a year in prison and a $1000 fine. For students using ChatGPT, it appears their the test scores were invalidated.
The FOIL presentation highlighted “Future Solutions” to combat AI-assisted essay submissions (again, I have asked ChatGPT to summarize):
Summary: AI-Assisted Responses – Future Solutions
To combat AI-assisted responses in statewide writing assessments, the Florida Department of Education (FDOE), in collaboration with DRC and Cambium, will implement several measures:
- AI-Generated Benchmarks: Before each test window, FDOE will create AI-generated essays using popular platforms (e.g., ChatGPT, Gemini, Bing, Claude) for comparison.
- Automated Detection: Cambium will enhance the automated scoring (AS) system to compare student responses against AI-generated content during and after testing.
- Human Review: Responses flagged by the AS system will be routed to expert human scorers for further evaluation.
- Exploring Additional Software: FDOE is investigating third-party tools designed to detect AI-assisted writing, though their reliability is still being assessed.
- Enhanced Test Security: Training for test administrators will be reinforced to prevent unauthorized device use during testing. The testing platform (TDS) already blocks internet access and copy/paste functions, meaning AI-assisted responses likely involve external device use.
I have ONE more solution to suggest: ELIMINATE STATE WRITING ASSESSMENTS ALL TOGETHER! Seriously, why are we jumping through so many hoops here? How much does this cost taxpayers? I recommend we have teachers assess their students’ writing abilities, say, perhaps, by writing assignments completed in the classroom with a pencil and paper. Let students write about subjects THEY are interested in rather than generic prompts. Give students real feedback, teach them how to edit and revise their work, give students a reason to not use AI and actually express their own ideas through writing.
This year, the grade 4-10 Writing assessment will be given March 31–April 11, 2025. It starts on Monday. Parents should be questioning why their child’s school day is being disrupted by this assessment.