Will Georgia districts hold teachers accountable for student scores on new tests?

Teachers are telling me student performance on this year’s Georgia Milestones tests and Student Learning Objectives will be factored into evaluations next year.

That surprised me as I understood the state was treating this inaugural administration of the brand new tests as a sort of practice round.

teacherhidingSo, I sent the state Department of Education some reader comments, including this one:

     The way the evaluations system works, the teacher evaluations for this academic year are based on this year’s observations and last year’s scores. Teachers are being told that this is still true and next year’s teacher evaluations will be based on this year’s test scores and observations that will take place next year. I’ve heard from teachers in three different metro counties, who have been told as recently as last week that the student growth models that make up roughly half of their teacher evaluation will not be changed. Is there any chance that you can get some clarification on that? Several teachers I know are interested to know.

DOE spokesman Matt Cardoza responded:

  The department has applied for a waiver to delay the use of this year’s growth data being used in next year’s evaluation for high stakes decisions. We have asked for a waiver for the use of the 2014-15 growth in the evaluation system. We expect we will get that. However, growth will still be calculated, but it will not be used for the teacher evaluation.

I responded I was hearing from teachers the student scores would be used in their 2015-2016 evaluations. Can systems do it anyway? Cardoza responded:

Because of local control, districts could always use evaluation results as they choose. By GaDOE asking for the waiver and with the PSC certification waiver, we are communicating to districts that we recommend that growth be used as information, not for high-stakes decisions.

 

 

Reader Comments 0

78 comments
OldPhysicsTeacher
OldPhysicsTeacher

I just saw this..."The University of Georgia saw a 28% drop in teacher-prep enrollment just last year!"  Keep it up, Republicans; keep it up and you'll be teaching your kids yourself... <snark>home-schooled kids do *so* well in college</snark>.  You guys wanted a scapegoat for your failures to govern adequately at even the local level, poorly at the state level and God-awful at the national level, and now you're starting to reap what you sow.

MaryElizabethSings
MaryElizabethSings

@OldPhysicsTeacher


We probably will see more home schooling emerge as fewer certified teachers enter the teaching profession.  Look for those home-schooled families to ask for, and receive, tax payer dollars as they venture forth with the education of their children at Sea World.

Teach2Learn
Teach2Learn

So is it fair to judge a teacher on test scores when students take less than 15 minutes to take a 60-70 minute test? One student spent 3-4 minutes on each section. Many students do not take the test seriously. Teachers work all year to teach the standards, teach important skills, then see their students simply bubble answers without even reading the questions. Heartbreaking.

OldPhysicsTeacher
OldPhysicsTeacher

Hahahaha.  Let's see how many college students go into teaching next year... and the year after, etc.  You get the government (and teachers) you deserve. Hahahaha. 

sandsage
sandsage

FWIW, Savannah's director of testing told our PTA a few months ago that this year's scores would be used to evaluate teachers, and that this was possible because last year's CRCT included questions aligned with the Common Core/GMAS testing that would allow it. In other words, they prepared for this at least a year in advance by baselining the test with questions inserted into last year's CRCT.

jerryeads
jerryeads

dg417s - BRAVO!  Not many folks understand that the CRCT and, I presume, the Millstones, are not developed and scaled to measure growth across years. The department 'faked' it with what some of us with measurement and statistics training consider egregious misuse of analysis to generate change data, but unless the tests were developed to reflect a curriculum with proper scope and sequence it would be difficult to develop a properly scaled set of tests that would fairly measure student growth. Even with decent instrumentation, the research is starkly clear that it's incredibly difficult to accurately infer teacher quality.

In an earlier blog, several folks wondered what the consequences of "opting out" would have. Those folks might enjoy a recent Washington Post Answer Sheet piece reporting that in some districts for some tests WELL over half the students refused to take the New York Common Core tests:

http://www.washingtonpost.com/blogs/answer-sheet/wp/2015/05/03/why-the-movement-to-opt-out-of-common-core-tests-is-a-big-deal/

Falcaints
Falcaints

The EOC or milestones don't count for the students this year, yet school districts can if they want measure teachers by those scores.  We won't even have the results back until next Fall.

dg417s
dg417s

Here's the problem that I see with Milestones in general versus using this test to measure student growth as part of teacher evaluations. I feel comfortable sharing this because you can download the examiner manual straight from the DOE website. The Georgia Milestones is designed first and foremost to measure student achievement in the tested courses - not student growth. These are two very different things. When you look at how teachers are measured for their evaluation, however, it is by student growth. The test, according to the state, isn't designed to measure that.

Independent ED
Independent ED

@dg417s The worst part about trying to measure growth using these tests in many subject areas, and the CRCT had the same issue, is that you can't measure "growth" when you're talking about two different courses.  4th grade science and social studies are nothing like 5th grade science and social studies.  Aside from the normal issues of using standardized tests in this manner, performance in 4th grade has nothing to do with performance in 5th grade. It would be like saying you could measure growth starting from an Algebra I EOC and moving to a Statistics EOC.  But, as usual, somewhere, someone cooks up some goofy formula to calculate a number to label students and teacher with.


And don't even get me started on the SLOs.  Those are the most ridiculous tests I've ever seen, and how our state will calculate anything from those is beyond me.  Why don't we just throw numbered ping pong balls in the air if we want to calculate data using the numbers produced by those.  

Wascatlady
Wascatlady

@Independent ED @dg417s The statistics prof used as an example of crazy to decide things based on a person's car tag number.  Yeah, you can see if there is correlation with a person's IQ, or ability to teach, or likelihood of committing a crime, but what would it mean?


This is THAT KIND of STUPID.

JaneInCobb
JaneInCobb

All occupations have their malcontents but this blog specializes in making them seem greater in number than they actually are.

May the rest enjoy their summer vacation!

Wascatlady
Wascatlady

@JaneInCobb Taking classes, working on lesson plans, "volunteering" for committees whose work will be ignored,..

HappyTeacher
HappyTeacher

Our assessment (the SLO) was given 5 weeks before the end of the school year, and included content that we had yet to cover. In addition to that, the scores did not matter to the students' average, which removed A LOT of motivation on the part of the student. I truly hope that the scores are not used against me as I feel that they do not accurately reflect their learning. From what I understand, many school systems are doing the same thing. I'm all for holding teachers accountable for student test scores but the deck shouldn't be stacked against us. Honest evaluation is valuable, what is taking place this year is not. It's an easy fix. Let's see if they do it...

Wascatlady
Wascatlady

@HappyTeacher I don't agree. Evaluation will never be "honest," no matter how good the test, until students have CONSIDERABLE skin in the game,


I've spent decades watching students who could do well "finish" in 10 minutes.

heyteacher
heyteacher

@Wascatlady @HappyTeacher The same could be said for the so-called student evaluations. Mine were fine, but my students"finished" them in 2.5 seconds flat which means they didn't really read the questions. I don't think either is very valid and says nothing about my teaching ability.

HappyTeacher
HappyTeacher

@Wascatlady That was what I was saying about the deck being stacked against us. When they made the test not impact the student's grade, it took away their motivation. It won't be fare until that is fixed.

MiltonMan
MiltonMan

Teachers complaining about being evaluated are the same ones who evaluate students by tests????

redweather
redweather

@MiltonMan They are not complaining about being evaluated, they're complaining about how they're being evaluated.  If students were evaluated based on one test all year, you can be sure there'd be a hue and cry.

Finan
Finan

While it's clear to any reader that you will remain opposed to so-called high-stakes testing or its use in teacher evaluations, Maureen—the world has long since moved on.

Even with the claimed "bugs" the test provides valid information about learning which should factor into current teacher evaluations.

MiltonMan
MiltonMan

@Wascatlady


Hopefully if you where a jockey, you would not being riding such a horse but we are talking about you so that is indeed asking a lot.

Wascatlady
Wascatlady

@Finan If I am a jockey, should I be evaluated on riding a broken down, wooden-legged horse?

redweather
redweather

@Finan And some people were convinced that this country had "moved on" from racism with the election of Barack Obama.  In my experience, the people who are quickest to "move on" are the ones who really don't want to address a problem or issue.  So much easier to simply put it behind them. 

MD3
MD3

@Finan Good grief Eduktr... how many alternate screen names are you going to invent?!? As has been documented repeatedly, the information gleaned from these types of standardized tests is very limited, and is not valuable for improving student outcomes. It is a money grab for Pearson and the other publishers, plain and simple. You have been an outspoken cheerleader for those seeking to make money off my child and others. But your volume doesn't make you right. Just makes you loud.

EdJohnson
EdJohnson

Be careful with the norm-referenced ITBS.  Being norm-referenced, the tests are designed and continually redesigned to maintain a bell curve and to keep approximately 50% of kids above average and approximately 50% below average.  Alfie Kohn once explained the matter during his lecture at Georgia State University School of Music some years ago (paraphrasing): The ITBS tends to include test items big-house kids are likely to get right and other kids are likely to get wrong, and tends to exclude test items big-house kids are likely to get wrong and other kids are likely to get right.  See the problem?

As for the SAT, well, watch this 26-mimute video from last week’s Network for Public Education (NPE) Conference…

https://vimeo.com/126615242

OriginalProf
OriginalProf

@booful98 @Wascatlady @EdJohnson 

I thought that "big house kids" referred to those living in what are called "McMansions," or huge pretentious houses built on small lots.

Semantics!  Funny to read all the possible readings here.

class80olddog
class80olddog

I agree that the only test that is needed is the ITBS - given on the last day of the school year - and carefully proctored by outside people (to minimize cheating).


But you have to have some sort of standardized testing - you cannot rely on teachers' grades.  (See the example of the young woman who had a 3.6 GPA but could not pass the GHSGT.)  Grade inflation is rampant.

MaryElizabethSings
MaryElizabethSings

@class80olddog


I agree except for the fact that it is not a good idea to give any standardized test on the last day of the school year because the students clearly do not have their minds on academic advancement at that time.  And, if memory serves me well, I believe the ITBS took several days to complete, not one day.

redweather
redweather

Three observations:


1.  Teacher complaints about testing will be shouted down by those convinced our public schools are "failing."  


2.  Big Data and big textbook publishers are winning.  


3.  It won't be long before most classes are "taught" by computer, with teacher sitting somewhere in the room merely monitoring student "progress."      


It's a brave new world.

ScienceTeacher671
ScienceTeacher671

AJCkrtk, the SLOs in most districts have all the problems you cite for the APC tests, but they are still being used in most, if not all, districts, by state mandate.

BKendall
BKendall

Maureen, would consider asking GADOE to explain in detail how the CRCT, EOCT, and the new Georgia Milestones measure student academic growth for students in each teachers classroom.

I cannot find evidence to prove these standardized tests measure academic growth from the first day of school in the Fall to the date of the standardized assessment in the spring.

I would really like to see their evidence, of how this all works.

MaryElizabethSings
MaryElizabethSings

@BKendall


This is how the individual academic measurement of each student's progress or regression could be accomplished yearly. Simply use the ITBS, given once a year, by using the following steps for all students (which I had used as an ILT, 30 years ago, in charting the continuous academic development of certain students, using the ITBS.)


(1) List all of the present students of a given teacher by name on one, long data sheet.


(2) Place last year's end-of year (April), standardized scores for each of the teacher's students beside each student's name on that data sheet.  Consider that score a pretest for that student for the current year (August).


(3) Test those particular students on the same standardized test (or different version of it) near the end of the present school year (April), considering that test result to be the student's post test score for the school year.


(4) Place those post test scores for each of the teacher's students beside each student's pretest score on the same data sheet. Then, determine how many months and years the student either progressed or regressed.  Chart the difference next to the student's post test score.


(5) Check the IQ scores, discipline records, and social workers' or counselors' comments about family problems for the current year of all students who did not make sufficient growth for the school year.  Make notes about each targeted student in a "Comments" section which should be placed beside the student's net gains or loss number, in both reading and mathematics, and any other curriculum area tested. (Separate data sheets should be used for each student, for each curriculum area tested and analyzed).


(6)  All the above information can be charted on one sheet of paper so that the net gain and losses for the students of a particular teacher can be visualized together.  


(7)  Repeat the same procedural testing cycle for the next year's teachers and students, continuing to use the previous year's end of school year's standardized test results (given in April) for the pretest scores for the new students who begin the new school year in August.



popacorn
popacorn

@MaryElizabethSings

Nice recipe, but IQ scores? From where? By whom? LOL. Can of worms. No way, won't work. Not PC in this century, sorry. Any other ideas?

MaryElizabethSings
MaryElizabethSings

@popacorn 


As I recall, the ITBS contains one section for a group IQ test, given, of course, to the individual students who are taking the ITBS.  Those individual IQ scores were reported on the student's overall record for the ITBS, along with his/her reading, math, science, and social studies scores.  In my day, those scores were pasted into the student's permanent school records in the main office, where I had retrieved them for each student for whom I needed to chart an academic analysis over several years, as an ILT for a decade in the DCSS. 

popacorn
popacorn

@MaryElizabethSings

No time to look now, but I would be shocked if ITBS purported to accurately determine IQ. This is a whole 'nother battery of tests. Sounds very fishy, and again, would never work. PC. Any teachers routinely see 'IQ scores' of students? Ever remember when true IQ tests are given? Watch your memories!

MaryElizabethSings
MaryElizabethSings

@popacorn 


The longer "true" IQ tests were always given in schools through the SST, along with the group IQ test administered on the ITBS.

Wascatlady
Wascatlady

@MaryElizabethSings @popacorn I gave the ITBS for many years, and recall no such IQ section.  This was in the elementary grades.


As far as IQ testing, isn't that done to all fourth graders to look for kids who might be gifted?  I know a large group takes it.

MaryElizabethSings
MaryElizabethSings

@Wascatlady


As I had stated, the group IQ may have been on the Iowa Test.  During my long career I worked with students standardized test scores as recorded in the  permanent records in the school's vault or counselor's office.  I know for certain that a group IQ score was recorded in student's records, along with reading, math etc. standardized test scores.  I am not absolutely sure what standardized test it was within.  Or, perhaps, since those days of 30 years ago for me, the ITBS simply stopped giving group IQ tests to students.

MaryElizabethSings
MaryElizabethSings

@Wascatlady


The long individual IQ test is given as a part of SST to any student who has special needs (gifted, learning disabled, etc.) and who has been referred.  It is a more valid test of course than the group standardized test IQ test, but the group IQ score at was a quick starting point for ascertaining a student's potential, especially if the same group IQ scores had shown up for several years for a given child who was not keeping up with his or her peers. Another quick way of determining potential was to administer the Listening Test of the Nelson Reading Test to students, individually.

popacorn
popacorn

@MaryElizabethSings @Wascatlady  So, to sum up, no one knows anything about IQ scores and public school testing. I would think it would be relevant to any conversation here, but it is avoided for some reason. And especially you Mary, if there is no IQ testing, and no real IQ scores to see anywhere, your entire philosophy/ideas and on and on crumbles like a cookie, and every word posted to this point becomes irrelevant. Sorry. 

popacorn
popacorn

@MaryElizabethSings @popacorn I know it hurts. Delusion awakened by reality often does. Can you enlighten us on where and when and what type of IQ tests are given in public schools?

Wascatlady
Wascatlady

@MaryElizabethSings @Wascatlady My time overlaps yours 1973-2014.  I think there may have been an IQ portion with some other test, but not the ITBS.  


And yes, the group IQ tests, as given for screening for admission to gifted, is a pretty gross evaluation method.


As a teacher, when we had a student who had been years behind for YEARS, the RTI process is nuts.  There is no time to waste, doing 3 week plans, followed by 3 week plans., ad nauseum.  This child needs help!  Let's find out what we have going on, starting with health and our measure of ability.  If all is normal, THEN go on to those 3 week plans!

MaryElizabethSings
MaryElizabethSings

@teachermom4 


Exactly right, teachermom.  Thank you, very much, for filling in the details of what I knew I had practiced when I was an Instructional Lead Teacher in DCSS from 1975 - 1984, grades 1 - 7. I was, also, the SST Chair in that same elementary/middle school from 1981 - 1984.


From 1984 - 1998, I was the SST Chair in a south DeKalb County high school where I functioned as the Reading Dept. Chair for the entire school, from 1984 - 2000, when I retired.


I was innovative in how I used those CogAT group IQ scores for students who were falling behind.  I had created an Academic Developmental Form (on one page) which charted a student's progress from 1 - 7 years in school, using reading level advancement, math level advancement, and IQ scores taken from the CogAT in order to help me better pinpoint the heart of a student's problems.  I also talked with parents in conference to chart the developmental history on that same one page form from birth to 1st year in school.

MaryElizabethSings
MaryElizabethSings

@popacorn 


I would hate to live inside your mind.  Sad.  IQ tests are still being given.  Read with greater depth and comprehension.  Or, at the very least, stop the petty put-downs which are adolescent. Grow up.

MaryElizabethSings
MaryElizabethSings

@popacorn 


"CogAT" is an acronym which stands for "Cognitive Aptitude Test," as I recall. Give it up, Popacorn, your silly, one-sided battle with me is juvenile.  Again, grow up.

popacorn
popacorn

@MaryElizabethSings

One-sided yes, silly no. Again, the CogAT is not an IQ test. You are revealing volumes about your true grasp of educational issues. As far as your endless saying of  'As I recall..', I wouldn't. You don't remember much and you don't know what you're talking about. 

teachermom4
teachermom4

@popacorn @MaryElizabethSings @teachermom4 It is a screening test that places children in percentiles based on their cognitive abilities as measured by the test. The result numbers given are comparable to IQ numbers. Average is 85-115. Gifted kids typically must score in the high 120s-high 130s.  It is not knowledge based; it is ability based. There are more thorough tests out there, but this one provides a great starting point.

popacorn
popacorn

@Wascatlady @teachermom4 @MaryElizabethSings

As refreshing as it is to see the girls club rally round a fellow member in distress, I would suggest you all learn the difference between a 'screening test' and a legitimate IQ test. They are apples and oranges. Surely they mentioned this in Teacher School!? Continuing to call it an IQ test further demonstrates ignorance. 

teachermom4
teachermom4

@MaryElizabethSings @popacorn The CogAT test is a separate section on the ITBS form. It gives an IQ type of measurement in verbal, quantitative, and non-verbal abilities. When I taught in DeKalb it was given in 1st, 3rd, 5th, and 8th grades. In my current district it is given in similar fashion. The scores are used to determine gifted screening, but in the past they were also used to red flag possible LD issues (a large discrepancy between one section and another might indicate a learning disability that should be investigated, but that was before the whole RTI process started). The scores are determined by birthdate rather than grade level to account for the fact that kids will have at least a one year spread across the grade level.

MaryElizabethSings
MaryElizabethSings

@teachermom4 


Thanks once again for your detailed knowledge, teachermom.  The CogAT served my purposes in analyzing all the factors involved in determining the reasons a given student might have academic problems.  It certainly added to the overall understanding of a student's mental processes.  (As a doctor analyzes body/mind processes.)


You are truly sharp, teachermom.  Much appreciated, as the years have gone by and I have not been doing what I have described to readers for 15 years and, as an ILT, who was charged with analyzing the academic growth all 700 of the students in my school, instead of teaching students directly, 30 years have passed.  Well done by you!

MaryElizabethSings
MaryElizabethSings

@popacorn 


Sorry, popacorn, but for you to perceive that I am in "distress" is truly comical.  You need to bone up on your knowledge and stop running your jaw so much.

popacorn
popacorn

@MaryElizabethSings

You have never given an IQ test and you don't know what one is. Yet you babble on about them. Wow! What little credibility you had has morphed into caricature.Pity your poor, distressed students. 

MaryElizabethSings
MaryElizabethSings

@popacorn 


Wrong again, Popacorn.  You are batting 0 today.  In graduate school, I took a graduate level course just on the administration of IQ tests.  In order to pass that course, I had to administer the long form of an IQ test to two different people I knew who would agree (no names given) and turn in my analysis to the professor for his validation of my work.   I "aced" that course.  That should really upset you! ;-)


Also, as the SST Chair, I interpreted the results of the long form IQ tests given by the psychologist on our team to certain classroom teachers, with discretion, to help in their understanding of how to reach certain students.


Oh, what you do not know about me is enormous.  But keep running your mouth.

popacorn
popacorn

@MaryElizabethSings

If you remember correctly, that is.

She doth protest too much, me thinks. 

I'm over this one. Have a nice trip? See you next fall!

Wascatlady
Wascatlady

@MaryElizabethSings @popacorn I think, instead of the months-long RTI process, students encountering significant difficulty should be immediately tested for ability (IQ), then learning problems.  (after vision and hearing tests) If none found, THEN proceed with RTI.


MaryElizabeth, like you, I was SST lead for years at the elementary level.

taylor48
taylor48

@popacorn @Wascatlady @MaryElizabethSings In my system, students take the CogAT, which is a cognitive abilities test.  While it's not a true IQ test, it tests students "readiness for school," and the scores can be correlated somewhat to IQ (ie a student who score in the 70's on the CogAT will, most likely, have a low IQ score).  Since it's given in a group setting, it's not as accurate as one given one on one, but it does give us a good idea of who might be gifted and who might need some additional intervention.  We also give the ITBS which is an achievement test.


Easy way to remember - CogAT tests what they have innately, ITBS tests what they've learned.