I read a post from Eduwonk this morning that got me all fired up. My dedicated readers, colleagues and friends have seen this happen before when journalists don’t do their homework. I will offer a quick disclaimer that I am not, nor have I ever been a journalist. I wish I had more friends who were journalists so they could yell at me and tell me why I am wrong about this. I become upset when paragraphs like this are thrown into an article:
Research into Teach for America’s effectiveness has been inconclusive, but at least three major studies in the past several years indicate that students taught by its teachers score significantly lower on standardized tests than do their peers. A small handful of other studies, and the organization’s own research, contradict that claim.
There are many things wrong with this paragraph and Eduwonk makes several good points about the entire article. I am going to stick to this paragraph for now and then make a comment about TFA.
- What is the difference between the number “3” (the major studies) and “a small handful”? I don’t know about you, but the latter sounds like more than 3.
- Obviously, we should not judge the strength of research based on volume alone. I doubt the writer spent any time looking at the studies themselves, their methodology, their authors and, most importantly, their funding source. This is an imperative for a journalist in my opinion to give readers a clear picture. As readers, we love hearing about studies and their praising/damning “conclusions”. I am guilty of this as well, but have spent a fair amount of time investigating the so-called “evidence”. I would rather not have to do that as I believe the journalist should be responsible for this, to some degree.
- I have not heard of these 3 studies that say negative things and I pay a fair amount of attention to the Education Policy world. I remember one such study from 2005. Its author has been a vocal critic of TFA for many years. Here is TFA’s response and a study that found different results. Note that Mathematica is an independent research group and TFA did not pay for the study.
As you may have guessed by now, I am a fan of Teach for America. The impact on hundreds of thousands of students is no doubt tremendous. Principals and administrators sing the praises of the organization and its Corps. I would argue that the research does not have to be overwhelmingly positive since the Corps are filling difficult to staff positions. Nevertheless, it has been mostly positive with regards to their impacts on student achievement.
An argument frequently advanced by their critics is that only 30-40% stay on past the initial 2 year commitment. This is true, but many stay in the world of education in some fashion. I have had the pleasure of working with many of these people and their contribution to ABCTE has been significant.
There is no silver bullet for tackling the issues many of us see with K-12 education in the U.S. TFA will only scale so far, but the positive impacts are difficult to argue against. The writer of this article was probably trying to “show both sides of the issue”. I applaud the attempt, but believe that there is a responsibility to do so with much more depth.