Nobody Knows How Well Homework Works
13th April 2024
Most studies cited by both sides use “time spent doing homework” as the independent variable, then correlate it with test scores or grades. If students who do more time on homework get better test scores, they conclude homework works; otherwise, that it doesn’t.
One minor complaint about this methodology is that we don’t really know if anyone is reporting time spent on homework accurately. Cooper cites some studies showing that student-reported time-spent-on-homework correlates with test scores at a respectable r = 0.25. But in the same sample, parent-reported time-spent-on-homework correlates at close to zero. Cooper speculates that the students’ estimates are better than the parents’, and I think this makes sense – it’s easier to reduce a correlation by adding noise than to increase it – but in the end we don’t know. According to a Washington Post article, students in two very similar datasets reported very different amounts of time spent on homework – maybe because of the way they asked the question? I don’t know, self-report from schoolchildren seems fraught.
But this is the least of our problems. This methodology assumes that time spent on homework is a safe proxy for amount of homework. It isn’t. Students may spend less time on homework because they’re smart, find it easy, and can finish it very quickly. Or they might spend more time on homework because they love learning and care about the subject matter a lot. Or they might spend more time because they’re second-generation Asian immigrants with taskmaster parents who insist on it being perfect. Or they might spend less time because they’re in some kind of horrible living environment not conducive to sitting at a desk quietly. All of these make “time spent doing homework” a poor proxy for “amount of homework that teacher assigned” in a way that directly confounds a homework-test scores correlation.