Title: Experiencing school mathematics; traditional and reform approaches to teaching and their impact on student learning

Author(s): Jo Boaler

Publication type: book

Online: no, but see the website of the author for related articles. See especially this article ('open and closed mathematics'). Also see this very short article by Boaler in Education Week.

Jo Boaler spent three years following students of two UK schools that use radically different approaches to mathematics education. The school that she labels 'Phoenix Park' uses discovery learning in small groups (a 'reform' approach) and the school that she labels 'Amber Hill' uses a traditional approach with lectures by the teacher and students working individually on exercises from the textbook. She later replicated this study in California with schools that she labeled 'Greendale', 'Hilltop' and 'Railside'.

The important thing here of course is which approach gives rise to the best results. A nationally normed test (NFER) was administered to the students at the beginning of the study and another nationally normed test (GCSE) was administered at the end. The GCSE is a mandatory national test and is important for university entrance. The distribution of standardized NFER scores (mean 100, standard deviation 15 nationally) for the two schools is (percentages are used for easy comparison between the two schools; at Amber Hill 160 students were tested, at Phoenix Park 109)

73 to 82 | 82 to 91 | 91 to 100 | 100 to 109 | 109 to 118 | 118+ | |

AH | 25 | 25 | 25 | 16 | 8 | 2 |

PP | 17 | 35 | 25 | 17 | 6 | 2 |

The performance seems to be about the same at both schools. We can do a similar thing for the GCSE scores. The British grading system is a bit odd, It will suffice to know the following about it. The grade A* is the highest, G is the lowest and U,X,Y are different kinds of failing grades. A crucial thing for university entrance is to have a grade in the A*-C range. Again percentages are given, at Amber Hill 182 students were tested and at Phoenix Park 108.

A* | A | B | C | D | E | F | G | U,X,Y | |

AH | 0 | 0.5 | 2.2 | 10.9 | 13.7 | 22.0 | 20.3 | 14.3 | 15.9 |

PP | 1.0 | 1.9 | 1.0 | 8.3 | 12.0 | 25.9 | 25.0 | 18.5 | 7.4 |

Also these scores are very similar. A notable difference is that rather a lot of students at Amber Hill fail, whereas more students at Phoenix Park get the very low grades E,F,G. Boaler sees this as a positive thing about Phoenix Park. A possible explanation (which Boaler does not give) has to do with the fact that the GCSE is actually not one exam, but three exams. There is the higher exam (grades A*-C), the intermediate exam (grades B-E) and the basic exam (grades D-G). So it is for example not possible to obtain a 'D' on the higher exam: the only possibilities are A*,A,B,C or fail. Boaler unfortunately does not indicate which percentage of the students at the two schools took which exam. But it is perfectly conceivable that at Amber Hill many students aimed higher than they could achieve and failed. Note that it is essential for further education to receive at least a C, so that participating in the basic exam is virtually useless. The figures show that nonetheless at Phoenix Park at least 43.5 percent of the students (the Fs and Gs) participated in this exam and by doing this gave up their chance at higher education without even trying.

Boaler picked Phoenix Park first and then picked Amber Hill as a comparison school. She chose Amber Hill based on the fact that it used a content-based mathematical approach and that it had a student body that was almost identical to that of Phoenix Park. These conditions are probably not unique to Amber Hill though, so there is some room for a biased selection here (from her writings it is quite obvious that Boaler prefers the Phoenix Park approach to mathematics education). What we can do is compare Phoenix Park with the nation instead of with the school that Boaler picked. Since the GCSE is a national test we can find the national mathematics scores from 1995 (the year the students in Boalers' study took the GCSE) on the web. The mathematics GCSE results for 1995 are as follows (with the Phoenix Park scores beneath it for easy comparison):

A* | A | B | C | D | E | F | G | U,X,Y | |

UK | 1.9 | 6.5 | 13.4 | 23.1 | 17.1 | 16.1 | 12.8 | 6.7 | 2.4 |

PP | 1.0 | 1.9 | 1.0 | 8.3 | 12.0 | 25.9 | 25.0 | 18.5 | 7.4 |

Of course this is an unfair comparison since the NFER test showed that the students at Phoenix Park were below the national average when they entered Phoenix Park. We can compensate for this by using the NFER scores. I must do this in a rather crude way since I do not have the scores for the individual students, so a proper statistical analysis is out of the question. So what I've done is the following (the results are in the accompanying picture). I basically plot the points in the above table with GCSE scores against each other: (Phoenix Park, UK). To make the picture easier to read I do not plot (for example) the percentage of students with a C against each other, but the percentage of students with a C or lower. This gives the purple crosses. I connect these purple crosses by straight line segments. The fact that the purple line is below the blue diagonal represents that Phoenix Park did worse than the national average on the GCSE. We can do a similar thing for the NFER scores. Since the national scores are not known here I plotted this against a normal distribution (the standardized NFER scores for the country are close to a normal distribution according to NFER). This gives the black circles. Connecting the black circles gives the black line. This black line is also below the blue line, indicating that Phoenix Park scored below the national average on the NFER. The interesting thing is that the purple (GCSE) line is below the black (NFER) line (except at the very top scores). This indicates that, compared to the nation, the students at Phoenix Park did worse on the GCSE than they did on the NFER. So Phoenix Park seems not to have done its students a lot of good. The same is of course true for Amber Hill, which performed very similarly to Phoenix Park. I also took a look on the internet at typical average scores of schools on the GCSE. It seems that Phoenix Park and Amber Hill are just about the schools with the worst GCSE scores in the UK. I cannot help but think that Amber Hill was specifically chosen for this fact.

The following enlightening footnote appears in the article 'open and closed mathematics' by Boaler:

When Phoenix Park first adopted a process-based approach, they were involved in a small-scale pilot of a new GCSE examination that assessed process as well as content. In 1994 the School Curriculum and Assessment Authority (SCAA) withdrew this examination, and the school was forced to enter students for a traditional, content-based examination. The proportion of students attaining grades A-C and A-G dropped from 32% and 97%, respectively, in 1993 to 12% and 84% in 1994. The school has now reintroduced textbook work in an attempt to raise examination performance.

Boaler doesn't say anything about the GCSE scores of Amber Hill at the moment that she decided to include this school in her study, but there is not reason to believe that it was markedly different from the above mentioned scores for Amber Hill. If that is the case, then Boaler seems to have been stacking the deck in favor of Phoenix Park and its discovery learning approach to mathematics teaching. But this didn't quite pan out the way that she probably wanted because the SCAA withdrew the process-based examination.

In the Education Week article Boaler mentions something rather funny if you know the facts. She says

On the national examination, three times as many students from the heterogeneous groups in the project school as those in the tracked groups in the textbook school attained the highest possible grade.

This proportion is actually even more impressive, it is infinity! The highest possible grade is A* and one student at Phoenix Park got that grade versus no student at Amber Hill. What Boaler seems to refer to here is a grade of at least A. Then she is right that the proportion is 1 to 3. These are also the absolute numbers however: 3 students at Phoenix Park versus 1 at Amber Hill. Talk about statistics with small numbers... She also writes in the edweek article:

One of the results of these differences was that students at the second school--what I will call the project school, as opposed to the textbook school--attained significantly higher grades on the national exam.

But as we've seen, this is not exactly true. The percentage of students at Phoenix Park who get a A*-C grade is actually slightly lower than at Amber Hill. And this is the grade range that counts, getting a 'pass' on the GCSE of lower than a C is basically worthless. Boaler also doesn't mention that the grades for the GCSE at both schools are lower than one would expect given the NFER scores. She seems determined to interpret everything in favor of Phoenix Park.