A recently published research paper on ‘A holistic multi-level analysis identifying the impact of classroom design on pupils’ learning’, written by Professor Peter Barrett and his team at Salford University has created quite a stir in mainstream media and received lots of attention. Among other outlets, it featured prominently in the Guardian, on dezeen, Wired Science and GOOD and was talked about a lot on twitter, too.
The paper claims that various built environment factors have a significant impact on the learning progress of a pupil. Comparing the worst and best designed classrooms in their sample of 34 classrooms across 7 different UK schools, differences in learning progression are 25% on average. Or in short: well-designed classrooms can increase the learning progression of a pupil by 25%.
This coincides with the heated debate between architects and the UK government over school design. The Department of Education has published so called ‘Baseline Designs‘ in October 2012, specifying in quite some detail how a cost-effective standard school is supposed to look like, including the provision of sample layouts of a 420 place primary and a 1200 pupil secondary school. The guidelines stipulate rectilinear forms, a specified number of floors, basic layout rules (e.g. classrooms on one side of circulation space), ideal width of corridors and height of ceilings, as well as countless other standards. The Royal Institute of British Architects has condemned this ‘flat-pack approach’ as inflexible, too restrictive and depriving. Well-known architects have chipped in, too – for instance Richard Rogers, who criticised the government for “making it unnecessarily difficult to design good schools.” (as quoted in the Guardian)
While the Salford research could shed new light on this debate, however, Secretary of State for Education, Michael Gove counters the findings claiming that “there is no convincing evidence that spending enormous sums of money on school buildings leads to increased attainment. An excellent curriculum, great leadership and inspirational teaching are the keys to driving up standards.”
So what is going on here? Does the government simply not get it or willfully ignore state of the art research (that they have co-funded by the way…)? Or do architects stubbornly claim that there is intrinsic value to their work without giving proof?
This leads me to the question how solid and conclusive the research findings from Barrett and his team are. In the following, I will firstly summarise the research approach / methods (many will not have had a chance to read the full paper, since it is behind a paywall…) and in a second step provide a personal viewpoint on the quality and conclusiveness of the research.
The paper is based on data from Blackpool Council on seven primary schools. 34 classrooms and a population of 751 pupils were studied in total using an elaborate statistical analysis and modelling. It was hypothesized that environmental factors have an impact on pupil progress in learning. In essence the paper aimed at explaining variances in learning progress (called outcome or dependent variable) by environmental factors (called input, independent or explanatory variables) while controlling for intervening factors such as age, gender and previous performance of pupils (input variables).
Therefore three sets of variables were used: 1) the outcome variable, i.e. learning progress of pupils. This was measured by TA levels as commonly used by the UK Government, where teachers assess the pupil’s attainment and their transition between key stages; 2) The first set of input variables, i.e. pupil based control factors, such as age, gender and start level of attainment; 3) The second set of input variables, i.e. environmental factors of classroom design. The paper identified 37 built environment factors for each classroom (for instance heating control, opening size of windows, size and shape of room, quality of chairs and desks, corridor usage, etc.) and clubbed them together to form ten distinct design parameters: light, sound, temperature, air quality, choice, flexibility, connection, complexity, colour, texture. Those environmental factors for each classroom were qualitatively judged by the research team on a five-point scale.
In summary, the study found that six environmental factors had a significant effect on the learning progression of pupils: colour, choice, complexity, flexibility and light had a positive effect on attainment, while connection had a negative effect. The authors concede that the mechanisms for connection are not fully explained or understood by their study and that this needs further research. Of the control variables, previous attainment as well as age-weighted previous attainment both had a significant effect on learning progression on a pupil-level. The statistical model developed by the authors allows scaling the level of impact of the environmental factors by comparing the learning progress of an average pupil in the best and worst designed classrooms. Those differences are 25%, which is why the authors conclude that environmental factors in classrooms account for 25% of the learning progression of pupils.
To me the paper appears well-founded and based on solid and rigorous research. The statistical analysis seems very elaborate (but then I’m no expert in statistical modelling). However, I have a few points of criticism based on the general methodology of the research.
These are the issues I find particularly problematic:
- The study does not control for the socioeconomic status of pupils or the catchment area of the schools. It seems that the schools studied vary significantly, if the learning improvement is compared class by class and school by school (as shown by the authors in table 13). There appear to be significant differences in learning progression between the schools: schools number 3 and 8 show lower than average improvements and schools 5 and 9 show higher than average improvements while schools 4, 6 and 7 improve averagely (the authors don’t discuss this, by the way). This seems worth exploring. It could be hypothesized that pupils with a higher socioeconomic status progress more quickly, or that schools in well-off areas perform better. It would also be interesting to take management, leadership and pedagogic styles of teaching into account in future studies. The authors themselves highlight that their next steps may include the potential analysis of school level effects and teacher performance.
- Environmental factors were judged qualitatively by the research team rather than measured quantitatively. This would be relatively easy: for instance, air quality and temperature are obvious parameters that can be measured exactly. I find it slightly disturbing that all parameters were judged by a single visit to the classrooms. It is also unclear whether school related factors were treated equally for different classrooms of the school. What is more, environmental factors vary day by day (imagine visiting a school on a bright and sunny day versus a gloomy and rainy day).
- The definition of some variables is controversial and no further explanation or evidence is given for the choices made. For instance three factors are devised on “carefully considered colours” (Barrett et al 2012: 5), where the study authors suggest that warmer colours were favourable for younger pupils while cooler colours were preferable later on. However, evidence on the impact of colours is far from clear. In a research paper providing a literature review of known environmental impact factors on schools, Woolner et al (2007) come to the conclusion that the use of colours in the built environment and its effect is an “area of debate” and presents “equivocal evidence” (p. 53 / 57). Or to give another example, the parameter connection is defined by four factors, among them ‘corridor usage’. Corridor usage is rated higher if “the corridor is not used for storage or breakout purposes”. I simply cannot see how this is a positive factor. Corridors are the main socialisation space of a school, where pupils interact among themselves away from the control of teachers and as such are crucial social spaces. Using corridors for breakout purposes could be highly beneficial to the pupils social experience, which might lead to better learning and peer instruction. In a comprehensive study of student interaction patterns, Celen Pasalar (2003) found that diverse activities take place in front of classrooms, that corridors have the highest occupancy rates, and that in schools with highly accessible corridors, students have more friends across different classes and grades. Another factor in the Barrett et al study is and ‘safe and quick access’ of facilities through the corridors and a classroom is rated higher if it is located near the main entrance or other specialist rooms such as the library or the cafe. Again, I can’t see how proximity to the cafe should per se have a positive effect. And again, if this is hypothesized to be a potentially important factor, this effect of proximity could be exactly measured by using methods like Space Syntax that work with connections of rooms, connectivity, accessibility and proximity thus taking spatial layout fully into account.
To conclude (a slightly lengthy post), I believe four aspects are worth noting: firstly, Barrett et al have provided an interesting contribution to a growing base of evidence on school design and the impact of the built environment on learning, and their findings should be taken into account in the debate on school design. Secondly, a lot more research needs to be done in the field and various shortcomings need to be addressed as highlighted above. Thirdly, it seems that the government indeed takes a political position in ignoring important evidence, where it seems to suit their special agenda. Last but not least, a part of the problem lies in the inaccessibility of research results to the wider public. Not only is the original research paper pay-walled and thus not easily available for anyone to read and appreciate, but also the way scientists communicate their research and the details of their studies does not lend itself to a wider public engagement with the topic. The paper by Barrett et al is not an easy read (even for a built environment researcher), is full of jargon and lacks explanations on certain assumptions made. Summaries given by the press office of the University of Salford or the articles in the mainstream media do not go into the necessary detail to allow a critical appreciation of the importance of the original research. I don’t know how this could be solved, but certainly closed access publishing won’t help a bit.