23rd February 2021 by Timo Hannay [link]
With schools in England due to reopen for face-to-face teaching on 8th March, some obvious question arise: How much has pupils' learning fallen behing during the latest lockdown? How quickly will they catch up? And what disparities will this whole process leave in its wake?
Following our previous study of tests taken at the beginning of the 2020 autumn term, we're proud to be working again with RS Assessment from Hodder Education, this time to analyse scores from tests sat at the end of 2020, after most children had been back at school for a term (albeit with many ongoing disruptions). These results, derived from hundreds of thousands of tests, provide important evidence on the ability of pupils and schools to close attainment gaps relative to previous cohorts once in-person teaching is restored. They also highlight areas for interventions to close widening educational disparities.
We find cause for both optimism and concern:
For full details, see our joint report:
As before, we hope these insights will help educators, policymakers, parents and others to better understand the current state of children's learning, anticipate the effects of forthcoming school reopenings and design appropriate interventions to minimise disparities. We welcome your feedback and suggestions: [email protected]
12th February 2021 by Timo Hannay [link]
What makes a good school? It is a deceptively simple question that too often elicits a simplistic response: an Ofsted rating, a league-table ranking, perhaps even a 'GCSE grades 9-4' score. This is not to say that none of these things are important, only that they hardly scratch the surface of the diverse ways in which schools can succeed (or not) in their educational and societal missions.
In a more normal year, this would be the season of school league tables – an annual appreciation of absolute attainment which feeds the impression that top grades are the essence of a sound education. They are undeniably important: educational opportunities and future life chances can hinge critically on getting good exam scores. But as the sole measure of an effective school they are surely too narrow.
That's why we're proud to be working with the Guardian on a new pilot project, The Guardian Schools Guide (though the views expressed here are those of SchoolDash, not the Guardian). This new Guide aims to provide a more multifaceted, personalised approach to understanding school success. For now, it covers mainstream state secondary schools in England (because these are the schools for which the widest variety of data are readily available), but we plan to expand the coverage in due course.
Let us count the ways
On the basis that no single measure can hope to capture the different dimensions and inevitable tradeoffs of successful schooling, what should we use? We reflected on three essential characteristics of effective indicators, which need to be:
In addition, indicators shouldn't overlap too much. To put it another way, they ought to represent distinct attributes rather than measuring more or less the same thing in different ways. To some extent, we have tried to achieve this, but there are practical limitations. For one thing, almost every measure in education tends to correlate to some extent with almost every other measure. For another, we are influenced mainly by what people are keen to know, and sometimes these things are not completely distinct.
With these criteria in mind, we have begun with the following ten indicators:
1. Admissions: How readily are places available? We use school occupancy rates and the ratios of first-choice applications to offers as a way to gauge the level of competition for places. Of course, being oversubscribed is often a symptom of success, so need not reflect badly on the school (quite the reverse). But most families considering a school would, all other things being equal, surely prefer one at which it is straightforward to secure a place than one at which the process is difficult or uncertain. Furthermore, as we'll see below, competition for places is not necessarily a reliable guide to quality.
2. Attainment: How good are the grades? This is our nod to traditional school league tables. We look at Attainment 8 and (where there's a sixth form) A-level average point scores. Of course, these don't take into account pupils' prior attainment, so arguably they tell you more about the school's intake than its educational effectiveness. But people care about attainment all the same and, as we acknowledged above, it has real consequences for post-school opportunities. So while imperfect, it surely belongs in any collection of top-level school indicators.
3. Attendance: How reliably do pupils show up at school? This is a somewhat controversial area. Traditionalists, including the present UK government, stress that pupils learn best at school. That's largely true, but there can be exceptions and some parents and campaigning groups argue for greater flexibility. We use a combination of overall absence rates together with the incidence of persistently absent pupils (those missing at least 10% of sessions).
4. Destinations: Do pupils go on to sustained education or employment destinations after they leave the school? This aspect is arguably even more important than academic achievement and has received increasing attention in recent years. We analyse the proportions of pupils going to identified, sustained destinations at age 16 and (where there's a sixth form) age 18. A potential future elaboration would be to show the balance between academic and vocational destinations – though not in the form of a score, since neither is intrinsically better or worse.
5. Disadvantaged pupils: What are the outcomes for poorer children? Some otherwise strong schools underperform in this area, while others with mediocre overall performance excel. As a socially, politically and educationally important attribute, we feel it is worth highlighting. This uses Progress 8 along with the destinations metric (see above) for students eligible for the Pupil Premium. Where there is a sixth form, we also use A-level value-added scores and the post-18 destinations metric. Note that to protect personal privacy, these numbers are sometimes excluded from the publicly available data where a school has very low numbers of such pupils.
6. Environment: How safe and healthy is the neighbourhood of the school? While this is clearly not under the school's control, it is of interest to many families, particularly if they are considering relocating. We use data on local crime and the living environment (which includes air quality, road traffic accidents and housing). This is provided at the level of Lower Layer Super Output Areas (LSOAs), of which there about 33,000 in England. We currently use the school's own LSOA based on its postcode, but it may be more informative to include other nearby areas too, especially if the school sits near a border, since sometimes neighbouring areas can have very different characteristics.
7. Finances: Does the school appear to be on a sustainable financial footing? We look in particular at annual revenue balances (in percentage rather than absolute terms). Schools with surpluses score higher than those with deficits. Arguably, schools with large surpluses ought to be penalised for not spending more readily on their current pupils. However, we see few such cases, at least in the state schools that currently make up the Guide. Furthermore, present surpluses seem to be a reasonable indicator of future ability to invest, which we consider a potentially important factor in choosing a school.
8. Progress: How much academic progress do pupils make while at the school? Unlike Attainment, this takes into account pupils' prior academic performance, so provides a better measure of school effectiveness. Even so, it's not perfect: there are systematic national biases by deprivation level, gender, ethnicity and other factors. We use a combination of Progress 8 and (where there's a sixth form) A-level value-added scores.
9. Representation: How reflective is the school of its local community? A few years ago, together with Ted Cantle of the iCoCo Foundation and another social-integration charity, The Challenge, we conducted an analysis of socioeconomic and ethnic segregation patterns in schools. As part of this, we developed a method to compare each school's pupil population with that of other schools nearby. Where there are disparities, these are not necessarily the 'fault' of the school: they could be the result of existing residential segregation or an inevitable consequence of skewed intakes at other nearby schools. Furthermore, for some very geographically isolated schools we omit this indicator altogether. However, we believe that it is potentially useful for those who consider social integration to be a desirable characteristic in schools.
10. Sixth form: If there's a sixth form, how well does it perform? We use a combination of A-level average point score, value-added score and post-18 destinations. These all appear as parts of other indicators, but we felt it useful to separate out sixth form performance, especially for those considering options at age 16.
(We also provide some staff indicators, but those will be the subject of a separate post.)
Except for the Environment indicator – for which data come from the Office for National Statistics (ONS) and refer to 2019 – all of the above use data from the Department for Education (DfE) over a period of the three most recent years available (either 2017-19 or 2018-20, depending on the data in question). This helps to smooth out annual statistical fluctuations. Averages are weighted towards later years, which emphasises more recent performance and rewards positive trends.
In order to allow different indicators to be compared and combined (see below), we convert them into the common currency of percentiles. In other words, we put all of the schools in the Guide into an ordered list and assign a number from 0 to 1001 based on how far up the list each school appears.
This has a couple of consequences that are important to appreciate. First, it's a relative measure, so if all schools improve by the same amount then none of their scores will change. To consider a particular indicator important, then, is not only to say that the attribute itself is significant, but also that there is a meaningful difference between the highest- and lowest-scoring schools. If you don't think that's true then you should probably ignore the indicator. Second, it's very unlikely that any given school will get universally high scores. So when we generate average score across all indicators, most schools are likely to come out in the region of 50 (give or take), which by definition is the median for each indicator.
This time it's personal
However, we don't want to simply use an average of all ten indicators. No two people are ever likely to fully agree on what 'good' looks like in a school and the components above are no more than a list of potential ingredients that can be combined in various ways to suit different tastes. Some will care more about attendance than admissions; the relative importance of, say, the sixth form or the local environment will also be influenced by personal perspectives. For this reason, the Guide allows users to personalise the overall score by up-weighting, down-weighting or omitting each indicator as they see fit. (Those fond of traditional league tables can even omit everything except Attainment, though we obviously wouldn't recommend such a move.)
There's much to be said about the properties of each indicator and how they relate to one another, but for this introductory blog post we'll look at just a few top-level characteristics, starting with the way in which indicators combine to produce overall scores.
Figure 1 shows the distributions of aggregate scores as we add indicators from the list above. Applying only Indicator 1 (Admissions), we of course see 10% of schools in each 10-point bucket. But after we add Indicator 2 (Attainment), the distribution transforms into a bell curve, with more schools in the middle and fewer at the extremes. This is because Indicators 1 and 2 are showing different things (indeed, as we shall see below, they are somewhat negatively correlated), so schools are unlikely to score very high or very low in both of them. We can continue on this path by combining any number of indicators: five, eight or all the way up to ten. As we add each indicator, the distribution shifts depending on whether the new data correlate positively, negatively or not at all with the data already included.
(Use the menu below to explore the distributions created by different numbers of indicators. Naturally, the personalisation feature in the Guide itself permits any combination of indicators, not just sequential ones explored here. It also allows different weights to be applied, so some indicators can have more influence than others.)
Relationship studies
To show in more detail how the indicators relate to each other, at least statistically speaking, Figure 2 provides a correlation matrix. Positive correlations are shown in blue and negative ones in red, while the size and intensity of the dots correspond to their strength. (The blue diagonal line is simply a consequence of the fact that each indicator correlates perfectly with itself.)
The first thing to note is that many of the correlations are very weak, especially for the Environment, Finances and Representation indicators, which essentially stand on their own, independent of each other and all the other indicators. The strongest correlation is between Attainment and Progress (correlation coefficient 0.81), which makes sense: while the former doesn't take into account pupils' starting points, more progress tends to mean higher attainment. The indicators for Disadvantaged pupils and Sixth form also correlate with Attainment, which is to be expected because both of them incorporate certain measures of attainment. Beyond that, Attendance correlates with Attainment (0.71) and, more weakly, with Progress (0.61). Destinations follows a similar pattern, correlating with Attainment (0.62) more strongly than with Progress (0.47). But these are relatively weak relationships: insofar as they represent broad educational trends, plenty of schools seem to buck them.
(Hover over the dots in Figure 2 to see the corresponding correlation coefficients.)
To see this more clearly, consider Figure 3, below, which shows how individual schools are distributed with respect to the different indicators. Looking at Progress against Attendance for the 'National sample' (a random selection of about 20% of schools, just for ease of display), it's clear that these two factors do correlate, but the points are scattered far and wide, which means that at the level of an individual school it's not reliable to assume that good Attendance necessarily accompanies good Progress, or vice versa. Progress and Destinations correlate even less. It is partly to reveal such exceptions to these general rules of thumb that we created the Guide.
Admissions is a particularly interesting case in point because it correlates negatively with most of the other indicators, including Attainment and Progress. This also makes sense: successful schools are usually more competitive to enter. Indeed, the correlation coefficients for Admissions give some idea of what attributes currently make for a popular school. (Short answer: mostly attainment, which may be in part a legacy of traditional league tables.)
That might make our choice of an Admissions indicator seem perverse since a low score is arguably just a symptom of success. Nevertheless, we believe it worth including, for two reasons. First, a competitive admissions process is still a barrier to many (a bit like the high price of a popular consumer good) and should be recognised as such. Second, the correlations with other indicators are generally weak, which means there are plenty of schools with impressive indicators at which it's also relatively easy to secure a place. We believe that these are worth highlighting.
(Use the menus below to explore other correlations and hover over the dots to see corresponding values. You can also click on the dots in the correlation matrix in Figure 2 above to view the corresponding plot in Figure 3 below.)
Figure 3 also allows schools in the different regions of England to be viewed separately. For example, we can see that Attainment and Environment don't correlate nationally, but Environment scores in London tend to be much lower than those in the neighbouring South East. We hope to examine such geographical trends further in future blog posts. In the meantime, you can explore regional differences using the legend above Figure 3: click on a region to turn it on or off; double-click to show a region on its own.
Test pilot
Data geeks though we are, we would never recommend choosing a school based on numbers alone. The indicators offered in The Guardian Schools Guide should be no more than a starting point for further discussions and (when possible) school visits. But we hope they offer a richer, more nuanced starting point than has previously been available.
As already mentioned, the Guide is currently a pilot project that will evolve in response to feedback and suggestions – from families, educators, policymakers and anyone else who's interested. So as ever, we welcome your thoughts to [email protected]. In the meantime, stay safe.
Footnotes:
This is the SchoolDash blog, where we write about some of our projects and other things that spark our interest.
Blue links take you to other parts of the page, or to other web pages. Red links control the figures to show you what we're talking about.
| Copyright © 2024 |