I find myself finding, losing and searching for particular data releases all the time. To save me a bit of time, I’m compiling a list of significant releases of statistical information about education. To start with these are all governmental, but I’ll add others where they come with a decent reputation. I’ll try to keep this up to date, as I use these myself, and I hope they are of use to you. If you find any decayed links, please let me know. Continue reading
The short answer is I can’t tell you, but to be a Pupil Premium reviewer, you have to go through a process of quality checking. School leaders who wish to become pupil premium reviewers must provide evidence of having improved the achievement of disadvantaged pupils in schools they have led or supported closely ( https://www.gov.uk/guidance/pupil-premium-reviews-a-guide-for-nles#who-can-apply ). Examples of the evidence can include: Continue reading
The performance of vulnerable groups in schools has been a concern for some time, but particularly since the introduction of the pupil premium. Gaps between attainment & performance of the disadvantaged and others even form part of the inspection framework, so in my school we’ve focused a lot on closing these gaps. Now is the time of year when we are looking forward to what we want our Year 11 and Year 10 students to achieve so I’ve been looking at the national picture of attainment & performance gaps. How does your school measure up? Continue reading
Secondary school leaders would like to think that they are judged on the difference they make and not on the pure outcomes of their school irrespective of context. The whole language of the most recent framework is about progress made, taking into account pupils and schools various starting points. While there has to be regard to national average attainments, there is a general sense that Ofsted inspectors try to take into account starting points when judging a school.
But are they succeeding? For example, do those schools which have low prior-attainment intakes have that taken into account properly? Do those with very able intakes get the appropriate challenge from Ofsted? Continue reading
Sometimes even I think statistics aren’t that helpful, particularly if they are reported thoughtlessly. Last Friday afternoon brought a prompt report from the Telegraph about some Education statistics released that day. The report title “Worst towns for GCSEs named in new league tables” wasn’t that accurate but sounded sexier than “Neighbourhood statistics in England: academic year 2012 to 2013”. In fact, Continue reading
What the data says about selective schools and Ofsted
A few weeks ago, Ian Widdows replied to my Swarm blogpost asking if the Swarm could highlight Secondary Modern schools as a group, and Grammar Schools as a group. Ian is part of an association of Secondary Modern schools, and he feels there is some bias in Ofsted judgements against Secondary Modern schools. At the time, the Swarm didn’t include data on admission policies. I can’t include all the DfE data or the Swarm would be slow, but I have now added admission policy. Both Secondary Moderns and Grammars appear as a “virtual LA” in the LA dropdown list.
Here is a picture of the Secondary Moderns: Continue reading
Of course, like most data geeks, I love the performance tables. In 1994, the publication of data about school performance was a revolution, and like most revolutions was greeted with horror and satisfaction depending on your prejudices. In the decade since, there has been much debate about the performance table data but usually with more heat then light being shed. We’ve been through Continue reading