ON school rankings confirm skill development

To all Ontario taxpayers — which of course includes parents — the school rankings that came out yesterday are a good thing. They not only indicate how some schools are struggling, they also show how some are gradually improving year-over-year during a five year period. And, of course, they indicate those schools that are doing exceptionally well. 

So, not only are the rankings giving parents and educators a snapshot of what is going on in the publicly funded system overall, they are also giving parents a choice.  For example, some boards of education have open boundaries and IF — and that is a big IF — there is room in a school that is ranked high, children from other neighbourhoods can attend. Similarly, if parents are about to relocate, they have an idea where they might want to move.

Of course, for the same reasons, many of those within the education system itself, don’t like the rankings. First and foremost, the criticism is that all teachers are doing is “teaching to the test.” Well, what is wrong with that if the students are all learning the same knowledge and skills?

Another criticism is that the testing is arbitrary and may seem unfair when two schools in completely different communities are being compared to one another. For example, you could have one school community that is full speed ahead with block scheduling and the balanced school day approach to curriculum with literacy and numeracy tasks taking up half the day — and — where lots of parent volunteers are available. Whereas, in another school community, the teaching staff are doing their best dealing with a high population of ESL and special needs students and there are few parent volunteers because the majority of parents work outside the home.

Nevertheless, while there are limitations to the rankings, as Moira MacDonald writes in today’s Toronto Sun, they are really the only comparison and accountability tools taxpayers have.  So, if for no other reason than that, they are a good thing.

However, let’s not lose track of the rationale behind the testing in the first place — that students learn and be tested on the SAME literacy and numeracy skills they all MUST have, no matter where they attend school.

That is the bottom line.

[…]

4 thoughts on “ON school rankings confirm skill development

  1. The C.D. Howe Institute is doing a similar report card, but in their study they do look at schools in a socio-economic context.

    Like

  2. The C.D. Howe Institute is doing a similar report card, but in their study they do look at schools in a socio-economic context.

    Like

  3. The CD Howe institute studies (first one done in Ontario last year, just completed one in BC) are a big improvement on the Fraser Institute rankings. There is little value in ranking schools against each other without taking into account the socio-economic background of the student body, given that research has shown again and again that SES impacts greatly on student achievement.

    An equally important strength of the CD Howe study (at least the most recent one) is that the author looked into the impacts and culture around ranking based on achievement, and found strong evidence that certain schools were manipulating the study body who was writing the test, excluding students who might drag their Fraser Institute standing down. This sort of behaviour has also been uncovered in the USA in other studies of the effects of ranking and high-stakes testing.

    This is an entirely perverse effect of the ranking, and undermines the real value of BC’s FSA — the value is not in the ranking itself, but in developing good, accurate and reliable measures to be used responsibly and with full recognition of the complexity of the situation. The Fraser Institute rankings throw all the complexity to the wayside, ignore any possible negative effects of their work, and damage the validity of the FSA and EQAO results.

    Like

  4. The CD Howe institute studies (first one done in Ontario last year, just completed one in BC) are a big improvement on the Fraser Institute rankings. There is little value in ranking schools against each other without taking into account the socio-economic background of the student body, given that research has shown again and again that SES impacts greatly on student achievement.

    An equally important strength of the CD Howe study (at least the most recent one) is that the author looked into the impacts and culture around ranking based on achievement, and found strong evidence that certain schools were manipulating the study body who was writing the test, excluding students who might drag their Fraser Institute standing down. This sort of behaviour has also been uncovered in the USA in other studies of the effects of ranking and high-stakes testing.

    This is an entirely perverse effect of the ranking, and undermines the real value of BC’s FSA — the value is not in the ranking itself, but in developing good, accurate and reliable measures to be used responsibly and with full recognition of the complexity of the situation. The Fraser Institute rankings throw all the complexity to the wayside, ignore any possible negative effects of their work, and damage the validity of the FSA and EQAO results.

    Like

Comments are closed.