|Back to Blog Index Page |

Sunday 8th September 2013

ISAs (Internal Student Assessment) are AQA's new (since 2009) method of assessing the 'practical' component of A-level biology. It is relatively common for me to have students who get four grade As in their theory papers for AQA A-level biology who are then dragged down to a grade B by poor (and often very poor) ISAs. If you are reading this there is a pretty good chance that you or your child will also have suffered from this phenomenon. It is very distressing for all concerned and is happening so often now that I feel the need to speak out in the forlorn hope that somebody, somewhere in a position of influence might be moved to do something about it.

To try to understand what is going on we need to have a little background to the AQA ISAs. I've been in this business for 24 years now, working with candidates for all of the UK examination boards, so I am uniquely placed to give a pretty thorough historical perspective.

Prior to 2009 the equivalent of AQA's ISAs was the 'assessed practicals'. These were done in the laboratory in the students' own time. As a result they were relatively 'easy' (and relatively easy to 'cheat' at) and hence raw marks tended to be extremely high.

To understand the relevance of this to today's ISAs you need to have a bit of background in understanding what 'UMS' marks are. This was the subject of a previous blog which I shall summarise here.

UMS ('Uniform Mark Scheme') marks are the marks that are reported to you by the examination board on your results sheet. Just to confuse everybody, for AQA biology units 1 and 4 they are given out of 100; for units 2 and 5 they are given out of 140 and for units 3 and 6 (ISAs) they are given out of 60 - a total of 600 UMS marks (300 for AS and 300 for A2). Other boards and other subjects may allocate UMS marks for each unit differently. Unless you ask for the paper to be sent back to you, you will never see the RAW marks that were used to calculate this UMS mark.

The UMS marks bear no direct relationship to the raw marks actually given for the paper. There is a conversion factor that varies from year to year and (to a certain extent) grade to grade.

In theory the conversion factor should take account of how well a candidate does with respect to all the other candidates in the cohort. Under the old 'normative' marking system used prior to 1987 the allocation of grades was totally transparent - the top 10% of all raw marks (irrespective of what they were) got a grade A; the next 15% got a grade B; the next 10% got a grade C and so forth. To get a grade 'A' you would need 80% UMS - ie 80/100 UMS (or 112/140 or 48/60, depending on unit), but the RAW marks needed to be given this UMS mark would vary from year to year as the difficulty of the examination varied and hence the conversion factor varied.

After 1987 the UK foolishly shifted to 'grade reference marking' whereby the examiners (bafflingly) simply decided 'on the basis of their professional experience' what raw mark was needed to get a given grade. So if, for instance, they decided that you needed 73 raw marks out of a 100 to get a grade 'A' on a given paper then that was converted to 80/100 UMS (or 112/140 or 48/60, depending on unit) - and so forth for the other grade boundaries. This has resulted in the famous 'grade inflation' we all read so much about (the subject of yet another blog) - prior to 1987 only 10% of candidates could get a grade A whereas nowadays it is nearer 25%.

It is possible to plot out histograms from the board's own figures that show the proportion of candidates achieving a given UMS mark. I did this routinely for AQA prior to the new AS/A2 system in 2009 and it soon became apparent that the histograms for the 'assessed practicals' were massively skewed to the right compared to theory papers. The only possible interpretation of this was that candidates were (unsurprisingly) getting incredibly high marks on the assessed practicals as, to cut to the chase, they were all being helped and/or doing them repeatedly until they got these high marks.

This meant, back in the days that I ran assessed practicals, that in order for my candidates to get a high UMS mark (in comparison with all the other candidates in their cohort) they had to get an incredibly high RAW mark as a lot of other candidates were also getting very high raw marks - often 59/60 raw would be required to convert to the 80/100 UMS needed to get a grade A. This also meant that for every raw mark lost on assessed practicals at the A-grade end of the scale the candidate lost up to nine UMS marks. I therefore made sure that I did not put any candidates in for AQA with less than at least 58/60 for their assessed practicals (when they were under my personal control), as to do otherwise was academic suicide in competition with others getting ludicrously high raw marks.

When the new ISAs (allegedly taken under controlled conditions) came in in 2009 we all breathed a (premature) sigh of relief as it should in theory now have been an 'unseen' paper, like a normal theory paper, and therefore should have had a very similar histogram to theory papers. Therefore there should have been no more of this ludicrous system of having to get ridiculously high raw marks to get a high UMS mark. (By the way, it used to be possible to get a grade A with a raw mark of 63 to 66% on most AQA theory papers, but this is now edging up to 80% in an attempt to curb the dreaded 'grade inflation'. Of course, to get a grade A in the 'assessed practicals' you needed to get close to 100% raw marks for the reasons discussed).

To my astonishment when I looked at the histogram for the new ISA (something which I only did for the first year admittedly - I need to repeat the exercise to be certain) the histogram was almost identical to that for the old 'assessed practicals' and NOT like the ones for theory papers! So you still needed to get ludicrously high raw marks in order to get high UMS marks.

There are only three possible explanations, to whit: 1) the examiners are making the ISAs artificially 'hard' in order to try to curb 'grade inflation' - regrettably they have extensive 'previous' for using adjustment of practical papers to fiddle their figures and it is pointless their denying this; 2) a lot of schools are essentially 'cheating' by preparing their students intensively for this examination; 3) students are swapping information extensively among themselves. As the ISA paper is released to teachers in September/October and can be taken at any time up to March a lot of students take the paper early and can hence pass on what they remember to colleagues...a truly ludicrous situation.

I have heard a lot of anecdotal evidence that the last of these happens and I have also heard that some schools are essentially 'cheating' by giving their students a 'mock' the day before with just a few words changed. The Daily Telegraph reported one such case of a school cheating in an ISA a few weeks ago and I was about as astonished by it as I was by the 'revelation' that some priests are gay - it's one of those 'big secrets' that frankly isn't!

It may seem that there is nothing that can be done about much of this without resorting to 'cheating' yourself, but actually I believe that a savvy school that cares about its students and is prepared to go the extra mile can do a lot and still maintain its integrity. Although my sample size is small and therefore doubtless skewed I can fairly confidently predict which of my students will get low ISAs depending on where they are at school, for example. Lazy and incompetent schools (of which there are several in the Oxford area) tend to give their students zero practice and training for ISAs, herd them into the ISA exam. (where, predictably, they stuff it up) and then refuse their candidates permission to do the second ISA on the grounds that it is "not allowed" or (a common bleat) "the candidates tend to do as badly in the second attempt as the first". Actually they just can't be bothered and don't care.

In reality the second ISA IS allowed by the board (and the better of the two is put forward to make up the aggregate mark) and the candidates that I and my colleagues drill intensively for ISA retakes seem to do a lot better second-time around.. I therefore regard the idea that 'candidates seem to get the same in retakes' as an excuse for not re-doing ISAs (a common response from the less able or willing schools) as straight defeatism! If the candidate does badly in their first AS ISA they should be allowed to do the second one - and if both are poor then they should be allowed to retake in their A2 year. To do otherwise is idleness and negligence, in my opinion.

As well as being a private tutor I am also associated with a private college in Oxford which I helped found and am still a shareholder in. Although I deliberately DO NOT ever see the biology ISA papers in advance, to avoid accusations of collusion with my private candidates, I know that their students do well in their biology ISAs and that full integrity is maintained. The secret is very extensive drilling in the ISA past papers and doing all of the past papers (which we have going back to 2010 now) with the mark schemes. It is massively labour intensive (the college devotes a full-time three day course to it), but ISAs account for 20% of the total marks and are therefore vital to success. My clear impression from these anecdotal observations is that the schools that do the most preparation get the best marks. Simples! Maybe if the schools stopped fannying-around doing 'posters' and playing with plasticine models they would have more time to do this...

There is one further 'insider's tip' that some more sensible schools have finally latched onto to improve their grades in ISAs. In addition to the 'ISAs' (papers 3T and 6T) which are marked by teachers and sent off to the board for moderation, the AQA also offer alternative 'EMPAs' (papers 3X and 6X) which are simply done and sent off to the board with no teacher input. These appear to be more secure than the ISAs - and the histograms that I have seen suggest that they are (therefore?) rather 'easier' to get high UMS marks in than the ISAs. This is presumably because the possibility of cheating and collusion at least is somewhat reduced.

Although the sample size is small, in the cases where I have recommended doing the EMPA rather than the ISA the outcome has been more favourable. One school in Oxford has recently shifted to EMPAs and my own candidate from there certainly seemed to do better in that than in his ISA - although he also drilled extensively in past papers. I also had a very modest online student in the north of the country who trod the same path and get a grade A in his EMPA after failure in an earlier ISA.

You would think that schools would embrace EMPAs as being less work (no marking needed) but most simply seem to be stuck in a 'groove' and do not even appear to know about the existence of EMPAs. Yet more negligence and stupidity.

In short, I believe that it IS possible for a school to improve its ISA results (or, even better, shift to EMPAs). The trick is simply a lot of preparation and a lot of teachers do not have the time or will to do this. The only way that parents are likely to see any improvement at the poorly-performing schools is to complain - and to complain hard. For those who are interested I hope that this blog will give valuable information to add to their case for improvement.

One obvious thing that schools who are failing in ISAs could do would be to approach some of the heads of biology at the schools who DO get good ISAs/EMPAs to see how they go about doing it...

There is really no excuse for this scandal to continue. The examination boards have a lot of awkward questions to answer with regard to their role in this fiasco and schools that merely roll-over meekly and accept this scandalous situation are also massively culpable, in my opinion. A couple of years ago I had a student (now at Oxford University) who achieved four A starreds in his A-levels - and got a grade 'D' in his A2-level biology ISA. I have had many others in similar (or worse) situations. To dismiss this as some sort of accident, freak result or student inability is frankly laughable. As always the examination boards are hiding behind the artificial complexity of a situation that is of their own creation and the schools are too gutless to challenge it. It is time that this situation is addressed before more young lives are wrecked.