Official Report 653KB pdf
The next item on our agenda—this morning’s substantive item of business—is an evidence session with the Scottish Qualifications Authority. I welcome Fiona Robertson, the chief executive of the SQA and Scotland’s chief examining officer; and Dr Gill Stewart, the SQA’s director of qualifications development. Thank you both for joining us today. We will begin with a short opening statement from Fiona Robertson. You have around two minutes.
Thank you, convener; I will try to keep my opening remarks brief. First of all, I thank the committee for the opportunity to appear before the committee today to reflect on national qualifications in 2023, to look ahead to next year and to discuss education reform.
On the results for 2023, I will begin by paying tribute to the 141,089 learners who received their SQA certificates on Tuesday 8 August this year. Those learners, including thousands in your constituencies and communities, can feel proud of their achievements across a wide range of national and vocational qualifications. In celebrating the remarkable resilience and commitment of learners, I also pay tribute to all the teachers and lecturers who have supported those learners and to our partners across the education and skills system who have helped to shape and agree our approach to awarding.
This year represents a further positive step on the path back to normal awarding, but the impact of pandemic disruption to learning and teaching continued to be felt. Recognising that, we put in place a wide-ranging package of support, including a sensitive approach to grading to help learners perform to the best of their abilities while maintaining the credibility of our qualifications.
For awarding in 2024, following extensive consultation with the education community, it was agreed that we will return to full course assessment for most courses in the 2023-24 session. Our wider approach to assessment will draw carefully upon the experience and evidence from this year and the views of learners and partners.
Finally, on education reform, SQA has engaged positively with the Scottish Government’s reform programme to replace SQA with a new qualifications body. We have also contributed to a range of reviews over the past few years, including, most recently, the independent review of qualifications and assessment and the review of the skills delivery landscape. As a number of interconnected reviews have now come to a conclusion, great care will be needed to ensure that any change and reform that follows is coherent, well understood, aligned and deliverable. The timeline for the replacement of SQA has changed, but we remain committed to delivering high-quality, credible qualifications that help learners to achieve their ambitions and deliver the skills for the future. I am happy to answer any questions.
Thank you. We now move to questions, and I will take convener’s privilege and ask the first. We have had a turbulent past few years, with the pandemic impacting on results. From one perspective, it is key to compare 2023 with 2019 in terms of benchmarking. At the national level, how comparable are those results?
I made it clear on results day, when we published information relating to the aggregate position for national courses, that, over the past few years, particularly given the pandemic and the years since, we have had to adopt a slightly different approach to awarding each year, in common with other awarding bodies across the United Kingdom. On that basis we need to be cautious about drawing comparisons, particularly in relation to any judgments about educational performance. Given that caveat, it is fair to say that we have seen some recovery, and that, combined with a sensitive approach to grading, has resulted in a strong set of results for this year. Learners have done well and have shown great resilience. They have worked hard, and that is evident in the results that we have seen. The results sit somewhere between those in 2019 and those in 2022 overall.
Some of the data regarding attainment seems to indicate that, since 2017, there has not been much noticeable improvement overall, with overall attainment at A to C marginally down at all levels compared with 2017. We can also see that, in 2019, before the pandemic, attainment at A to C was starting to decline slightly. I want to see attainment across the board improving. I am concerned that, since 2017, before the pandemic, there has not been that overall improvement in attainment. It seems that attainment is levelling out or tapering downwards. What are your comments on that?
As I highlighted, we need to be cautious about drawing conclusions, given the changes to awarding over the last few years. On a more granular level, you see variability in results in individual subjects at individual levels for a variety of reasons. It is always important to look below the headline national 5 and advanced higher results and consider the issues. We are in the midst of publishing course reports for every subject at national 5 and advanced higher, and that provides more information and reflection on performance in individual subjects that can be helpful to practitioners and learners as they approach further learning and teaching this year.
I have to advise caution around comparisons, as there can be a variety of reasons for things. As I said, the 2023 results are pretty strong overall, particularly given the challenges of the last few years, but we need to be cautious about drawing too many conclusions about the results.
I am sure that we will pick up on that thread more as the session goes on.
Good morning. Thank you for the information that you submitted in advance and for your opening statement.
I have just a quick point on attainment. How would you describe the attainment gap between 2016 and 2019?
Only since 2020 have we published an equalities impact analysis on results day. In the 2016 to 2019 period, we saw a persistent attainment gap. As with overall results, you can see some slight variability over time. When we published our equalities analysis this year, we saw some slight changes to the attainment gap in 2023 compared with 2022, although, again, I caution about drawing too much by way of conclusion from that. It is, however, an accepted fact that there is a persistent education attainment gap in Scotland. I know that the Government has made commitments to address it and to invest in addressing it.
I want to move on to the sensitive approach, as you described it, that was taken this year. Can you tell us the impact of that sensitive approach and how you know what it has been?
I set out in more detail what we sought to do in the methodology report, which I appended to the papers that I sent to the committee in advance of the session. The sensitive approach is a step-by-step approach that we take in individual awarding meetings for individual subjects. Of course, colleagues, including members of the committee, are interested in the aggregate position, but Gill Stewart and I, with another colleague, chaired a series of meetings this year with teachers who are senior markers and senior appointees to consider a range of evidence that would inform our grading decisions. That was the final stage of the approach that we have taken.
In my opening statement, I mentioned a package of support that included modifications to assessment, for example, which continued last year. It took a range of assessment instruments to increase learning and teaching. In our grading meetings, we considered a range of evidence, as we would normally do, about the performance of individual assessments: how the exam performed; how young people performed in the exam; and whether the exam and the assessment instruments performed as intended by the teachers who develop and design our qualifications and assessments. We also considered any impacts of the modifications, the removal of revision support—we had revision support in 2022—and, if necessary, whether any further adjustments might be required or appropriate as part of that sensitivity.
It is important to highlight that the sensitive grading was designed to benefit learners. It was designed to be more generous if needed and if the evidence supported it. It is difficult for me to say that the impact was X, because we held, I think, 129 grading meetings across national 5 and advanced higher, and we took individualised decisions on that basis. We published our grade boundary adjustments on results day, and I published the median adjustment of grade boundaries as part of my chief examiner’s report.
I appreciate that. I found the report helpful, including the detail on the sensitive approach, as you have set out. Are you able to set out what the results would have been had you not applied the sensitive approach?
No.
How do you know which parts of the sensitive approach helped and which were unhelpful? How do you know that they are not needed into the following year?
What I have set out in summary and what the methodology report sets out is what we did.
I appreciate that.
For all the right reasons, we did not say, “This percentage is the sensitive approach”, because we took an evidence-based approach to grading and, with our senior appointees, considered whether adjustments were needed, bearing in mind the performance of assessments and the impact of modifications, the impact of the removal of revision support and any further considerations. Effectively, a holistic judgment is made on the basis of an individual subject. We have not quantified what that sensitive approach would be. However, as my chief examiner’s report highlights, in a normal year, the median grade boundary adjustment is zero. If you go back to 2019 and the period before, you will see that the median grade boundary adjustment was zero. Last year, in 2022, it was -4 at grade C and -2 at grade A. This year, it was around -2 at C and -1 at A. That highlights, in broad terms, the fact that, for a variety of reasons, we took steps beyond those that we would take in a normal year. That demonstrates, again in broad terms, the quantum of the adjustments that we made, which were larger than usual. We need to bear in mind that “minus” means that we lowered the boundary at grades C and A and, therefore, more learners achieved A to C.
09:45
Thank you. Willie Rennie, do you have a supplementary question?
Yes, it is just a quick one. Fiona Robertson, you said that we should not draw comparisons between now and prior to the pandemic but that you are confident that things have got better. Is that not a contradiction?
No. It goes back to the discussions in June and early July that Gill Stewart and I were part of, whereby we saw evidence of strong performance and some improvements in performance. Actually, the fact that the grade boundary adjustments were less significant this year than last highlights evidence of some recovery. There is that balance between recognising that we are on a path to recovery and continuing to take a sensitive approach to grading in order to benefit learners.
I have to stress that, in every year since 2019, circumstances have necessitated that we have had to take a slightly different approach to awarding. That has led to different sets of results in different years. Therefore, we need to be cautious about saying that there is educational improvement overall or not, as the case may be. We have had to make some adjustments, but it is important to highlight that we saw some evidence of strong performance and, compared with 2022, a degree of recovery.
Liam Kerr wants to come in on this topic. We will jump ahead to him before we come back.
I am grateful, convener.
I would like to continue the line of questioning on grade boundaries and the sensitive approach. I appreciate that people who are watching will have heard you using terms such as “grade boundaries”, “-2” and so on. In your submission, you say that grade boundaries are “not pre-determined” and are “based on evidence”, and you have said that the median grade boundary adjustment is zero. Are you able to set out concisely what you mean by “grade boundaries”, what is being adjusted and what evidence you use to do that?
Yes, I will attempt to set that out clearly. I accept and recognise that it is a technical process—I hope that the committee appreciates that, too—but we are very conscious that there are learners behind all of this. We are very cognisant of that.
We have a published policy on grade boundaries, so there is transparency around it. Basically, the grade boundary relates to the marks at which you achieve a particular grade. A grade boundary is set for grade C, and we also set a grade boundary for A and for upper A.
Our assessments are set with the expectation that grade boundaries will be notional. By “notional”, we mean that, on average, you would expect to get a grade C if you got over 50 per cent and a grade A if you got over 70 per cent. That is the expectation, and the assessment is set with that in mind.
Judgments are made when setting assessments. My colleague Gill Stewart can talk about that in more detail, as she has many years of experience in that area. Assessments often perform as expected, but sometimes they do not, and there can be modest grade boundary adjustments to allow for that. In very simple terms, sometimes an assessment is easier than expected, and sometimes an assessment is harder than expected. We can make modest adjustments to account for that. At the heart of this is the idea that a grade A in one year will, broadly speaking, be the same as a grade A in another year, so we make adjustments to deal with that. That is what grade boundary adjustments are.
That is really helpful. I have a question about the adjustments. In your submission, you say that, when you looked at setting grade boundaries, the SQA took into account
“the legacy impact of the pandemic upon learning and teaching.”
Yes.
How did the SQA evaluate and quantify the impact of the pandemic before setting grade boundaries?
The “National Qualifications 2023 Awarding—Methodology report” sets out the staged approach to that in a bit more detail. The normal awarding approach is to look at a range of evidence on how assessments performed. In the past couple of years, with the slight adjustments to our grading, we have also looked at other evidence and made a judgment on whether we should make any further adjustments—ones that do not undermine the integrity of the qualification but which recognise that there have been impacts.
Gill Stewart might be able to exemplify that by giving a couple of examples relating to the sciences and languages. That might bring the approach to life a little bit—I understand that it might feel rather dry, but it is an important part of our process. We use an evidence-based process that seeks to ensure that, year on year, there is fairness in our approach. It is really important that I highlight that.
I will bring in Dr Stewart to give us some helpful live examples.
The best example that I can think of relates to modern languages. As you would expect, we assess the skills of reading, writing, talking and listening. We assess those skills separately, so we can see how learners have done in each of those assessments.
It is important to understand the breadth of evidence that we consider. We look at statistical information, which helps us to determine whether the assessment performed as intended, but we also ask all the markers—more than 8,000 teachers and lecturers are involved in marking—to complete a marking report when they have completed their marking. We asked them how they thought learners did relative to 2019 and 2022. We also use teacher estimates as part of our panoply of information, and the principal assessor and the depute principal assessor, who oversee the setting and marking of the assessments, look at all the markers’ reports. We use all that evidence to make judgments.
I turn to the modern languages example. First, we compared the performance in the reading, talking, listening and writing assessments with the performance in 2019. That is the original standard that we are trying to get back to when it comes to the skills that we expect our learners to have at national 5 or higher level. That told us where we were relative to our original standards, but we then compared performance this year across those four assessment areas with performance last year. That told us that there had been significant recovery in 2023 compared with 2022.
We used the comparison between 2019 and 2022 to see what the impact of the pandemic had been on learning, and we made a judgment about the previous year. However, this year, we were able to compare how learners had performed in reading, writing, listening and talking in 2023 with how they had performed last year. That told us that there had been another stepwise recovery in learners’ skills, but they were still a little bit off the 2019 standards. Does that help?
I understand.
In relation to languages, we saw the impact that the disruption to learning had particularly on listening and talking. Performance in those elements was weaker last year, and it was still slightly weaker this year, but there was a bit of an improvement on last year. Last year, we made a significant adjustment to address that, because we felt that that was the right thing to do. That is part of the generous grading. This year, we still made an adjustment, but it was not as significant as the one that we made last year. The methodology report sets out the stepwise approach, which is very much in line with—
That makes sense. I am grateful for—
Crucially, it consistently involves principal assessors and depute principal assessors, who are teachers, and we take account of their feedback when making our judgments.
I am very grateful. That makes it clear. It begs one final question. Is it your intention to take the sensitive approach next year? If not, is that because you have concluded that there is no longer a legacy impact from the pandemic?
As I set out in my opening statement and in all the documentation that I published on 8 August, we have a balance to strike. It is not just us who need to consider that balance; other awarding bodies and regulators across the UK are considering and have considered it. We need to make sure that we recognise the challenges of the past few years and the impact of those on learning, teaching and learners. We need to balance that with the on-going credibility of the qualifications and with the path back to normality.
This year, England substantially returned to normal awarding, with some protections in place. There is an expectation that, this coming year, Wales and Northern Ireland will follow. We have not made a final decision on our grading approach for this year. However, as I outlined in my opening statement, we have made a decision on full course assessment for this year. Things such as practical work in science and assignments for a range of our courses, which they are designed to have, will be brought back.
We will engage on and discuss further final awarding and grading approaches. I hope that we can substantially get back to normal. However, given that we have returned to full course assessment, we need to be mindful of the impact that that might have—as we would normally be with any changes to the assessment approach. I am very aware of the fact that some teachers have expressed concern about the return to full course assessment, particularly for learners who have gone through secondary 4 and secondary 5 without having done some of that work. We need to be very mindful of that in our grading approach and of where the evidence takes us on that.
Thank you very much.
Thank you, panel. I have a couple of quick questions—I do not think that they go very deep—on national 4 and advanced highers in local authority centres. Can the panel explain the rise in the number of entries at national 4 level?
10:00
Centres make decisions in the best interests of their learners, and that includes decisions on entries. It is quite difficult for me to give a granular explanation for any changes. Over the past couple of years, what we have certainly seen, compared with pre-pandemic levels, is an increase in dual entries at national 4 and national 5, and there has probably been a bit of a shift from national 3 to national 4. Similarly, we have seen a greater degree of stability at higher and advanced higher levels.
We have seen the number of national 5 entries go up, however, and a big part of that is the dual entries in national 4 and national 5. We have also seen a bit of a shift from national 3 to national 4. Modifications might form part of that story, because we removed the added value unit from the national 4 qualification. Again, that was done in response to the pandemic and the disruption to learning; it freed up time for learning and teaching. That could have had an impact on entry patterns and entry levels.
Entry decision sare, rightly, for centres to take in the best interests of their learners. The focus has fairly consistently been on national 5 qualifications, highers and advanced highers, and I am sure that that will continue to be so in this discussion, but we have also seen increases in our other awards, including personal development awards and skills for work, in which we are seeing quite healthy numbers.
We also offer a wide range of other qualifications, and we are seeing those coming through the sectors, including the schools sector. They are vocational qualifications that include higher national certificates and modern apprenticeships. They include SQA qualifications, substantially. We are also seeing a bit of diversification in entry patterns. That should be seen as a positive thing, because it is recognition that there are choices for learners that suit their interests and pathways.
On the back of that, to what degree is the SQA seeking a return to the achievement patterns that existed before the pandemic? Is the SQA seeking to do that, or is it looking to see new patterns emerge?
There is no predetermination in our resulting. Centres will have a range of choices for the entry decisions that they might make, and, as I have highlighted to your colleagues, we take an evidence-based approach in terms of resulting and the outcomes of our qualifications. The SQA’s responsibility is to make sure that we can judge the competence, skills, knowledge and understanding that are required to achieve a qualification. It is on that basis that we make our judgments. There is no predetermination around final outcomes.
Would the SQA be happy, at least initially, to return to the achievement patterns that were seen before the pandemic? I know that you are saying that you are not setting exact limits or rules, but were the patterns that existed before the pandemic good enough to want to go back to them?
My focus is not really on achieving a particular pattern of achievement. My job is to ensure that we are awarding on merit and on the evidence that is in front of us. I therefore do not have a predetermined view about that. Everyone who is involved in Scottish education wants improved outcomes and good pathways for young people—that goes without saying—but our role is clear: it is to ensure that we are making awards on the basis of demonstrated attainment and achievement. I hope that that answers your question.
It does. Thank you.
I will bring in Ross Greer for a supplementary question on this line of questioning.
Is not it the case that, ultimately, the issue is that grading is relative and that, owing to how the system operates, there can be only so many As, Bs and Cs each year? For example, if the number of A grades in the first instance looks to have increased significantly, that is interpreted as there being a question about the integrity of the data. Ultimately, the approach to grade boundaries sets a cap on the number of A grades that there can be each year.
No. In the external assessments that we set, we expect to see a degree of differentiation in performance. Questions are, in part, set with that in mind—for some subjects more than others. Maths is an obvious example: some questions are set as A-type questions, B-type questions and so on. As Gill Stewart highlighted, our markers will provide feedback on the performance of those assessments and how the particular assessment instruments have performed. If there was a cap or predetermination, we would see much less variability in our awarding.
On individual subjects, there are around 140 national 5, higher and advanced higher courses, some with very small numbers of entries. In fact, there can be quite high degrees of variability year on year because of the impact of a particular cohort. For subjects with larger numbers, there is obviously a spread and, therefore, perhaps a bit more stability in outcomes overall, but every year will see differences and changes. On the one hand, the assessments are absolutely set with a degree of differentiation in mind, but on the other hand we see variation in performance each year and we take a considered and evidence-based approach before we make a judgment about grading.
I do not want to overemphasise or, indeed, to underemphasise the importance of the grade boundary meetings. As I highlighted, in a normal year, the median adjustment is zero, so for the most part assessments work; we make no adjustments to grade boundaries, therefore the results fall where they fall. Over the past couple of years, we have made more adjustments to grade boundaries as part of a more generous approach to grading, which has benefited learners. We have provided the grade boundary information: as I said, we publish it every year. That highlights that there were a number of courses for which we made quite significant grade boundary adjustments, but there were a large number of courses for which we did not. It was done very much on a case-by-case basis.
I hope that that explains that there is no predetermination or cap, because we see variability year-on-year, which one would expect. In courses that have a small number of entries, there will be years in which there is really strong performance and there might be years in which there is less strong performance. It is really important that we reflect that in our results.
Thank you. I am going to carry on with this theme, if that is okay. You spoke about median adjustments and results falling where they fall. You spoke specifically about larger subjects, using maths as an example. Despite the sensitive approach to boundaries that we have been discussing at length, the results in national 5 maths and English were, respectively, worse than or the same as they were in 2019. Does the SQA have particular concerns around those key qualifications, and, if so, how are you feeding back to local authorities, schools and teachers to address some of the gaps in learning?
Dr Stewart, do you want to come in on that?
To add to what Fiona Robertson said about grade boundaries, I point out that our job is to maintain standards from year to year in individual subjects. That is done through setting the assessments in a standardised way each year, then using the grade boundaries process to assess whether we have maintained those standards or whether things were slightly more challenging or easier than intended. That is very much the purpose of grade boundaries. As Fiona said, we made other adjustments to do with the pandemic in our generous and sensitive approach to grading to take account of the impact of the pandemic on learning.
National 5 mathematics is an interesting area because there is lots going on there. There are various factors at play, and we are carrying out further analysis of all the data to help us to understand and so that we can have discussions with local authorities, teachers and so on. One of the things that we see happening is there being a larger proportion of dual entries for national 4 and national 5. Also, we do not just have one course in mathematics; we have a course called national 5 applications of mathematics, and we have seen significant increases in uptake of that qualification. From memory, I think that we are sitting at something like 19,000 this year, which is up from 14,000 last year. We are seeing a shift in uptake of the various mathematics courses, as well as those patterns of dual entry.
The other factor that we have to take into account is that national 5 mathematics is often required for learners to go on to further courses of study, whether at college or university, so we see significant numbers of resits for national 5. If you take out the resit population and just look at students in secondary 4 who are sitting national 5 mathematics, the attainment rate is sitting at just over 70 per cent. If you then take out the dual entries for national 4 and national 5, the attainment rate goes up to 80-odd per cent. There is a lot going on, and there are different patterns happening across local authorities and schools, but we are doing a lot more analysis to understand that more fully, because we need to have dialogue with Education Scotland and local authorities if we think that there are—
What you have explained is the number of extra entries, not the results that are coming from them, so could you do that? I asked about English as well, in which I do not know whether there are quite so many.
I think—
If you do not mind, I am directing that question to Dr Stewart again.
There being another course in mathematics is significant. If there is a variable cohort, as there is, and decisions are being made locally about which learners will do national 5 applications of maths or national 5 mathematics, it is not known whether there is a uniform spread of learners across the ability range. For instance, it might be that the better learners are going to national 5 applications of maths. We do not know, because the cohort who previously all did national 5 maths is now split across two mathematics courses, and there are significant numbers doing both courses.
I am not providing answers; I am just saying that a lot of new variables have come into play that mean that we really need to do a higher degree of analysis of what is happening at the local level, what is happening across different year groups and what is happening in national 5 applications of maths.
I will bring you in in a second, Fiona.
When you have done that, which sounds like it will be a substantive piece of work, will there be an opportunity for you to feed back to us on the findings from that?
Yes
Super.
Fiona Robertson, can you respond briefly to my question?
To add to what Gill Stewart has said, I point out that, in any year, local authorities and schools will look at their results, undertake analysis and reflect on learning, teaching and entry patterns. There is a focus on percentages, but percentages are in part defined by the entry decisions that local centres make. There is only so much that we can say about that. Those are local decisions, and local authorities have statutory responsibilities in relation to improving education and considering the issues more generally. We will play our part in providing data and analysis to aid their thinking.
My chief examiner’s report highlighted weaker performance in maths; that is what we saw. As Gill Stewart highlighted, underneath all that there are some unusual patterns on entries, and the point about resits is important. There are not many courses in which there is a substantial number of resitting candidates. Maths is an exception to that and has been previously; that is not new. This is an opportunity for the results that we publish and the data that we produce to aid further conversation in the system, including in local authorities, schools and Government, about some wider issues in the education system. That is part of our role.
10:00
Entries into the Gaelic-medium qualifications remain relatively low. Have you undertaken any work to understand the reasons for that?
We can come back to you with more information about that. We have our Gaelic language plan, and we support use of the medium of Gaelic in education, but we are limited by provision at local level. Schools make the entries and so on; we do our bit to support Gaelic-medium education and we provide assessments for Gaelic medium in a small number of subject areas. We also have our Gaelic learners’ courses and our native-speaking Gaelic courses to help.
It would be helpful if you could provide that information once you have it.
My question is in the same light; it is about advanced highers in disadvantaged areas. Are we on top of why there is less of an offer in schools in those areas? What can be done about that to improve the situation, given that it has not improved in recent years?
It is difficult for us to comment on that area in any detail. The availability of courses and the provision in local authorities and centres more generally are ultimately a matter for them. There has certainly been some movement in advanced higher provision, with, for example, the hub at Glasgow Caledonian University and the scholar resources for particular courses at advanced higher as well. We offer a very wide range of courses, including at advanced higher level. Some of the advanced highers are quite small entries, but we find that learners make a variety of choices, particularly in S6. That sometimes includes advanced highers, and sometimes it includes other qualifications; it can include a vocational offering. We see a bit of diversification, but those decisions are made at a local level.
You have a strategic overview across the country and will have developed an expertise and understanding about what works in some areas and what does not. Do you have regular discussions with local authorities about how to improve the offer in certain localities?
We engage with centres, local authorities and the Association of Directors of Education in Scotland regularly on those issues.
Has this come up?
Actually, advanced higher provision is not something that comes up all the time.
Do you not think that it should? There is a big gap between the wealthier areas and the poorer areas. Should that not be on your red risk register?
As I say, our role is to offer qualifications. I think that we would all want to see the same choices being made available to learners, wherever they live. Ultimately, however, that decision does not rest with the SQA; it rests with local authorities and individual centres. As I said, there has been some movement in schools working together and at Glasgow Caledonian University to make sure that there is the volume of students needed to make some courses more viable, but decisions on the availability of courses and the curriculum are made locally. Gill Stewart may have something to add to that.
This is not specifically on advanced higher, but I know that Education Scotland is doing some work to showcase best practice, particularly in the diversification of the curriculum. It is looking at different models that are being used across different local authorities so that we can spread the word about some of that. We are working with Education Scotland in that space.
That is helpful. Thank you.
We have also done some work with the Scottish Credit and Qualifications Framework Partnership on pathways. There are things that we can do but, ultimately, decisions about the curriculum and the availability of courses are a complex issue, and local decision makers, local authorities and individual schools make those decisions on the basis of the needs of their local school community.
Thank you. You have alluded to some of the other courses in response to Mr Rennie’s questioning. Continuing on this theme, I bring in Pam Duncan-Glancy.
Can you explain the rise in entries to work courses and for the national progression award?
I was at Leith academy last week, where I witnessed the value of the diversification of choices that learners are making. While learners absolutely continue to make choices about national 5, higher and advanced higher provision, there is a range of other qualifications that they can take throughout the year. Indeed, they can be certificated throughout the year, both in the senior phase and, in some cases, before the senior phases commences. That is good to see. There are, for example, national progression awards and PDAs. We do lots of different awards, including mental health awards and other things. We have seen healthy growth. Again, it boils down to individual decisions that schools take about the diversification of the curriculum and the opportunities that are made available to learners. There is still some variability across the country in that context.
It is absolutely to be celebrated that individual schools are thinking about these things and about what will best suit the needs of learners. However, there is also a debate to be had about whether there are some things that should be entitlements of learners. Mr Rennie just asked about advanced highers. The question of whether there should be more consistency came up in some of the discussions that were part of the Hayward review. There is a balance to be struck there, and that is a debate that the education system needs to have. As far as we are concerned in the SQA, the catalogue of the qualifications that we offer is wide and it is deep. Lots of choices can be made. The offer is there; the issue is the availability of courses in schools, colleges and other settings.
Do we know anything about the demographics of the people who are going forward in those circumstances?
We have not undertaken that analysis.
Could you undertake that analysis? It feels like it would be useful for the committee to understand some of the demographics of that increase.
We could maybe look at that. As I mentioned, we have published the equalities impact report. We do not hold that information. We have to engage with the Scottish Government, which holds the information on the Scottish index of multiple deprivation. It is not just SIMD but other characteristics, and we do not hold that information. We can certainly look at that as part of our wider equalities work.
It is really positive to see schools diversify in their curriculums. A report from the Organisation for Economic Co-operation and Development said that, although young people can be motivated by wanting to go to university, some young people are not motivated by that and are looking for something that they are good at. Sometimes, a skills for work course in a vocational area, or a national progression award in cybersecurity, or whatever, might engage that young person and get them interested in learning, and they can find out that they are good at something. That is really important to encourage them in their learning.
It is positive that we are seeing increases in the uptake of vocational qualifications in schools and trying to engage young people in what would help them to take the next steps in their careers. The work that Education Scotland is doing is trying to spread good practice in some areas and help schools to understand how they could do some of that as well, which is good. Fiona Robertson mentioned the work that we have done with the SCQF. As she said, we have a broad catalogue but, sometimes, it is about trying to identify which bits of that broad catalogue might be appropriate for use in schools and highlighting some of that.
For example, in computing, we know that some young people are not particularly good at programming, which you need to be for national 5 computing, but they are interested in cybersecurity, gaming, data analysis, software development and all those different things, so we have lots of small qualifications in those areas that some schools are using and getting good engagement on with those learners. National 5 computing is not for them; it is suitable for some young people who want to go on and do programming, but it is not for others. Sorry—I just used that as a wee example.
We like examples on this committee.
We do. That is a really good example, so thank you for sharing it. We are seeing that those particular awards are helpful. That diversification is important, not only because the OECD picked it out but because we understand that that is what young people want. It is also important for us to know who is going forward and being presented for those awards, as opposed to those who are being presented elsewhere, to check whether there are any patterns that may need to be looked at further.
Thank you very much. It is Stephanie Callaghan now. Thank you for your patience this morning.
Thank you, convener. Thank you, panel. I am just looking at the variation in results that are associated with SIMD areas. I wonder why different approaches to certification can lead to significantly different attainment gaps. Can you explain what is behind that? Maybe that question is more for Dr Stewart.
It is difficult to say definitively what might lie behind that. What I can say is that wider research evidence says that, if you have teacher judgment, teachers are more likely to make more generous judgments on the performance of learners, particularly those from lower SIMD areas and less advantaged backgrounds, if you see what I mean. Research also shows differences between learners with different protected characteristics. I am talking about broader research on teacher judgments, not specifically about research to do with Scotland, national courses, and so on. There might be some natural bias. There could be all sorts of things going on. I do not think that you can say uniformly that one form of assessment, either teacher assessment or external assessment, will suit less advantaged learners or advantaged learners, because learners are a mixture. Some will respond better to teacher assessment. Some will respond better to examinations and coursework. It is not a uniform population.
Do you see what I mean, Stephanie? Maybe I did not explain that very well. We all have our preferences in how we like to learn. Similarly, we all have our preferences in how we would like to be assessed. Often, boys prefer to be assessed via an exam rather than continuous assessment, but there will be some boys who prefer continuous assessment. There is a lot of variation in the population, and I do not think that you can say that less-advantaged people prefer teacher assessment and more-advantaged people prefer continuous assessment—I know that you are not saying that.
10:30
It would be interesting to look at that in more detail if any further work on this were to be carried out.
The attainment gap seemed to be narrower, generally speaking. Over the past five years, the higher the level of qualification, the smaller the gap seemed to be in attainment. What is the thinking behind that? Is there any reason or explanation for it?
There is no doubt that, with alternative certification in 2020 and 2021, we saw a different pattern of attainment. We saw teachers awarding higher grades overall. When you look at the composition of the attainment gap, you see that, if you move from the attainment pattern that we had in 2018 or 2019 to the more generous attainment pattern that was achieved in 2020 and 2021, a result of that is a narrowing of the gap. That was seen not just in Scotland but in the rest of the UK, so it was a common feature of alternative certification.
Interestingly, following the return to exams last year, we published research that looked at teacher estimates compared with the results that were achieved through external assessment. I recommend that you look at that report, because we saw teacher estimates return, broadly speaking, to pre-pandemic levels. We will produce further analysis on the position this year. This year and last year, we saw that the pattern of results from awarding through external assessment—the return to exams—was very similar to the pattern of teacher estimates as they came in. That is interesting, because, over the past couple of years, we have also seen a changing pattern in teacher estimates compared with 2020 and 2021.
You mentioned research. We have undertaken significant research over the past couple of years, and we will do another evaluation of 2023 awarding. That has given rise to some interesting observations about how people felt about alternative certification, continuous assessment and, of course, the return to exams. We surveyed learners, practitioners, parents, carers and those who worked most closely with us, particularly appointees. There were some really interesting conclusions. For example, learners absolutely trust the judgment of their teachers, but, when we looked at the evaluation of the 2021 approach to alternative certification, although they trusted the judgment of their teachers, there was a concern that perhaps the judgment of those who were not their teachers or who were in another centre was questionable. It is about the judgment of the individual versus the relative judgment of others. That is why we have a national awarding system rather than a local awarding system. There are some really interesting observations and findings that should feed into what happens next.
We have a pretty balanced assessment system. Yes, we have an exam system at present, but the courses have been designed to be balanced, and many courses have a significant amount of coursework. Some have teacher assessment and continuous assessment of some kind or have different components that are assessed during the year. There is balance across our courses. That continues to be the case, and I advocate that balance.
On my second question, are you able to explain why the higher the level of qualifications is, the smaller the attainment gap is? For example, at advanced higher level, the gap is smaller than it is at national 5 and higher levels.
We would have to do more analysis of that. If you are asking for my thoughts, my view is that the numbers doing the qualifications significantly reduce. It is a much smaller population that is doing advanced higher, and it will be, by its very nature, a much more select group of young people. We would have to do further analysis to interrogate that and see the patterns and so on, and we would have to speak to others. We can identify the patterns, but we cannot necessarily explain why those patterns are thus.
Up until this point, why has there not been some curiosity to look into that aspect? Has it not stood out as something that perhaps needs to be looked at—in, of course, a positive way?
We provided the equalities impact analysis. You are absolutely right. In broad terms, albeit with caveats around comparisons—that is important, because there are different awarding approaches—we saw a slight widening of the gap between 2022 and 2023 and a narrowing of the gap between 2019 and 2023. Advanced higher looked a little different, however, and we saw a slight narrowing of the gap this year.
To be clear, the trend has been over the past five years or so. It has not just been the case during the Covid period.
The contributors to any changes in the attainment gap absolutely go back to Mr Rennie’s question about entry levels and from where those entries are coming. That is the first thing to say. With individual learners’ achievement, and for some advanced higher courses, we are seeing greater variability in performance year on year, because it is a smaller cohort, and that can impact on the attainment gap. A variety of factors can therefore contribute to that.
It is important to highlight the point that the analysis that we are undertaking is a national analysis. We are not looking at individual local authorities or, indeed, at individual schools. There is a government programme—the Scottish attainment challenge—and that is looking at and interrogating the data. Education Scotland is, of course, very heavily involved in that and is seeking to understand and inform what further approaches should be taken on learning and teaching, on particular support or, indeed, on entries. Our analysis is prompting some of those questions, and that can be a good thing.
Thank you.
I will now move on to questions from Ross Greer.
I would like to ask a few questions about the appeals system. Over the past few years, it has changed quite a bit for a variety of reasons—most obviously, but not entirely, because of the pandemic. The 2022 appeals system probably received the most positive welcome from young people and from organisations that represent them and their rights. We had an appeals system that allowed direct access for young people, that was free and that considered evidence in the round. It was not just a script remarking service.
Perhaps this is a subjective term, but we have gone back from that. We have moved away from that for this year and the system has gone back to script remarking again. Can you explain the rationale behind that decision? Specifically, what were the issues with last year’s appeals service, which was based on wider evidence of young people’s work throughout the year?
I will try to be succinct, but we looked at a number of things. First, when the 2022 appeals service was introduced, there had been two years in which there had been no exams, and we therefore felt that, as part of the package of support, we would, for one year only, put an appeals process in place.
The process was well considered, and we discussed, including with learners and the wider community, how we would put in place an appeals service that, unusually, could be based on alternative evidence. The appeals service was not based on whether we had marked an exam incorrectly and on remarking the exam script, as a typical appeals service is in not just Scotland but everywhere else. It was one that could look at alternative evidence that was gathered during the course of the year in terms of learning, teaching and assessment. As you have highlighted, that was unusual, because such a service is not available elsewhere and certainly not in the rest of the UK.
It was a direct appeals service that was also free. You are absolutely right that it was welcomed. We continue to have a free and direct appeals service this year.
Although we had announced that the appeals service in 2022, which was based on alternative evidence, was for one year only, it was absolutely right and proper that we looked at that as part of our evaluation. We had a lot of discussion about the appeals service, how people felt about it and what the evidence was. We got feedback from learners, practitioners and appointees who were involved in looking at the evidence. There were almost 60,000 appeals, which involved collecting evidence from every centre that submitted evidence, and that was looked at by our appointees, who are also teachers.
The biggest issue was fairness. The appeals process based on alternative evidence was perceived to be fair and generous in the sense that, if someone did not perform well in their exam on the day, their alternative work could be looked at. That worked for some learners. About three out of 10 learners who appealed got a higher grade as a result, but seven out of 10 did not. Keep in mind that someone could make an appeal only if their teacher estimate was above their exam grade, by which I mean if the evidence and the judgment of their teacher gave the expectation that they would have done better than they did in the exam.
How do those numbers—the three and seven out of 10—compare with what would usually be the case with a script remarking service?
They were a little higher, but not much. It is apples and oranges, not apples and apples, in the sense that—
Presumably, you get far fewer appeals with script remarking than you did last year with a different system.
Yes, and they are made on a different basis. We have to be careful about drawing comparisons between rates of success or otherwise.
The consistent feedback that we got from markers was that there were issues around the sufficiency of evidence from schools, and there were, in some cases, issues around the judgments that schools had reached on the estimates. That says something about fairness. I dealt with a number of individual cases in which learners, through no fault of their own, had not been assessed properly or in which there had been inappropriate judgment about the standard that they were expected to achieve. Sufficiency of evidence means a breadth of evidence that we could look at and say, “Yes, on balance, this learner could have got another award.” The standard by which that evidence was judged was also important. The approach was really based on variability and the breadth of evidence.
There was also emerging evidence that, if you have an appeals process of the type that I am explaining—one that is based on alternative evidence—that in itself can present problems, particularly over time. We had a similar appeals process in the pre-2014 period, which was before my time. Over time, for all the understandable reasons, schools start to collect evidence on the off chance that they will need to appeal, and that potentially promotes overassessment and creates workload for learners and teachers. We got some practitioner feedback on concerns about workload for learners and teachers. We do not want learners to be overassessed on the basis that they may need an appeal further down the line.
10:45
I will just come in on that point. I completely understand the concern about overassessment, particularly in relation to the challenges in 2021 with managing the lack of exams. However, in the period between 2014 and the pandemic, the script remarking service that we moved towards rather than the usual assessment system was, partly because of cost, disproportionately used by independent schools. I get the concerns about fairness, but the script remarking system that we used and to which we have now returned has its own issues with fairness—they are evidenced—as well.
I go back to my point about comparing apples to apples. The pre-pandemic service is not the service that we have now. You are right, and I understand that there were issues—they predate me in this role—around the fact that the decision to use the service was determined by the school rather than by the learner. It was also done in the knowledge that, if the appeal was unsuccessful, the local authority—it was usually the local authority—would be charged. The charging system presented a perception that that could influence behaviour, including any judgment about independent schools and the ability to pay and all of that. The cost became part of the perception of that service.
We now have a free, open service. Anyone can appeal, on any subject. A learner can appeal directly with the click of a button. The appeal is on the basis of the assessment instrument that has been used. It is an appeal on the basis on which the learner’s grade has been determined, and—
Sorry to cut in—I am conscious of time. That is the core issue, because it comes back to the debate that we have had over the past couple of years and discussions that I have had with you on those exceptional circumstances: the young people who had a family bereavement immediately before their exam or a panic attack during their exam or whatever. I have brought some of those cases to you as casework, and we have had wider policy discussions about them. How do we make sure that the young people in those exceptional circumstances, of which there are a wide variety, get a fair opportunity?
They still do. Perhaps I should have said at the start that we have retained an exceptional circumstances service—it is not an appeal—that is precisely for individuals in those circumstances. Those learners get their results on results day, so it is not really part of a post-results service such as an appeal.
In effect, we have retained the alternative evidence approach for those who need it most. I do not think that we have published figures yet, but, pre-pandemic and in the normal course of things, there would usually be around 5,000 entries that take up the exceptional circumstances service. Where learners are unable to take an exam for, as you said, personal reasons or reasons of illness, bereavement or other things, or where there has been disruption on the day of the exam, which, again, can be for a variety of reasons, the centre can make an exceptional circumstances request.
Ross, can you drill it down, please?
Yes. Sorry.
That covers some young people but not all of them. For example—I have dealt with casework like this—there is the young person whose parent died the day before the exam but who really felt that they wanted to go in and take the exam. They are having to make a choice: “Do I think that I can perform well enough in the exam, or do I make a choice before that to take up the exceptional circumstances service?”
They could still be covered by exceptional circumstances.
Right. Can I just finally—
Finally, please, Ross. Lots of members want to come in on this.
That is really important. It does not include those who simply did not take the exam.
Did the young people on the learner panel support the change? Did organisations that represent young people’s rights support the change to the script remarking service this year? Did the Children and Young People’s Commissioner Scotland support that?
I do not have a list of every stakeholder who agreed or disagreed, but we undertook the evaluation, which highlighted the issue of fairness and—
Sorry—I am conscious that I am taking up other members’ time. Did the young people on the learner panel support that?
My recollection is that the learner panel had some concerns about the return to the review of scripts. However, we engaged with the breadth of stakeholder interests. We undertook survey work of 2,000 learners, 1,000 practitioners, 500 parents and carers, and appointees. I have a responsibility not to ignore those issues of fairness.
I have dealt with a number of difficult cases this year. You have highlighted your constituency cases. I get representations from a number of MSPs and others during the year. I found it quite difficult to deal with cases in which a learner had an expectation of an award from their school but the evidence did not support that expectation. That was through no fault of their own; they had either not been assessed appropriately or the judgment had been made incorrectly. I realise that there may be some stakeholders who agree or disagree, but I also have a responsibility around fairness.
I am not aware of any other country that has an appeals service on the basis of alternative evidence. Most exam boards and regulators determine an appeals service that is based on the assessment approach on which the appeal is based.
I understand the concern. We have done as much as we can to ensure that those learners who need it most still have access to a service that can utilise some alternative evidence and, actually, a wider range of alternative evidence, while maintaining fairness in an appeals process.
Thank you.
Thank you very much, Fiona, for making that quite clear.
My colleague Ross Greer has covered the topic in some detail. I just want to check something. I think that you said that three in 10 appeals are successful through the alternative evidence approach and that that is higher than the number of appeals that are successful through the script approach. Is that what you said?
I think that what I said was that it was not appropriate to compare the two, because of the differences in approach.
Okay.
I do not have the outcomes for this year’s appeals service, because we are still working on it. We are working through nearly 40,000 appeals at this time, and we will publish information on those appeals outcomes. I do not have a degree of predetermination about an appeals outcome. The appeals outcomes are based on evidence, and they will be what they will be. There is no predetermination about one appeals service being better than another.
The fact is that the appeals service from last year was based on an estimate being higher than the resulting grade. I would have expected, other things being equal, the success rate for appeals last year to have been higher than it was, because it was based purely on the estimate. Therefore, if there was integrity to the estimate, the appeals should have been successful. The point is that, when we looked at the evidence to support the estimate and at the assessment evidence, both in terms of sufficiency and standard, only three in 10 appeals were successful. The reason that we did not continue with that appeals process, albeit that we gave no indication that we would continue, was on the basis of fairness.
On that basis, you said earlier that the assessment was based on a balance of exams and coursework. Can you explain the variability of evidence that you have said has caused some of the concern around the alternative evidence approach?
The alternative evidence approach for appeals was based on the assessment evidence that was held by the centre. Our appointees, who are working teachers, were involved in making the determination on whether an appeal was successful or not. There were issues around the breadth of that evidence relating to whether the evidence covered an appropriate amount of course material and whether the judgment on that course material was appropriate. There are two elements: the breadth and the standard.
Okay.
I am going to move on to let somebody else in.
Schools, colleges and centres will have utilised a number of different forms of evidence in making that determination.
I will move on to Michelle Thomson and then Liam Kerr.
I will stay on this theme. An article from Tes Scotland on 7 July noted that attendance at school has historically been a problem but that Covid has exacerbated it. Covid still casts a long shadow, particularly over certain socioeconomic groupings. In your decision making around removing alternative assessment evidence, how did you reflect on there still being significant pockets of children for whom attendance has fundamentally shifted?
Attendance per se did not feature in the decision around moving from one appeals process to another. In fact, attendance issues in isolation can impact on achievement and alternative evidence, so—
[Inaudible.]—is what I am saying.
No. In the first part of the evidence that we provided at this session, we talked about our grading approach. Both this year and last year, we have been very mindful of playing our part in providing support to learners and doing what we as a national awarding body can. We recognised that we needed to take that generous approach last year and to continue that generosity through the sensitive approach this year. We recognise that learners have faced very significant challenges over the past few years. It has been important that we have played our part in assisting and supporting that, and that is why we have done what we have done. School attendance absolutely plays into the impacts and the legacy of the pandemic.
One of the key fairness points for us was this: you might have somebody in one school and somebody in another school who both made an appeal based on alternative evidence, and whether it was successful would depend on how well the teacher understood the requirements of the course and on the standards and judgments that they made. The appointees—who are teachers and lecturers—looked at all the evidence in the alternative appeals service, and they were really quite disheartened at some packages of evidence that they saw and how that evidence did not back up the estimates that had been made.
That is not in the hands of the young person. If I am at school, I am dependent on my teacher making an estimate and having good evidence to back that up. However, my teacher may be a great teacher but not good at that particular aspect. I do not have a choice about that. That is where the lack of fairness for the learner comes in for us. It was dependent upon the school and their understanding.
That is where I am a wee bit confused. That concept of fairness has a multitude of variables, some of which you have set out. I am merely reflecting on how perhaps another element of that fairness is the fact that, in pockets of society, there are still significant longitudinal effects of the pandemic. Attendance, which Tes Scotland—I should have quoted this figure—estimates at 90.9 per cent, is historically lower. I am merely noting that. I will let other people come in.
Do you want to address that point?
The only thing that I will add is that learning and teaching come first. The support that individual learners get, whether that is encouraging them to come to school or in relation to other things, is, obviously, in the hands of the school and, where appropriate, the local authority.
11:00None of us underestimates the challenges that the education system has experienced over the past few years. Therefore, there is a system piece around the support that is put in place. It is difficult for us as a national awarding body to address differential disruption to learning. We had this conversation earlier in the year when there was industrial action by teachers in schools. In some cases, targeted industrial action meant that some schools were closed and others were open. It is difficult for us to make adjustments to awarding to allow for that. I hope that there is an acceptance by the committee that that is difficult for us.
However, we can and do deploy flexibilities. For example, in that case in the spring, we made adjustments to visiting verification, where we go out and undertake assessments in schools. We were able to offer some flexibilities to schools to help address some of that. I have to be honest and say that there are limits to what we, as a national awarding body, can do to address that difference across Scotland. Learning and teaching come first. The support that learners get in their school and their classroom comes first.
Thank you for that.
I will bring Liam Kerr back in. Thank you for your patience, Liam.
I want to close out the issue of the appeals process. You have pointed out that there was a pre-pandemic system and a during-pandemic system and that we now have—if you like—a post-pandemic system. What is your early thinking on what the future system will be? Is this year’s system now the standard for the appeals structure, or will there be further revisions?
I have highlighted my commitment in that respect. We undertook full evaluations of our awarding approach last year and the year before, and we will do so again this year.
However, we have made adjustments to our appeals process. We now have a direct appeals process in which learners can make that decision for themselves, and it is a free service. That is different from the pre-pandemic system, but it is still based on the assessments that they have undertaken. We will, as part of our evaluation, consider whether any further change is to be made, but I do not expect a full iteration in the coming year. In short, I do not expect significant changes to our appeals process this year, but we will, if we need to, reflect on the basis of the evidence that comes through from this year’s resulting.
As I have said, we have already delivered the priority appeals for learners who are going on to university or whose results are important for their next steps, and we are now going through the standard appeals and hope to result them before the end of next month. We will publish the outcomes and then see what the evidence tells us.
Thank you very much. We will now tack away from appeals and move on to the topic of reform.
I want to give you the opportunity to get some stuff on the record. As you will be aware, I am new to the committee, but I have had pretty extensive experience of large so-called transformation programmes in corporate life, and they are invariably difficult, time consuming and expensive. I just want to reflect on where we are here. For a start, the decision to abolish the SQA must have had a resultant impact on your staff’s morale, so I want to get your reflections on that and hear more about what you are doing, from a leadership perspective, to maintain morale in the organisation.
Thank you for the question. Organisational reform can be, as you have highlighted, difficult, time consuming and expensive. In June 2021—27 months ago—the then Cabinet Secretary for Education and Skills announced that the SQA would be replaced. That has created uncertainty for staff. I am on record as saying, when I was asked about this last year and the year before, that the SQA is full of colleagues who have great expertise and who operate with professionalism and integrity in all that they do. Today, we have talked substantially about the work that the SQA has been doing over this period to deliver for learners every year?both before and since the announcement—and that is what we will continue to do.
It is critical that we maintain continuity of delivery throughout the organisational reform; we cannot just stop what we are doing to allow it to take place. Instead, we have to manage that process at the same time as we continue to deliver and improve, including improving our services. I have already highlighted our appeals service, which is now direct for learners and is free. That is an example of a service that we have delivered in short order over the period.
The uncertainty that the reform process has created has been difficult, and I will ask my colleague to comment on that. As a leadership team and in discussions with staff, we have sought to keep close to our colleagues at this time; to be honest with them about what we know and do not know about what is happening next—which, sometimes, has been difficult; and to keep a resolute focus on the job at hand, which is to deliver with integrity and professionalism. That is what we have done over the period, and we have sought to keep close to our colleagues in doing that.
I am sorry to jump in, but, on your comment that you have “sought to keep close” to your staff, do you have specific, regular communication sessions with them? If so, what and how frequent are they? It would be helpful to hear about them.
For the past couple of years now, we have been having regular executive team sessions with all staff.
How often do you have them?
At least monthly. We have a series of directorates, and there is a lot of engagement within them. During the pandemic, we had regular pulse surveys and annual people surveys—all of those things. There is also our performance framework, through which we look at not only the performance of our qualifications delivery but a range of other issues; for example, we keep an eye on the likes of employee turnover and retention et cetera. We have kept a close eye on all of those things.
As part of our audit and risk responsibilities, we have been keeping a close eye on ensuring that the balance is right with regard to the risk appetite in the whole organisation. We have sought to do as much of that as possible. I will acknowledge, though, that it has been difficult. At times, it has not been possible to provide answers to staff about what is happening or what is happening next.
The fact is that the reform programme is a Scottish Government programme; it was an announcement that the cabinet secretary made in relation to the SQA. The SQA is a public body, and there was an expectation that there would be legislation to create a new qualifications body. That will be necessary. We are, after all, a creature of statute—there has to be legislation in place in order to create a new qualifications body. The new cabinet secretary made it clear in the summer that the legislation would be delayed, so there will be no new qualifications body before 2025, and that will mean a considerable period of uncertainty.
However, with all the discussion and debate about the organisation, what we do and how we do it, it is important that we continue to improve our services. We have placed a big focus on our communications, including with learners and on engagement with them. Indeed, the executive team will be meeting a learner panel next week to discuss how learners are feeling. As well as that much more direct engagement that we are having with learners, my team and I are engaging with the wider system very proactively, including closer engagement with practitioners. We have continued to do that.
If you were in charge of legislation and the operating framework, would there be one particular change that you would like to make?
With regard to legislation, the SQA has been consistent in its wish to move from a voluntary accreditation and regulation framework to a system in which the expectation would be that all publicly funded qualifications in Scotland would be regulated. That would benefit learners, because it would provide assurance that all qualifications, be they SQA qualifications, new qualifications, awarding body qualifications or qualifications provided by any other provider in Scotland, were of a high quality.
At the moment, we have a voluntary system, in which qualifications are regulated only voluntarily. That sounds like quite a technical issue, and it is, but this is about giving assurance—
Yes—that is fundamental.
It is all about giving assurance that any qualification offered in our schools and colleges is of a high quality. I would say, with my other hat on as chief regulator of qualifications in Scotland, that that is really important.
I was heartened to hear the Minister for Higher and Further Education say in his statement on purpose and principles that, as a result of the Withers review—that is, the independent review of the skills delivery landscape—his view was that the new qualifications body should have oversight of all publicly funded qualifications below degree level. It is really important to give schools, colleges and any other centres surety about the quality of the qualifications that are being offered and delivered in Scotland, and it would bring us into line with the rest of the UK.
I know that other colleagues want to come in, so I will leave it there.
Thank you, Michelle. That is very kind of you.
Liam Kerr has a very brief supplementary question, and then we will move on to Willie Rennie.
I am very grateful, convener.
Do you believe that the new body will be in place by the time of the exam diet in 2026?
The cabinet secretary spoke in June about her expectation that legislation would be forthcoming this parliamentary year, and it was set out in the programme for government. She also set out an expectation that there will be a new public body in place by the autumn of 2025. On that basis, there is an expectation that a new qualifications body will be overseeing qualifications in 2026.
The truthful answer to your question is that it all depends on parliamentary process, does it not? It depends on the passage of the legislation through Parliament and the implementation of any legislation that follows. If all goes to the plan set out by the cabinet secretary, there is no reason for me to believe anything other than that a new public body will be in place in the autumn of 2025.
It was a brief supplementary, and I had been hoping for a brief response, too. I call Willie Rennie.
Michelle Thomson has already covered the impact of structural reform and the delay, but can you tell me whether you have lost good people from the organisation as a result of that further delay?
I think that we have. We certainly saw it happening after the announcement by ministers and the uncertainty that it created. It is important that I highlight that it took, unfortunately, a number of months for ministers to confirm that there would be no redundancies and that jobs were safe. That had an impact, and we saw an increase in staff turnover and lost people whom we would have wished not to have lost from the organisation.
Not only that, but it can also be harder to recruit to an organisation that is not going to exist. Indeed, that was the context for a period of time—albeit that the situation was resolved and ministers were able to confirm by the end of 2021 that there would be no compulsory redundancies. I, as chief executive, and Gill Stewart, as senior director in the organisation, want to ensure that our qualifications body in Scotland has the best people possible, so it has been a difficult situation. We have lost some people, and, in some circumstances, we have found it harder to recruit.
That said, many fantastic colleagues remain, and they remain committed and continue to work hard to benefit learners. We have many exceptional colleagues in the organisation who are committed to the SQA and the period ahead.
11:15
That is really interesting—and quite disappointing in many ways. Did ministers respond to your pleas for some certainty about redundancies?
Yes, they did.
How long did that take?
It took a period of time.
How long?
Five or six months, I think.
So, there were five or six months in which people were potentially leaving the organisation. Are you able to quantify how many?
I do not want to overplay or underplay things. In an organisation the size of the SQA, you will see people leaving for lots of reasons, and you will also see people coming in, and that has continued to be the case. As I have said, I do not want to overplay or underplay anything, but there is no doubt that we found recruitment and retention in that environment more challenging than it might otherwise have been—and, to an extent, we continue to do so.
Can you quantify that for us?
It is hard to quantify the effect, but we certainly saw it.
Have you seen any effect on recruitment with the further delay and the decision in June not to bring in the legislation for a period, or has that just compounded the same issue?
I am not aware of any particular effect.
Staff morale is being affected by the prolonged period of uncertainty on the shape of the new organisation. Everybody is asking, “Am I going to be part of it or not?” or is wondering how their roles will change. We have had the publication of the Hayward review and the Withers review, but we still do not know the Government’s response to them, and there is some frustration about when we will be able to make changes to qualifications. That frustration is arising particularly in staff in my directorate, because they would like to make changes and improvements to the qualifications. Those are the frustrations that are coming through.
It is a really interesting topic, but I am sorry to say that, in the interests of time, we will have to move on from it, if you do not mind. We still have a lot of content to cover, and I have my eye on the clock. Do you have anything further to ask on this, Mr Rennie?
No.
Okay—thank you. We move to questions from Ben Macpherson.
Just briefly, on that last point of consideration—
I wanted us to move on.
—has there been an impact on your international work, which I know is significant? There has been a focus on the domestic work. Please be succinct.
It is an important point. There is a broad point in relation to reform in the context of the SQA brand having quite a lot of recognition overseas. The work of the SQA and Scottish education are seen positively overseas, so the move to a new organisational name will potentially mean that there are some things that we need to manage.
Over the pandemic and the period since the pandemic, our international work has been subject to some fluctuation in a way that you might expect. We had some assurance from the Government, should that be required for our international partners, around our organisational on-going concerns, so that has been dealt with in the normal way, and we have continued to see some growth in some of our markets. In some cases, there has been a bit of a pandemic effect, but, overall, we still—
I am conscious of the convener, so if there is anything further on that that you want to add, please follow up in writing.
Yes, I am happy to provide further information in writing.
Thank you.
On the delivery board, I would be grateful if you could set out what the target operating model for the new qualifications body will be.
As I highlighted in my answer to Michelle Thomson in relation to the reform programme, we have been part of a Scottish Government reform programme and a number of pieces of work. A strategic group was set up, as was a delivery board for the SQA to take forward some work on the national qualifications body, which includes a range of stakeholders. There is a similar board for the setting up of the new inspectorate and the new national agency. That has been the set-up of the governance arrangements in relation to reform.
As part of that, we were asked to develop a target operating model, aligned with the design principles, which were broadly based on the design principles that were brought forward by Ken Muir in his report on the work that he did as part of the reform programme. At the Government’s request, the delivery board oversaw the production of a target operating model, otherwise known as TOM. That was to be prepared and submitted to the Government in June of this year, which we did.
A point that I was going to mention in response to Willie Rennie’s question about reform more generally, which is really important in any organisational reform, is that function should be followed by form. Function comes first, followed by form, so form follows function. One of the uncertainties in relation to the new qualifications body was about function, not least given the independent reviews that have been undertaken by James Withers and Louise Hayward. Put bluntly, you need to know what a new organisation is going to do before you can fully understand how it is going to operate, how it is going to be structured and so on.
The target operating model has been submitted to the Government on the basis of what we currently know. Therefore, I would consider the target operating model for the new qualifications body as submitted to the Government to be a work in progress that was developed on the basis of what we knew at the time. We have aligned the target operating model around the design principles for reform, which are about being user centred, data focused and flexible to change from the operating environment. It will be a learning organisation, as well as operating in the functional space around learning, and we will be digital by default. We will be collaborative, and we will operate in a sustainable way.
Through some work that has included looking at route maps of the customers that we have and the products that we are currently responsible for, we have set out some thinking around what a target operating model might look like. However, at the point at which it was submitted, there were some missing pieces. The Hayward review had only just reported. The Government has not yet concluded its consideration of that report or James Withers’s report. Some information about purpose and principles came out for the tertiary sector at the end of June. Obviously, we have also had the national discussion, so there are a number of things that need to be considered before that can be concluded.
If the new organisations are truly to adhere to and align with those design principles, there will need to be investment in our systems and in our processes. Similarly, I am sure that that will be the case for the new national education body and the new inspectorate. There is a considerable amount of work still to do there. The Government has also set in train some thematic reviews, which are the connector pieces for the reform programme. Those are still at a very early stage, and they absolutely feed in to the target operating model. They include things such as digital and culture. There is a range of thematic projects. Those are still at a very early stage. I noticed in the Scottish Parliament information centre report—
I am sorry, Fiona, but I must ask you to cut down your response times. I understand the importance of this, but—
I absolutely appreciate that, but I also have to highlight the fact that these are complex issues.
I understand that.
They are complex issues.
I have no further questions, convener, although I saw that Michelle Thomson was looking to come in on this point.
Is it a brief supplementary?
I will leave it just now in the interests of time.
Thank you very much.
This is probably the most critical that I have heard you be of the Government. I can tell that there is frustration that there is a lack of clarity, which you talked about in terms of employment but also in terms of clarity on the function. I would like to hear your view on whether you think the Hayward review is headed in the right direction. I noticed that your chairman at the time said:
“we must ensure change can be delivered successfully”.
There was a hint that perhaps the current direction of travel was in danger of not pursuing that successfully. I would be interested in hearing your view on that.
First, I need to say that I do not think that I have been critical of Government, but I have a responsibility to be honest with the committee about where things are. To be fair to the Government and the cabinet secretary, on the basis of the reviews that have been completed and on the basis of the point that I have made and have been advocating for a period of time since the announcements were made, in 2021, it is important to consider function first. Therefore, it is important that that consideration takes place. That is important.
The Hayward review is a substantive report. It considers some potentially considerable changes to our education and skills system that need to be carefully thought through. Organisational reform and the order of that needs to be considered carefully, as I have highlighted to the committee, to ensure that we can continue to deliver successfully for today’s learners at the same time as preparing the ground for tomorrow’s learners and future learners. That is really important, and we need to make sure that we continue to deliver successfully, at the same time as making any changes that may come from Hayward.
For me, there are a number of important considerations that need to be taken forward in relation to Hayward, and we have made some points consistently over a period of a number of OECD reviews and a range of other reviews that have taken place over the past few years.
First, any changes to assessment and qualification should be seen alongside consideration of matters of curriculum design and pedagogy. Louise Hayward’s report, if implemented, would change significantly the curriculum models in place in our schools, so it would give rise to quite significant consideration of the things that learners have experience of and, indeed, issues that this committee and predecessor committees have spent quite a lot time on around subject choice, numbers of subjects and so on. If the Scottish diploma of achievement were accepted as something that Scotland wanted to pursue—Louise Hayward has been careful about looking at the conditions for success and the investment in the system that would be needed—it would be important that such an SDA could benefit all learners, whatever their pathway, and that we considered carefully any unintended consequences, particularly around equity and the personal pathway element.
11:30There is a need to promote further integration and choice to ensure that every school offers a rich curriculum. That is at the heart of the breadth that is evident in the vision for assessment and qualifications that Louise Hayward sets out.
As I have highlighted to the committee, it is really important that there are clear models of change across the education system and clarity of roles and responsibilities. Those conditions for success and the provision of sufficient investment and clarity about what is required will be absolutely critical in ensuring that there is capacity. Concerns are already emerging around the workload implications of some of the recommendations that have been made; those are about not only capacity but the capabilities, systems and technology that would be required to truly make the reform process a success.
From my perspective as chief examiner, I feel strongly that important issues around the principles of assessment—validity, reliability, practicability, equity and fairness—need to be at the heart of any assessment and qualification system. That is really important.
My view is—
I am sorry, but I have my eye on the clock, and you have had four minutes to respond. I have one further—
I will not ask any more questions. The fact that you have given a long list is an indication of your anxiety. That sends a clear message to the Government about the process of reform.
Thank you. Can we move, finally, to questions from the new deputy convener, Ruth Maguire, please?
I thank the panel for its evidence so far.
I want to ask about the SQA’s communication with the profession and young people. There has been a bit of a theme of criticism about that. In the past, the committee has heard the SQA described as an “unlistening and distant organisation.” I see that the SQA’s position is that it involves teachers and that teachers are integral to its work. I note from your submission that you
“are committed to incorporating the perspectives and experiences of teaching professionals and learners in our decision-making process.”
You also say that you have “a refreshed engagement programme”. Can you give us a bit more detail about that? Specifically, if I were a teacher or a learner involved in that process, how would I know that the SQA is listening, responding and reflecting back the diversity of views that are out there?
There are two aspects of that: communication, in that we impart messages about what is happening; and engagement, which includes discussion and debate about decisions that we need to take. I will start with communication.
I acknowledge that, when I started in this role, a lot of our communication with learners was through centres. We focused on communication with schools, with the expectation that those messages would be passed on. In many cases, they were, but there was very little direct communication between the SQA as an awarding body and individual learners.
We have sought to substantially address that while, obviously, maintaining the very important relationship that learners have with their school. We have much more direct communication with learners. That includes, for example, booklets this year on everything that people need to know about national qualifications. I wrote directly to every learner in Scotland with information on our new appeals service. There was no ambiguity or uncertainty about what that was, and key dates and a full explanation of what things are were included.
We have also increased our communication on a range of social media channels to ensure that we reach learners directly. We have had some good feedback and, indeed, we have engaged with learners themselves on what good on those channels might look like to ensure that they hit the mark. We have some really good data that relate to the reach of our communications this year, which is, in some cases, up by 200 or 300 per cent. We have some really good data analytics on the reach of our communications with learners.
Similarly, we have taken a range of approaches with practitioners. There is weekly centre news that goes to every centre in Scotland. That includes core messages—
Forgive me, Fiona: I tried to catch your eye, but I did not manage to. That sounds really positive, but, as I can feel the convener’s eyeballs on me, can we move to engagement? You have given some good examples of direct communication. Let us talk about engagement.
Yes—absolutely.
We have a group of around 4,000 practitioners, learners and parents/carers we can survey at any time, and we do. We have used a variety of tools as part of our research to elicit views on a range of issues.
Over the past three years, I have chaired the national qualifications and higher national and vocational qualifications groups, which include a group of national stakeholders. In 2021, for example, the national qualifications group met every week to talk about and provide advice on the decisions that we needed to take. That is combined with a communications group so that we engage on how we communicate with learners.
At a more granular level, Gill Stewart’s directorate has a range of national qualifications subject teams, which are groups of subject practitioners. We are looking at that approach to see whether we can widen it so that we are—
Let me jump in again. I am sorry—I am at risk of being rude. I am reflecting on a response that was given to Ross Greer about a decision that one section—I think that you said that it was the young learners panel—was not content with. How organisations feed back when they are not going to do what their surveyed people would like them to do is interesting. Could you speak about that decision to do something different? Apologies—I cannot remember what the decision was.
Engagement is important, as are the quality and integrity of it, but I am sure that the committee is aware that, in education, there is usually a variety of views on most things and that we have to make balanced decisions on the basis of the responsibilities that we have and the feedback that we hear. Most organisations—
We absolutely get that. How did you do that?
We have to do that. We seek to collect views and we undertake evaluations. I appreciate that my answers are long. That is because there is a multilayered approach, but—
I am asking about a specific example so that you can give me quite a short answer. It is not about the wider question; I am asking about an example of how the organisation might do that.
We can give you an example and follow up afterwards on how we have conducted the evaluations of 2022 and how we will do that for 2023. There is a range of survey work, focus groups and engagement with a range of—
I am sorry, but I am going to interrupt.
I think that I understand what you are saying. You are talking about how we feed back the decision. Is that right?
Yes. I completely understand about the range of views and the range of people you speak to. We had the example in which one of your stakeholder’s views was not taken forward for, I presume, completely understandable and legitimate reasons. As a reflection of how you are communicating well with your stakeholders, how did you manage telling them why, and how did you tell them?
We try quite hard to continue the conversation about the decisions that have been taken. Of course, a decision such as one on the appeals approach needs to be taken through our advisory council and our qualifications committee and then, finally, it is a decision for our board. There will be formal communication in relation to that decision, but there will be on-going engagement with those groups. For example, there is on-going engagement with the learner panel about the feedback that is provided and the decisions that we have had to take. That is an on-going process.
I accept that it may be because of the way that I am asking the question, but I do not feel that I am getting an answer. I do not know whether that is reflective of the communication style of the organisation.
I am not sure that I understand your question—apologies for that. I am trying to say that we seek to make evidence-informed decisions—
The committee understands that.
—that are the culmination of engagement and discussion, and that that feeds into a formal process. As a public body, we are obliged to do that, including through our board. We feed back the decisions that we have taken in a variety of ways—formally and informally—as part of our on-going engagement with different groups.
That is helpful. Thank you.
Thank you very much, deputy convener, for drilling down into that.
I thank the panel members for their time today. The public part of our meeting has now concluded, and we will consider our final three agenda items in private.
11:41 Meeting continued in private until 12:53.