Skip navigation
The HCPC will be closed from 12 noon on 24 December 2024, reopening 2 January 2025. Email inboxes and phones are not being monitored. More information

Reviewing the performance of approved education providers and programmes

Our approach to performance review assessments

Through performance review assessments, we undertake periodic, proportionate engagement with education providers, to understand their performance, and quality of their provision. We seek to gain assurance about the education provider’s continued alignment to our education standards. Through an assessment, we decide when we next need to engage with the education provider, and set a review period of between 1 and 5 years – this is based on risks and potential issues and when those might need exploring. We are also able to consider significant issues, and where education providers do not meet our standards, withdraw approval.

Education providers complete a portfolio covering a set of themes we consider are important to demonstrate ongoing quality of their education provision for the programmes we approve. These themes are linked to our standards, sector developments and initiatives which may affect the quality of education provision. Where available, we also ask education providers to reflect on performance data points linked to the numbers of learners, learner non-continuation, outcomes for those who complete programmes, and learner satisfaction. These data points give us metrics-based information about how education providers are performing linked to these areas (normally in comparison to a benchmark), and over time whether there are changes in that performance. We explore our use of data in assessments in more detail in later in this report.

The portfolio and data points enable us to form a risk-based view of education provider performance, and to identify and support education providers who may not be performing as they need to. Ultimately, we can trigger regulatory interventions if there are risks to learners not meeting our standards on programme completion. Education providers need to share challenges, how they have overcome them as well as successes, which enable us to fully inform our view on performance.

Compared to our previous education quality assurance model’s monitoring processes, assessments through performance review are much more robust. Our previous education quality assurance model focused on change, and so where education providers had not made changes, there was little for us to review. Assessments were also at programme level, which meant institution-wide changes were not always reported or picked up when undertaking modular programme level assessments at the same education provider. This risked under-reporting of challenges and successes, and inconsistency in assessments, giving a partial view of quality.

 

Key findings

We assessed 93 of the 138 HCPC-approved education providers via performance review in the two years assessed, which gives us a good indication of how HCPC-approved education providers are performing across the board*. We identified common themes across assessments, linked to developments and how the sector has responded to challenges.

  • Quality assurance focus – education providers were transparent throughout the process, openly discussing the problems and challenges they had identified, and what they were doing to resolve such issues. This showed a strong quality assurance and continuous improvement mindset, which is integral to quality assurance and enhancement. Consideration of the quality of programmes was also seen as integral to change and innovation. Education providers with strong centrally managed policies, and common approaches across their provision, were more easily able to reflect as an institution against the thematic portfolio areas.
  • Recognising and understanding challenges – the sector is outward facing, and aware of challenges from within and outside of the sector, such as cost of living, industrial action, emerging technology, and an aging population. Challenges that directly or indirectly affect delivery of programmes were often well thought through, and flexibly considered in line with established standards and frameworks (such as our education standards). Obligations to external organisations (such as other regulators and professional bodies) are also a key consideration for education providers.
  • Types of education providers, and UK nations – there was a clear split between the approach of higher education institutions (HEIs) and non-HEI education providers. HEIs normally have clear, well utilised, structures (normally with a level of commonality across education providers), and non-HEIs lack similar structures, or have less ridged structures, with less commonality across education providers. HEIs also have external mechanisms, frameworks, and standards to adhere to, and non-HEIs may not as standard. This meant non-HEIs often needed to work harder to show good performance. There are also differences in influencers and approaches within the UK nations, with education, health and social care being devolved matters across the UK.
  • Partnership working – strong partnerships are integral to sustainability and quality of programmes. Good partnership working is best underpinned by formal arrangements which clearly defined objectives, expectations, and responsibilities, which are supported by formal engagement procedures.
  • Programme capacity – education providers considered growth in overall capacity of programmes, and the impact of this growth on practice-based learning, and education provider resources (including staffing). This links to the challenge noted in the approval section, and similar challenges were faced for existing education providers. Through performance review assessments, we were able to consider how education provider intentions worked in practice, and could identify where there were challenges which needed more thought and attention from education providers. From our assessments, we were confident that education providers were growing their capacity in a reasonable way, considering the broader sector and external constraints, such as the capacity of practice-based learning.
  • Education provider use of data – all education providers use data in some way to inform their operations, whether that be learner data to inform learner support, financial data to plan, or other data sources and uses. However, linked to this area, there were problems with feedback fatigue, which impacted internal education provider feedback mechanisms (such as module feedback), and external mechanisms (such as the National Education and Training Survey).
  • COVID-19 – the COVID pandemic was both a challenge to manage, and a catalyst for change and innovation. This theme cut across many of the portfolio areas, and we saw innovation in areas such as delivery of teaching, practice-learning environments, simulation, and learner support.
  • Alignment with our revised standards of proficiency – all relevant education providers demonstrated alignment with the revised standards of proficiency (SOPs) through reflections on thematic changes to the standards, and showed us how they reviewed their programmes to align with them from September 2023. This only applied to education providers assessed in the 2022-23 academic year, when we added this requirement to portfolios. We will continue to monitor education provider adherence to the revised SOPs through future performance review assessments.
  • Shortfalls in education provider approaches – in some areas, such as interprofessional education, and service user and carer involvement, some education providers were less developed than we would expect. We picked up specifics through assessments, and from these assessments are confident all education providers meet standards in these areas.

We have provided detailed findings linked to how education providers have performed linking to continued alignment to our standards, and the challenges experienced, in appendix 2.

*Welsh Higher Education Institutions (HEIs) were not included in either of the two years reviewed, due to our decision to review all Welsh HEIs in the third year of our review programme (the 2023-24 academic year). We made this decision as all Welsh allied health professional training was recommissioned, and we reviewed provision in the 2021-22 academic year through the approval or focused review process.
 
Quality activities and referrals

During performance review assessments, we sometimes need to explore areas in more detail to consider education provider performance. These can be where there are gaps, or it can be to identify best practice that we can then share with the sector. We call these explorations ‘quality activities’. We can undertake a range of quality activities, from clarification via email and documentary submissions, to virtual or face-to-face meetings with various stakeholder groups.

Performance review considers the performance of the education provider within a set review period. When concluding assessments, we make a judgement about when the next performance review assessment will take place (a one to five year period). Sometimes, there are areas which require follow up at a later time, such as a specific planned development or change, or us seeking reassurance that an education provider’s approach works in practice. We capture information about these areas, and have tools which enable us to pick them up through future assessment processes. We describe these as ‘referrals’, and we are clear when referring with what we will be looking for when we next review. This helps education providers to consider and plan continued alignment with our standards. When referring, we are confident that education providers continue to align with our standards at this time, but we consider there is a specific area of risk that we need to consider through future assessment.

The following chart presents the number of quality themes and referrals linked to each portfolio area, and is provided to summarise the areas where there were the most areas that we needed to explore further with education providers through our assessments.

imageot0vr.png

 

The areas most often referred to other processes were:

  • service user and carer involvement in education programmes (15 referrals) – we require that service users and carers are involved in programmes in some way, and usually referred this area when involvement was under development or changing;
  • academic and placement quality (9 referrals) – we require that education providers have mechanisms to ensure the quality of academic and practice-based learning, and usually referred this area when there were concerns in these areas, or if changes were recently made;
  • resourcing, including financial stability (8 referrals) – we require that programmes are sustainable and fit for purpose, to enable all learners on programmes to complete their education and training, and usually referred this area when there were changes in resource modelling or increases in learner numbers; and
  • Interprofessional education (5 referrals) – we require that learners are able to learn with, and from, learners and professional in other relevant professions, and normally referred this area when approaches were under developed or changing.

Referrals in these cases usually enabled us to set requirements for education providers to ensure they developed as needed in specific areas, consider how successful changes have been, and how initiatives have worked in practice.

 

Assessment outcomes – review periods

When defining the review period of between one and five years, we consider the following:

  • Stakeholder engagement – how the education provider engages with their stakeholders with quality assurance and enhancement in mind.
  • External input into quality assurance and enhancement – how the education provider engages with professional bodies, and other relevant organisations, and how they consider sector and professional development in a structured way.
  • Data supply – whether data for the education provider is available through external sources, or if they have established a regular data supply.
  • What data is telling us, and how the education provider considers data in their quality assurance processes.
  • If there are any specific development(s) or risk(s) that will impact at a specific time.

image8bknu.png

 

In 2022-23, we set a two-year review period for a lower number of education providers when compared to 2021-22. This is likely linked to the prioritisation exercise that we undertook when implementing the current education quality assurance model.

When adopting the model, we decided to assess all education providers against our performance review requirements across a three-year programme of assessment. This period was chosen to balance relative risk (see below for an explainer of how we considered risk), and to deliver assessments within our team resources.

We prioritised education provider based on a number of factors, to consider where there could be higher risks to assessing education providers later in the programme. These factors were:

  • The total number of learners;
  • When the last HCPC annual monitoring audit was undertaken through our previous education quality assurance model;
  • The number of available externally sourced data points*; and
  • HCPC ‘performance score'**.

We included education providers who did not appear in externally sourced data returns in the first year of our reviews. Through assessments in 2021-22, the main reason for giving two-year review periods was that education providers were not included in external data returns, and that they did not establish direct data returns through assessments.

For education providers included in external data returns, review periods were set at 5 years for 60% of education providers. We set this review period when:

  • The education provider was high performing, from a data, intelligence and based on the findings from our review;
  • Any immediate issues raised through assessments were dealt with by the education provider; and
  • Any remaining issues did not need to be addressed before a five-year review period.

Reasons for setting shorter review periods were normally due to:

  • A significant change planned by the education provider which might impact on a range of our standards, which we considered needs reviewing along a shortened period to ensure any risks associated with changes were properly managed; and / or
  • Low data scores, to ensure actions defined by education providers were progressed to manage risks.
*We use several external data supplies to consider education provider performance. Further information about our approach to data, including the ‘ceiling’ for review periods when data is not available, is included in the data and intelligence section of this report.
**We produce an overall performance score for each education provider, based on externally and internally available data metrics. We only use this score internally to inform high level resourcing decisions, as we decided that external use of this score was reductive, hiding nuances which could be drawn out through full data and education provider / programme assessment.
Page updated on: 17/04/2024
Top