Quality assurance

5.148     Sitting behind all of these processes must be some element of quality assurance. Quality assurance (QA) of LSET occurs in two distinct settings in the legal services sector: as ‘institutional’ QA through the formal stages of classroom-based education and training, and as ‘workplace’ systems for assuring continuing competence or assurance of specialisation. The systems in place are described, and the critical issues for each setting are identified and discussed.

Institutional QA

5.149     This section addresses degree level qualifications, vocational qualifications, and other legal professional qualifications consecutively. A more detailed analysis of existing regulation is provided in the Literature Review, Chapter 6.


5.150     QA for the QLD and GDL relies primarily on a dual system of regulation by the state agency – the QAA – and the relevant approved regulators (the SRA and the BSB). Additional requirements may be made for institutions in Wales (for example, facilitating study in the Welsh language).

5.151     Programme accreditation in this context is different from on-going quality assurance. Accreditation is itself a quality assessment. All programmes require initial approval and some element of on-going (usually quinquennial[1]) review. Approval is dependent on programmes satisfying appropriate quality indicators (eg, satisfaction of basic regulatory requirements regarding the Foundation subjects, assessment and credit, sufficiency of faculty, learning resources and facilities) and satisfactory engagement with broader indicators of quality regarding curriculum design, teaching and learning and assessment strategies, and staff development. These issues are relevant at programme review and re-validation, where student progression and performance data, external examiners’ reports, and employability data may also add to the range of indicators.

5.152     Within this framework continuing QA and quality enhancement are a matter for internal, institutional processes. Programmes are expected to produce annual reports incorporating analysis and reporting at module and programme level. Student input into these processes is the norm (eg, through the operation of staff-student liaison groups or committees, and student representation on programme committees). Progression and performance data, external examiners’ reports, and employability data are also used as indicators in the QA conducted within institutions.

5.153     In the public sector these internal processes are themselves subject to quality audit by the QAA. A brief summary of the system of QAA audit is contained in Chapter 6 of the Literature Review. QAA oversight will also apply to the non-law degrees of GDL graduates and to the non-law degrees of IP attorneys and some notaries.

5.154     Standards (which form the basis for quality assessments) are set at subject level. For law these standards are an amalgam of QAA requirements for all honours degrees in law (qualifying and non-qualifying) and professional requirements set down by the relevant approved regulators (the BSB and the SRA). As noted in Chapter 4, the professions’ standards and guidance are consolidated in the JASB Handbook.

Vocational education for barristers, CILEx members and solicitors

5.155     QA for vocational education is a function of each approved regulator, and QA procedures for the LPC and BPTC have been recently reviewed. The underlying QA processes are similar. Institutions intending to offer the LPC or BPTC must be validated and must satisfy the regulator that rigorous quality assurance systems are in place. Both courses also require the appointment of external examiners, annual reporting and monitoring.

5.156     The LPC has, since 2010, moved to a new system of authorisation and validation based on paper exercises. Except where a provider has not previously been validated, there is no requirement for an institutional visit.[2] Continuing quality assurance on the LPC is achieved primarily via an ‘enhanced’ external examiner system. Externals are appointed as independent assessors by the SRA, with the function specifically to advise the SRA on the standards set by the provider. A lead external examiner is also appointed to have oversight of processes at each institutional provider. This is intended to strengthen the independence of oversight.

5.157     External examiners are also appointed on a subject-basis at each BPTC provider. They have responsibility for verifying standards of assessment and ensuring consistency between providers, and play a part in monitoring the quality of courses and provision of resources. An additional layer of external moderators was introduced in 2007-08. Moderators scrutinise subject assessment across all providers, and thus also support the objective of assuring consistency. Each examiner visits the institution at least twice during the academic year, and is required to produce both an interim and a final report.

5.158     Periodic monitoring is required for the BPTC, with provision in the regulations for both ‘regular’ monitoring visits and ‘triggered’ visits where the BSB has identified a cause for concern such as declining standards or over-recruitment. The Wood Report (BSB, 2008) noted the frequency with which monitoring visits mentioned the general good quality of teaching, student support and facilities. The continuing utility of the visits themselves seems largely to have been assumed from the value of the evidence produced.[3] Similar ‘triggered’ visits are operated by the SRA who may institute a monitoring visit if significant concerns are raised about a provider, but routine monitoring visits are no longer conducted. It is too soon properly to assess the impact on quality of this change, if any.

5.159     CILEx qualifications are offered in 72 colleges across England and Wales, with 69 offering face-to-face tuition and three by distance learning, with the CILEx Law School as the largest distance learning provider. They have also been embedded in a number of degrees. QA of those centres is the responsibility of CILEx, and the QA process used is designed to comply with Ofqual’s General Conditions of Recognition (2012) as well as being subject to IPS oversight. Accreditation of a centre for CILEx, described in more detail in the Literature Review, Chapter 6, includes requirements as to examinations, professional skills, prevention and investigation of malpractice and maladministration by the centre and withdrawal of centre accreditation. Accreditation is normally for a five-year period.

QA and the smaller regulators

5.160     The size of the training communities for the smaller regulators allows a closer degree of control and supervision over the training process. Some deliver and/or assess and quality assure their own qualifications. In other cases the provision may be through a single accredited institution, so that QA procedures for courses have been individually negotiated. Insofar as information is publicly available, it is summarised in the table following and described more fully in the Literature Review, Chapter 6.

5.161     Licensed conveyancer, notary and registered trade mark attorney qualifications have some element of dual assurance by virtue of delivery by universities and colleges. It is notable that no LSA 2007-approved regulator has adopted an approach that accredits only the assessment process, not the training.[4]

5.162     Given the scale of the smaller operations, to require providers to be, for example, Ofqual assured could be disproportionate at the present time, though such a move would facilitate transfer and exemption. Where Ofqual or similar external assurance are not feasible, steps should be taken by the regulators themselves to increase public information about internal quality assurance processes and their outcomes.


Table 5.4: QA processes of the smaller regulators


Occupation Courses and qualifications
Costs lawyers Modular programme delivered by ACL Training Ltd, and authorised by the CLSB.
Licensed conveyancers Centralised assessments. CLC literature suggests that Bradford College, the Manchester College and the Manchester College of Higher Education and Media Technology are currently accredited to provide at least some of the CLC courses. Distance Learning provision is through CLC.
Notaries  '8. Practical Qualifications

8.1 Any person wishing to be admitted as a general notary under rule 5 shall have followed and attained a satisfactory standard in a course or courses of studies covering all of the subjects listed in schedule 2.

8.2 Whether a particular course of studies satisfies the requirements of these rules and whether a person has obtained a satisfactory standard in that course shall be determined by the Master after seeking the advice of the Board.

8.3 The Master after seeking the advice of the Board may by order direct that the award of a particular qualification meets the requirements of these rules as to some or all of the subjects listed in schedule 2.

8.4 The Master may as a condition of making a direction under rule 8.3 require the body by which the qualification is awarded to issue those pursuing a course of studies leading to that qualification with such information about the notarial profession, these rules and other rules made by the Master and the Company as the Master may specify.

8.5 The Master may by Order add any subjects to the list in schedule 2 or remove any subjects from that list or alter any of the provisions of that schedule but before doing so he shall consult the Board.'

The UCL Notarial Practice course is currently accredited for this purpose.


Patent Attorneys The European Qualifying Examination is organised and conducted by a Supervisory Board, an Examination Board, Examination Committees and an Examination Secretariat of the European Patent Office. There is no required preliminary training but courses are offered by CEIPI  (Centre d'Etudes Internationales de la Propri�t� Intellectuelle) and the European Patent Institute (EPI) and reference is made to HEI and other pre-existing courses in European patent law. The assessment itself is closely prescribed by a European Patent Office regulation (EPO, 2011).
Domestic qualifications at foundation and final level are organised and assessed by the Joint Examination Board of CIPA and ITMA. The Bournemouth, Brunel, Queen Mary and Manchester Universities are accredited examination agencies for the foundation level.
Patent administrators CIPA Certificate in Patent Administration delivered and administered by professional body
Registered trade mark attorneys (new model) Bournemouth, Brunel, Queen Mary, and Manchester universities are accredited examination agencies for foundation level activity in the relevant regulations. Nottingham Trent University is an examination agency at both foundation and final level. In practice, the diet of courses is delivered through the Queen Mary Certificate in Trade Mark Law and Practice followed by the NTU Professional Certificate in Trade Mark Law and Practice.
Trade mark administrators ITMA Trade Mark Administrators Course delivered and administered by professional body
Legal services apprenticeships Competencies set by reference to NOS. Formal education and activity and supervised practice are blended in accordance with the relevant apprenticeship framework. An Apprenticeship Quality Statement (2012) sets out the standards the National Apprenticeship Service expects for the delivery of apprenticeships by employers and training providers. Award bearing knowledge- and competence-based units are formally accredited by CILEx.


Critical issues

5.163     It should be noted that relatively little evidence was received regarding QA systems and processes as such during the research phase. As noted in Chapter 2, a number of respondents raised concerns about the variability of standards on undergraduate programmes and, to a lesser extent, the LPC. Little by way of comment was received in respect of other courses. It is not always clear on what basis inconsistency is being alleged.

5.164     Considerable resources are allocated across institutions to achieving consistency within and between programmes and institutions. Assessment criteria are widely used, papers are double marked, and moderated by trained external examiners. Some evidence exists of a broad consensus between externals as to grading standards; nevertheless cross-institutional variations may occur. These may be accounted for by various factors. For example, differences in programme design have been highlighted during the research, specifically in relation to the LLB. Other factors accounting for divergence may include:[5]

  • the ‘fuzziness’ and complexity of assessment judgments, which may inhibit standardisation;
  • use of a different range of assessment tools between courses and institutions, and variations in the range of outcomes assessed;
  • permitted differences in assessment regulations determining how classifications are awarded (averaging, weighted averaging, etc);
  • inevitability, given current assessment models, of some relative marking within institutional cohorts;
  • increased numbers of students and institutions.

5.165     These would ‘legitimately’ explain some of the variance, but less excusable factors could also be involved such as inadequate training in assessment, the allocation of insufficient time and resources to marking and moderation, and breach of assessment practices. Even those variances that are regarded as more or less legitimate might be reduced if greater use were made of  statistical tools to check for error variation, and assessors were more aware of the ‘principles of measurement’ on which gradings should be based.[6] The extent of these problems is, inevitably, hidden, though discussion of these can be found in the general educational literature.

5.166     These are not localised problems to law, but reflect, in part, the wider challenge of delivering, assessing and grading the complex learning achievements required, as well as the impact of a culture of institutional autonomy in higher education in the UK and in similar systems, such as the USA and Australia (Yorke et al, 2008).

5.167     The law degree exhibits three trends which complicate these otherwise more general issues. Numbers on law degrees have grown faster than the traditional legal professions; consequently the proportion of law graduates able to enter those professions has shrunk – thus, as noted in Chapter 2, law graduates now constitute less than 50% of newly admitted solicitors.[7] At the same time, recruitment trends, aside from the currently low level of activity, highlight a long-term change in the pattern of recruitment, as numbers of GDL students, and of direct entrants from other occupations or jurisdictions increase.[8] Finally market and regulatory changes accelerated by the LSA 2007 could see recruitment, and students, moving away from the traditional professions. In combination these changes are all likely to explain why a declining proportion of law graduates progress into the traditional professions.

5.168     As the Legal Services Institute argues (2010, 2012) this situation certainly diminishes the authority (de facto if not de jure) of the professional bodies over the academic law schools and places in doubt the extent to which it is appropriate to impose additional requirements.

5.169     It can also be argued that a combination of other strategies could be harnessed to deliver change:

  • a more rigorous approach to standard-setting;
  • market competition;
  • information;
  • re-focusing existing regulation.

5.170     A more rigorous approach to standard-setting: the problem of disjunction between standards within individual institutions and the standards associated with the discipline or profession is widely acknowledged, and one which the external examiner system struggles to address. A solution to this is, as proposed in Chapter 4, to adopt a significantly different and more formalised approach to setting and using standards. As described in Chapter 4, this is seen as a key means of enhancing QA at the institutional level. It may be supported, to some degree, by the following.

5.171     Market competition: as the sector develops we are seeing greater variation in courses and approaches. Examples include the professional practice-oriented law degrees launched by BPP and the University of Law; the various exempting degrees offered by Glamorgan, Huddersfield, Northumbria, Nottingham Trent, and Westminster Universities; and the development of a problem-based curriculum at York Law School. Market responses to innovation, informed by more transparent information about providers, can be used to motivate change, though too great a reliance on competition may also undermine attempts to build greater collaboration in standard-setting.

5.172     Information: There is a growing requirement for the university sector to publish output measures relevant to student choice and QA (broadly defined), including degree classifications awarded, class contact, employability, student satisfaction and other data. A number of such indicators are now published officially as ‘Key Information Sets’ for each degree, including all QLDs. This moves university education closer to market-based incentives for maintaining and enhancing quality. However, the relationship between factors, such as contact hours and quality can be contested, and such innovations may cause perverse incentives.

5.173     Re-purposing and refining existing regulation and required processes: Other approaches should be explored, such as ensuring institutions are putting a proper level of resources into the student learning experience; enhancing assessment – discussed in Chapter 4; encouraging collaboration rather than imposing ‘command and control’ regulation; and the provision of guidance as to best assessment practices (eg, on the use of statistical tools) (see also Brown, 2009).

5.174     Within the vocational context too, greater consistency might be facilitated by external assessment (see the BPTC), though large centralised assessments can be as problematic as localised ones, unless appropriate resources are put into training question-setters and assessors, and moderation. Separating teaching from assessment may also create additional challenges in terms of assuring fairness, consistency and authenticity of assessment. A more lateral approach to dealing with inconsistencies of institutional coverage might involve providing e-learning resources to refresh or fill gaps in substantive knowledge, for example.

QA for learning in the workplace[9]

5.175     There are examples of existing good practice in assuring quality in workplace learning in the sector, and also considerable variation in practices.[10] Entity regulation will increase the responsibility of firms and other entities to quality assure their training. This includes periods of required supervised practice such as the training contract. Additional standards and accreditations may extend to specific training obligations, such as those requiring chambers’ selection panels to have training[11] in ‘fair recruitment and selection processes’ (BSB, 2012c). Other entity-related accreditations, such as the Law Society’s LEXCEL standard,[12] impose positive training and audit or monitoring obligations, but they do not necessarily link clearly to maintenance of competence:

And LEXCEL doesn’t do it. You know we’re LEXCEL accredited but LEXCEL makes sure that you’ve got great systems in place but actually when it comes down to the nitty-gritty of the quality of advice I don’t think LEXCEL really gets to it.


Regulatory monitoring of workplace-based learning

5.176     Regulatory monitoring varies between the different professions. It may be of performance at entity level, as part of an entity-based scheme, inherent in individual accreditation, or attached to the provision of periods of supervised practice. Requirements for the periods of supervised practice in all professions are described in the Literature Review, Chapter 6.

5.177     Monitoring of initial training is seen as critical to assuring that it is converting technical knowledge into practical know-how. Supervision, and potentially the monitoring of supervision, may be considered critical to that transition. Monitoring of pupillage is in the course of further development but is likely to involve random or triggered visits, confidential questionnaires and interviews for pupils (BSB, 2012b:61). There is similarly provision for training establishments with trainee solicitors to be visited for monitoring either by random selection or following a trigger (SRA, 2006) although there was some doubt about the extent and effectiveness of such monitoring (see below).

5.178     Variation in the quality of learning and supervision in training contracts has been a long-standing issue (see, eg Goriely and Williams, 1996). Within the LETR research data, although some respondents (68% of the LawNet respondents, for example) felt the training contract in particular was sufficiently regulated to assure quality, others were more doubtful:

I would change the training contract, having received virtually no training from a firm which believed in trainees ‘learning by their mistakes’. I would make it more prescriptive about the level of training and supervision received. Some firms are excellent and others are appalling. The appalling ones then complete the assessment forms for the qualifying solicitor so that it appears that all the steps have been taken.

Solicitor (online survey)

Whilst I had a very broad training contract not once did the [SRA] knock actually on our door and say ‘Let’s have a look at the training [logs]’. And I did have that and I did keep my training logs. I remember frantically trying to get them all up to scratch for qualifying. They never once even looked.

Solicitor (recently qualified)

5.179     Whether this is actually problematic is difficult to assess. Firms need their trainees and qualifiers to perform well for the firm, so the incentive for proper training and supervision tends to be assumed. But there is little public information on the scale of monitoring activities. While it is clear there is very high quality training that is valued by both trainees and employers, there is also a risk that, without effective internal or external oversight, the value of workplace learning may be seriously undermined. This has even been seen as a potential justification for abolishing the training contract (see, eg, LSI, 2010).

5.180     The relative absence of monitoring continuing workplace learning (including CPD) has been noted. This is also an area where entity-based regulation could play a significant role in providing a proportionate quality assurance system. New South Wales and Queensland in particular have substantial experience of using ‘practice reviews’ or ‘self-assessment audits’ to require incorporated legal practices to self-assess and report on their implementation of appropriate management systems (Briton and McLean, 2008; Mark and Gordon, 2009; Briton, 2011).[13] These are distinct from compliance audits, which may have formal regulatory consequences. Rather, they form part of what in New South Wales has been called an ‘education towards compliance’ strategy (Parker et al, 2010) since they are designed to facilitate firms identifying where their systems are non- or only partially compliant with regulation, and help identify steps that will enable firms to develop fuller compliance. This would seem to offer a proportionate approach to monitoring, since it places the onus on the entity to manage and report rather than impose a high level of external supervision and monitoring on the process.



[1] The period of review may be varied, eg, on first validation a degree may be given approval for three or four years to enable an early re-assessment where the validation panel has reservations about some aspect of provision or delivery, but which are not sufficiently serious to refuse approval.

[2] Though internal validation and review processes will continue.

[3] ‘We are sure that the BSB will wish to continue the monitoring system which seems to have been successful to date’ (BSB, 2008:57).

[4] Though this is the approach adopted by the Office of the Immigration Service Commissioner in accrediting regulated immigration advisers. The qualification framework for patent attorneys also adds a supervised practice component to its assessment structure.

[5] See generally Hanlon et al, 2004; Yorke, 2008, Yorke et al, 2008.

[6] On this last point, particularly, note Elton and Johnston (2002:29):


[It is] important to remember that class boundaries were settled a long time ago when virtually all assessment was based on finals papers. To take them over into schemes where a substantial part of the assessment is by other means is quite indefensible (rather like keeping the marks on a thermometer the same, but change from mercury to alcohol). In particular, it is well known that on average course work assessment leads to higher average marks and smaller spreads of marks. It is very likely that the apparent grade inflation over the past twenty years is due largely to examiners’ ignorance of simple principles of measurement and is not a reflection of either improved learning or greater lenience in marking.


[7] Though there is, as noted, some concomitant evidence of greater graduate recruitment into the newer professions, some of which may not require the same breadth of prior learning.

[8] This is marked in the solicitors’ profession – see the analysis by the LSI (2010:22-24).

[9] The quality assurance requirements demanded of approved training providers have been explored at some length in Chapter 6 of the Literature Review. An overview of periods of supervised practice including supervisor requirements, their prescribed content and method of sign off or assessment appears in Chapter 2, Annex II.

[10] This includes, for example, in-house academies; effective appraisal and mentoring schemes and support for CPD and qualifications such as the Oxford IP Diploma.

[11] Which may involve classroom or online sessions, private study or formal CPD on the topic.

[12] Eg, for LEXCEL requirements see Law Society (n.d. a:8). Note that insurers may also have an interest in quality and risk management of performance. See for example, the Aon ‘Quality Assurance Risk Management portal’ at http://www.aonrm.com/

[13] The two jurisdictions between them have conducted over 1,000 such reviews – see Parker et al, 2010; Briton, 2011.