均分要求75%
Group 2 二类大学
grade requirement
均分要求80%
软科中国大学排名2022(总榜)或软科中国大学排名2023(总榜)排名前100的大学
非‘985工程’的其他 院校
以及以下两所大学:
University of Chinese Academy of Sciences 中国科学院大学
University of Chinese Academy of Social Sciences 中国社会科学院大学
Group 3 三类大学
grade requirement
均分要求85%
软科中国大学排名2022(总榜)或 软科中国大学排名2023(总榜)101-200位的大学
School of Computer Science – all MSc programmes 计算机学院硕士课程入学要求
Group 1 一类大学 Grade requirement | 院校 |
Group 2 二类大学 grade requirement | 院校 |
Group 3 三类大学 grade requirement |
College of Social Sciences – courses listed below 社会科学 学院部分硕士课程入学要求 MA Education (including all pathways) MSc TESOL Education MSc Public Management MA Global Public Policy MA Social Policy MA Sociology Department of Political Science and International Studies 全部硕士课程 International Development Department 全部硕士课程
Group 1 一类大学 Grade requirement | 院校 |
Group 2 二类大学 grade requirement | 院校 |
Group 3 三类大学 grade requirement |
All other programmes (including MBA) 所有其他 硕士课程(包括 MBA)入学要求
Group 1 一类大学 | 院校 |
Group 2 二类大学 grade requirement | 院校 |
Group 3 三类大学 | |
Group 4 四类大学 来自四类大学的申请人均分要求最低85%,并同时具有出色学术背景,优异的专业成绩,以及(或)相关的工作经验,将酌情考虑。 |
|
Please note:
Holders of the Licenciado/Professional Title from a recognised Colombian university will be considered for our Postgraduate Diploma and Masters degrees. Applicants for PhD degrees will normally have a Maestria or equivalent.
Holders of a good bachelor degree with honours (4 to 6 years) from a recognised university with a upper second class grade or higher will be considered for entry to taught postgraduate programmes. Holders of a good Masters degree from a recognised university will be considered for entry to postgraduate research programmes.
Holders of a good Diploma Visoko Obrazovanje (Advanced Diploma of Education) or Bacclaureus (Bachelors) from a recognised Croatian higher Education institution with a minimum overall grade of 4.0 out of 5.0, vrlo dobar ‘very good’, for 2:1 equivalence or 3.0 out of 5.0, dobar ‘good’, for 2:2 equivalence, will be considered for entry to taught postgraduate programmes. Holders of a good Bacclaureus (Bachelors) from a recognised Croatian Higher Education institution with a minimum overall grade of 4.0 out of 5.0, vrlo dobar ‘very good’, or a Masters degree, will be considered for entry to postgraduate research programmes.
Holders of a Bachelors degree(from the University of the West Indies or the University of Technology) may be considered for entry to a postgraduate degree programme. A Class II Upper Division degree is usually equivalent to a UK 2.1. For further details on particular institutions please refer to the list below. Applicants for PhD level study will preferably hold a Masters degree or Mphil from the University of the West Indies.
Holders of a good four-year government-accredited Bachelors degree from a recognised Higher Education college with a minimum overall GPA of 3 out of 4 for 2:1 equivalency, or a GPA of 2.75 out of 4 for 2:2 equivalency; or a good four-year Bachelors degree (Ptychio) from a recognised University, with a minimum overall grade of 6.5 out of 10 for 2:1 equivalency, or 5.5 for 2:2 equivalency; will be considered for entry to taught postgraduate programmes.
Holders of a good Bakalár, or a good pre-2002 Magistr, from a recognised Czech Higher Education institution with a minimum overall grade of 1.5, B, velmi dobre ‘very good’ (post-2004) or 2, velmi dobre ‘good’ (pre-2004), for 2:1 equivalence, or 2.5, C, dobre ‘good’ (post-2004) or 3, dobre ‘pass’ (pre-2004) for 2:2 equivalence, will be considered for entry to taught postgraduate programmes.
Holders of a good Bachelors degree/Candidatus Philosophiae, Professionbachelor or Eksamensbevis from a recognised Danish university, with a minimum overall grade of 7-10 out of 12 (or 8 out of 13) or higher for 2:1 equivalence, or 4-7 out of 12 (or 7 out of 13) for 2:2 equivalence depending on the awarding institution will be considered for entry to taught postgraduate programmes.
Holders of the Licenciado or an equivalent professional title from a recognised Ecuadorian university may be considered for entry to a postgraduate degree programme. Grades of 70% or higher can be considered as UK 2.1 equivalent. Applicants for PhD level study will preferably hold a Magister/Masterado or equivalent qualification, but holders of the Licenciado with excellent grades can be considered.
Holders of a Bachelors degree from a recognised university in Egypt will be considered for postgraduate study. Holders of Bachelors degree will normally be expected to have achieved a GPA of 3.0/4 for 2:1 equivalency or 2.8 for 2:2 equivalency. Applicants holding a Bachelors degree with alternative grading systems, will normally be expected to have achieved a 75% (Very Good) for 2:1 equivalency or 65% (Good) for 2:2 equivalency. For applicants with a grading system different to those mentioned here, please contact [email protected] for advice on what the requirements will be for you.
Holders of a good Bakalaurusekraad from a recognised university or Applied Higher Education Institution with a minimum overall grade of 4/5 or B for 2:1 equivalency or 3/5 or C for 2:2 equivalency, or a good Rakenduskõrgharidusõppe Diplom (Professional Higher Education Diploma), will be considered for entry to taught postgraduate programmes.
Students who hold a Masters degree with very good grades (grade B, 3.5/4 GPA or 85%) will be considered for Postgraduate Diplomas and Masters degrees.
Holders of a good Ammattikorkeakoulututkinto (AMK) (new system), an Yrkeshögskoleexamen (YHS) (new system), a Kandidaatti / Kandidat (new system), an Oikeustieteen Notaari or a Rättsnotarie, a good Kandidaatti / Kandidat (old system), a professional title such as Ekonomi, Diplomi-insinööri, Arkkitehti, Lisensiaatti (in Medicine, Dentistry and Vetinary Medicine), or a Maisteri / Magister (new system), Lisensiaatti / Licenciat, Oikeustieteen Kandidaatti / Juris Kandidat (new system) or Proviisori / Provisor from a recognised Finnish Higher Education institution, with a minimum overall grade of 2/3 or 3-4/5 for 2:1 equivalence or 1-2/3 or 2.5-3/5 for 2:2 equivalence, will be considered for entry to taught postgraduate programmes.
Holders of a good three-year Licence, License Professionnelle, Diplôme d'Ingénieur/Architecte Diplômé d'État, Diplôme from an Ecole Superieure de Commerce / Gestion / Politique, or Diplome d'Etat Maitrise of three years duration or a Maîtrise from a recognised French university or Grande École will be considered for postgraduate taught study.
Holders of Bachelors degree will normally be expected to have achieved a minimum overall grade of 13 out of 20, bien, for 2:1 equivalency, or 11 out of 20, assez bien, for 2:2 equivalency depending on the awarding institution.
Holders of a good three-year Bachelor degree, a Magister Artium, a Diplom or an Erstes Staatsexamen from a recognised university, or a good Fachhochschuldiplom from a Fachhochschule (university of applied sciences), with a minimum overall grade of 2.5 for 2:1 equivalency, or 3.0 for 2:2 equivalency, will be considered for entry to taught postgraduate programmes.
Students from Germany who have completed three years of the Erstes Staatsexamen qualification with a grade point average (GPA) of 10 from the first six semesters of study within the Juristische Universitätsprüfung programme would be considered for entry onto LLM programmes. Students from Germany who have completed the five year Erstes Staatsexamen qualification with a grade point average (GPA) of 6.5 would be considered for entry onto LLM programmes.
Students who hold a Bachelor degree from a recognised institution will be considered for Postgraduate Diplomas and Masters degrees. Most taught Masters programmes require a minimum of an upper second class degree (2.1) with a minimum GPA of at least 3.0/4.0 or 3.5/5.0 Students who have completed a Masters degree from a recognised institution will be considered for PhD study.
Holders of a good four-year Ptychio (Bachelor degree) from a recognised Greek university (AEI) with a minimum overall grade of 6.5 out of 10 for 2:1 equivalency, or 5.5 out of 10 for 2:2 equivalency, or a good four-year Ptychio from a recognised Technical Higher Education institution (TEI) with a minimum overall grade of 7.5 out of 10 for 2:1 equivalency, or 6.5 out of 10 for 2:2 equivalency, will be considered for entry to taught postgraduate programmes.
4-year Licenciado is deemed equivalent to a UK bachelors degree. A score of 75 or higher from Universidad de San Carlos de Guatemala (USAC) can be considered comparable to a UK 2.1, 60 is comparable to a UK 2.2. Private universities have a higher pass mark, so 80 or higher should be considered comparable to a UK 2.1, 70 is comparable to a UK 2.2
The Hong Kong Bachelor degree is considered comparable to British Bachelor degree standard. Students with bachelor degrees awarded by universities in Hong Kong may be considered for entry to one of our postgraduate degree programmes.
Students with Masters degrees may be considered for PhD study.
Holders of a good Alapfokozat / Alapképzés (Bachelors degree) or Egyetemi Oklevel (university diploma) from a recognised Hungarian university, or a Foiskola Oklevel (college diploma) from a recognised college of Higher Education, with a minimum overall grade of 3.5 for 2:1 equivalency, or 3 for 2:2 equivalency, will be considered for entry to taught postgraduate programmes.
Holders of a Bachelors degree of three or four years in duration from a recognised university in India will be considered for postgraduate taught study. Holders of Bachelors degree will normally be expected to have achieved 55% - 60% or higher for 2:1 equivalency, or 50% - 55% for 2:2 equivalency depending on the awarding institution.
Either: A four-year Bachelors degree (first class or very good upper second class)
Or: A three-year Bachelors degree (first class) from recognised institutions in India.
For MSc programmes, the Business School will consider holders of three-year degree programmes (first class or very good upper second class) from recognised institutions in India.
For entry to LLM programmes, Birmingham is happy to accept applications from 3 or 5 year LLB holders from India from prestigious institutions.
Holders of the 4 year Sarjana (S1) from a recognised Indonesian institution will be considered for postgraduate study. Entry requirements vary with a minimum requirement of a GPA of 2.8.
Holders of a Bachelors degree from a recognised university in Iran with a minimum of 14/20 or 70% will be considered for entry to taught postgraduate taught programmes.
Holders of a Bachelors degree from a recognised university in Iraq will be considered for postgraduate study. Holders of a Bachelors degree will normally be expected to have achieved a GPA of 3.0/4 or 75% for 2:1 equivalency, or 2.8/4 or 70% for 2:2 equivalency.
Holders of a Bachelors degree from a recognised university in Israel will be considered for postgraduate study. Holders of Bachelors degree will normally be expected to have achieved score of 80% for 2:1 equivalency or 65% for 2:2 equivalency.
Holders of a good Diploma di Laurea, Licenza di Accademia di Belle Arti, Diploma di Mediatore Linguistico or Diploma Accademico di Primo Livello from a recognised Italian university with a minimum overall grade of 100 out of 110 for 2:1 equivalence, or 92 out of 110 for 2:2 equivalence, will be considered for entry to taught postgraduate programmes.
Students who hold the Maitrise, Diplome d'Etude Approfondies, Diplome d'Etude Superieures or Diplome d'Etude Superieures Specialisees will be considered for Postgraduate Diplomas and Masters degrees (14-15/20 or Bien from a well ranked institution is considered comparable to a UK 2.1, while a score of 12-13/20 or Assez Bien is considered comparable to a UK 2.2).
Students with a Bachelor degree from a recognised university in Japan will be considered for entry to a postgraduate Masters degree provided they achieve a sufficiently high overall score in their first (Bachelor) degree. A GPA of 3.0/4.0 or a B average from a good Japanese university is usually considered equivalent to a UK 2:1.
Students with a Masters degree from a recognised university in Japan will be considered for PhD study. A high overall grade will be necessary to be considered.
Holders of a Bachelors degree of four years duration from a recognised university in Jordan will be considered for postgraduate study. Holders of Bachelors degree will normally be expected to have achieved a GPA of 3.0/4, 3.75/5 or 75% for 2:1 equivalency or 2.8/4, 3.5/5 or 70% for 2:2 equivalency.
Students who have completed their Specialist Diploma Мамаң дипломы/Диплом специалиста) or "Magistr" (Магистр дипломы/Диплом магистра) degree (completed after 1991) from a recognised higher education institution, with a minimum GPA of 2.67/4.00 for courses requiring a UK lower second and 3.00/4.00 for courses requiring a UK upper second class degree, will be considered for entry to postgraduate Masters degrees and, occasionally, directly for PhD degrees. Holders of a Bachelor "Bakalavr" degree (Бакалавр дипломы/Диплом бакалавра) from a recognised higher education institution, with a minimum GPA of 2.67/4.00 for courses requiring a UK lower second and 3.00/4.00 for courses requiring a UK upper second class degree, may also be considered for entry to taught postgraduate programmes.
Students who hold a Bachelor degree from a recognised institution will be considered for Postgraduate Diplomas and Masters degrees. Most taught Masters programmes require a minimum of an upper second class degree (2.1) with a minimum GPA of at least 3.0/4.0 or 3.5/50
Holders of a Bachelors degree of four years duration from a recognised university in Kuwait will be considered for postgraduate study. Holders of Bachelors degree will normally be expected to have achieved a GPA of 3.0/4, 3.75/5 or 75% for 2:1 equivalency or 2.8/4, 3.5/5 or 70% for 2:2 equivalency.
Holders of a good pre-2000 Magistrs or post-2000 Bakalaurs from a recognised university, or a good Postgraduate Diploma (professional programme) from a recognised university or institution of Higher Education, with a minimum overall grade of 7.5 out of 10 for 2:1 equivalency, or 6.5 out of 10 for 2:2 equivalency, will be considered for entry to taught postgraduate programmes.
Holders of a Bachelors degree from a recognised university in Lebanon will be considered for postgraduate study. Holders of a Bachelors degree will normally be expected to have achieved a score of 16/20 or 80% for 2:1 equivalency, or 14/20 or 70% for 2:2 equivalency.
Holders of a Bachelors degree from a recognised university in Libya will be considered for postgraduate study. Holders of a Bachelors degree will normally be expected to have achieved score of 70% for 2:1 equivalency or 65% for 2:2 equivalency. Alternatively students will require a minimum of 3.0/4.0 or BB to be considered.
Holders of a good Bakalauras (post 2001), Profesinis Bakalauras (post 2001) or pre-2001 Magistras from a recognised university with a minimum overall grade of 8 out of 10 for 2:1 equivalency, or 7 out of 10 for 2:2 equivalency, will be considered for entry to taught postgraduate programmes.
Holders of a good Bachelors degree or Diplôme d'Ingénieur Industriel from a recognised Luxembourgish Higher Education institution with a minimum overall grade of 16 out of 20 for 2:1 equivalence, or 14 out of 20 for 2:2 equivalence, will be considered for entry to taught postgraduate programmes.
Students who hold a Masters degree will be considered for Postgraduate Diplomas and Masters degrees (70-74% or A or Marginal Distinction from a well ranked institution is considered comparable to a UK 2.1, while a score of 60-69% or B or Bare Distinction/Credit is considered comparable to a UK 2.2).
Holders of a Bachelors degree from a recognised Malaysian institution (usually achieved with the equivalent of a second class upper or a grade point average minimum of 3.0) will be considered for postgraduate study at Diploma or Masters level.
Holders of a good Bachelors degree from a recognised Higher Education Institution with a minimum grade of 2:1 (Hons) for UK 2:1 equivalency, or 2:2 (Hons) for UK 2:2 equivalency, will be considered for entry to taught postgraduate programmes.
Students who hold a Bachelor degree (Honours) from a recognised institution (including the University of Mauritius) will be considered for Postgraduate Diplomas and Masters degrees. Most taught Masters programmes require a minimum of an upper second class degree (2:1).
Students who hold the Licenciado/Professional Titulo from a recognised Mexican university with a promedio of at least 8 will be considered for Postgraduate Diplomas and Masters degrees.
Students who have completed a Maestria from a recognised institution will be considered for PhD study.
Holders of a Bachelors degree, licence or Maîtrise from a recognised university in Morocco will be considered for postgraduate study. Holders of a Bachelors degree will normally be expected to have achieved a score of 15/20 or 75% for 2:1 equivalency, or 13/20 for 2:2 equivalency.
Students with a good four year honours degree from a recognised university will be considered for postgraduate study at the University of Birmingham. PhD applications will be considered on an individual basis.
Holders of a Bachelors (Honours) degree of four years duration from a recognised university in Nepal will be considered for postgraduate taught study. Students with a Bachelors degree of at least three years duration plus a Masters degree may also be considered for postgraduate study. Degrees must be from a recognised institution in Nepal.
Holders of Bachelors degree will normally be expected to have achieved a GPA of 3.2/4.0 or 65%-79% average or higher for 2:1 equivalency, or a GPA of 3.0/4.0 or 60%-65% for 2:2 equivalency depending on the awarding institution.
Holders of a Bachelors degree from a recognised Dutch university, or Bachelors degree from a recognised Hogeschool (University of Professional Education), or a good Doctoraal from a recognised Dutch university, with a minimum overall grade of 7 out of 10 for 2:1 equivalence, or 6 out of 10 for 2:2 equivalence, will be considered for entry to taught postgraduate programmes.
Students who hold a Bachelor degree (minimum 4 years and/or level 400) from a recognised institution will be considered for Postgraduate Diplomas and Masters degrees. Most taught Masters programmes require a minimum of an upper second class degree (2.1) with a minimum GPA of at least 3.0/4.0 or 3.5/5.0
Holders of a good three-six-year Bachelorgrad, Candidatus Magisterii, Sivilingeniø (siv. Ing. - Engineering), "Siviløkonom" (siv. Øk. - Economics) degree from a recognised Norwegian education institution with a minimum GPA of B/Very Good or 1.6-2.5 for a 2.1 equivalency, or a GPA of C/Good or 2.6-3.2 for a 2.2 equivalency; will be considered for entry to taught postgraduate programmes.
Holders of a Bachelors degree of four years duration from a recognised university in Oman will be considered for postgraduate study. Holders of Bachelors degree will normally be expected to have achieved a GPA of 3.0/4, 3.75/5 or 75% for 2:1 equivalency or 2.8/4, 3.5/5 or 70% for 2:2 equivalency.
Holders of a Bachelors degree of four years in duration from a recognised university in Pakistan will be considered for postgraduate taught study. Students with a Bachelors degree of at least three years duration followed by a Masters degree of one or two years duration, or holders of a two year Bachelors degree and a two year Masters degree in the same subject, may also be considered for postgraduate study.
Holders of Bachelors degree will normally be expected to have achieved a GPA of 2.8-3.0/4.0 or 65% or above for 2:1 equivalency, or a GPA of 2.6/4.0 or 60% or above for 2:2 equivalency depending on the awarding institution.
A two-year degree followed by a three-year LLB will count as a full Bachelors degree.
All qualifications must be from recognised institutions. For further details on recognised institutions, please refer to Pakistan’s Higher Education Commission
Holders of a Bachelors degree from a recognised university in the Palestinian Territories will be considered for postgraduate study. Holders of Bachelors degree will normally be expected to have achieved a GPA of 3/4 or 80% for 2:1 equivalency or a GPA of 2.5/4 or 70% for 2:2 equivalency.
Holders of the Título de Licenciado /Título de (4-6 years) or an equivalent professional title from a recognised Paraguayan university may be considered for entry to a postgraduate degree programme. Grades of 4/5 or higher can be considered as UK 2.1 equivalent. The Título Intermedio is a 2-3 year degree and is equivalent to a HNC, it is not suitable for postgraduate entry but holders of this award could be considered for second year undergraduate entry or pre-Masters. Applicants for PhD level study will preferably hold a Título de Maestría / Magister or equivalent qualification, but holders of the Título/Grado de Licenciado/a with excellent grades can be considered.
Holders of the Bachiller, Licenciado, or Título Profesional with at least 13/20 may be considered as UK 2.1 equivalent. Applicants for PhD level study will preferably hold a Título de Maestría or equivalent qualification.
Holders of a good post-2001 Licencjat / Inzynier (Bachelors degree), or a pre-2001 Magister, from a recognised Polish university, with a minimum overall grade of 4.5/4+ out of 5, dobry plus ‘better than good’ for 2:1 equivalence, or 4 out of 5, dobry 'good' for 2:2 equivalence, will be considered for entry to taught postgraduate programmes.
Holders of a good Licenciado from a recognised university, or a Diploma de Estudos Superiores Especializados (DESE) from a recognised Polytechnic Institution, with a minimum overall grade of 16 out of 20, bom com distinção ‘good with distinction’, for 2:1 equivalence, or 14 out of 20, bom ‘good’, for 2:2 equivalence, will be considered for entry to taught postgraduate programmes.
Holders of a Bachelors degree of four years duration from a recognised university in Qatar will be considered for postgraduate study. Holders of a Bachelors degree will normally be expected to have achieved a GPA of 3.0/4, 3.75/5 or 75% for 2:1 equivalency or 2.8/4, 3.5/5 or 70% for 2:2 equivalency.
Holders of a good Diplomă de Licenţă, Diplomă de Inginer, Diplomă de Urbanist Diplomat, Diplomă de Arhitect, Diplomă de Farmacist or Diplomã de Doctor-Medic Arhitect (Bachelors degree) from a recognised Romanian Higher Education institution with a minimum overall grade of 8 out of 10 for 2:1 equivalence, or 7 out of 10 for 2:2 equivalence, will be considered for entry to taught postgraduate programmes.
Holders of a good Диплом Бакалавра (Bakalavr) degree with a minimum grade point average (GPA) of 4.0 from recognised universities in Russia may be considered for entry to taught postgraduate programmes/MPhil degrees.
Students who hold a 4-year Bachelor degree with at least 16/20 or 70% will be considered for Postgraduate Diplomas and Masters degrees.
Holders of a Bachelors degree of four years duration from a recognised university in the Saudi Arabia will be considered for postgraduate study. Holders of Bachelors degree will normally be expected to have achieved a GPA of 3.0/4, 3.75/5 or 75% for 2:1 equivalency or 2.8/4, 3.5/5 or 70% for 2:2 equivalency.
Students who hold a Maitrise, Diplome d'Etude Approfondies,Diplome d'Etude Superieures or Diplome d'Etude Superieures Specialisees will be considered for Postgraduate Diplomas and Masters degrees. A score of 14-15/20 or Bien from a well ranked institution is considered comparable to a UK 2.1, while a score of 12-13/20 or Assez Bien is considered comparable to a UK 2.2
Students who hold a Bachelor (Honours) degree from a recognised institution with a minimum GPA of 3.0/4.0 or 3.5/5.0 (or a score of 60-69% or B+) from a well ranked institution will be considered for most our Postgraduate Diplomas and Masters degrees with a 2:1 requirement.
Students holding a good Bachelors Honours degree will be considered for postgraduate study at Diploma or Masters level.
Holders of a good three-year Bakalár or pre-2002 Magister from a recognised Slovakian Higher Education institution with a minimum overall grade of 1.5, B, Vel’mi dobrý ‘very good’ for 2:1 equivalence, or 2, C, Dobrý ‘good’ for 2:2 equivalence, will be considered for entry to taught postgraduate programmes.
Holders of a good Diploma o pridobljeni univerzitetni izobrazbi (Bachelors degree), Diplomant (Professionally oriented first degree), Univerzitetni diplomant (Academically oriented first degree) or Visoko Obrazovanja (until 1999) from a recognised Slovenian Higher Education institution with a minimum overall grade of 8.0 out of 10 for 2:1 equivalence, or 7.0 out of 10 for 2:2 equivalence, will be considered for entry to taught postgraduate programmes.
Students who hold a Bachelor Honours degree (also known as Baccalaureus Honores / Baccalaureus Cum Honoribus) from a recognised institution will be considered for Postgraduate Diplomas and Masters degrees. Most Masters programmes will require a second class upper (70%) or a distinction (75%).
Holders of a Masters degree will be considered for entry to postgraduate research programmes.
Holders of a Bachelor degree from a recognised South Korean institution (usually with the equivalent of a second class upper or a grade point average 3.0/4.0 or 3.2/4.5) will be considered for Masters programmes.
Holders of a good Masters degree from a recognised institution will be considered for PhD study on an individual basis.
Holders of a good Título de Licenciado / Título Universitario Oficial de Graduado (Grado) /Título de Ingeniero / Título de Arquitecto from a recognised Spanish university with a minimum overall grade of 7 out of 10 for 2:1 equivalence, or 6 out of 10 for 2:2 equivalence, will be considered for entry to taught postgraduate programmes.
Holders of a Special or Professional Bachelors degree of four years duration from a recognised university in Sri Lanka will be considered for postgraduate taught study.
Holders of Bachelors degree will normally be expected to have achieved 60-74% or a CGPA 3.30/4.0 or B+ for 2:1 equivalency, or 55-59% or a CGPA 3.0/4.0 or B for 2:2 equivalency depending on the awarding institution.
Holders of a good Kandidatexamen (Bachelors degree) or Yrkesexamen (Professional Bachelors degree) from a recognised Swedish Higher Education institution with the majority of subjects with a grade of VG (Val godkänd) for 2:1 equivalency, or G (godkänd) for 2:2 equivalency, will be considered for entry to taught postgraduate programmes. Holders of a good Kandidatexamen (Bachelors degree) or Yrkesexamen (Professional Bachelors degree) from a recognised Swedish Higher Education institution with the majority of subjects with a grade of VG (Val godkänd), and/or a good Magisterexamen (Masters degree), International Masters degree or Licentiatexamen (comparable to a UK Mphil), will be considered for entry to postgraduate research programmes.
Holders of a good " Baccalauréat universitaire/ Diplom / Diplôme; Lizentiat / Licence; Staatsdiplom / Diplôme d'Etat" degree from a recognised Swiss higher education institution (with a minimum GPA of 5/6 or 8/10 or 2/5 (gut-bien-bene/good) for a 2.1 equivalence) will be considered for entry to taught postgraduate programmes.
Holders of a Bachelors degree from a recognised university in Syria will be considered for postgraduate study. Holders of Bachelors degree will normally be expected to have achieved score of 70%, or ‘very good’ for 2:1 equivalency or 60%, or ‘good’ for 2:2 equivalency.
Holders of a good Bachelor degree (from 75% to 85% depending upon the university in Taiwan) from a recognised institution will be considered for postgraduate Masters study. Holders of a good Masters degree from a recognised institution will be considered for PhD study.
Students who hold a Bachelor degree from a recognised institution will be considered for Postgraduate Diplomas and Masters degrees. Most taught Masters programmes require a minimum of an upper second class degree (2.1) Students who have completed a Masters degree from a recognised institution will be considered for PhD study.
Holders of a good Bachelors degree from a recognised institution will be considered for postgraduate study at Diploma or Masters level. Holders of Bachelors degree from prestigious institutions (see list below) will normally be expected to have achieved a GPA of 3.0/4.0 for 2:1 equivalency or 2.7 for 2:2 equivalency. Applicants with grades slightly below these requirements may also be considered for an offer if they have a relevant Bachelors degree, good scores in relevant modules, or relevant work experience.
Holders of a Bachelors degree from all other institutions will normally be expected to have achieved a GPA of 3.2/4.0 for 2:1 equivalency, or 2.8 for 2:2 equivalency.
Prestigious institutions: Assumption University Chiang Mai University Chulalongkorn University Kasetsart University Khon Kaen University King Mongkut University of Technology - Thonburi (known as KMUTT or KMUT) Mahidol University Prince of Songla University Srinakharinwirot University Thammasat University
Holders of a bachelor degree with honours from a recognised Caribbean and West Indies university may be considered for entry to a postgraduate degree programme.
First (1st) | 3.5 GPA, B+, 1st, First Class Honours degree |
Upper Second (2:1) | 3.0 GPA, B, 2.1, Class II Upper Division Honours degree |
Lower Second (2:2) | 2.5 GPA, B-, 2.2, Class II Lower Division Honours degree |
Students with a Bachelors degree from the following universities may be considered for entry to postgraduate programmes:
Students from all other institutions with a Bachelors and a Masters degree or relevant work experience may be considered for postgraduate programmes.
Grading Schemes
1-5 where 1 is the highest 2.1 = 1.75 2.2 = 2.25
Out of 4.0 where 4 is the highest 2.1 = 3.0 2.2 = 2.5
Letter grades and percentages 2.1 = B / 3.00 / 83% 2.2 = C+ / 2.5 / 77%
Holders of a postdoctoral qualification from a recognised institution will be considered for PhD study. Students may be considered for PhD study if they have a Masters from one of the above listed universities.
Holders of a Lisans Diplomasi with a minimum grade point average (GPA) of 3.0/4.0 from a recognised university will be considered for postgraduate study at Diploma or Masters level.
Holders of a Yuksek Diplomasi from a recognised university will be considered for PhD study.
Holders of a Bachelors degree of four years duration from a recognised university in the UAE will be considered for postgraduate study. Holders of a Bachelors degree will normally be expected to have achieved a GPA of 3.0/4, 3.75/5 or 75% for 2:1 equivalency or 2.8/4, 3.5/5 or 70% for 2:2 equivalency.
Students who hold a Bachelor degree from a recognised institution will be considered for Postgraduate Diplomas and Masters degrees. Most Masters programmes will require a second class upper (2.1) or GPA of 3.5/5.0
Holders of a good four-year Bachelors degree/ Диплом бакалавра (Dyplom Bakalavra), Диплом спеціаліста (Specialist Diploma) or a Dyplom Magistra from a recognised institution, with a minimum GPA of 4.0/5.0, 3.5/4, 8/12 or 80% or higher for 2:1 equivalence, or a GPA of 3.5/5.0, 3.0/4, 6/12 or 70% for 2:2 equivalence, depending on the awarding institution, will be considered for entry to taught postgraduate programmes.
The University will consider students who hold an Honours degree from a recognised institution in the USA with a GPA of:
Please note that some subjects which are studied at postgraduate level in the USA, eg. Medicine and Law, are traditionally studied at undergraduate level in the UK.
Holders of the Magistr Diplomi (Master's degree) or Diplomi (Specialist Diploma), awarded by prestigious universities, who have attained high grades in their studies will be considered for postgraduate study. Holders of the Fanlari Nomzodi (Candidate of Science), where appropriate, will be considered for PhD study.
Holders of the Licenciatura/Título or an equivalent professional title from a recognised Venezuelan university may be considered for entry to a postgraduate degree programme. Scales of 1-5, 1-10 and 1-20 are used, an overall score of 70% or equivalent can be considered equivalent to a UK 2.1. Applicants for PhD level study will preferably hold a Maestria or equivalent qualification
Holders of a Bachelors degree from a recognised Vietnamese institution (usually achieved with the equivalent of a second class upper or a grade point average minimum GPA of 7.0 and above) will be considered for postgraduate study at Diploma or Masters level. Holders of a Masters degree (thac si) will be considered for entry to PhD programmes.
Students who hold a Masters degree with a minimum GPA of 3.5/5.0 or a mark of 2.0/2.5 (A) will be considered for Postgraduate Diplomas and Masters degrees.
Students who hold a good Bachelor Honours degree will be considered for Postgraduate Diplomas and Masters degrees.
Most modules include a substantial workshop element, directly focussing on student work.
We have three terms per year, the autumn, spring and summer terms semester. Term dates can be found on our website.
The programme is made up of two 40-credit modules (Writer's Workshop, Creative Writing Masterclass) and two 20-credit modules (Intertextuality; Poem as Story). As a full-time student, you will take one 20-credit module and one 40-credit module in the first two terms, followed by your dissertation. You can typically expect six hours of classroom time per week, two for a 20-credit module and four for a 40-credit module. If you are a part-time student, we advise that you complete the 40-credit modules in your first year and the 20-credit modules in your second year, allowing you more time to focus on your dissertation in year two.
Each module represents a total of 200 hours of study time, including preparatory reading, homework and assignment preparation.
As a postgraduate student in the College of Arts and Law, you have access to the Academic Writing Advisory Service (AWAS) which aims to help your transition from undergraduate to taught Masters level, or back into academia after time away. The service offers guidance on writing assignments and dissertations for your MA/MSc programme with individual support from an academic writing advisor via tutorials, email and the provision of online materials.
International students can access support for English Language development and skills through the Birmingham International Academy (BIA) .
We have three teaching terms per year, the autumn, spring and summer terms. Term dates can be found on our website .
As a full-time student, you will typically take three modules in each of the first two terms, followed by your dissertation. If you are a part-time student, you will typically take three modules across each year, followed by your dissertation.
As a postgraduate student in the College of Arts and Law, you have access to the Academic Writing Advisory Service (AWAS) which aims to help your transition from undergraduate to taught Masters level, or back into academia after time away. The service offers guidance on writing assignments and dissertations for your MA/MSc programme with individual support from an academic writing advisor via tutorials, email and the provision of online materials.
International students can access support for English Language development and skills through the Birmingham International Academy (BIA) .
The University of Birmingham is the top choice for the UK's major employers searching for graduate recruits, according to The Graduate Market 2024 report .
Your degree will provide excellent preparation for your future career, but this can also be enhanced by a range of employability support services offered by the University and the College of Arts and Law.
The University's Careers Network provides expert guidance and activities especially for postgraduates, which will help you achieve your career goals. The College of Arts and Law also has a dedicated careers and employability team who offer tailored advice and a programme of College-specific careers events.
You will be encouraged to make the most of your postgraduate experience and will have the opportunity to:
What’s more, you will be able to access our full range of careers support for up to 2 years after graduation.
Postgraduates in the Department of Film and Creative Writing develop a range of skills including the ability to lead and participate in discussions; critical thinking, and an appreciation of different theoretical contexts; the ability to develop opinions and new ideas; and an aptitude for thinking and working creatively with others. While some graduates go on to careers in related industries, such as writing, media and television, others have used their transferable skills to pursue roles such as advertising, teaching, and in the heritage and cultural sectors.
We use some essential cookies to make this website work.
We’d like to set additional cookies to understand how you use GOV.UK, remember your settings and improve government services.
We also use cookies set by other sites to help us deliver content from their services.
You have accepted additional cookies. You can change your cookie settings at any time.
You have rejected additional cookies. You can change your cookie settings at any time.
Published 28 August 2024
© Crown copyright 2024
This publication is licensed under the terms of the Open Government Licence v3.0 except where otherwise stated. To view this licence, visit nationalarchives.gov.uk/doc/open-government-licence/version/3 or write to the Information Policy Team, The National Archives, Kew, London TW9 4DU, or email: [email protected] .
Where we have identified any third party copyright information you will need to obtain permission from the copyright holders concerned.
This publication is available at https://www.gov.uk/government/publications/research-on-parent-and-pupil-attitudes-towards-the-use-of-ai-in-education/research-on-public-attitudes-towards-the-use-of-ai-in-education
1.1 background .
The Responsible Technology Adoption Unit (RTA) within the Department for Science, Innovation and Technology (DSIT) commissioned this research in partnership with the Department for Education (DfE) to understand how parents and pupils feel about the use of AI tools in education.
As AI tools such as large language models (LLMs) become more advanced, there are opportunities for such tools to support both teachers and pupils by creating tailored content and support, as well as streamlining processes. However, there are many questions that need to be answered before AI is implemented widely.
The project sought to answer the following research questions:
Under what circumstances, if any, are parents and pupils comfortable with AI tools being used in education?
Under what circumstances, if any, are parents and pupils comfortable with pupils’ work being used to optimise the performance of AI tools?
Through deliberative dialogue with parents and pupils, Thinks Insight and Strategy (Thinks) explored their concerns, hopes, and expectations, as well as the conditions for use of AI in this context.
Thinks engaged a total of 108 parents and pupils across three locations in England in a mix of face-to-face and online activities. Each participant took part in four to six hours of engagement, following the below structure:
Inform: Participants were provided with information about the purpose of the research, as well as key principles such as machine learning, data protection, intellectual property, and current and potential AI applications in education.
Debate: Participants explored the potential social, ethical, legal, financial, and cultural issues associated with use of AI in education, and were provided with a range of views from experts and officials.
Decide: At the end of each session, each group of participants articulated their preferred conditions for use and explored areas of consensus and difference.
1. Parents and pupils frequently share personal information online, often without considering the implications. The benefits and convenience of using online services, especially those that provide a tailored experience, tend to outweigh any privacy concerns.
2. While awareness of AI among both parents and pupils was high, understanding did not run deep. AI is often associated with robots or machines, and fictional dystopian futures. Only some – those with more knowledge of or exposure to AI – thought of specific applications such as LLM-powered tools.
3. As a result, views on the use of AI in education were initially sceptical – but there was openness to learning more. Initial concerns about AI in education were often based on a lack of understanding or imagination of how it could be used.
4. Parents and pupils agreed that there are clear opportunities for teachers to use AI to support them in their jobs. They were largely comfortable with AI being used by teachers, though more hesitant about pupils interacting with it directly.
5. By the end of the sessions, participants understood that pupil work and data is needed to optimise AI tools. They were more comfortable with this when data is anonymised or pseudonymised, and when there are clear rules for data sharing both with private companies and across government.
6. The main concerns regarding AI use centred on overreliance – both by teachers and pupils. Participants were worried about the loss of key social and technical skills and reduced human contact-time leading to unintended adverse outcomes.
7. The research showed that opinions on AI tools are not yet fixed. Parents’ and pupils’ views of and trust in AI tools fluctuated throughout the sessions, as they reacted to new information and diverging opinions. This suggests that it will be important to build trust and continue engagement with different audiences as the technology becomes more established.
Participants agreed on some key conditions for the use of AI in education and the use of pupil work and data to optimise AI tools:
Human oversight: Human involvement in AI use to ensure teacher roles are not displaced, to correct for error and unfair bias, and to provide safeguarding.
Parent and pupil permissions: Providing parents and pupils with the necessary information to make informed decisions about the use of their data.
Standardisation and regulation: Ensuring that tools introduced at schools are of a uniform standard to avoid exacerbation of inequalities, with strict oversight of tech companies providing the tools.
Age and subject restrictions: Using AI tools only where appropriate and where they add value. Strict age restrictions on direct interaction with AI.
Profit sharing: Ensuring that tech companies share some of their profits so that these can be reinvested into the education system and benefit schools and pupils – while recognising that private companies will need to be incentivised to develop better tools.
While this report describes the views of the parents and pupils who participated in the research, the suggestions contained within would require further research, discussion and consultation (and use of other types of evidence) prior to translation into policy and practice.
2.1 project background .
The use of AI in education has the potential to support pupils’ learning and help reduce teacher workload. But as with any new or emerging technology, there are a range of issues which need to be considered before this is implemented widely.
The Department for Education (DfE) and the Responsible Technology Adoption Unit (RTA) within the Department for Science Innovation and Technology (DSIT) wanted to understand how parents and pupils feel about AI tools being used in education, as well as what they think about pupils’ work (e.g. schoolwork, homework, exam scripts) being used to improve AI tools.
This research aimed to create a space for pupils and parents to learn about and discuss the issues, consider their preferences for the use of AI in education, and inform DfE’s approach to implementing AI within the education system.
The overall objectives of this project were to understand:
In which circumstances, if any, are parents and pupils comfortable with AI tools being used in education?
a. Which kinds of use cases are acceptable?
b. How much human oversight do parents and pupils want to see?
c. What concerns need to be addressed?
d. What wider factors affect acceptability?
In which circumstances, if any, are parents and pupils comfortable with pupils’ work being used to optimise the performance of AI tools?
a. Should parental agreement be required? If so, would parents give permission, and under which conditions?
b. Who should control how the work produced by pupils is used and accessed?
c. Who, if anyone, should profit from AI which is optimised with pupils’ work?
Thinks Insight & Strategy (Thinks) recruited six cohorts of parents across three locations in England. Three cohorts took part in an in-person workshop, while the other three took part in online workshops:
Parents of children with special educational needs and/or disabilities (SEND)
Parents of children of pre-school age
Parents of primary school pupils
Parents of pre-GCSE pupils
Parents of GCSE pupils
Parents of post-GCSE pupils (aged 17-18)
We also recruited three cohorts of pupils across the three locations for face-to-face workshops, all attending state-funded schools:
Pre-GCSE pupils
GCSE pupils
Post-GCSE pupils (aged 17-18)
Table 1 below shows the breakdown of parent and pupil cohorts across the three fieldwork locations, by mode (in-person or online). A demographic sample breakdown can be found in the Appendix.
In-person fieldwork
Parents of pre-GCSE pupils | 6 | 6 | 12 | ||
Parents of GCSE pupils | 6 | 6 | 12 | ||
Parents of post-GCSE pupils | 6 | 6 | 12 | ||
Total parents | 12 | 12 | 12 | 36 | |
---|---|---|---|---|---|
Pre-GCSE pupils | 6 | 6 | 12 | ||
GCSE pupils | 6 | 6 | 12 | ||
Post-GCSE pupils | 6 | 6 | 12 | ||
Total children | 12 | 12 | 12 | 36 |
Online fieldwork
Parents of children of pre-school age | 6 | 6 | 12 | ||
Parents of primary school pupils | 6 | 6 | 12 | ||
Parents of pupils with SEND | 6 | 6 | 12 | ||
Total parents | 12 | 12 | 12 | 36 |
---|
In-person workshops
We engaged a total of 36 parents/carers (referred to as “parents” throughout) and their children aged 11-18 (36 in total) in six-hour long workshops. Workshops took place in three locations across England on 24 February, 25 February and 2 March 2024. In these workshops, we used the following structure:
Inform : First, we established the purpose of the dialogue and the reason for involving parents and pupils, providing contextual information about data, foundation models and potential applications. This included showing videos from those involved in the development of AI educational tools and a participant-led demo of some educational AI products.
Debate : After a short break, we explored the potential social, ethical, legal, financial, and cultural issues associated with use of AI in education. This included watching videos from government ministers, officials and specialists in education who explained some of the potential benefits and risks of AI in education.
Decide : After lunch we brought together participants in their groups to compare views and explore areas of consensus, conditions for use and preferences. This involved the groups discussing different AI use case suggestions and constructing their ideal future scenario.
Online workshops
We engaged a further 36 parents in two online workshops, on 21 February and 28 February 2024, each lasting two hours. We followed the same deliberative research process structure divided over the two sessions.
Inform : In the first workshop, we focused primarily on informing the participants and providing contextual information. We showed videos from those involved in the development of AI educational tools and used voting tools to interact with participants. This workshop ended by asking participants to reflect on any concerns or needs for reassurance they might have.
Debate and Decide : In the second workshop, participants were shown videos from government ministers, officials and specialists in education who explained the benefits and risks of AI in education. Following discussion on these topics, participants ranked potential uses of data and pupil work according to levels of comfort, before offering their thoughts and recommendations.
3.1 summary.
While awareness of AI is relatively high, understanding does not run deep. Most participants had heard of and used AI-powered tools, although not necessarily on purpose.
With increasing use of AI, many accept it as part of modern life, but remain uneasy about the perceived invasiveness of the technology.
However, this generally did not stop participants from using and sharing their data with services that offer an improved experience based on machine learning, such as tailored recommendations or GPS. Expressed concerns about privacy were therefore at odds with actual behaviour.
Most parents had not previously considered the application of AI tools in education beyond the risk of pupils using LLMs to plagiarise. However, for many children, the use of technology is already a big part of their everyday lives at school, meaning they viewed this as a natural extension, or a continuation, of what is already happening.
At the start of each workshop, participants were asked to list their first associations with the terms “AI” or “Artificial Intelligence”. Although awareness of AI as a “hot topic” was high, understanding of the technology did not run deep. Both pupils and parents were likely to associate AI with robots or machines, but also with social media, streaming and shopping platforms, apps, and websites. In particular, they thought of chatbots, targeted advertising, and algorithms recommending products. Despite some awareness, only a handful of participants across the parent and pupil samples had purposely interacted with LLM-powered tools or proactively used them regularly. When prompted with some other less obvious examples (such as GPS, AutoCorrect and predictive text) however, most discovered that they had much more exposure to AI than they had originally thought.
Parent of primary school pupil, Newcastle:
[An online video streaming platform] has tracked who I view and what kinds of people I have viewed and followed and brings up related videos.
Most participants accepted the use of AI in various settings, products, or services, as an inevitability of modern life. However, many expressed unease about the technology, due to its perceived invasiveness both in terms of its increasing ubiquity and its reliance on personal data-sharing. Generally, participants found it easier to think of the risks of AI than benefits, even where they acknowledged that it could improve efficiency or convenience. These concerns often centred on humans being replaced by machines resulting in job displacement, but also machines not being an adequate replacement for humans because they are perceived to lack more nuanced understanding – for example, in customer service settings.
Younger children were generally the least worried about AI, often because they had not given much thought to it, were less concerned about data security, and more used to technology playing a role in their lives. Older children, and particularly those aged 17-18, were more likely to have used AI as well as to have a general awareness of its use. Some had used LLMs and found them useful, though only to an extent, as they had quickly found limitations of the technology. Even among children and young people, some aspects of AI were seen as “creepy” or going too far, particularly AI features used by social media platforms that mimicked human conversation too closely or felt overly friendly in tone to users.
Post-GCSE Pupil, Birmingham:
I use [LLM-powered tool] to help with my essays; it’s quicker.
Post-GCSE Pupil, Birmingham :
Sometimes it asks really random questions and you think do you need to know that?
The use of personal data in relation to AI was also a concern for both parents and children. In particular, concerns involved the sale of data to third parties by companies developing AI tools and misuse of data by other humans (for example, in the creation of deepfakes). Despite these concerns, parents and pupils reported frequently sharing their personal data online. They noted that personal information is shared to create accounts, verify their identity, and receive relevant and tailored information or experiences. They also acknowledged that the benefit and convenience of sharing this data largely outweighed their concerns. Participants noted that they had little understanding of, or gave little consideration to, what happens to their data once it has been shared, beyond a general assumption that companies store and sell it to third parties to make a profit. In part, this may be because the benefits of sharing personal data were felt to be more immediate and tangible than the risks (such as a hypothetical data breach), which can feel more abstract and far-removed as a possibility.
I’m not actually sure what [the app] does with my data, other than checking that I’m old enough to view the videos and the content is suitable.
Post-GCSE pupil, Newcastle:
[What does [a video streaming service] do with your information?] Stores it, recommends you shows, brings new things in, sells your information.
Compared with their children, parents demonstrated higher awareness of the risks of data sharing, both in relation to their own data, and that of their children. They were concerned about the information that was being put “out there”, but also felt resigned to it. A handful of parents with higher levels of knowledge of technology were excited about the opportunities offered by AI, though still wary.
Parent of a child with SEND, Bristol:
Helping overcome barriers is good, but thinking about, for example, language, research, literature, it might take away from that. Create an overreliance on tech and developing social skills. What would data mining implications be? Would teachers lose jobs?
Stimulus provided: .
Before exploring views of AI in an education context, participants were shown a video explaining what AI is and why it is important to understand and engage with it.
In the context of limited understanding of AI, initial views of the use of AI in education were mostly sceptical. Most had not considered the use of AI in education before and found it difficult to imagine how it might be used within schools. Initially, participants were more likely to imagine pupils interacting directly with AI, rather than teachers using it to support them in their roles. Many participants immediately thought of the replacement of teachers with machines, in line with their initial concerns about human job displacement. This was rejected by participants, as they felt it was important for pupils to interact with human teachers. In addition, underlying assumptions about AI (and technology in general) making people lazy, particularly held by parents, also coloured spontaneous perceptions.
Parent of pre-GCSE pupil, Newcastle:
As long as the humans are not replaced, if it streamlines and allows for more personal time [with teachers], that’s got to be a benefit.
As a result of this relatively limited prior knowledge and understanding of AI, it was initially unclear to both parents and children what the potential benefits of AI might be for teaching quality or pupil attainment. There was also uncertainty about what the use(s) of AI in various educational settings could be in practice. However, with scepticism largely grounded in a lack of experience or understanding, participants expressed an openness to hearing more. This was particularly true of pupils, many of whom felt more comfortable sharing their data and using technology relative to parents. Some pupils had already used AI in an educational context or knew that their teachers did. Even those who had not actively used AI in a school setting were familiar with the idea of existing software supporting pupils and teachers. As a result, most pupils felt that AI use in schools was already becoming the norm and further use would be a natural progression of technology application, even if they had not fully considered the implications.
4.1 summary.
Both parents and pupils thought the main advantage of AI use in education was its potential to support teachers and, by extension, improve pupils’ learning experience.
Parents, and to a lesser extent pupils, were much less certain about pupils interacting directly with AI, especially unsupervised – even though they could see benefits in AI providing highly tailored support.
Both parents’ and pupils’ main concerns revolved around the quality of teaching, overreliance on AI resulting in loss of key social and technical skills, as well as the suitability of AI to address certain subjects and pupil needs.
Across the board, participants were more comfortable with use cases where AI supports teachers (for example, preparing a lesson) or is used for “lower stakes” tasks (for example, marking a class test, rather than an exam).
There was a sense that AI use should always be optional, both for teachers and pupils, and that parents should have a say in whether and how AI is used – though there was little acknowledgement of the practical issues that could arise in introducing AI on an opt-in/out basis.
Before exploring more detailed uses of AI in education, participants were provided with stimulus in the form of demonstrations of AI tools currently available to support with learning or in development, and videos explaining:
Machine learning and its potential uses in education
Current and potential benefits of AI for teachers and pupils
Current and potential risks of AI use, including data protection, privacy, intellectual property, and bias
The strategic benefits and risks of AI use in education from a policy perspective, and how they can be managed
Supporting teachers .
The biggest perceived opportunity for AI use in education was to support teachers in generating classroom materials and managing feedback in more efficient ways. The perception was that this could reduce administration tasks and increase the attractiveness of teaching as a profession.
Across the workshops, parents and pupils felt most comfortable with teachers using AI as a tool to support lesson delivery (for example, by helping to plan lessons). They were less comfortable with the idea of pupils engaging directly with AI tools, as they wanted to ensure some level of human oversight and pupil-teacher interaction.
Pre-GCSE Pupil, Bristol:
It can help teachers making slides, like information slides, and answer questions about stuff.
Parent of post-GCSE Pupil, Birmingham:
It’s less stressful for teachers to sort the homework, lesson plans… and [gives them] more time to be present and support the kids.
Using AI as a support to teachers was felt to enable better learning experiences.
There was a higher level of comfort with AI when it was seen as enabling teachers to redirect their time and energy into delivering high quality education. For parents in particular, the terms “helping” and “assisting” the teacher reassured them AI would play a supporting role, rather than taking over the teacher’s role, and alleviated parents’ concerns about the risks of potential overreliance on AI (see section 4.2.1 Lower quality of learning). Interestingly, some parents and pupils assumed that the introduction of AI tools would lead to more contact time between teachers and pupils – though they were not clear on whether they would expect pupils to spend more time in school.
I think it’s great. I’m impressed by it. I think if teachers have got that kind of tool to help with the administrative side, they have more time in the classroom for actual teaching rather than having to go home and mark and make lesson plans.
The potential for AI tools to support teachers to provide detailed, timely, high-quality feedback was considered to be a key benefit. Parents felt that better quality feedback would help them understand their child’s progress, and identify areas where they need more support. As a result, parents were supportive of the benefits of using AI tools to help teachers to provide more frequent and personalised feedback.
Parent of pre-school pupil, Bristol:
It would be more targeted to my child; it would collect so much information on my child that it would support and help their learning. To show [what their] focus area [is], what subjects, might show me what might be good extra learning.
Both parents and pupils recognised that some AI tools could make learning more fun and engaging for pupils by generating visually engaging and creative resources that teachers might not currently have the time to create. During the in-person workshops, participants were encouraged to explore an LLM-powered tool using tablets and some suggested prompts. Many were impressed by the ways in which the tool could help quickly bring topics to life in the classroom, such as when assuming the character of a historical or literary figure and answering questions asked by pupils from the perspective of that character.
Some pupils saw an opportunity for LLM-powered tools to inspire them to be more creative in their work, either by expanding on pupils’ own ideas, or by providing a starting point on which pupils could then layer their own thinking and creativity, such as when writing an essay or story.
Using AI in these ways was felt to be exciting and engaging, bringing topics to life and helping pupils develop their own ideas. Participants, particularly pupils, expressed a more positive sentiment about AI tools creating a more interactive learning environment where they could input ideas and get interesting new feedback generated by the AI. This use of AI in education was seen by some as more acceptable than auto-correcting pupils’ work, or providing the answers to copy and paste in response to an assignment question being asked of AI. Some pupils felt more positive about AI being used interactively to gain ideas and enrich understanding, rather than inputting questions and extracting answers.
[Future vision of AI] To generate interactive lesson plans and deliver lessons that are more engaging.
While there was some interest in the opportunities for AI to provide personalised learning, most parents – and pupils – had concerns about the quality AI could achieve as a personal tutor.
Across the workshops, most participants emphasised the value of one-to-one support and feedback in education but acknowledged that it can be hard to attain for some, and is dependent on teachers’ availability. AI potentially providing the same support as a one-to-one personal tutor, immediately available and tailored to pupils’ needs, was seen as a clear opportunity to improve the quality of pupils’ education. We also heard from pupils that some felt personalised AI tools could “make learning more interactive” and be able to assess and identify areas pupils might need support in.
Parent of post-GCSE pupil, Birmingham:
It [AI tutor] might challenge them [the child] when the class isn’t ready to go on, but they could.
Participants recognised the potential for AI to offer more tailored and targeted support calibrated to the specific needs of individual pupils. Some pupils felt that personalised AI tools could help them improve by providing support with subjects they struggle with (such as via extension activities or summary sheets). Some parents of children with SEND saw an opportunity for AI tools to provide individualised support for their child, ranging from supporting speech or writing, tracking their progress, or even using AI as a tool for early identification of potential SEND.
Upon closer consideration of AI providing personalised learning experiences, parents and pupils raised concerns regarding the amount of data the AI would need in order to provide personalised experiences. Parents were also concerned about pupils using AI unsupervised – which they perceived would be the case if AI was used in this way. One barrier to using AI in this way was the association that some made with unsophisticated customer service “chatbots”, which most had experienced to lack nuance and understanding for individual situations. Despite some perceived benefits, parents of children with SEND in particular were hesitant about their child using these tools unsupervised due to concerns about unfair bias, lack of sensitivity, or access to harmful content.
As a result, whilst many saw an opportunity for AI to fill a gap in personalised learning, parents and pupils were unconvinced that the quality of the personalised learning that AI could deliver would be sufficient.
Parent of GCSE pupil, Birmingham:
The potential is phenomenal, it’s like the child would have its own teaching assistant, there has to be a big buy-in from the kids, parents and teachers themselves. Thinking about the implementation though, you’re looking at farm size data storage, how is that funded, and the upkeep of that as well, that’s a big cost.
Parent of post-GCSE pupil, Newcastle:
It would need a lot of data about your child to support your child in each area that they’re struggling in.
Lower quality of learning .
Concerns about overreliance on AI were prevalent among participants, particularly the perception that AI could reduce quality of education and socialisation through decreased human contact hours.
Human-to-human learning was seen as critical to providing children with a good education. We heard that there would need to be clear boundaries for the use of AI to ensure pupils benefit from social interaction with their teachers. This concern was particularly pronounced among parents of children with SEND.
This worry also compounded an overall concern about the amount of time children spend on screens. Some parents associated AI use in education with yet another chunk of their child’s time being spent on a screen rather than having human contact. There was uncertainty about what the long-term effects prolonged screen time might be on a child’s physical and mental wellbeing. Some parents suggested placing a time limit on the use of AI in the classroom and at home. Without this, there was felt to be a risk that, when combined with use of personal devices during their leisure time, children would never have a break from screens.
Following the experience of the pandemic, participating pupils were particularly keen to maximise face-to-face learning experiences and were consequently less positive regarding uses of AI which could result in increased screen time to the detriment of face-to-face activities.
GCSE pupil, Birmingham:
I missed the social interaction of being in school [during the lockdowns implemented in response to Covid-19].
I feel restricted [when learning] online.
Parent of primary school pupil, Birmingham:
Too much screen time isn’t good for their head, it affects their sleep.
Related to their spontaneous concerns about AI’s potential impact on the labour market, participants worried about job losses caused by the displacement of teachers by AI.
In participants’ initial reactions prior to guided discussion, we heard concerns about AI being used to make up for teacher shortages, effectively making human teachers redundant in the process. Participants balance this concern against what they see as the key opportunity: AI freeing up teachers’ time to do what they do best.
Parent of pre-GCSE pupil, Bristol:
What will the teacher be doing with the saved time? And how do you know the tasks being given will be relevant?
Both parents and pupils were concerned that the use of AI in education could result in pupils failing to develop key skills.
In the context of overreliance on AI, there was concern that pupils could use AI to complete tasks such as maths problems or creative writing with little of their own input. There was also unanimous concern about AI leading to plagiarism. This overreliance could lead pupils to become unable to perform key skills without AI.
GCSE pupil, Bristol:
It feels really detrimental to use a lot of AI, because in the long-term you won’t know anything. You wouldn’t want to go to the dentist and they’ve done their homework with [LLM-powered tool] and they know nothing.
Pre-GCSE pupil, Bristol:
You need to be able to do it yourself and then get the feedback.
Older kids might use it to write assignments so they’re not actually learning. Instead of researching and learning about it, they just put it into [LLM-powered tool] to get the answer.
Some parents of children with SEND were concerned that their child could become over reliant on AI tools, particularly AI that personalised learning to their specific needs. Whilst this was seen to support them to some degree (as mentioned in section 4.1.4), it was also felt to risk a loss of key skills.
As a parent, my son has dyslexia, so he has to programme in text, and the computer processes it and helps him type it. So it’d be useful for that […] But you don’t want him to rely on that.
Data quality – specifically whether AI could misinform pupils – was a concern for many. Some felt that AI had the potential to reinforce unfair biases.
Throughout the workshops, many participants expressed uncertainty over whether, at its current stage of development, AI was good enough to be used in an educational context. As participants became more informed about machine learning and how it works, more participants questioned the quality of the data being used to train AI and whether there would be sufficient human oversight to quality check AI outputs.
Expectations of where and how interactive AI tools would use data, such as marking class tests or providing feedback, was not consistent among parents and pupils. Some were concerned about AI processing and learning from incorrect answers. This was seen to be potentially damaging to the educational process if it led to pupils receiving incorrect feedback from AI. Uncertainty about how AI learns and generates information for different uses was a driver of concern for AI being used in education, where it feels more important that data is accurate than in other settings. As a result, parents and pupils felt it was imperative that AI tools were carefully assured, and that appropriate training was provided, before AI is used widely in schools.
Inaccurate information being fed through the software could be really concerning.
After showing participants a video about machine learning and an animation about bias (see Appendix), some expressed concern about the potential for AI to reinforce harmful biases and reproduce inaccurate information. This raised questions about how quickly AI can “unlearn” biases and how these unfair biases would be picked up. Unfair bias in AI was perceived as a potential risk, however, many parents acknowledged that this risk already exists in humans. The majority of participants wanted reassurance that AI was going to be monitored by a human to ensure the information given to pupils was accurate.
The fact that AI is always learning, and it learns from the data the kids are putting in, so if they aren’t getting it right, it could take it off course.
Lack of safeguarding and the risk of encountering harmful content when pupils interact directly with AI were concerns for parents.
We heard concerns, particularly from parents of younger pupils, about children being exposed to harmful content at school when using AI, as it didn’t feel clear whether there would be robust safeguards in place. This built on an existing worry about how children interact with technology and what they are exposed to online. Some parents therefore suggested they would want to limit this risk where possible by reducing unsupervised technology use, rather than introducing a further opportunity for their child to encounter harmful content. At the same time, many parents felt that they already had little control over their child’s consumption of online content, and educational tools were likely to be safer than unregulated access to the internet.
Like on [social media platform], and it learns from what you’re watching, if you’re watching suicidal content it’ll keep showing you suicidal content.
Parent of primary school pupil with SEND, Birmingham:
She’s already talking to [voice assistant] all the time, it’s a different world for them.
Clarification on whether pupils could be exposed to harmful content through their use of AI, and the steps to prevent this, was essential for all participants – but particularly parents. We consistently heard that parents would like a clear understanding of how AI will be used by their child and reassurance that steps are in place to protect their child from any harms. Additionally, both parents and pupils mentioned that they would expect there to be systems in place that would flag if a pupil was trying to access harmful content, or asked questions or mentioned real events in their personal or school lives that suggest a safeguarding issue.
Overall, most parents felt more comfortable with their child using AI in schools with supervision from a teacher or member of staff. If it was to be used at home, some said they would want to oversee use. This was particularly important for parents of pre-school and primary school pupils, who were at times worried about whether there would be any security controls to prevent pupils accidently seeing harmful content.
Parents and pupils were concerned that AI use would exacerbate existing inequalities in society.
Almost all participants felt that if AI could indeed support children’s learning and potentially give them a head start, there should be equal access to it for all schools. Within the current education system, they assumed that the best AI tools would only be accessible to the schools that could afford it. They felt this would exacerbate existing inequalities, add to the unfair advantage of those who are better off, and lead to further stratification – of the education system, but also of the labour market and society as a whole. Parents of pupils who attend schools that are struggling or in disadvantaged areas felt resigned to inequality getting worse, with AI tools just another resource their child could miss out on.
There was also some concern about variation in teachers’ abilities to use AI to its full potential, at least at first. Both parents and pupils worried that if training and support wasn’t provided to ensure all teachers meet a minimum level of proficiency with AI tools, some pupils would benefit less from AI use than others.
As a result, many felt that the introduction of AI tools in schools should be centrally coordinated and funded, with tools standardised and quality assured, and profits from selling pupils’ work and data reinvested into the school system.
What about schools that don’t have the facilities? It was hard enough before all this.
It will just make the wealth divide worse.
Poor and working class [areas] might not have access to computers, affluent areas will have the best access.
In order to give permission for their child’s data to be used, parents need more clarity and reassurance about how data will be collected, stored, used and shared.
Concerns about privacy and data breaches were prevalent among parents, many of whom had questions about how and where their child’s data will be stored and shared. They were also concerned about the potential longevity of data, and the extent to which it could “follow their child through life” and affect their employment and further education opportunities. There were also concerns about potential data sharing between government departments. Parents of pupils with SEND in particular were concerned that the data could affect their child’s eligibility for state-funded benefits.
Where does it go, where does it stop? Will it always be tagged to you? What about applying to university?
Given these concerns, the majority of participants wanted to see data protection rules adhered to, and reassurances that data generated from pupils’ interactions with AI would not be used for wider, non-education related purposes. Alongside this, they needed clear information about why data is being collected, who will have access and how long it will be stored. For any use of AI in education, pupils’ personal data being accessed or hacked was a key concern which led to some participants feeling uncomfortable with pupil data being used to train educational AI tools.
There is a sense of big brother about it all. Infant school, they’ve got your whole life in a data bank, how is that information going to be utilised.
Table of ai use case acceptability , acceptable .
Acceptable uses of AI were felt to be those that help rather than replace teachers:
Creating a lesson plan
Generating class tests
Generating class materials
AI was also felt to be acceptable if being used by teachers as a tool to provide additional academic support:
Generating feedback on pupils’ work
Marking classwork
Marking class tests
Participants, especially parents, were hesitant to say AI was acceptable to personalise learning:
Helping teachers decide what support a pupil needs
Personal tutor chatbot for pupils
There was some positive sentiment towards personalised learning and the potential benefits to the quality of education. When it was considered acceptable, specific conditions were required:
The personalised AI tool is monitored and ‘signed off’ by a teacher
Clear information is provided about what pupil data will be used and how it will be stored
Parents’ permission is obtained before personalised AI tools are used
Pseudonymised or anonymised data to be used, with robust data protection.
Use cases felt less acceptable where AI error could negatively impact educational outcomes (and therefore the future prospects of children) by getting an exam mark wrong.
5.1 summary .
Parents and pupils were generally comfortable with pupil work being used to optimise AI tools, with very few concerns about intellectual property.
However, there was much more uncertainty about work being personally identifiable and personal data being shared outside of schools and DfE.
Both parents and pupils needed reassurance about the de-identification or anonymisation of data, especially concerning special category data, which was seen as requiring more protection, or the links to other information, such as patient records (such as for children with SEND).
Although neither parents nor pupils thought that they should be directly compensated for providing their work or data to tech companies, they strongly felt that private companies should be required to share at least some of the profit with schools (via DfE).
After receiving an explanation of machine learning, participants were provided with examples of different forms of pupil work (such as homework, class work, mock exams, exams) and data (such as name, age, SEND status) that could be used to optimise AI tools.
Pupil work that can be used to optimise ai tools .
Parents and pupils were comfortable with pupil work being used for AI tool development in the vast majority of cases.
Most participants understood that greater breadth and volume of data provided to optimise AI tools results in AI tools having a greater understanding of what constitutes ‘good’ and ‘bad’ work, and being able to provide constructive feedback. Most grasped the need for AI tools to be optimised with work spanning higher to lower grades, and some specifically pointed out that without examples of ‘bad’ work and the ability to identify what makes work stronger or poorer, AI tools would not be able to assess work as needed.
In particular, participants felt that AI tools would need to be optimised with as many different styles of work as possible, in order to fairly and accurately assess and support pupils with differing abilities and needs, especially children with SEND. They noted the particular importance of this in more subjective cases, such as in creative writing.
For me it would be that what is put into the system is enough to get a positive outcome for the children.
Although there was confusion about how exactly AI tools would learn from pupils’ work, parents and pupils still felt pupil work was fine to share. By the end of the engagement, both parents and pupils understood that providing a wide range and quality of work would improve AI outcomes in the long run. As a result, they accepted data sharing as a necessity.
While most types of work are fine to be used, usage needs to be clearly communicated to avoid concerns about plagiarism or penalisation.
The topmost concern about sharing work with AI tools was of more substantial pieces of work (such as coursework) being plagiarised by other pupils. Parents and particularly pupils’ first assumption was that AI tools could be used by other pupils to generate work that draws heavily from their own work, leading to their efforts being co-opted. Some understood AI ‘learning’ from pupils’ work to mean that AI would then use it to create new pieces of work for other pupils.
Post-GCSE pupil, Birmingham:
Not okay to share [Homework] – because your schoolwork is your intellectual property, it’s you and you did that. If the AI takes that then you can’t copyright it.
It can’t use everyone’s homework so it can be copied and plagiarised.
Despite this assumption, this concern was only notable for larger pieces of work that pupils spent considerable time on, with little concern about other more routine work produced by pupils (such as class test answers).
There was also concern from some about pupil work being shared more widely by AI tools, with pupils in particular worrying that this would mean that examples of ‘bad’ work they produced would be circulated among or accessible to other people and cause embarrassment or judgement.
Further explanation of how work would be used to optimise AI tools, rather than being regurgitated or circulated, provided reassurance to uncertain pupils and parents. Emphasis on the volume of data required to optimise AI tools, and reiterating that an individual piece of work would be one among millions of pieces of pupil work, also reassured some parents and pupils.
Additionally, some parents noted that examples of high-scoring essays or exam answers were already shared more widely, and did not feel sharing work with AI tools would be cause for more concern.
However, pupils and parents maintained some doubts about the limitations of AI optimisation, especially in relation to more creative or subjective pieces of work.
Some parents and pupils were unconvinced by the ability of AI tools to assess work for subjective subjects requiring more nuanced interpretation such as PSHE, or creative subjects like Art and English. They did not feel that pupil work would optimise tools to the extent needed for them to achieve a human level of expertise and understanding, making the use of pupil work feel futile.
I think it makes sense with the factual subjects, because with science and maths most of the time there is a definitive answer. But like English there is a main answer but there are other right answers too.
Concerns about plagiarism were also heightened for creative work such as artwork or longer essays, which pupils felt was more obviously valuable intellectual property and could hold more personal significance than written work. As above, they struggled to understand how AI tools could be optimised using this work or to believe that a sufficient level of optimisation could be achieved.
It wasn’t very clear about the copyright situation, I think that’s a huge thing to know, for all children, some children have been designing logos and stuff from like 13/14.
Acceptable pieces of work were those felt to be less ‘valuable’, with fewer concerns about them being plagiarised or misinterpreted by the AI:
Participants were less sure about the use of work that more effort had gone into or that felt more subjective or creative:
Coursework
There was more reluctance about the use of more ‘serious’ pieces of work with higher stakes, and more reassurance needed for their use:
Mock exams
Exam answers
Types of data .
Parents and pupils were most comfortable with anonymised demographic data being used and shared.
In almost all cases, participants were comfortable with anonymous demographic data being used to optimise AI tools. They particularly recognised the importance of providing AI tools with information on pupils’ ages or year groups, in order to accurately gauge the progress and performance of pupils at this level.
While there was some confusion about the need for data like gender, most participants were nevertheless fine with it being provided as it was not a threat to pupils’ anonymity. A few parents expressed concern that this data could contribute to unfair bias or discrimination, and some parents and pupils stressed the need for data about gender in particular to be inclusive, reflecting pupils’ own gender identities rather than erasing them.
You’ve got bias in AI but its already there, probably easier to correct than it is in a person.
More conditions were attached to the use of pseudonymised and special category data which was seen as requiring more protection, despite recognition of its necessity and openness to its use.
[On including gender] It depends what it’s being used to train it for. It doesn’t really bother me but bias can happen.
Parents and pupils understood that in order for AI tools to provide personalised, lifelong support for pupils that is tailored to their educational needs and learning styles, data linkage is necessary via pupil identifiers. There was openness to this due to the potential benefits for pupils and the perception that this tailored support would lead to better outcomes than generic AI use.
However, participants were deeply concerned about the security of this data, especially special category data, fearing that any breaches would result in comprehensive datasets about individual pupils’ demographics, abilities, and weaknesses being shared more widely and exploited. This was a particular worry for parents of children with SEND, for whom concerns centred around their children’s future opportunities. They were particularly concerned that their child’s SEND status could be shared between government departments which could impact the benefits their child might be entitled to, or about future employers accessing their child’s data via the companies developing AI, impacting their child’s future.
Both parents and pupils strongly felt if data is pseudonymised, identifiers should be held at a school level and ought not to be shared with tech companies or the government. There should also be stringent restrictions and safeguards in place to ensure the security of this data, with assurances communicated to parents and pupils of how the data is stored, who has access to it, and when and where it will be used.
Data should only be shared with schools, parents and education department.
Parents and pupils felt strongly that personally identifiable data should not be used in any circumstance.
Participants emphasised that data that allows individual pupils to be identified, such as name or date of birth, should not be used. This data was seen as unnecessary for AI optimisation in an educational context, and was deemed to carry too many risks for pupils when linked with the other data being collected, particularly special category data. While parents were more resistant to the use of this data, citing the concerns about future opportunities covered above, pupils also strongly preferred the use of their data in an anonymised or pseudonymised form.
Use of data that could easily be anonymised and was felt to be relevant to AI understanding of pupils’ work was widely accepted.
Assurances were needed about data perceived as more sensitive or pseudonymised, particularly to address concerns about data security and storage:
Pupil identifier
Information about SEND (or any health conditions)
Data identifying pupils was unacceptable and felt to be unnecessary:
Date of birth
Parents and pupils .
All participants expected to be involved in decisions made around the use of pupil’s work and data, with parents and pupils having final say.
While parents and pupils didn’t expect to make specific decisions about AI optimisation, they did expect to be consulted on whether and by whom pupil work and data can be used. There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement. Parents, in particular, stressed the need for clear and comprehensive information about pupil work and data use and any potential risks relating to data security and privacy breaches.
Pupils also felt that knowing how their work and data would be used would be important, and that they should have a say alongside their parents, especially if they were old enough (see 7.2 Parent and Pupil permission for further discussion of age at which pupils should have a say). However, they were less likely to require extensive detail about its intended use, reflecting their higher level of comfort with data sharing and acceptance of its necessity in order to benefit from the tools using it. With the understanding that pupil work is their intellectual property, some pupils were more concerned about the use of their work than their data (see 6.1.2 for concerns about work use).
If child’s work is going to be used/processed in AI the parents should be advised.
Schools were most trusted to make decisions about the use of pupils’ work and data, as well as to hold data that was seen as more sensitive (such as SEND data or pupil identifiers). Where concerns about school involvement existed, they were centred around unequitable AI use and access.
Parents and pupils felt that schools could be relied on to make decisions in the best interest of pupils and to prioritise educational outcomes and safety over other considerations like AI development and profit. Central to this trust was the widely held perception that schools are not primarily profit-motivated and are already trusted to safeguard pupils, which led to the assumption that schools can be relied upon to continue doing this when it comes to AI. As a result, participants wanted schools to have the final say in how pupil work and data is used, with the ability to approve or reject uses suggested by the government or tech companies if they are felt to harm pupils or jeopardise their privacy and safety.
Schools were also trusted to hold pupil data, with many who were uncertain about special category data being shared and used feeling reassured about this data being collected if schools could control its use and guarantee that it would not be shared beyond the school.
The ID number has to stay within the school and be really safe.
I would want to feel the school (teachers especially) have all the info and are confident the AI is safe.
A few parents noted that schools may not all choose to use AI, or that there could be disparities within schools if it were left up to teachers’ discretion and some refused to integrate AI into their teaching. Some worried that schools with fewer resources would be left behind as other schools (such as private schools) adopted AI use to their advantage. There was also a minor question about the impact schools’ teaching philosophies might have on the decision to use AI or not, for example whether religious schools might choose not to use a standardised AI tool in order to have control over what exactly students learn.
However, there was little real concern about schools’ oversight of AI tools and pupil work or data, with most participants feeling the more control schools have, the better.
Parent of GCSE pupil, Bristol:
What about schools that don’t have the facilities? It was hard enough before all this.
Access is a concern, ensure there’s a level playing field across the board.
Parents and pupils saw a role for DfE in setting rules around AI use and (to a lesser extent) pupil work and data, recognising the need for a central authority. However, many participants were worried about potential negative impacts from the use of AI tools and pupil work and data by government.
Most felt there was a role for DfE having a say in how AI is used in schools, feeling that central guidelines would make AI use more consistent. DfE was also generally trusted to make decisions in the best interest of pupils and with education rather than profit in mind. This was seen to necessitate its involvement in any decisions made by tech companies.
However, trust in DfE to set rules was predicated on school involvement in decisions made, particularly those around the use of pupil work and data to optimise AI tools. While there was a need for DfE to provide central oversight, parents and pupils were still hesitant to hand over complete control of pupil data. In this, participants’ preferences reflected pre-established views that schools, being closer to pupils and in close communication with them and their parents, were more familiar with pupils’ needs and parents’ concerns, and were therefore more likely to make decisions accordingly.
Pupil-centric at every stage, profits should be distributed to the schools and [for] development not just led by tech companies, with the education [sector] as well.
There was a notable tension between the desire and perceived need for robust government oversight, and concern around government involvement. Many parents and pupils worried that other government departments might not make decisions in the best interest of pupils, or might not have the ability to direct efficient, effective, and beneficial use of AI.
My initial thought is an independent regulatory body so they’re a step away from it but I don’t know what that looks like.
Parents also worried about how pupils’ performance and special category data (such as SEND status) could be used by government if held in a central database accessible beyond DfE. There were also concerns around how particular agendas might determine the content used to optimise AI and therefore how and what AI tools teach pupils.
This was a particular concern for parents of children with SEND, who worried that their children’s future could be affected if pseudonymised or personally identifiable data is held and accessed by government beyond their time at school. They required reassurance that data showing their children’s level of ability and any SEND would not be used in future, for example to affect their entitlement to government assistance.
Many parents also generally worried about increased surveillance if provided with data on children throughout their formative years, particularly if AI use becomes standard and most or all of the population’s data in this context is held and used by a limited number of central organisations.
Thinking about the work…How long will it be kept there - who will it be shared with and how much of my child’s personal info is attached to it?
Participants feared that particular viewpoints or biases, including those within the curriculum, could become more entrenched in AI and harder to correct. For these participants, involvement of independent experts within the field of AI and education could mitigate some of this risk by providing a check for decisions and ensuring a balance of views.
I feel like they’re trying to push the kids in a certain direction, and then the government gets to know everything [decision] they make.
Trust in tech companies was extremely limited and there was little to no support for them to be granted control over AI and pupil work and data use.
Profit was almost universally assumed to be the primary or sole motivation of tech companies, rather than the desire to improve education and pupil outcomes. Reflecting starting views of tech companies as non-transparent and assumptions that data is sold on to third parties, participants did not trust them to protect or use data responsibly. Parents and pupils assumed that given free rein and with no oversight, tech companies would choose to sell data on to other companies with little concern for pupil privacy or wellbeing.
I think yes, the company is going to benefit, that’s economics, but I think it would be good to give it back to schools.
Yeah, you kind of want to know what type of people are developing [it], if the people running it are doing it for the wrong reasons, it could get out of hand, you want to know they’re doing it for the right reasons.
Participants did note that tech companies working in close partnership with schools or DfE, with clear oversight and regulation, would provide some assurances that they would be more likely to use pupil work and data responsibly and to benefit pupils.
6.1 summary .
Participants’ identified the following conditions for the use of AI in education and the use of pupils’ work and data to optimise AI tools:
Human oversight: Human involvement in AI use to correct for error and unfair bias, as well as providing safeguarding.
Parent and pupil permissions: Providing parents and pupils with the necessary information and the opportunity to make informed decisions about the use of their data.
Standardisation and regulation: Ensuring that AI tools used within schools are of a uniform standard to avoid exacerbation of inequalities, with strict oversight of any tech companies providing the tools.
Age and subject restrictions: Using AI tools only where appropriate and where they add value. Strict age restrictions on direct interaction with AI.
Profit sharing: Ensuring that tech companies that benefit from accessing data share some of their profits so that this can be reinvested into the education system and benefit schools and pupils – while recognising that private companies will need to be incentivised to develop better tools.
Participants stressed the importance of human involvement in AI use at every step of the process.
Given the recent developments in AI, and the need to continue to optimise it, the use of any tools in the classroom or at home was seen as risky if not overseen by humans, at least to begin with. This concern was particularly pronounced after participants heard about the risks of bias and about AI only being as good as the data it learns from. Many noted that AI can make mistakes or ‘hallucinate’ inaccurate responses, and would need humans to ensure nothing was being taught or assessed incorrectly. There was also an assumption that errors made by AI would be harder to correct than those made by a teacher, which can often be addressed directly by parents or pupils in conversation. This means AI tools should always be checked, with any resources created looked over by teachers, any marking or feedback generated by AI tools reviewed by teachers, and any tests or exams marked by AI being assessed by teachers or external markers.
Parents were particularly keen that pupils’ AI use is supervised or at least controlled, and that AI tools are never used as a substitute for a teacher. Pupils similarly stressed that learning should not be solely delivered by an AI tool operating independently, as teacher-pupil interaction is highly valued and most felt some level of human subjectivity is always needed. Pupils also worried that AI use without human oversight might mean errors made by AI are overlooked, leading to them not learning the skills they need or being taught incorrectly. Any potential errors should and could be picked up by earlier human assessment of AI outputs.
Parent of Pre-GCSE pupil, Newcastle:
The [AI] tool should supplement the teacher, not replace or undermine [the teacher]. A pupil-teacher relationship is still very important for [the pupil’s] development.
Both parents and pupils felt they should be enabled to make free and informed decisions about how pupil work and data is used.
This means having an understanding of when AI tools will be used and why, and how pupil work and data will be used to optimise them and why. Almost all participants felt that agreement should be a pre-condition of AI use.
Despite consensus that agreement should be required, views around the details of agreement differed:
Parents emphasised their responsibility to make informed decisions for their children’s wellbeing. They therefore felt their permission ought to be required, particularly for younger pupils (generally those aged under 16). Many were resistant to the idea that their children could make these decisions for themselves, wanting to have a say in all aspects of their children’s education.
Pupils tended to attach more importance to their own comfort with AI and work and data use, particularly with the understanding that the work they create is their intellectual property. Most pupils we spoke to had experience of permitting data sharing for themselves when signing up to and/or using apps and websites, and most did not view agreeing to work and data use for AI optimisation purposes any differently. While many were happy for their parents to also have a say, some felt this should not supersede their own wishes, and that pupils should have final say over the use of their work and data above a certain age (13 or 16).
Parent of GCSE pupil, Birmingham:
Up to 16, it’s definitely a parental choice, but as they start to make their own choices this would be included.
Might be good to trial with older kids, because we can already consent ourselves and then you could show the parents the positive data.
Expectations for how permission would be provided varied, but most parents described an “opt-in” model and expected to be given the chance to understand and agree to all potential uses of their child’s data and work. Parents suggested that this agreement could be “staggered” as understanding of AI tools and comfort with its use grows, and that schools and DfE could make decisions about AI use within the parameters of permission provided. Generally, the expectation was that even completely anonymised data and work would require some level of permission to be shared and/or used, though most participants indicated they would agree to its use. However, there was little consideration of how this would work in practice, especially alongside equitable access to AI for all pupils and schools, which was seen as an important condition for its use.
Generally, pupils expressed higher levels of comfort with sharing their data than parents, many of whom had serious concerns about data privacy, security and storage. A few pupils assumed their parents would lack understanding and would be reluctant to allow them to share their data as a result, in contrast to their own willingness to share it. Many parents noted that widespread AI use and normalisation of data-sharing would make them feel more positively about it and more likely to easily provide permission, assuming that once AI use becomes “tried and tested”, concerns are likely to be alleviated.
AI use in schools should only be through standardised and strictly regulated tools to ensure quality control and equity of access.
Parents and pupils stressed that all schools should have access to the same, quality assured, AI tools. Many suggested this could be provided by certification processes sanctioned by schools and the government, with only AI tools that are officially tested and meet a minimum performance standard being approved for use in education. For many, this would alleviate concerns about some pupils or schools benefitting over others by accessing more developed AI tools than others.
Concerns about the quality of AI tools also led to worries that pupils could be penalised for, or disadvantaged by, poor teaching or support provided by low-performing AI tools. Pupils worried that they would be held accountable for any errors committed as a result of incorrect AI teaching or support. Parents also wanted guarantees that, in cases where low-performing AI tools led to poor pupil performance, the pupil would not be penalised, and emphasised a need for regulations ensuring clear accountability in case of AI error or misuse. In particular, parents of primary and pre-school children wanted guarantees of accountability in the case of malicious or inappropriate content being propagated by AI tools, along with strong and appropriate content safeguards to ensure they are safe for children to use.
Parent of Post-GCSE pupil, Newcastle:
If used in marking exams, make sure its accurate so pupils are not disadvantaged.
While there was no overall consensus on who ultimately could be held accountable for any issues that arise, many suggested DfE and schools both have a responsibility to ensure AI tools are fit for use, and to minimise and rectify any errors or misuse. Others felt that this responsibility should lie with tech companies, and that as the developers of these tools, they should be made to answer if their use harms pupils.
Regulation was also felt to be crucial for ensuring stringent data collection, privacy, and security.
DfE and the wider government were generally seen as responsible for setting, communicating, and maintaining these standards. Parents in particular expected clear rules to be established for:
How pupil data can be collected;
For what purpose it can be collected;
How it will be stored;
How long it will be stored for; and
Who can access it.
Parents emphasised the importance of these regulations being put in place and communicated as a pre-condition for widespread AI use in education.
Parents and pupils were in agreement that the use of AI tools should be restricted, with the most accepted uses involving older pupils and subjects seen as “objective”.
There was a general consensus that AI tools would be best used directly by pupils in secondary education, at which point both parents and pupils felt that pupils would be able to confidently and safely interact with the technology. There was less concern about pupils not developing necessary social skills at this point (due to interacting with AI tools alongside teachers), and less concern about the use of pupils’ data and work. Overall, both parents and pupils felt most comfortable with AI tools being directly used by pupils old enough to understand the tools and agree to their use. Parents’ estimation of this age tended to be higher than pupils, as pupils were more likely to set the minimum age at 11 or 13, while many parents felt that pupils would only be able to meaningfully agree at age 16.
GCSE pupil, Birmingham:
Maybe it’s not appropriate for young kids, you should have restricted access, and it might not simplify it enough.
Parents of primary and pre-school pupils were least comfortable with the potential use of AI tools, citing concerns around unintentional exposure to harmful content and children not picking up the skills they need to develop. At this age, the importance of play and socialisation was emphasised, and parents worried these elements of young children’s day-to-day education would be lost or minimised through reliance on AI.
Both parents and pupils were most comfortable with AI being used to support learning (and particularly to mark work and/or provide feedback) in subjects seen to have more concrete, and therefore more easily assessed answers, such as Science or Maths. These subjects, which contain simple answers (for example, multiple choice), were seen as less likely to confuse AI tools or to be incorrectly assessed due to bias or a lack of understanding. Participants broadly felt reassured that AI tools could be sufficiently optimised to correctly assess these forms of work and would trust their use when overseen by a teacher.
There was considerably less openness to AI being used to support marking or to assess more creative or subjective subjects like Art, English, Religious Studies or Social Studies. Participants deeply doubted that AI could engage with pupils’ schoolwork on these subjects in the same way as a human, or to grasp their nuances as a teacher would. They also broadly felt that these forms of schoolwork are more personal to pupils, or involve more effort to create, making the stakes of any AI error feel higher.
Parent of primary school pupil, Bristol:
You lose being creative, the students being creative, relying on an AI to educate them, and then using AI to do their homework, they’re going to lose that creativity.
There was widespread consensus that, if profit were to be generated through the use of pupil’s work to enhance AI in education, schools would be the preferred beneficiaries, and resistance to the idea of tech companies being the sole profiteers.
Generally, parents and pupils acknowledged that pupils profiting individually from the use of their work and data would not be feasible, but almost all strongly believed that any profits derived from this data use should be distributed among schools to enable pupils to benefit. This belief was intensified by the understanding of intellectual property and pupils’ ownership of their work and data. Participants suggested a minimum share of the profits being handed back to schools, but views on how this should be done varied, with many feeling this should be done to maximise equality of access to AI (with profits being used to fund AI tools and resources for schools who are not able to do this themselves), while others felt profits should be equally shared. Few participants thought profits should correspond to each school’s level of data sharing and AI use, and participants were especially positive about profits being used to level the playing field for schools.
While participants did want schools to profit from AI use, some felt this could happen through profits being used by local authorities or regional bodies to improve education in the area, or by DfE to improve the education sector at a national level, rather than being distributed to individual schools. Most were comfortable with profits being shared between schools and DfE, however, the general assumption was that pupils would benefit most directly if profits were distributed to individual schools.
Participants accepted that tech companies would profit in some way from the use of pupil work and data, but the consensus was that they should not be the sole beneficiaries. Parents of children with SEND were particularly negative about AI tool development becoming a money-making exercise. Understanding of how exactly tech companies could profit was limited, with most assuming that they would make money by selling pupils’ data to third parties. There was a lack of awareness of other ways in which they might benefit from this data use such as by developing other AI tools for commercial use. On prompting, this form of benefit was generally seen as acceptable if used to develop educational tools for use outside the education sector, but unacceptable if used to develop tools for other purposes. This possibility was seen as misusing data for something other than its intended use, reflecting existing discomfort and concerns about data being sold by tech companies without participants’ knowledge or agreement.
7.1 methodological reflections .
Due to time pressures, the in-person fieldwork was carried out as a single six-hour session per location. Sitting still and processing information for this length of time can be challenging for adults’ attention spans and energy, but it was particularly difficult for pupils. We knew we would need to share large volumes of information, and aimed to make the sessions as engaging as possible by:
Using different types of stimulus (including animations, videos from experts, worksheets, hands-on demonstrations of AI tools);
Providing written summaries of all videos; and
Including activities that would require participants to stand up and move around (including voting exercises).
However, in the end, we had to adapt our approach in several ways to counteract participant fatigue:
In the first workshop, we asked participants to compare three different future scenarios, with detailed information about the different use cases of AI in education, the types of data and work that would be used to optimise it, and the conditions in place to regulate its use. This activity took place towards the end of the workshop, and participants found it very challenging to compare such abstract, yet detailed, scenarios. In subsequent workshops, we focussed instead on asking participants to describe the future they would like to see, rather than testing potential scenarios first.
We gave pupils additional break time after lunch. By this point they had understood the basic principles of machine-learning and this meant they were more refreshed for the final activity where we discussed conditions for use.
Some lessons for future engagement workshops:
Including more interactive tools can help to bring concepts to life and keep participants engaged. Participants who had not previously used LLM tools, benefited from being able to see how it works in reality. For future engagements, it may be worth thinking carefully about how devices and applications can be used in sessions.
There are some practical implications for running joint sessions for parent and pupil groups, as they have different needs. We adapted discussion guides for parents and pupils and, as much as possible, made all stimulus suitable for the youngest sample members. However, it may be worth considering splitting groups, so their agendas are decoupled from one another, allowing more flexibility and further adaptation to suit participants’ age.
Shorter sessions over several weeks, as well as a mix of in-person and online fieldwork, may be more suitable for complex topics such as this. Online participants, who had a week between workshops, returned to the second session refreshed. In addition, many had used the interim to think about or discuss what they had learnt with friends or family, which meant they brought more nuanced perceptions and opinions to the final session.
The research showed that awareness, understanding, and opinions of AI are all still evolving. As the technology becomes more established, the public will be further exposed to its applications and form opinions based on those experiences. However, we also know how important the commentary and opinion of others - both expert and lay person - are in shaping views and impacting trust. For parents in particular, other parents are powerful influencers, so it will be important to continue engaging with this audience to understand how they feel about the use of AI in education.
There are also a number of specific questions surfaced by the research, which we feel warrant further exploration:
The relationship between private interest and public good : How comfortable are parents and pupils with private companies profiting and how are they held to account and incentivised to ensure they put public good first?
Oversight and coordination of data sharing : To what extent is there support for the central management and facilitation of data access across government and with researchers and private companies? Would parents and pupils be comfortable with an “EDR UK” organisation, similar to HDR UK, ADR UK, or SDR UK?
Equal access and opting out : What happens if you want to opt out? And how can we ensure nobody is left behind?
8.1 demographic sample breakdown .
Location | Bristol | 36 | |
---|---|---|---|
Birmingham | 36 | ||
Newcastle | 36 | ||
Location Type | City/Urban | 48 | |
Suburban/Small town/Large village | 32 | ||
Rural | 26 | ||
Unknown | 2 | ||
Gender | Male | 43 | |
Female | 65 | ||
Age | 18 and under | 36 | |
19-24 | 1 | ||
25-39 | 22 | ||
40-59 | 47 | ||
60+ | 2 | ||
Ethnicity | White | 79 | |
Black, Black British, Caribbean or African | 16 | ||
Asian or Asian British | 10 | ||
Mixed or Multiple ethnic groups | 2 | ||
Other | 1 | ||
Feeling about technological developments and uses of AI (parents only) | Excited | 36 | |
Sceptical/Worried | 36 | ||
Total | 108 |
Head of Government Practice at Faculty | Tom Nixon | What is AI and why is it important? | |
Data Scientist at 10 Downing Street | Andreas Varotsis | What is machine learning? | |
Head of Digital Education at Bourne Educational Trust | Chris Goodall | Potential benefits of using AI for teachers and pupils | |
Head of Digital Learning at Basingstoke College of Technology | Scott Hayden | Potential benefits of using AI for teachers and pupils | |
Digital Strategy at the Department for Education | Fay Skevington | Potential risks of using AI around data protection, privacy, and IP | |
Parliamentary Under-Secretary of State at the Department for Education | Baroness Barran | The bigger picture: wider risks and benefits of AI use and how to manage them |
Don’t include personal or financial information like your National Insurance number or credit card details.
To help us improve GOV.UK, we’d like to know more about your visit today. Please fill in this survey (opens in a new tab) .
IMAGES
COMMENTS
Craft a future in storytelling and literary analysis through an undergraduate BA English Literature and Creative Writing degree at the University of Birmingham. Learn to relish writing in all literary genres from a wide range of critical perspectives while honing your creative skillset in imagining and communicating narratives across genres ...
The module is designed to introduce key aesthetic and cultural developments that shaped literary production during this period and train students in methods of research and analysis at university level. Lectures will frame set texts as (1) representative examples of specific literary-historical moments and (2) case studies for particular modes ...
n The Department of English Literature is a central part of the School of English, Drama and Creative Studies, along with the Departments of Drama and Theatre Arts, Film and Creative Writing, English Language and the Shakespeare Institute. As an English student, you will benefit from teaching and expertise from across the School.
Studying English Literature and Creative Writing in combination, allows you to develop your skills as a writer, whilst setting your own work and aspirations in the context of the wider literary world.Our BA programme introduces you to a broad range of types and periods of literature, with a huge variety of different areas to pursue, from medieval literature to twenty-first-century digital ...
Studying English Literature and Creative Writing in combination, allows you to develop your skills as a writer, whilst setting your own work and aspirations in the context of the wider literary world.Our BA programme introduces you to a broad range of types and periods of literature, with a huge variety of different areas to pursue, from medieval literature to twenty-first-century digital ...
See how students rated English Literature and Creative Writing at University of Birmingham. Plus, view full entry requirements, average graduate salary and prospects, tuition fees you'll pay, funding available and more.
English Literature and Creative Writing University of Birmingham. BA (Hons) · 3 Years · Full-time · Birmingham · 2024-2025. Tariff points: 128/136 University of Birmingham. Edgbaston Birmingham Birmingham B15 2TT Visit our website Contact details ...
Studying English Literature and Creative Writing in combination, allows you to develop your skills as a writer, whilst setting your own work and aspirations in the context of the wider literary world. ... More detailed module information can be found on the 'Course detail' tab on the University of Birmingham's coursefinder web pages. How ...
Overview. The English with Creative Writing (Hons) programme at University of Birmingham introduces you to a broad range of types and periods of literature, with a huge variety of different areas to pursue, from medieval literature to twenty-first-century digital literary culture. These include unparalleled resources for the study of Shakespeare, through our world-leading Shakespeare Institute.
Studying English Literature and Creative Writing in combination, allows you to develop your skills as a writer, whilst setting your own work and aspirations in the context of the wider literary world. ... Required subjects for English Literature and Creative Writing BA Hons at University of Birmingham
Department of English Literature. Explore the pleasures of reading and studying literature in a community of outstanding scholars and students. Our research and teaching spans the full range of literature from Old English to digital culture, and incorporates traditional and non-traditional areas of study.
Creative Writing at Birmingham is part of the Department of Film and Creative writing, and is closely aligned to the Departments of English Literature and English Language and Applied Linguistics, allowing you to benefit from our breadth of expertise. This practice-based PhD will support you in the development of a long-form piece of creative ...
Organisation profile. The Department of English Literature at Birmingham is a leading centre of research in English Literature and Culture. We are one of the top 50 English departments in the world (QS 2021) and we explore, research and study a wide range of literary specialisms, from Old English to Digital Cultures and everything inbetween ...
Our research covers a range of subjects including English Literature and Language, Drama, Film, Creative Writing and Shakespeare Studies. UN Sustainable Development Goals. In 2015, UN member states agreed to 17 global Sustainable Development Goals ... University of Birmingham, 9 Sept 2020. DOI: 10.25500/edata.bham.00000544. Dataset. View all 6 ...
Students may pursue a concentration in literature within the English major or a minor in literature. Creative Writing. Students may pursue a concentration in creative writing within the English major or a minor in creative writing, taking workshops in poetry, fiction, creative nonfiction, and writing for young people; one forms class; and ...
As a second year you will progress to Composition and Creative Writing, in which you explore and deepen your practice of fiction and non-fiction. You will take an English Literature module focusing on texts from before 1900, as well as any module from English Literature, Creative Writing, or another University department.
Dissertation in English Literature ; Optional modules . 80 credits of optional modules, including, if you wish, work-based placement and creative practice modules in Creative Writing, English Language and Film; Detailed descriptions of final year compulsory modules and list of optional modules
Course details. This degree has three stages, each comprising 120 credits. with a broad introduction to the arts and humanities before learning how culture affects the creative process of writing. , you'll focus on your creative writing and English literature studies with two compulsory modules. , you'll complete your degree with an advanced ...
Work with our team of award-winning authors to develop your creative practice and its critique. Creative Writing at Birmingham is part of the Department of Film and Creative Writing, and is closely aligned to the Departments of English Literature and English Language and Applied Linguistics, allowing you to benefit from our breadth of expertise.
In "Werewolf Hamlet," Madden-Lunsford, professor of creative writing at the University of Alabama at Birmingham's Department of English, handles the difficult subject of family dynamics in a sensitive way. The selection of her new book by the Junior Library Guild's editorial team speaks to Madden-Lunsford's storytelling skills as they ...
I want to study English and Creative Writing in the UK and I have Birmingham listed as a tentative choice. What interest me about the course are its module on editing, which seems one-of-a-kind, and its more general emphasis on collaborative writing.
Degrees and GPA Requirements Bachelors degree: All graduate applicants must hold an earned baccalaureate from a regionally accredited college or university or the recognized equivalent from an international institution. Masters degree: This program requires a masters degree as well as the baccalaureate. University GPA requirement: The minimum grade point average for admission consideration for ...
Students must complete a minimum of 40 credits in both Creative Writing and English Literature. English Literature options. From Plato to the Postmodern: Theories of Literature and Art. Hidden Romanticism. Interactivity: The Theory and Practice of Getting Stuck In. Last Year's Novels. Remembering World War One.
The cross disciplinary minor in creative writing is designed to encourage students to develop their writing talents across a number of literary forms and communication contexts. ... Iris Literature and Arts Magazine. Events. Back to Top. Department of English. Keezell Hall MSC 1801 921 Madison Drive Room 215 Harrisonburg, Virginia 22807 Email ...
Dissertation. In addition to your taught modules, you will complete a dissertation. This will be 75% creative portfolio and 25% critical essay. You will write a 10,000-word portfolio of creative work in the form of a screenplay, excerpt of a novel, a collection of short fiction or a collection of poetry (600 lines).
You lose being creative, the students being creative, relying on an AI to educate them, and then using AI to do their homework, they're going to lose that creativity. 6.6 Profit sharing