PART 13: FACULTY ASSESSMENT POLICY

Starting point:

- This is a core task of the Educational Quality Control Unit and the Study Programme Committee and an important element in the quality assurance of the programme: how are the intended competencies actually tested?

- There is a need for an overview (how and what do we test?). It is initially about the form (how do you communicate this to the students, criteria). In addition, the content will also be viewed in the context of horizontal and / or vertical coherence of the programme (learning path).

- Emphasise good practices. Often not yet visible enough => more reflection than control. The faculty wants to think along about the assessment policy, not control it.

- Some concrete issues have highlighted this topic even more: low pass rates in some subjects (for example, in higher years), discrepancy in points in the assessment of papers (research paper / master's dissertation), lack of transparency for students (for example, the assessment procedure of oral exams), multiple choice and higher cut-off score, lack of insight into local academic culture for guest and new lecturers.

- An umbrella faculty assessment committee and an assessment committee per study programme are established to monitor and coordinate these matters. Lecturers will be asked to provide information about their assessment policy (including exam questions). Students will also be interviewed more in depth about the assessment practice

This policy documents consists of:

  1. Brief background about the UGent assessment policy
  2. A proposal for the Faculty Assessment Concept
  3. A proposal for establishing the Faculty Assessment Committee + working method
  4. A proposal for establishing Assessment Committees per study programme + working method

Brief background about the UGent assessment policy

UGent assessment policy = coherent whole of measures and provisions to monitor and promote assessment quality. UGent opts for an assessment model consisting of three phases (aiming, measuring and guaranteeing) and three levels of involvement (lecturers, study programmes and the university). In a first phase, a vision on qualitative testing is made explicit. This is the AIMING phase. Subsequently, in a second phase, the accomplishment of the proposed quality objectives is verified (MEASURING phase). Finally, in a third phase, the assessment quality is guaranteed (GUARANTEEING phase) in agreements and regulations, in the organisation and planning of the evaluations and by supporting lecturers and study programmes.

UGent assessment concept: Educational translation of 'Dare to think' in the field of assessment. The UGent ‘Measuring Integrated Knowledge Competencies' assessment concept indicates what our institution wants to emphasise in further assessment development. With this policy we also want to characterise the individuality of assessment at Ghent University more educationally.

More concretely:

  1. Valid assessment is based on intended competencies
  2. Integration also tests knowledge
  3. Authentic assessment in various contexts promotes transfer
  4. Discipline-crossing assessment encourages critical sense
  5. Feedback stimulates creativity
  6. Action and interaction require the highest reliability
  7. Transparency installs trust in assessment
  8. Self-reflection is the basis for lifelong learning
  9. Varied assessment practice respects diversity

Assessment vision of the faculty of Political and Social Sciences, based on UGent's assessment concept

Quality assessment control takes place at different levels, namely within the study programme itself by the assessment committee and the study program committee and outside the study program by the examination committee and the education quality control unit.

Faculty assessment concept

Valid assessment is based on intended competencies

One of the most crucial quality assessment criteria is validity: Evaluation has to be meaningful, usable and valid to make a statement about the achievement of the predetermined competencies by students. In 2013-14 the faculty rewrote the educational competencies that we aim at among students. These written competencies form the basis for well-considered choices regarding evaluation. The evaluation form always has to be suitable for measuring the intended competencies at a certain moment. The Faculty Assessment Committee will conduct an annual evaluation based on our competence matrices per study programme: are the evaluation methods used in relation to the intended learning outcomes (exams and papers) for the Bachelor's and Master's degree programmes? In addition to this global overview of the assessment methods for the full Bachelor and Master programmes, an analysis is also made of the assessment methods for the individual learning outcome blocks.

Great diversity of evaluation forms from an educationally innovative perspective.

Our programmes use classical evaluation methods (written and oral exams) to verify whether students have acquired sufficient academic knowledge. However, we want to focus even more on various innovative evaluation forms with which higher, broader and / or more complex competencies are evaluated. In our assessment policy we wish to emphasise this more strongly by mapping out existing good practices in this area and disseminating them more widely in our training programmes. To this end, we also want to make sure that these educationally innovative evaluation forms (such as peer assessment, papers, etc.) are implemented and applied in practice according to the highest quality standards and in a transparent manner.

The education quality cell also provides certain tools for this (such as assessment matrices, information afternoons, etc.)

Feedback stimulates creativity

The faculty wants to focus on high-quality feedback! In particular interim or non-periodic evaluations with immediate feedback are one of the most powerful stimuli for learning. These encourage students to regularly focus on their studies, spread over the entire duration of a course unit. The faculty wants to maximise the potential of efficient feedback possibilities and focuses on automated or non-automated feedback, individual or collective feedback, peer feedback, etc. Organising classical feedback more collectively and repetitively is an option (you can reuse classical feedback, because the same remarks resurface). It is important to coordinate employees in this aspect. To guarantee the reliability of the assessment and the quality of the feedback, the use of clear evaluation criteria is crucial.

Transparency and communication

Transparency and communication of assessment is essential. The faculty assessment committee, therefore, wants to provide guidelines in this regard. Internal communication among lecturers of our faculty about their assessment is crucial in order to arrive at a high-quality assessment policy. We want to guarantee this by:

  • Annual checks of the course sheets by the faculty assessment committee; these describe the evaluation forms, moments and conditions for passing a course in a clear terminology. Criteria are not only necessary for the sake of transparency, but also serve as a guide for students to learn about certain things (for example, to clarify what a high level is). Give students the right tools!
  • Pointing out the importance of the following aspects to the lecturers:
    • Explaining a course sheet in the first lecture
    • Including a course sheet (for example at the front of the course)
    • Mentioning papers and the like that count towards grades on the sheet + providing partial results to the student via personal communication (see also new tool in Minerva, gradebook)
    • Providing example questions
    • Drawing up a kind of information sheet about the design of the course that is given during the first lecture; when and where, learning material, lesson preparation, exam, communication (for example, via Minerva), lesson plan and / or reading list
    • Personal feedback after the exam. Efforts can be made to provide systematic feedback for the high-stake subjects (for example, the master's dissertation, research paper, papers, etc.)

Self-reflection contributes to personal development in the context of lifelong learning

Although assessment usually serves as an assessment tool, evaluation can also be used as a means of learning (= at individual level, self-reflection). With a view to lifelong knowledge development, it is important that students reflect during their UGent period on their own learning process. Students can do this through the educational evaluations, the ombudsperson and through his/her representatives on the study program committees and Educational Quality Control Unit.

Good planning and distribution during examinations as part of high-quality and fair testing

It is ensured that that assessment is well programmed. We strive for fixed exam schedules for all years (exceptions, of course, are made in the event of programme changes, unavailable auditoria, public holidays, etc.). Students and lecturers are consulted. The use of Centauro should make things more clear for the students

Assessment committees as a sustainable quality assurance.

Within the faculty, faculty and programme-specific assessment committees are established, with the following objectives:

  1. Ensuring consistent quality assurance of assessment form in the study programme that fits within the overall vision of assessing the study program
  2. Stimulating validity, reliability and transparency of assessment (interim and final evaluations) and clear communication / feedback to the lecturers and students of the study program.
  3. Achieving a coherent assessment practice within each study programme in which all proposed training competencies are achieved and tested in a valid manner.

Faculty assessment Committee: Composition & method

Founded in the academic year 2013/14, as a sub-committee of the Educational Quality Control Unit

  1. Who?
    Director of studies (chairs the committee), Chair Study Programme Committee, ombudsperson, 1 assistant academic staff, 1 member of the Educational Quality Control Unit, 1 member of education innovation, 1 student • by invitation from someone of the Department of Educational Policy, curriculum manager / FSA (regarding exam schedules)
  2. Frequency: annually
  3. Quality assurance check by Faculty Assessment Committee:
    • The faculty assessment committee prepares template exam questions (with, among other things, name, study programme, year, session, examiner, duration of exam, material usable during exam, multiple choice (with our without negative marking, negative marking type, student name, designated grade space to give points per question) + feedback)
    • The faculty assessment committee makes a quality checklist for lecturers: list on the basis of which the lecturer can check during and after the preparation of the exam questions whether the various aspects of the quality assurance of the exam questions have been met (think of the three-fold validity, reliability and transparency + items related to exam form such as multiple choice, open questions, open book, etc.).
    • The faculty assessment committee sets up a screening matrix that will be used by the programme-specific Assessment Committees
    • Annual evaluation on the basis of our competence matrixes per study programme: are the proposed programme competencies sufficiently achieved and adequately assessed?
    • The faculty assessment committee also acts as a first-line assessment committee (operation specific to the programme) for the common courses from 1e Bachelor
    • Quality control Master's dissertation (every four years) by the faculty assessment committee (for example, an assessment matrix)
    • General problems such as pass rates
    • Exam scheduling (publication deadlines)
    • The Assessment Committees (per study programme) report where necessary to the Faculty Assessment Committee about their operation. On the basis of all reports, a global advice report is drawn up and fed back to study programme committees (listing good & bad practices in order to make sure the future assessment policy is evidence-based.

Assessment Committees per study programme: composition & operation

  1. Founded in the academic year 2013/14, in the hands of the Educational Quality Control Unit (????)
  2. Who?
    • Chair study programme Committee (chair), 2 additional professorial staff, 1 member from the assistant academic staff, 1 member from the education quality control unit + 1 student will be involved in the operation of the programme-specific assessment committees. • On invitation, other employees (possibly external) can attend the meetings.
  3. Quality assurance control by programme-specific assessment Commissions:

    A. For exams of individual course units

    • Annual checking the course sheets (section evaluation + final score calculation) See faculty assessment concept (point 4 transparency and communication)
    • • Screening courses
       Which courses?
       New courses
       If educational evaluations show that there are problems or at the request of students
       If pass rates show extreme peaks or troughs (outliers)
       When?
       For new courses/new lecturers: Lesson week 13 (1th + 2the semester) (reason: proactive supervision)
       For other courses (week after the end of the exam period)
       What?
       Final exam questions (questions themselves, who drafts them / intervision about drafting exam questions / do the questions represent the entire course), etc.)
       Exam organisation (How much time? Who marks exams? ...)
       Assessment criteria, correction key (if applicable)
       A screening matrix is used that is prepared by programme-specific assessment committee:
       Evaluation form and organisation
       Assessment vision: how do we test learning outcomes, and what is the link with the study programme competency? Crucial components are:
      o Objective (subject competence), interpretation of the relationship of the subject competence with the study programme competence
      o Grading method: list criteria and link those criteria to grade levels
      o Formulating a link between the learning outcomes and grade levels (overview table - see assessment matrix for the Master's dissertation, research paper)
       Assessment communication (ECTS, syllabus, feedback, ...)
       Coordination (instruction from staff regarding assessment, peer review when making exams)
       Actions and conclusions
       Grade distribution: lecturers receive their grade distribution for the relevant course. This provides a statistical representation of the exam grades calculated per academic year (OASIS).
       Interested parties can also use the 'assessment matrix' (education tip: http://onderwijstips.ugent.be/tips/stel-een-toetsmatrijs-op-voor-een-valide-beoordeli/ When preparing an exam, it is important that the various objectives and / or components of the course are represented in a balanced manner. The use of an assessment matrix is primarily intended for evaluations in which representativeness is aimed for, but it is certainly also useful to get a well-considered view of a 'regular' exam.
       The programme assessment committee discusses the various files and also links them to pass rates. Subject to technical feasibility, the coherence between exam results is also checked (which does not necessarily mean that poor coherence with exam results from other courses is problematic)
       Feedback study programme assessment committee to lecturer (possible suggestions)
    • Recommendations for study programme committee
       Prepare global advice on the basis of all submitted files.
       Urgency and more specific advice on problem subjects

    B. Research paper and papers (at least every four years)

    • research paper: for example, creating an assessment matrix by analogy with assessment matrix of the master's dissertation.
    • • papers:
       distribution of papers within the model trajectory over the Bachelor and Master years (with a view to the construction of various standard learning tracks.)
       continuous assessment criteria

Pro Memoria

Study programme committee: members of assessment committee are often members of the study programme committee, things that are reported there (for example, by students) can be fed back to the assessment committee.

Examination Board: ensuring completeness and accuracy of, among other things, the grades. Problems can also be identified here.

Educational quality control unit:

  • educational evaluations, assessment is also reviewed here (educational quality control unit can identify good practices, as well as problem subjects)
  • can provide score distribution tables (can be obtained from OASIS) (statistical representation of the exam grades calculated per academic year- runs over a period of 5 years)
  • pass marks per course

Activities & instruments for study programme committees and / or assessment committees to monitor assessment quality

Implemented in the faculty of Political and Social Sciences

Activity Explanation, directive questions, tools and good practices
1 Ensuring that the faculty or program-specific assessment vision remains up-to-date and making adjustments where necessary An assessment vision is not a static or isolated concept. The faculty assessment committee commits itself to updating the faculty assessment policy annually. We also adjust the instruments where necessary.
2 Assessing the division of tasks between educational quality control unit, study programme committee, possibly the assessment committee, faculty board and lecturers and adjust where necessary See 'composition and operation of faculty assessment committees and study programme assessment committees’
3 Drawing up a timeline scheduling recurring ad hoc activities related to the assessment quality in the study programme committee and assessment committee calendar In order to monitor the assessment quality in a systematic manner, a timeline with activities for the study programme committee and assessment committee is indispensable: See 'composition and working method of faculty assessment committees and study programme assessment committees’ (frequency).
4 Monitoring the alignment between evaluations and study programme competencies (cf. constructive alignment within the study programme)

One of the most important points of attention in an assessment policy is checking whether the evaluation forms are sufficiently consistent with the study programme competencies, whether they are consistent with UGent's assessment policy and in line with the vision of the faculty / study programme. This also means checking whether the evaluation throughout the course is well-structured: see annual analysis of our 'competence matrices'.

Directive questions
• Is each study programme competence assessed in at least 2 courses throughout the study programme? (at our faculty, at least 3)
• Is a study programme competency sufficiently covered by the whole of learning outcomes of the various courses?
• Are the evaluation forms appropriate for the intended study programme competencies?
• Are the evaluation forms sufficiently varied to validly assess the intended competencies?

5 Reflecting on the nature, coherence, situation in the study programme and variation of the evaluation forms and moments (end-of-term assessment and continuous assessment)

Reflection is possible on the basis of the analysis of frequencies, coherence, timing of evaluation forms and evaluation moments in the study programme on the basis of the information in the course sheets: see annual analysis of our 'competence matrices' + biennial 'work and exam forms' exercise.

Directive questions
• Are the evaluation forms and moments balanced in each standard learning track year?
• Are the evaluation forms and moments structured in a logical manner throughout the study programme?
• Is the study programme passable? In other

6 Analysing the distribution of exam grades, pass marks and study duration data

The aim is to discuss issues that stand out and to check whether logical explanations can be given for deviating distributions and whether or not actions are needed. See score distributions of exam marks and annual pass mark analysis.

Directive questions
• Do the pass marks and score distributions of the course units correspond to what can be expected in view of the nature, place in the curriculum, required prior knowledge, etc. of those course units?
• Does the percentage of graduates per year of the standard learning track correspond with the expectations?
• Is there a desired distribution in exam grades?
• Do students pass the study programme within a reasonable study period?

7 Monitoring study programme evaluations, educational evaluations and lecturer surveys

To this end, the results and education and study programme evaluations and lecturer surveys are analysed on a regular basis in function of the assessment quality in course units and the study programme. Education evaluations provide information about how students experience the reliability and validity of the assessment within the individual course units and the programme evaluations provide information about how students assess the variation in evaluation forms, the transparency and validity of assessment and the degree of feedback in the programme. Lecturers themselves report on their assessment practices in the lecturer surveys. At the faculty of Political and Social Sciences this is done in both the educational quality control unit and study programmes committees.

Directive questions:
• To what extent do the educational evaluations of similar courses correspond?
• Which points of attention with regard to assessment can be derived from the results of educational or studyprogramme evaluations in benefit of the study programme or certain course units?
• Can an evolution be identified in the

8 Benchmarking the quality of the final level of the graduates

To guarantee that graduates have acquired the intended competencies, it is necessary that programmes not only map the assessment processes but also the final level achieved through alumni and work field surveys and analyses of the quality of master dissertations.

Good practice in the department of Sociology in 2015: screening students who did not pass vs students who barely passed.

9 Analysing and mapping (interim) feedback in the programme

It is necessary to understand the feedback culture, to reflect on it and adapt it when necessary. The answers to the questions in programme evaluations ("You have received sufficient interim feedback on the assignments and papers") and educational evaluations ("The feedback on the evaluations was useful and relevant") provide a first, though limited, picture of the feedback culture.

It may be useful to gather more specific information about some feedback topics.

Examples of this are:
• To what extent is feedback provided throughout the course in addition to the feedback moments prescribed by the Education and Examination Code.
• At what times is interim feedback offered (versus feedback after the evaluation). Do students receive this feedback in time?
• Are there certain high-stakes course units (in which, for example, complex course competencies are assessed) for which it is desirable to provide interim and / or more in-depth feedback? Can sufficient staff be made available?
• Which forms of feedback are used? (individual feedback, group feedback, automated feedback, peer feedback)?
• Is feedback always referred to as feedback so that students are aware of the feedback culture?
• Are the technological possibilities for maximising feedback sufficiently exploited?

10 Team consultations within the study programme about exams

Mutual coordination between the different lecturers is essential for a qualitative assessment and a coherent assessment policy in the study programme. After all, high-quality assessment is based on a shared assessment vision and responsibility. Everyone in the study programme has to be involved in the assessment policy within the study programme. To meet up and to reflect on the topics, assessment and assessment forms, is not merely useful It is absolutely necessary when issues at stake are important, inter-related or urgent. This may involve, for example, a team meeting of lecturers of course units that come before or after each other, of all lecturers who give writing or presentation assignments or of all the lecturers of the first bachelor year.

Directive questions
• Which essential issues are assessed in which way and at what level in a previous course unit?
• On what criteria are students assessed for a writing or presentation assignment in the various course units involved? Is it useful to develop a common assessment list?

Good practice:
• It is assessed whether the workload is balanced throughout the course and in each standard learning track.
• This appears to particularly necessary with regard to a methodological learning path and is done at our faculty, also in function of overlap.
• Consultation is also necessary in the first

11 Formulating faculty or study programme-related guidelines on the modalities of evaluations (in addition to central guidelines) where desired.

Faculties or study programmes can formulate additional guidelines when this is desirable or necessary. Depending on the program-specific accents in the assessment vision, these additional guidelines may focus on validity, reliability, transparency, feedback, fraud prevention, etc. and may or may not be part of the faculty Education and Examination Code.

>Good practice at our faculty: Feedback is provided within 4 weeks after submitting papers.
• Students receive at least two sample exam questions. Test exams are organised in the first bachelor year and various test exams are uploaded on Minerva.
• Item analysis is a standard procedure after multiple choice exams.
• A standardised criteria list is used for the Master's dissertations / internships by the various assessors. See faculty Education and Examination Code.
• Students can prepare (most) orals exams in writing
• For each written exam with> 200 students, a seating plan is drawn up in which students are optimally spread over the auditorium. Is done in practice.

Other possible examples (not an exhaustive list)
• An assessment matrix is always drawn up for multiple choice exams.
• Interim feedback is provided at least once in case of large-scale assignments or projects.
• Written exams with multiple choice questions also contain at least one open question.
• At least two assessment moments and / or at least two independent assessors are used for high-stakes evaluations or evaluations of complex competencies.

12 Supervising the application of the 4 eyes principle

This means that lecturers are reminded to apply the 4 eyes principle prior to the exam. It is ensured that agreements made within the study programme or faculty regarding the application of the 4-eyes principle (for example, with regard to the mutual exchange of information about evaluations within courses) are observed.

At UGent, basic confidence in lecturers is of paramount importance. Starting from a confidence in the expertise of the lecturers is not at odds with the four eyes principle.

13 Screening the assessment quality in course units

Faculties / study programmes themselves can determine the composition of the executive body, the selection of course units, the planning and frequency, the focus (which assessment aspects are screened?) and follow-up of quality assessment screenings of courses units.
• At our faculty, this screening carried out by the faculty assessment committee and study programme assessment committees (with the director of studies,, study programme chair, chair of the department, student and an educational quality control unit employee). The screening may or may not take place in the presence of the responsible lecturer (s) and / or several lecturers of course units with a certain coherence.
• The selection of the course units is based on the 'item' evaluation in the education or course evaluations, or on the basis of programme-specific points of interest (for example, evaluations in first bachelor year, evaluations based on specific evaluation forms, course units that assess certain learning outcomes or that belong to a specific learning or assessment path.) and ad hoc in the event of problems.
• Planning and frequency:2X / year
• Focus: screening matrix for individual course units
• The follow-up of these screenings takes place in the programme assessment committee/faculty assessment committee, study programme committee and educational quality control unit. This can consist of advice and tips to the examiners, annual (anonymous) reports and advice to improve the assessment practice, identification and dissemination of good practice examples, advice regarding the operation of assessment policy or adjustment of the assessment vision in the study programme /faculty, reporting about activities and proposals to improve working methods.

Optional activities:

In addition to the above basic activities, it is advisable to work on certain aspects of evaluation depending on the programme or faculty-specific priorities / accents or occasional problems or issues.

Below you can find a non-exhaustive list of examples of such optional actions of which we have already achieved quite a lot:

Activity Explanation, target questions (???), tools and good practices
14 Documenting exams and related tools and making them available for consultation. We do this via the screening matrix for course units which pass through the TCs (test form and organisation, test vision, test communication, coordination). (?????) They also provide the latest exam questions and a correction key. The course sheet is also always added.
15 Analysing the organisability of the assessment in the study programme

We always strive for fixed exam schedules (FSA).

It is important not to pass on the assessment of complex competencies solely to the last or last years of a study programme, but to incorporate evaluations earlier in the study programme in which (aspects) of those competencies are given a place and students receive feedback on their performance.

16 Developing learning paths and assessment lines and organising team meetings concerning examinations within learning and or assessment paths.

Learning and assessment trajectories clarify how students gradually acquire competencies over the course of the programme and where and when these are assessed. It is important to gradually build up the level of education and assessment.

Learning and assessment trajectories make this structure transparent for students and lecturers themselves. This gives students a better view of their learning process and it is easier for

17

Ensuring adequate Assessment professionalisation of the examiners (and those involved in assessment quality insurance)

Professionalisation is essential in function of the assessment expertise of examiners and other parties involved in assessment quality insurance.

Different possibilities to work on this are:
• Monitor participation rates in central basic training courses (compulsory for new lecturers) and specific evaluation training courses.
• Encourage lecturers collectively or in a targeted manner to follow certain basic training courses or specific training courses on evaluation from the central offer (for example,, target that at least one involved lecturer per multiple-choice exam follows the multiple-choice training).

18 Analysing the master's dissertation and / or internship assessments This involves screening existing assessment practices with regard to the master's dissertation and internship. Adjusting the assessment procedure and criteria is also in line with this. Takes place every year at our faculty!
19 Analysing the nature, amount and distribution of end-of-term and non-periodic evaluations for optimal student feasibility To this end, it may be interesting to draw up a timeline of all evaluation moments and the estimated preparation time for students. In this process, we can also verify whether or not the study programme is feasible for students. Directive questions here is “Are the most selective courses units correctly located in the years of the model trajectory”?
20 Drawing up, evaluating and adjusting an anti-fraud policy See plagiarism regulations in the faculty of the Education and Examination Code Ephorus + plagiarism software 
21 Making grade agreements in the lecturer team (for example, what does 10/20, 18/20, etc.)mean?)

The purpose of this is standardisation, which ensures that equivalent performances get the same score. This also constitutes an opportunity to work on excellence: how to valorise excellent student performance?

Good practice:
Within our faculty an assessment system for exam grades and the assessment of master's dissertations is implemented. This system clarifies what is considered excellent and what a certain assessment or exam grade stands for.

22 Analysing special conditions for success in the different courses units of a deliberation set. It is useful to check whether the special conditions that are implemented in each of the individual course units are reasonable and whether they are desirable as a whole within certain parts of the standard learning track. See, for example, the rule in some course units where you have to pass for both parts (end-of-term and periodic evaluation) before you can pass the course

Appendix

Screening matrix