The swift proliferation of digital learning technologies has meant that university educators have had to adapt to the demands of changing patterns of work and student learning, with the enactment of academic practice occurring across a wide range of inter-connected, digital, and physical environments. This paper captures digital change work undertaken by teaching staff on seven different undergraduate programmes at a post-92 university in the North-East of England that has prioritised and supported the design and implementation of flexible assessment arrangements. This strategic work has encouraged teaching staff to rethink how the significant resources devoted to assessment might be reconfigured (even reimagined) to support student learning across different modes of delivery. Focus has necessarily shifted onto the interplay between teaching practices and processes of digital transformation and notions of accessibility, flexibility and students partnering with faculty in real-time to provide authentic assessment solutions.
Drawing on the experiences of academic teaching staff in Chemical Engineering, Interior Design, Business Management, Policing, Social Work, Education Studies, and Paramedics the paper draws out key insights, challenges, and opportunities for practice innovations, before discussing the pressing considerations for developing and implementing attempts at flexible assessment design. The paper adopts an explicit ‘practice focus’ to negate and challenge the common place view of digital tools and technologies as a separate domain in assessment design. The primary challenge, it will be argued, is one of understanding and adapting a wider repertoire of approaches and practices in assessment to encompass new and multiple contexts that are no longer experienced as separate, virtual, or otherwise in students’ learning experiences. In response, a post-digital perspective is adopted as a means of framing how teaching staff might make judgements, not about assessment in general, but about the flexibility in and of combinations and configurations of diverse practice elements that make up the student assessment experience.
Keywords: flexibility; flexible assessment; assessment; post-digital; digital transformation
Part of the Special Issue Teaching practices in times of digital transformation
As the international higher education sector has responded to the global COVID-19 pandemic, the rapid drive to digitise assessment has raised significant challenges and opportunities for educators. The swift proliferation of digital learning technology has meant that the academy has had to adapt to the demands of changing patterns of work and student learning, with the enactment of academic practice occurring across a range of sprawling, inter-connected, digital, and physical environments. In the face of such complexity, more nuanced, flexible, and responsive assessment design and support has been called for. However, whist there have been various accounts of digital developments occurring in higher education (HE) assessment (i.e., Sambell and Brown, 2020a, b and c), there has been little accompanying empirical evidence demonstrating what it means to negotiate the opportunities, challenge, and struggle that educators have experienced in envisioning alternative, flexible assessment arrangements, and changing what they do at a practice level.
In response, this paper considers change work undertaken by teaching staff on seven different undergraduate programmes at a post-92 university in the North-East of England that has prioritised and supported the design and implementation of flexible assessment arrangements as part of a broader digital transformation effort institutionally. Early assessment change work during the pandemic typically involved instigating an array of viable, short-term, alternative ‘digitised’ assessment arrangements that were loosely framed by more or less elaborate interpretations of digital learning strategy. Despite such arrangements serving as accelerators for the digitalisation of assessment for many HEIs, this early work has not necessarily foregrounded wider efforts at sustained ‘digital’ transformation. The decisive factor here is that digital transformation encompasses more than just digitisation of information and processes. It is the sum of digital practices and procedures necessary to achieve a change process that enables HEIs to successfully leverage the use of digital technology (Kopp et al., 2019). This paper draws together practice insights from the implementation of one such digital transformation project (see Elkington, 2021) designed to encourage teaching staff to rethink how the significant resources devoted to assessment might be reconfigured (even reimagined) to support student learning across different modes of delivery. Framed in this way, focus necessarily shifts onto understanding and working with the interplay between teaching practices and processes of digital transformation and notions of accessibility, flexibility and students partnering with faculty in real-time to provide authentic assessment arrangements that are sustainable over time.
The onset of the COVID-19 pandemic necessitated rapid changes to established assessment practices and processes for higher education institutions. Such changes were most commonly realised through creating short-term ‘alternative online’ assessment arrangements, such as changing timed examinations to coursework tasks or remote take-home exams. This, in turn, resulted in largely pragmatic strategies to provide emergency support and guidance as practitioners adapted their assessment diets to the demands of hybrid models of learning and teaching. The high-risk nature of many traditional summative assessment methods such as the unseen, time-constrained, examinations was exposed as a feature of normative assessment patterns and practices, further strengthening concerns around the inclusivity of alternative approaches being deployed (Baughan et al., 2020). Assessment in higher education serves several purposes. Firstly, it represents the institutional quality-assured processes that lead to a qualification (assessment-of-learning). Secondly, it provides learning opportunities that feed forward to future improvement (assessment-for-learning). Thirdly, it contributes to and shapes decision-making and helps to monitor and regulate student own learning (assessment-as-learning). The imperative of securing viable summative assessment alternatives, alongside the need to provide a reliable, verifiable mark for each individual assignment, functioned to limit the use of different assessment methods that have demonstrable value for enabling learning, such as formative (assessment-for-learning) processes (Elkington, 2021; Irons and Elkington, 2021). This narrow view of assessment has been sustained during the recent return to campus through justifiable concerns about staff workloads and consistency and fairness in marking and moderating student work. The pandemic offered a unique opportunity for universities and practitioners to reimagine assessment, to make it relevant, adaptable, and trustworthy in a rapidly shifting higher education environment (Jisc, 2021).
Calls for greater ‘flexibility’ in assessment arrangements and a concomitant shift in priorities from conventional, formulaic, approaches of content learning to tasks that focus on the ‘process’ of student learning and that are sensitive to their related needs predate the COVID-19 pandemic (Boud and Soler, 2016; HEA, 2015; Jones-Devitt, Lawton, and Mayne, 2016; Ryan and Tilbury, 2013). The drive to digitise student assessment during the pandemic raised significant opportunities for embedding flexibility in assessment and devising pedagogic responses consistent with it. This has, in turn, required increasingly ‘blended’ approaches revolving around the combination of teacher and student perspectives, as well as the range of new learning opportunities and approaches offered. These dynamics have blurred the boundaries between distinct contexts of learning and activity, and the often unexpected, interleaved experiences they can engender (Goodyear, 2021; Rapanta et al., 2020). ‘Flexibility’ in assessment from this perspective is about responding to students’ individual learning needs as well as the needs of the curriculum, adopting approaches that go beyond the traditional forms and practical limitations of many established practices. Assessment then becomes relevant to contemporary needs and circumstances and reflective of the learning process, making use of innovative assessment methods and strategies too impractical to deliver without digital tools.
When reference is made to ‘flexibility’ in learning and teaching, focus usually falls on realising flexible instruction: that is, educators aiming to implement instruction that supports more flexibility in time, space, pedagogy, and technology (HEA, 2015; Ryan and Tilbury, 2014). Advances in digital technologies has led to significant developments in flexible learning and how the student learning experience is mediated, accessed, and assessed. These developments are partly underpinned by the practical and technological drivers for flexibility in modes of accessing and delivering learning, teaching, and assessment in higher education (Ryan and Tilbury, 2014), as well as a more practical view of flexibility that emphasises the ability of people to think, act, and work differently in complex, uncertain and changeable scenarios (Cope and Kalantzis, 2017). Pedagogically speaking, when facilitated effectively, flexible learning can provide students with the opportunity to take greater responsibility for their learning and to engage in activities and opportunities that reflect their own needs. This expanded view of flexibility can extend to choices students make about assessment methods and formats, as well as the extent to which they have an active role in shaping the assessment approaches and processes they are involved in (Irwin and Hepplestone, 2012).
Flexible assessment, as it is presented here, is intended to move beyond simply providing students with a range of formative and summative tasks and the alignment of assessment tasks with programme learning outcomes (Elkington, 2021). Such a view fails to say much about the intricacies inherent in effectively operationalising assessment and feedback processes in blended learning environments. It also lacks a sense of continuity with the different purposes for assessment and how these might be used to bring together the complex requirements for current, as well as future learning. More recently, the range and depth of curricular choices available to students have become more comprehensive and widespread, supported by new digital learning tools and technologies as enablers for embedding flexibility in learning, teaching, and assessment by giving students a greater sense of ownership over their studies (Deeley, 2018). Increased student choice and input in assessment processes can also have a positive effect on student engagement and motivation; the exercise of choice and self-direction, leading to greater feelings of autonomy, control, and self-determined action (Boud and Soler, 2016). Irons and Elkington (2021, p. 84) note how utilising educational technology in deploying flexible assessment arrangements has the potential to help shape assessment design through:
supporting the changing and increasingly diverse student population;
providing greater flexibility and choice for students in the timing, content, and location of assessed work;
providing variety and authenticity in assessment design and delivery, i.e., using simulated or blended environments;
supporting and capturing a wider range of skills such as problem-based and inquiry-based approaches;
speed up the provision of feedback – either through automated responses (e.g., multiple choice questions, MCQs), or through providing generic feedback, or making use of technology for speedier communication (e.g., chat functions);
promoting consistency, accuracy, and fairness in marking assessed work (i.e., through the interactive use of assessment rubrics).
The pandemic prompted academic practitioners to begin factoring in many of these features into their assessment planning. However, seeking greater flexibility in assessment in this broader sense has marked a shift from a focus on the way assessment should be done – a task genre – to a focus grounded in the shared practice realities of learners and educators (Bearman et al. 2022). Relatedly, educators have been compelled to reconsider how the significant resources devoted to assessment and feedback might be reconfigured for digital and physical learning environments to better support student learning across different modes of delivery (Elkington, 2021). In this sense, assessment and feedback design represent important variables driving broader environmental changes in what is increasingly recognised as a post-digital reality for HE (Carvalho, Goodyear, and Laat, 2017; Fawns, 2019; Jandric, 2020). A post-digital sensibility shifts an understanding of digital technology in education away from tangible ready to hand tools and devices, towards a view of entangled relations between physical and digital technologies, spaces, and practices (Fawns, 2022). This emergent post-digital condition signals the need to develop a wider repertoire of approaches and practices that recognise and integrate multiple learning pathways and contexts at the point of assessment design. Advances in digital technology have revealed significant potential for opening up to flexible learning, teaching, and assessment processes, extending access, and improving inclusivity (Nieminen, 2022). However, such advances can also bring downgraded pedagogical interactions raising concerns about the quality of learning experience provided through more flexible patterns of participation. Where extension of choice and an expansion of delivery models and logistics are the only consideration during the development of more flexible learning pathways and experiences, flexibility as a pedagogical concern can be side-lined or absent amidst a focus on efficiencies, access, and maintaining standards. Instead, appropriate digital tools need to be combined with congruent and flexible pedagogic strategies and the adoption of principles and practices of teaching and assessment that are encompassing of and responsive to new and multiple learning contexts.
Effective strategies for achieving flexibility in assessment require a variety of relevant and accessible approaches, employing a carefully designed and balanced range of summative assessment tasks and associated formative assessment processes that enable all students to demonstrate what they know, understand, and can do. These considerations were core to the institutional approach to embedding and scaling flexible assessment that forms the basis on which this article is developed (for a detailed account of this assessment change work see Elkington, 2021).
The assessment change work undertaken prioritised and supported the design and implementation of flexible assessment arrangements on an institutional scale. When using educational technologies in flexible assessment designs, it was recognised that academic staff needed to consider students’ perception of technology and their skills and abilities in using those technologies (their digital literacy). Educators also needed to be able to make informed decisions about when and how technologies can enhance and transform learning and assessment practices throughout the assessment cycle. To this end, a set of three core principles were developed as institutional guidance to provide academic staff with the basis for thinking proactively about flexible assessment arrangements at module and programme level during the early stages of the pandemic (Elkington, 2021). In year 2 of the pandemic, the emergent flexible assessment principles and guidance were retained to encourage a shift in emphasis away from purely procedural approaches to devising alternative assessment arrangements, in the short-term, towards a broader view of assessment wherein students’ differences could be proactively considered through a sustained model for flexible assessment and not simply the porting of established assessment practices to digital space.
Considering the highly diversified nature of student populations in higher education and the central role assessment plays in driving student learning, inclusive assessment ought to be a primary consideration in the development of assessment strategies. When considering issues of student diversity and inclusivity, academics might naturally turn to particular ‘groups or sub-groups’, such as students with disabilities or specific additional requirements (e.g., dyslexia) and how alternative assessment arrangements might be provided. Recent research illustrates a shift in emphasis away from such procedural approaches to inclusivity and diversity toward an alternative ‘capabilities’ approach (Gun et al., 2015 and Hanesworth et al. 2018) that has influenced the development of a broader interpretation of inclusivity in educational practice. Underpinning this broader view are values of equity and fairness, where students’ differences are considered and valued within the mainstream curriculum, pedagogy, and assessment. In contexts affected by the COVID-19 crisis, the source of student differences might not just be cognitive, but also social, economic, and political.
Designing flexible assessment according to this principle means keeping in mind individual differences between students for the purpose of accessibility, employing different combinations of assessment methods, tools, and support to meet the diversity of learning needs for different groups of students to ensure an equitable and relevant learning experience for all students. Academic practitioners, therefore, need to take steps to ensure that across a programme of study, no student is disadvantaged by the nature and pattern of assessment and can engage in diverse forms of assessment practice that maintain academic standards and recognize different backgrounds, learning needs, preferences, and motivations. Students can feel alienated when they find assessment tasks refer to cultures, experiences, gender, and race that are not their own. Assessment tasks, and the digital tools and processes that academics adopt to support them, be equitably accessed by students. This is underpinned by an understanding that individual students have strengths, qualities, and skills that will be beneficial for their own learning, as well as that of their peers.
A feature of modern modular course structures is that most assignments have a summative function (assessment-of-learning). In summative assessment the stakes are high for students, which may lead to them taking strategic approaches to their studies, potentially limiting their broader learning. Students often look to summative assessment tasks as an indicator for what it is that they should be learning, especially if there is little in the way of formative assessment to help guide them (Sambell et al., 2013). It should be noted here that both summative and formative assessment have capacity to promote learning, but it is easy for summative assessment to become the focus of students and academic staff. Assessments need to encompass a simultaneous focus on attainment of standards and on student learning development. An emphasis on assessment-for-learning is relevant here because it means using approaches that help learning to take place as widely and effectively as possible (Sambell et al., 2013). Good assessment-for-learning practice creates low-stakes, formative, opportunities for feedback as students’ progress. It is important that the role of formative assessment is subjected to careful design and deliberate planning alongside summative tasks, to ensure that key learning outcomes are being addressed and engagement in tasks prompts the kinds of learning desired (Elkington, 2019).
According to this principle, assessment is ‘learning-focused’ when it is designed to actively involve students in assessment processes in ways which develop their ability to self-monitor, regulate their own learning behaviour, and when feedback is appropriately future facing and can be acted upon in timely and meaningful ways. Formative assessment and formative feedback are integral to learning-focused assessment design. Effective feedback is more important than ever in instances where there is need to accommodate flexible study patterns and maintain productive student engagement in blended learning environments. Designing in regular formative opportunities was encouraged as a recognised way of checking for individual understanding; giving students an indication of where they are in relation to achieving learning outcomes or standards, where they need to progress to, and how they will be able to reach the expected level. Students should be gradually introduced to the idea of flexible assessment at a module level, where early low-stakes (formative) assessment tasks are broken-down into separate, yet interrelated, assessment components allowing them to complete tasks in a proficient way and improve skills without feeling overwhelmed.
For students to feel capable of fully engaging in their learning in higher education, it is important that they have a good understanding of the requirements of assessment and how the overall assessment design fits together, including familiarity with the related terminology, standards and criteria, assessment methods, skills, and technologies and tools. Evidence shows that students’ performance is likely improved when they have a better understanding of the purpose of an assessment task and the standard of work expected (Smith et al., 2013). It is important that steps are taken to ensure that assessment is explicitly positioned to develop students’ ‘assessment literacy’ as part of an interactive process in and through which they acquire understanding of assessment practices (Price et al., 2012). This includes making students aware, early on, of the technical and practical requirements of all assessment arrangements and tasks (i.e., specific considerations such as defined start and finish times or restrictions on what can be submitted as assessed work, such as file sizes).
Devising opportunities for students to actively engage with assessment criteria for learning through activities such as self-evaluation, and the analysis of exemplar work using rubrics, can have positive effects on learning, helping students to ‘see’ standards and criteria in concrete ways and develop their capacity to regulate their own work through their ability to judge, select and apply appropriate approaches and techniques to assessment tasks. When facilitated asynchronously in online settings, such activities provide students with accessible means of posting work and receiving feedback over a longer timeframe (Irons and Elkington, 2021). Offering opportunities for student dialogue around commonly shared activities such as moderated discussion forums, or other forms of peer-to-peer dialogue and learning can be helpful in promoting positive and enduring assessment engagement and support.
A crucial element of effective technology-enabled assessment is the ability of educators to discern the affordances of specific technologies and how these can support assessment ‘for’ learning (Irons and Elkington, 2021). This becomes especially significant given that much of what students typically report as being ‘most valuable’ about digital technologies relates to a ‘strategic academic focus’; completing prescribed academic work and performing tasks ‘well’ (Henderson et al., 2017). Assessment arrangements, therefore, needed to be adaptive enough to capture actual student learning that takes place ‘in practice’. Through institutional guidance and supported by principles for flexible assessment (above), academic teams were encouraged to use a variety of assessment, rather than relying on one or two signature (high stakes) assessment methods (e.g., written exams), enabling each student to enhance their strengths and challenge their less-developed learning and skills, helping to develop a broader range of potential learning outcomes. The ability to combine the testing of authentic knowledge, skills, and personal qualities is regarded as an important component of integrated assessment (Crisp, 2012). This is supported by findings from a previous study by Elkington and colleagues (2021) who reported key insights from healthcare course leaders in relation to their experiences of working with the same flexible assessment principles to develop and embed alternative authentic digital assessment practices in response to the pivot online during the pandemic. Working to embed flexibility in alternative online practical healthcare assessments revealed a learning-focussed approach was integral to student success and engagement, operationalised through an intentional shift in emphasis on to more actively engaging students with formative assessment tasks and processes. Elkington and colleagues found for the Healthcare-based courses involved such learning-focused assessment effectively scaffolded for authentic interactions and practices through utilising a variety of digital tools and technologies to support students to better understand and regulate their own learning across different modes of study. The present study has aimed to develop a fuller understanding of flexible assessment design and practice by exploring a wider range of university teachers’ recent experiences when making and creating or significantly modifying assessment arrangements during pandemic-affected periods of delivery. More specifically, the study enquires into the nature of flexible assessment arrangements in higher education through the following research question: ‘to what degree does the process of digital transformation impact the implementation of flexible assessment designs and practices?’.
A qualitative research approach was adopted consisting of in-depth participant interviews as the main data collection mechanism. The study used in-depth semi-structured, online interviews to explore staff perceptions of the digital transition to assessment in response to the COVID-19 pandemic. The aim was to document and understand the lived teaching and assessment experiences of university teaching staff to reveal meaning through a process of interpretation (Annellis, 1996; Gadamer, 2013).
An initial analysis of institutional assessment outcomes captured the range of design/policy changes alongside those courses displaying the largest percentage reduction in attainment/awarding gaps (for 2019-20) and improved student continuation rates (for 2020-21) utilised as indicators for high performance. The dataset was predominately based on the UK Higher Education Statistics Agency student return information including access and participation and also the Destinations of Leavers in Higher Education progression indicator. The institution-level summary data is publicly available on the Office for Students website. This initial phase of institutional data analysis identified 10 courses displaying a combination of reduced attainment/awarding gaps and improved student continuation rates. A final list of 7 courses were then taken forward for further study on the basis that staff were available to take part in the subsequent data collection process, namely: Chemical Engineering, Interior Design, Business Management, Policing, Social Work, Education Studies, and Paramedics.
From the institutional level data course leaders and teaching faculty who were directly involved in the design and delivery of the identified programmes were approached. Faculty were eligible if they were currently employed on an academic contract at the institution and either delivered teaching content or were involved the assessment of identified modules. Academic faculty who consented to inclusion were asked to participate in one-to-one interviews with a researcher aligned to the project. In total 14 teaching staff spanning the 7 identified courses participated in the study. The individual staff interviews were held via an online format (Microsoft TEAMs), at a time convenient for the respondent, and lasted for an average duration of around 30-40 minutes. All interviews were conducted by the two primary authors experienced in qualitative interviewing.
An interview question schedule was devised to inform semi-structured interviews with participating teaching staff. All interview questions were piloted amongst a selection of academic staff resulting in minor amendments to the format of the questions, including the wording to improve overall accessibility (of language used in the question schedule). The open-ended questions acted as a prompt for staff to discuss and reflect upon their experiences of working with flexible assessment. Question themes included asking faculty to explain the steps and reasoning for their chosen assessment processes and how these had been made accessible and fair for students. The positives and challenges of such alternative assessment practices were explored. Finally, staff were asked about the sources of information and/or support they had found most useful when devising alternative assessment arrangements. Any additional questions were conversational in nature and were raised following topics/themes generated in respondent answers (Creswell and Poth 2018). This allowed for a flexible approach to the semi-structured interviews ensuring that the key themes were explored whilst allowing opportunities for additional data to be generated (Greenhalgh et al., 2020).
Staff interviews were recorded with qualitative responses transcribed verbatim. Inductive thematic analysis was employed to transform the data and identify key themes (Braun et al., 2018). Transcripts were transferred into a document to allow respondent’s comments to be reviewed by the researchers (Braun & Clarke, 2006). Both researchers read the transcripts in full independently of each other. Following this before the researchers coded similar comments manually enabling the grouping of themes into broad categories. These themes were further reviewed where additional coding permitted a set of sub and main themes to be created. This was completed by each researcher independently of each other. These themes were redefined and re-grouped if advised following a process of triangulation and peer debriefing. Critical discussion between researchers occurred to verify, modify, and refine the themes (Gadamer, 2013). For any disagreements (e.g., formation of themes, coding) a third researcher provided an independent review of the disagreement. Hermeneutic revisiting of the data set reduced researcher prejudices or biases which may have de-valued the theme generation.
This section presents findings according to four themes relating to staff experiences of devising and implementing flexible assessment generated through data analysis. This is supported by drawing on illustrative quotes from participants that both typify common perspectives on and a continuity with the institutional view of flexible assessment (presented in preceding sections), alongside accounts of alternative views on and discontinuity with recognised flexible assessment principles. Pervading patterns of hybrid or blended learning and teaching during the COVID-19 pandemic, and those accompanying the more recent transition back to campus-based provision, render explicit the diversity of ways in which students perceive and understand assessment information, how they navigate assessment with different learning environments, and the ways in which they are motivated to learn and express what they know in and through different assessment arrangements. In this section staff experiences are presented to be illustrative of the range of flexible assessment designs and practice examples discussed in interviews, alongside the opportunities, issues, and challenges they afford.
With new digital learning tools and technologies, there are multiple opportunities to capture both performance and assessment data and analyse them to understand how students are progressing through deploying different forms of integrated activities. The simplest of these activities for teachers in the present study tended to focus on providing early feedback information based on student performance on a specific learning task. For example, shifting lab-based assessments online for Chemical Engineering students prompted one participant to seek out alternative arrangements that simulated the kinds of authentic collaboration and interaction encouraged in face-to-face lab settings:
“Creating small online lab groups in which students were encouraged to work through different case-based lab scenarios as part of the assessment process before presenting their final lab reports via either a recorded presentation or written lab report […] provided opportunities to provide and receive feedback at different stages and still allowed for authentic interactions and discussion around lab-based tasks” (Participant 1, Chemical Engineering).
When it comes to developing knowledge and skills in relatively complex and challenging domains, issues relating to motivation and engagement emerged as important feedback areas. The strategy of including early low-stakes opportunities for assessment feedback provided regular touch points for staff and students to consider progress and performance. This was particularly appealing when dealing with larger classes, where there was pressure to adopt more apparently efficient forms of assessment, as a Business Management lecturer remarked:
“We introduced weekly tasks into our online seminar sessions where students were asked to complete a short reflection on their assessment progress as an individual blog before sharing and discussing key insights, issues, and challenges in small groups in the session and then continuing asynchronously afterwards too […] Working with such a large student group, this really helped with gauging overall student progress from a staff perspective, whilst also allowing for individualised feedback which student seemed to respond positively too” (Participant 6, Business Management).
Here the fundamental design principle is that these tasks provide ‘developmental feedback’ to students on their progress. This might include such activities as lecturer-guided discussions around student understanding of tasks, criteria, and standards; peer-led discussions of exemplar work preceding draft work; encouraging self-review and reflection on feedback received on draft work. Where this strategy was deployed, early experiences of formative activities were necessarily lecturer-led, enabling deliberate practice, aimed at developing student understanding and confidence relative to different assessment tasks and approaches, as one Policing lecturer describes:
“Building more formative opportunities around summative tasks certainly seemed to provide a good way of focusing students onto the process of developing their assessment work and recognising the different bits of feedback information they receive that can help improve it” (Participant 11, Policing).
Where multiple summative assessments are to be included in a single module, another strategy discussed by participants was to design these tasks so that they were connected or ‘phased’ as parts in an integrated assessment experience, with careful thought needing to be given to combinations of low and medium-stakes (process-focused assessment) tasks and the role they play in student learning development.
“Due to the practical nature of the course, the challenge was how to virtually engage students with practically relevant assessment tasks that still developed the kinds of skills needed. Breaking the assessment process down […] including formative activities that emphasised working through different recorded practice scenarios and then reflecting on the skills being demonstrated seemed to help to inform students’ responses to the summative tasks that followed” (Participant 4, Social Work).
The use of online collaboration tools, blogs, and discussion boards provided opportunities for students to provide and consider divergent responses to assessment tasks and encourage interactions with other learners. Participants reported that using such interactions as a means to encourage regular dialogue around ideas of quality work and/or to facilitate peer review and peer feedback as integral components of the assessment process seemed to support student understanding of standards for their work on a particular task; positioning assessment as part of an interactive process in which students acquire understanding of assessment practices, criteria and standards through open, active engagement, and participation. As an example:
“It was interesting […] we ended up placing quite a lot of emphasis and effort on putting together different ways to encourage students to engage with the assessment brief and criteria. By having students consider the assessment criteria through focused online discussion or in small-group breakout spaces and applying these criteria to exemplar artefacts seemed to prompt useful interactions around understanding assessment work” (Participant 6, Business Management)
Taken together, the practice examples captured here demonstrate how flexible assessment designs can be created or adjusted to model for certain types of student behaviour – i.e., to combat a lack of engagement, to encourage more independent student learning, and to mitigate the risks of cheating or possible inequitable access to resources – to varying degrees of success. Participants also highlighted cases where limitations in the tasks themselves or the ways they had been set up and implemented had caused challenges, resulting in some digital tools that did not integrate or limitations on students accessing certain resources, as noted by one Law lecturer:
“A certain task might require students to navigate between one content area on the VLE and then a shared collaborative online space elsewhere. Each provided a window of opportunity for students to work with available resources, but reflecting on it now issues with (a lack of) technical integration sometimes meant there often wasn’t much engagement which then defeated the object of running the tasks together” (Participant 8, Law).
These (infra)structural issues meant that some options were not open to staff and students, even though they were regarded pedagogically and practically desirable. Some practices did not happen due to the fact teachers and students were not co-located in time and space. Several participants reflected upon the fluidity of blended and online assessment spaces created through adopting such alternative assessment arrangements and how such spaces can represent and create, as well as close off, opportunities for learning:
“It’s challenging because you work to put forward different opportunities […] using different combinations of digital tools and arrangements to encourage students to engage, but we very quickly realised that not all students experienced things in the same way. Depending how and where and when they are accessing the work, these arrangements can either bring them in or shut them out” (Participant 3, Interior Design).
Moving beyond providing a range of assessment types that are predefined by the teacher, one flexible strategy variously deployed by participants was to provide students with the opportunity for negotiated and managed choice between an accepted range of ‘alternative’ assessment methods. Encouraging students to work closely with teachers to negotiate and agree on equitable assessment arrangements aimed to encourage students to take greater responsibility for their learning and improve student engagement – a critical consideration at a time when students are likely to need to shift between blended and fully online delivery models, as discussed by one Education Studies lecturer:
“We tried to set things up so that students worked through a series of early online formative interactive tasks to help familiarise them with the assessment criteria and the different ways they might express these through their assessment work […] before giving them the option of producing either a recorded audio-visual artefact or a written equivalent. In the end we received a mix of artefacts with students saying they valued the opportunity to pick how best to express their learning” (Participant 10, Education Studies).
Wider institutional guidance for designing flexible assessment advised that available alternative assessment arrangements should be equivalent in both their relative weightings and capacity to demonstrate the learning outcomes and assessment criteria of the module, as well as the level of challenge they present students (students should be able to complete either task with the skills being developed). Participants reported that working to embed greater flexibility into their assessment arrangements revealed the importance of giving careful thought to how combinations of different assessment methods and tasks might meet the learning needs and preferences of diverse student groups. For example, on the Paramedic course:
“I was more sensitive than I had previously been to how students might receive and respond to different types of assessment and how important it is to think through how those tasks might be experienced […] to put forward designs that open up rather closed off assessments for students” (Participant 12, Paramedic).
A useful strategy utilised in different forms by participants in this respect was to introduce easily actionable ‘formative’ opportunities designed to enable students to trial new practices and build confidence in using learning tools and technologies. As an example:
“I made a conscious effort to select a certain range of digital tools I wanted my students to use and created regular time and space for focused practise and knowledge checks throughout the (assessment) process […] using online discussion board tasks and shared online collaborative spaces like Microsoft OneNote to provide opportunities to monitor their progress and provide them with timely feedback” (Participant 1, Chemical Engineering).
Collaborative, multimedia, sharing tools enable staff and students to share multimedia products such as videos, presentation slides, and images; create, share, and comment on digital content; and provide students with a platform for peer feedback exchanges. Multimedia sharing tools were regularly used by participants to craft accessible multimedia feedback that is content speciﬁc and timely, enhances the lecturer-student and peer-to-peer dialogue through feedback, as well as enabling students to solicit feedback from multiple sources and prompting them to follow up on feedback information they receive. This provided the opportunity to shift the emphasis on to more student-led approaches, as one Social Work lecturer commented:
“A mix of feedback information for students casts a wider net in terms how we can engage them with relevant information about their assessments […] it also means not all of the emphasis is on me as the module lead to give it to them. Having those different digital tools and spaces available sets up different conversations around assessment that have been useful for students to access and work with different information on their progress” (Participant 5, Social Work).
Connecting students through common activities and shared experiences using different forums (i.e., discussion boards) and tools (i.e., shared class blogs) appeared to provide flexible, timely and accessible opportunities for students to interact online in relation to topics relevant to assessment tasks in ways that are often more accessible and personally meaningful to them than lecturer-mediated discussion around similar topics. There was a recurrent belief amongst participants that students both expected and valued such low-stakes activities, with most feeling they were successful in the absence of more conventional in-person forms of engagement.
A powerful point of reflection shared by many participants was how an over-reliance on conventional forms of assessment, such as essays and exams, can mean that certain students in a diverse cohort could be unfairly disadvantaged, as they do not have the same opportunities to demonstrate their capabilities (e.g., strengths in oral communication). For example, on the Law course:
“It was clear pretty quickly that just shifting physical exams online wasn’t going to meet students where they were in terms of their circumstances and working conditions at home” (Participant 7, Law).
The rapid expansion and deployment of digital learning tools and technologies has provided opportunities for new forms of representation and different ways of working with information and content to demonstrate knowledge, skills, and achievement – something that was openly supported and encouraged through institutional guidance on designing flexible assessment. As an example:
“Reframing assessment tasks as ways of producing learning artefacts definitely opened things up in terms of viable alternative designs […] focus moves past just covering your bases in terms of reliability of tasks themselves to really needing to think about the different ways students can communicate what they’ve learned” (Participant 4, Social Work).
The affordances of certain digital strategies allowed teachers scope to design different forms of assessment that enabled students to document their achievements and progress in a variety of ways and over different timescales. Several participants reported that building in a greater level of flexibility into assessment processes also provided a means of enhancing students’ understanding of fairness and transparency. For one Interior Design lecturer this involved instigating and mediating online dialogues with students with the aim of supporting them to understand the goals of the assessment task set:
“It was important for me to find ways of replicating those interactions students would ordinarily be having in seminars and tutorials around the nature of the assessment task and criteria. Not being able to physically bring us together […] I really wanted to encourage students to explore and discuss assessment tasks together […] setting up a discussion-based task on Teams or a discussion board activity in the VLE that could be facilitated both live and asynchronously just opened things” (Participant 2, Interior Design).
Facilitating such dialogic engagement around expectations for assessment appeared to help teachers to work alongside students to unpack and understand the requirements and assessment criteria on which academic judgements will be made and grades assigned. Such flexibility also had implications not only for how staff communicated assessment briefs, expectations, and processes, but also how they operationalised feedback practices. Through the social affordances of digital tools such as online discussion forums, blogs, and e-portfolios, there are ‘real-time’ opportunities for extending decision-making in assessment, by sharing assessment challenges and experiences across a wider cohort group. This might come in the form of a peer comment against the criteria of a rubric, a select response question where the answer can be immediately checked, or a reply in a discussion board or response to a blog post, or a review of a work in an e-portfolio.
Participant accounts of operationalising alternative assessment designs and practices have provided valuable insights into the practice realities of negotiating assessment change and present evidence why a large proportion of efforts at experimentation and change taking place in the area of flexible assessment were necessarily geared toward embracing models of ‘co-creation’:
“There was no real alternative than to bring students into the conversation about assessment design […] thinking flexibly around the designs and experiences you want to model for with students has definitely been a source of creativity and challenge for me but in a positive way” (Participant 13, Chemical Engineering).
“There’s needed to be a real practical emphasis on more discursive assessment designs to set up tasks that are more responsive to students’ individual situations (Participant 8, Law).
Whilst efficiency was certainly a motivation, such models also function to reposition the power and authority of the teacher ‘as expert’ to make space for enhanced contributions from learners. By utilising the affordances of available digital tools in a bid to (re)configure the dynamics of learning interactions, as well as confronting the predominant ‘teacher-led’ framing of assessment design, participants addressed the challenges of shifting the locus of interactions between teacher and student by making moves to involve students more actively in the assessment process. A Business Management lecturer noted:
“It was a case of connecting students to the assessment conversation and making sure there was enough flex in the designs to allow them to have a say in how things unfolded for them. Some students found this a challenge […] wanting a more directive approach. But other students responded very positively to having a say in things” (Participant 10, Business Management).
The varied blended locations and practice assemblages created through flexible assessment arrangements offer glimpses into the affective and discursive encounters they afford that had not been available to staff previously:
“Encouraging students to be more actively involved in assessment and not just in finding solutions to issues that might come up […] but helping shape what they do through joined-up discussions and interactive activities online definitely created different spaces for meaningful interactions” (Participant 9, Education Studies).
Participants also reported the need to overcome certain practical and logistical barriers associated with deploying such digitally integrated assessment designs. This involved the capacity to anticipate challenges when creating a flexible assessment design, which, in turn, often required staff to adapt a design iteratively over several subsequent interactions with students to improve it. For example:
“It was certainly the case for me that what I set out to do with students needed to be checked and adapted as things progressed. There were issues with access and then challenges around certain tools not working which needed immediate attention and responses to course correct” (Participant 3, Interior Design).
“I found it was a case of focusing and refocusing as I went […] through interactions with students it was clear when certain elements of what we were trying to do weren’t working. I guess this is part of being flexible with assessment isn’t it? Working through issues to find a solution that works practically” (Participant 6, Business Management).
Participants described the pressure of needing to find a way to make their alternative assessment arrangements work using available digital tools, which often resulted in adaptations or compromises, with the related risks of such improvisation and experimentation perceived to be a particular point of tension. Practical and logistical challenges associated with effectively adopting flexible assessment designs, the time required and unanticipated costs of operationalising such designs from a resource perspective, as well as the uncertainty around whether final designs would land with students, were recurrent factors that led many participants to either simplify or even abandon designs mid-deployment.
These findings provide insights into how teachers have approached the integration of technology into assessment and how such technologies have influenced their interpretation of flexible assessment designs. The themes captured here highlight the opportunities (continuities), as well as the issues, quandaries, and blockages (discontinuities) that have emerged during the process of negotiating the tension between achieving flexibility in assessment as set out in institutional principles and guidance and having to generate efficiencies in assessment at a practice level.
Capturing lecturers’ experiences of devising and implementing flexible assessment arrangements with technology-supported designs has helped to craft an understanding of why some designs appear more effective than others in practice. New or alternative flexible designs and approaches have been shown to introduce new opportunities, as well as unanticipated inefficiencies, particularly when an approach did not have the desired impact or shape student behaviour as intended. Indeed, these issues were much more prevalent in how participants described their experiences of leading such assessment change than technical failures during their implementation of such arrangements. Fundamentally flexible assessment designs need to be inclusive and adaptable, ideally offering an element of choice to students and staff in how they navigate and satisfy the expectation to demonstrate required learning outcomes.
Seeking greater flexibility in assessment, from this perspective, means developing assessments that allow the widest range of students to participate whilst also resulting in valid inferences about their performances and learning. Though the form and scope of flexible assessment designs will inevitably shift and change depending on the purpose of work being undertaken, there are signs that the recent rapid expansion and deployment of digital learning tools and technologies has been a vector for change for many established assessment practices. Crucially, the evidence presented suggests that taking steps to embed greater flexibility in assessment can provide students with an equivalent, rather than identical, opportunity to demonstrate their understanding, offering examples of the kind of inclusive pedagogic reform that Kalantzis and Cope (2016) claim is needed, aligning with the affordances of new media to promote ‘productive diversity’ in higher education.
By harnessing relevant technologies and digital tools, the student experience can be enhanced through better access to assessment information and adopting learner-focused designs that make use of a broader range of tasks, automated or speedier feedback, and timely student-student and student-staff dialogue regarding assessment. Utilising such arrangements also seems to have provided opportunities for new forms of representation and the use of multiple modalities to demonstrate student achievement and progress in a variety of forms and over different timescales – i.e., blogs, e-portfolios, and a variety of alternative audio-visual assessment formats have been effectively implemented, enabling students to create multi-modal artefacts and different forms of documenting their progress and showcasing achievement. Taken together, the use of flexibility in assessment formats in this regard supports core agendas of accessibility and promoting independent, self-directed learning at the point of assessment design (Hanesworth et al., 2019; Bearman et al., 2022).
Through an emphasis on ‘assessment-as-practice’ and acknowledgement of the everyday activities of assessment as they are conducted, and not in terms of what assessments ‘should do’; preferencing flexibility in assessment attends to some of the many issues that are masked when assessment is framed in terms of setting tasks, testing, and grading as an act of producing objective data about student performance. This has, in turn, revealed the complexity of actions, the multiplicity of demands, and the need for different kinds of representation in how teaching and assessment arrangements are structured for different disciplines and modalities of study (Goodyear, 2020). In this way, flexible assessment also challenges the idea of pre-set assessment designs as evidenced through the various local and in-the-moment adjustments made by practitioners. The range and variety of technology-supported assessment designs captured and described by participants suggests that flexible assessment, is a product of a dynamic, shared, relationship between the teacher, the student, the digital tools, and technologies deployed, and the broader context – none of which remain static and speak directly to what makes assessment authentic (McArthur, 2023).
Acknowledging the variable influence of digital technology as part of a wider mix of interleaved relations provides a practical reference point with which to navigate the territory between pedagogy-led and technology-led approaches to assessment design. From a post-digital perspective, though digital technology has implications for a multitude of practices, such possibilities are always socially and materially situated, as well as being relative to the practice traditions, culture, policy, and infrastructure in which they are embedded (Knox, 2019). Framing assessment in this way helps to better understand how particular assessment designs and practices have a tendency to persist over time, anchored as they are in a web of established structural and cultural conventions, and how being sensitive to wider practice contexts means taking account of information and influences beyond what is in immediate focus through assessment design. Consequently, this means teachers must also account for how blended environments, with their varied material and social configurations, are influenced not only by the pedagogic designs they deploy, but they are also simultaneously shaped by a confluence of institutional policies and centralised configurations of digital technologies (Knox, 2019).
Instead of assuming sameness and homogeneity in student assessment experiences, flexible designs need to calibrate and set up a variety of educational options for students aligned to their needs, patterns of readiness, interests, and circumstances (Haniya and Roberts-Leib, 2017). This has been shown to have important consequences for staff assessment orientations; requiring them to be proficient in both flexible assessment design, as well as being equally adept at the orchestration of flexible assessment practices, taking account of and being responsive to the unfolding ‘lived’ assessment experiences they share with students. This study has also presented evidence that such orchestration will inevitably involve a combination of workarounds and adaptations ‘in’ practice as students and staff negotiate and (re)interpret formal assessment processes into situated practices.
As we begin to contemplate the longer-term considerations for sustaining flexible assessment, it is clear deeper exploration of the relationship between flexibility and assessment is required. This, in turn, involves critical questions about how best to position university assessment to be relevant both now and in the future. What is clear from the participant accounts considered in this study is that flexible assessment embraces overlapping and contested value systems with multiple and often conflicting outlooks and motivations at the individual, local, and institutional level that continue to shape the nature and extent of assessment change in response to the pandemic. Flexible assessment, therefore, becomes a site of value conflict, with the orchestration of assessment, in turn, becoming a key exercise in the management of conflict between different orientations to the idea of flexibility in assessment at a practice and institutional level. Importantly, this diversity in orientation has revealed that there are multiple ways to be flexible with assessments while still challenging students, maintaining rigour, and continuing to provide required structure and support.
Proactively seeking flexibility in assessment from this perspective requires a practice focus which can help to avoid unproductive assumptions and generalisations about assessment design. A post-digital view of assessment helps to highlight and understand what might be ‘new’ about such practices and our relationships with the digital in these designs whilst also recognising the ways that educational technology is already embedded in, and entangled with, existing assessment practices and wider systems (Fawns, 2019). The developments captured in this study have shown that such practices are not experienced as isolated individual digitisation projects; rather they are illustrative of the complex inter-dependencies that practitioners and organisations share while undergoing digital transformation processes. This distinction is important in highlighting how considering flexibility in relation to teaching and assessment has not been automatic but can prompt significant change in the relational dynamics of assessment design. This requires a shift in practice thought beyond a ‘task genre’ of assessment (Bearman et al., 2022) to instead consider designs for a unique time and place, accounting for challenge, setting, and circumstance(s).
It is recognised that the examples provided in this paper of teaching staff working through flexible assessment designs and practices are both emerging and contingent; that is, they are a part of a broader process of digital transformation and as such will be experienced in different ways depending on disciplinary influences and on the interplay between other relevant contextual and socio-cultural factors. Recent experiences of responding to the assessment challenges of the pandemic may have accelerated some pre-existing trends in the adoption technology-supported assessment. However, the practice realities of teaching staff captured in this study have revealed that whilst available digital tools and technologies might be deployed with the intention of ‘enhancing’ existing pedagogical arrangements, more attention still needs to be given to better understanding technology-human associations in the context of flexible assessment design. In response, adopting a practice perspective has been discussed as important not only for designing and understanding meaningful assessment activities and processes, but also for resisting agendas that would have us use technology as a basis for streamlining, standardisation, and efficiency without considering how such changes manifest in the shared practice realities of staff and students.
Sam Elkington, Student Learning and Academic Registry, Teesside University, Middlesbrough, United Kingdom.
Dr. Sam Elkington is Professor of Learning and Teaching at Teesside University where he leads on the University’s learning and teaching enhancement portfolio. Sam is a PFHEA and National Teaching Fellow (NTF, 2021). He has worked in Higher Education for over 15 years and has extensive experience working across teaching, research and academic leadership and policy domains. Most recently Sam worked for Advance HE (formerly the Higher Education Academy) where he was national lead for Assessment and Feedback and Flexible Learning in Higher Education. Sam’s most recent book (with Professor Alastair Irons) explores contemporary themes in formative assessment and feedback in higher education: Irons and Elkington (2021), Enhancing learning through formative assessment and feedback. London: Routledge.
Email: [email protected]
Paul Chesterton, SHLS Allied Health Professions Centre for Rehabilitation, Teesside University, Middlesbrough, United Kingdom.
Dr. Paul Chesterton is a Professor of Learning and Teaching at Teesside University. He is a National Teaching Fellow, Collaborative Award for Teaching Excellence winner and Principal Fellow of Advance Higher Education. His interests include ensuring students are partners within higher education and the development of allied healthcare education.
Paul is a qualified physiotherapist and Trustee of the Chartered Society of Physiotherapy Charitable Trust. Prior to teaching Paul spent a number of years working in professional sport as a physiotherapist, most recently at a Premier League Football Club. Together with his sport experience he has extensive experience working with the National Health Service, Occupational Health, and Private Practice.
Email: [email protected]
Article type: Full paper, double-blind peer review.
Publication history: Received: 30 November 2022. Revised: 03 February 2023. Accepted: 06 February 2023. Published: 29 May 2023.
Cover image: Badly Disguised Bligh via flickr.
Annells M (1996). Hermeneutic phenomenology: Philosophical perspectives and current use in nursing research. Journal of Advanced Nursing, 23(4):705-13.
Baughan, P., Carless, D., Moody, J., & Stoakes (2020). Re-considering assessment and feedback practices in light of the Covid-19 pandemic. In P. Baughan (Ed.) On Your Marks: Learner-focused Feedback Practices and Feedback Literacy (pp. 179-191). York: Advance HE.
Bearman, M., Nieminen, J. H., & Ajjawi, R. (2022). Designing assessment in a digital world: an organising framework. Assessment & Evaluation in Higher Education, 1-14.
Boud, D., & Soler, R. (2016). Sustainable assessment revisited. Assessment & Evaluation in Higher Education, 41(3), 400-413.
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77-101.
Braun V, Clarke V, Hayfield N, Terry G. Thematic analysis (2018) In: Liamputtong P (Ed.), Handbook of Research Methods in Health Social Sciences. Singapore: Springer.
Carvalho, L., Goodyear, P., & de Laat, M. (Eds.). (2017). Place-based spaces for networked learning. London: Routledge.
Cope, B., & Kalantzis, M. (2017). E-learning Ecologies: principles for New Learning and Assessment. London: Routledge.
Cope, B., & Kalantzis, M. (Eds.). (2016). A pedagogy of multiliteracies: Learning by design. London: Springer
Creswell, J.W & Poth CN. 2016. Qualitative inquiry and research design: Choosing among five approaches. Sage Publications.
Crisp, G. T. (2012). Integrative assessment: reframing assessment practice for current and future learning. Assessment & Evaluation in Higher Education, 37(1), 33-43.
Deeley, S. J. (2018). Using technology to facilitate effective assessment for learning and feedback in higher education. Assessment & Evaluation in Higher Education, 43(3), 439-448.
Elkington, S. (2019). Assessment: understanding the basics. In S. Marshall (Ed.), A Handbook for Teaching and Learning in Higher Education (pp. 72-80). London: Routledge.
Elkington, S. (2021). Scaling Up Flexible Assessment. In P. Baughan (Ed). Assessment and Feedback in a Post-Pandemic Era: a time for learning and inclusion (pp. 31-41). Advance HE Pedagogic Innovation Series.
Elkington, S., Chesterton, P., & Cosson, P. (2021). New Directions for Student Engagement in Authentic Healthcare Assessment. Student Engagement in Higher Education Journal, 4(2), 146-164.
Fawns, T. (2019). Postdigital education in design and practice. Postdigital Science and Education, 1(1), 132-145.
Fawns, T. (2022). An Entangled Pedagogy: Looking Beyond the Pedagogy—Technology Dichotomy. Postdigital Science and Education, 4, 711-728.
Gadamer, H.G. (2013). Truth and method. A&C Black.
Goodyear, P. (2020). Design and co‐configuration for hybrid learning: Theorising the practices of learning space design. British Journal of Educational Technology, 51(4), 1045-1060.
Goodyear, P. (2021). Afterwords: considering the postgraduate, postdigital and postcritical. In T. Fawns, G. Aitken, & D. Jones (Eds.), Online postgraduate education in a postdigital world – beyond technology. Cham: Springer.
Greenhalgh S, Selfe J, Yeowell G. 2020. A qualitative study to explore the experiences of first contact physiotherapy practitioners in the NHS and their experiences of their first contact role. Musculoskeletal Science and Practice, 1, 50.
Gunn, V., Morrison, J., & Hanesworth, P. (2015). Equality and diversity in learning and teaching in Scottish universities: trends, perspectives and opportunities. Higher Education Academy (HEA), available at: http://radar.gsa.ac.uk/4225/1/equality-diversity-learning-teaching-scottish-universities.pdf
Hanesworth, P., Bracken, S., & Elkington, S. (2019). A typology for a social justice approach to assessment: learning from universal design and culturally sustaining pedagogy. Teaching in Higher Education, 24(1), 98-114.
Haniya, S., & Roberts-Lieb, S. (2017). Differentiated Learning: Diversity Dimensions of e-Learning. In B. Cope and M. Kalantzis (Eds.), E-Learning Ecologies: principles for New Learning and Assessment (pp. 183-206). London: Routledge.
Henderson, M., Selwyn, N., & Aston, R. (2017). What works and why? Student perceptions of ‘useful’ digital technology in university teaching and learning. Studies in Higher Education, 42(8), 1567-1579.
Higher Education Academy (2015). Framework for flexible learning in higher education. Frameworks for student success series. Available at: Flexible learning in higher education | Advance HE (advance-he.ac.uk) (last accessed 29/11/022).
Irons, A., & Elkington, S. (2021). Enhancing learning through formative assessment and feedback. London: Routledge.
Irwin, B., & Hepplestone, S. (2012). Examining increased flexibility in assessment formats. Assessment & Evaluation in Higher Education, 37(7), 773-785.
Jandrić, P. (2020). The postdigital challenge of pandemic education. Journal of Contemporary Educational Studies, 71(4), 176-189.
Jisc (2021). Assessment Rebooted: From 2020’s quick fixes to future transformation. Available at: Assessment rebooted | Jisc (last accessed 29/11/2022).
Kopp, R., Dhondt, S., Hirsch-Kreinsen, H., Kohlgrüber, M., & Preenen, P. (2019). Sociotechnical perspectives on digitalisation and Industry 4.0. International Journal of Technology Transfer and Commercialisation, 16(3), 290-309.
McArthur, J. (2023). Rethinking authentic assessment: work, well-being, and society. Higher Education, 85, 85–101. https://doi.org/10.1007/s10734-022-00822-y
Nieminen, J. H. (2022). Assessment for Inclusion: rethinking inclusive assessment in higher education. Teaching in Higher Education, 1-19.
Nieminen, J. H., Bearman, M., & Ajjawi, R. (2022). Designing the digital in authentic assessment: is it fit for purpose? Assessment & Evaluation in Higher Education, 1-15.
Knox, J. (2019). What does the ‘postdigital’ mean for education? Three critical perspectives on the digital, with implications for educational research and practice. Postdigital Science and Education, 1(2), 357-370.
Price, M., Rust, C., O’Donovan, B., and Handley, K. (2012). Assessment Literacy: The Foundation for Improving Student Learning. Oxford: The Oxford Centre for Staff and Learning Development.
Rapanta, C., Botturi, L., Goodyear, P., Guàrdia, L., & Koole, M. (2020). Online university teaching during and after the Covid-19 crisis: Refocusing teacher presence and learning activity. Postdigital Science and Education, 2(3), 923-945.
Ryan, A., and Tilbury, D. (2013). Flexible pedagogies: New pedagogical ideas. Higher Education Academy, York.
Sambell, K., McDowell, L., & Montgomery, C. (2013). Assessment for Learning in Higher Education. Abington: Routledge.
Sambell, K. and Brown, S. (2020a) Contingency-planning: exploring rapid alternatives to face-to-face assessment. Available at: Kay Sambell and Sally Brown: Covid-19 Assessment Collection - Sally Brown Sally Brown (sally-brown.net) (last accessed 29/11/2022).
Sambell, K. and Brown, S. (2020b) Fifty tips for replacements for time-constrained, invigilated on-site exams. Available at: Kay Sambell and Sally Brown: Covid-19 Assessment Collection - Sally Brown Sally Brown (sally-brown.net) (last accessed 29/11/2022).
Sambell, K. and Brown, S. (2020c) The changing landscape of assessment: some possible replacements for unseen time-constrained face-to-face invigilated exams. Available at: Kay Sambell and Sally Brown: Covid-19 Assessment Collection - Sally Brown Sally Brown (sally-brown.net) (last accessed 29/11/2022).
Smith, C.D., Worsfold, K., Davies, L., Fisher, R. and McPhail, R. (2013). Assessment literacy and student learning: the case for explicitly developing students assessment literacy. Assessment & Evaluation in Higher Education, 38(1), 44-60.