Technology has transformed further education. But at what cost, asks JL Dutaut. The history of edtech is after all a story of failures
Paul Feldman, chief executive of Jisc, further education’s main technology body, wrote last month of the sector’s role in addressing the demands of a changing world of work and the importance of technology in meeting the challenges (FE Week, 25/10/19). But neither Jisc nor the Independent Commission on the College of the Future (ICCF) that it supports are the first to consider them.
In November 1982 Margaret Thatcher announced the creation of a technical and vocational education initiative (TVEI), the first major intervention in curriculum by a British government.
Its drive? Amid high youth unemployment and a changing world of work, “to improve our performance in the development of new skills and technology”.
TVEI was in effect for 14 years. It was responsible for pushing forward the success of the BBC Micro computer; some credit it for the development of the UK’s thriving animation industry. More than that, it is cited as an example of positive change management.
TVEI was implemented regionally. Local authorities were responsible for developing curricula that were tailored to local employment sectors, and they did this in partnership with educators and industry. It was a curriculum-led model focused on innovation, rather than an assessment-led model like the GNVQ policy that grew to replace it. It saw the advent of computer rooms in schools and staff training to make effective use of them.
Command and control
When I began my teaching career in an FE college in 2004, the media department was a far cry from one in which students worked on “industry-standard” technologies. Wrong hardware and software were just the start.
The popularity of the media courses secured us some investment, and before long we had two bespoke specialist computer suites. The impact on the quality of the learners’ experience and the work they produced was immediate, but it wasn’t sustained.
The reason? Redundancy.
Given the option of an ongoing investment – an operational expenditure – in the form of a favourable lease agreement (much like a mobile phone contract, and even including set-up and servicing), the college chose a one-off purchase instead.
Failure to grasp the systemic nature of technological progress was endemic
Whether colleges buy or lease their technology is specific to each college and each investment. No specific regulation prescribes how they spend their core funding in this respect.
Technically then, this was a leadership decision. It meant that the college could catch up with industry in costly lurches and spurts, but could never keep up at a steady pace. The assets depreciated at an alarming rate the moment they were purchased. Worse, rather than support learning, they quickly began to hinder it.
A catch-22 arose. Dependent upon their attractiveness to prospective students, the courses couldn’t run without repeated injections of large capital expenditures. No capital, no students. No students, no capital. The victim, either way, was curriculum.
But it isn’t quite right to place this at the door of college leaders alone. The Sixth Form Colleges Association’s James Kewin notes: “As capital funding from government is limited to bricks and mortar, colleges have been forced to use their dwindling core funding to invest in technology.”
That core funding is based on lagged numbers, and consistently fluctuating per-student rates and programme costs. If equipment is non-specialist, or a course is perennially popular, it’s easier to make sustainable investment decisions, but this is much harder for specialist equipment and courses where student demand might fluctuate. Kewin states: “The uncertainty of year-on-year funding coupled with how low it is means it is very hard to be strategic about any of this.”
It is also an approach to technological investment modelled by politicians. The year we got our Mac suites, Charles Clarke, then education and skills secretary, demonstrated exactly what not to do when he announced a now broadly derided £25 million “investment” in interactive whiteboards (IWBs) for schools. Gone was any pretence of change management. Failure to grasp the systemic nature of technological progress was endemic.
All the while, technology had nonetheless transformed leadership in other ways. Since 1997, the policy paradigm of Tony Blair’s government had been reducible to one word: deliverology. Technology had empowered the collection of data for assessment and monitoring purposes on a previously unimaginable scale.
Today, it is dwarfed by the potential of big data, but by 2004 new practices had already emerged that still shape the sector. The summary judgment of classroom practice using tick-box proformas, for example, was already routine.
The monitoring of every aspect of lecturers’ practices is one effect of technology that has been sustained. By 2016, among the top 20 contributory factors to teachers’ workload in a major University and College Union (UCU) survey, five could directly be put down to the impact of technology, including the top-ranking, “increased administrative work”.
The major selling point of the first wave of technological ingress into education had been the streamlining of workflows. Instead, where any time was saved, new tasks had filled the gaps, made possible by a technologically empowered managerialism and evidenced by swelling email inboxes. The second-most cited cause of workload in UCU’s 2016 survey: “Widening of duties considered within my remit.”
There’s an app for that
If the first wave of edtech was characterised by placing terminals in front of teachers and plugging them into the zeitgeist of an industrial revolution that required the sector’s response, the second wave can best be understood as an era of loosely supervised free play. It is a shift with which policymakers are only just getting to grips.
It was at the BETT show in 2004 that Clarke announced his IWB policy. It was with reference to the same event – a buzzing marketplace of solutions looking for problems as much as the other way around – that Damian Hinds wrote in The Daily Telegraph last year: “With around a thousand tech companies selling to schools, it’s by no means easy to separate the genuinely useful products from the fads and the gimmicks.”
A year after Clarke’s announcement, a DfES paper entitled Harnessing technology – transforming learning and children’s services encouraged the use of virtual learning environments (VLEs). Moodle, an early platform, and still the one with the largest market share in the UK, grew exponentially in England’s FE sector.
There is an often unseen investment of time and energy by lecturers
Ofsted’s 2009 review of VLEs concluded that “there was no consistency”. In the colleges surveyed, Ofsted found “substantial duplication of effort”, a “waste of the potential of VLEs”.
Worse was the waste of teachers’ potential. Ofsted found no provider with a quality assurance system for its VLE; such arrangements were left to tutors and heads of department. The common factor in effective VLEs was “the enthusiasm of the subject teacher”.
Today, the VLE market is more diverse and the infrastructure and functionality greatly improved. But the same problems persist, and they do so across a much bigger field than simply VLEs.
The second edtech wave introduced a plethora of other start-ups and apps to simplify and gamify almost all aspects of teaching and learning. Each adoption represents a much greater investment than subscription costs – an often unseen and under-appreciated investment of time and energy by lecturers.
Meanwhile, with all the support and investment of leadership teams, the second wave also brought management information systems such as Capita’s UNIT-e. While technology to support teaching and learning has splintered into a baffling array of consumables of varying quality, tools to monitor every aspect of educational institutions have concentrated and sharpened.
The third wave
Before ICT was cool, the initials didn’t stand for information and communications technology but for information and control technology – the DES referred to it as such in its 1981 pamphlet, The School Curriculum. That pamphlet would ultimately shape Thatcher’s 1982 announcement.
In a sign that the shift hasn’t quite happened, the pamphlet uses the same language of curriculum as Ofsted’s newest framework, some 38 years apart – fundamental values, intent, implementation and impact (or synonyms thereof).
By and large, since 2010, governments have stopped the centralised control of edtech. It may be a welcome respite for the sector, but a lack of leadership can be just as problematic and the market pressures on the profession have continued.
Only a year ago, Hinds published a workload review that, while admonishing leaders to “ditch email culture”, also urged technology companies with a £10 million bait to innovate ways to reduce teacher workload.
The third wave is already barrelling over the education sector
As politicians still play in the receding waters of the first wave, with the crash of the second wave still in the distance, the third is already barrelling over the education sector.
With big data and algorithms, eye-tracking goggles and attention-monitoring headsets, facial recognition and body cameras, and exponentially more powerful tools to “personalise learning”, deep ethical concerns should give pause.
Published today, the ICCF’s progress report states that edtech “will require a radical shift for colleges away from course delivery towards a more personalised service”. If it is to avoid the crystal-ball gazing and Silicon-Valley utopianism that have become clichés of policymaking in this area, it must consider these successive waves and the emerging patterns in the sand as they recede.
Dystopias are just as likely as their idealistic opposites, and the unsustainable toll that the first two waves have wrought on the profession suggests they may be even more likely. The human-centred education that Paul Feldman is calling for may just have to start with teachers, and rebuild an ethos of change management.