Cash application form for free school meals

The Education Funding Agency (EFA) has issued a new application form for providers to dish out free school meals cash to needy learners.

The government wants disadvantaged 16 to 18-year-olds to be fed by providers — and has earmarked funds to develop kitchen facilities — or through a credit or voucher scheme for outside caterers.

However, it conceded in April some learners may have to be given cash in “exceptional circumstances” that it listed at the time — and it also said there may be even more situations where learners were handed money.

The EFA’s new document last week was aimed at providers in this latter situation. It lays out how providers, from September, can give out the minimum £2.41 per meal for special, but undefined, reasons.

Piran Dhillon, Association of Colleges public affairs officer, said: “The conditions [for cash payments] set by EFA are challenging but allow some flexibility to colleges in how they deliver the meal entitlement.”

She added: “We are pleased EFA has taken on board concerns following visits to colleges, which have resulted in some improvements to the scheme.

“This has included a higher cost per meal (£2.41 rather than the £2 suggested initially) and acceptance there needs to be exceptions to the cashless transaction rule.”

Paul Warner, Association of Employment and Learning Providers director of employment and skills, said the new guidance reflected how “a great deal of training takes place away from institutions with catering facilities”.

Potential fraud cases with SFA rocket

The number of potential fraud cases referred to the Skills Funding Agency (SFA) more than doubled between 2012-13 and 2013-14.

In the SFA’s annual report and accounts for 2013-14, it said that 108 new “allegations of financial irregularity” were considered, compared to 52 the previous financial year.

The figure for the number of allegations in 2013-14 was 132, but the SFA said 25 allegations related to one case, although it declined to identify the provider.

The report said: “Therefore, the total number of new cases considered during the period was 108.

“There were 40 cases (of which 21 were investigations) brought forward from 2012-13 and a further six allegations brought forward that had not been entered into the vetting and assessment process as at April 1, 2013.

“During 2013-14, 37 investigations were closed and, as at March 31, 2014, there were 18 investigations ongoing and 22 cases at the vetting and assessment stage.”

An SFA spokesperson told FE Week: “We are continually reviewing and strengthening our governance and business processes, particularly in relation to allegations of financial irregularity.

“At the beginning of this year, we changed the way we report on allegations and now also include those that were referred to other agencies.”

The report for 2012-13 had said that 41 allegations were brought forward from the previous year and 52 were made to March 2013.

No case was referred to the police in 2012-13, but one case was passed on to officers in 2013-14, although they decided not to take up a criminal investigation, according to the SFA.

An AoC spokesperson said: “The SFA spent more than £4bn in 2012-13 on more than 2,000 organisations via some 40 programmes.

“We see SFA’s investigation work as being necessary to protect the public purse. The issue here isn’t the number of investigations that have been conducted but what the outcomes were.”

Stewart Segal, Association of Employment and Learning Providers chief executive, said: “We fully support the zero-tolerance approach to potential fraud taken by the SFA.

“There are very few cases of fraud in the sector and as the annual report says most of the errors found in audits are misapplication of funds or where there is insufficient evidence.

“The error rates of PFA audits are around 1 per cent which shows that providers meet the complex rules of funding and evidence.

“The process for dealing with any funding issues needs to be robust, effective and transparent and it is good news that there are very few cases where there is evidence of fraud.”

Hub system ‘ready’ to calculate funds

The Skills Funding Agency will be hoping that long-standing problems in developing new funding software are at an end as it prepares to use the new Hub data collection system to calculate provider payments for the first time.

The agency’s revamp of its data collections and funding system was due to have been completed 11 months ago.

But it has suffered ongoing problems leading to the continued use of the old Online Data Collection (OLDC) as a crutch for its replacement, the Hub, since September.

However, the SFA has announced that the Hub will be stepping up to the plate for the R11 data return, due in by July 4, with responsibility for calculations.

An agency spokesperson said: “We intend to continue to run the two systems in parallel until we consider it appropriate to formally de-commission the old system.”

A spokesperson for the Association of Employment and Learning Providers said: “Providers seem sanguine about dealing with the issue. We hope however that a switch to a single system can be made soon.”

An Association of Colleges spokesperson said: “It’s imperative accurate information is on record and a back-up system during this transition period offers some reassurance.”

World Class Apprenticeships

Download your free copy of the FE Week 16-page  supplement on World Class Apprenticeships ~ in partnership with OCR.

 

Click here to download (10mb)


 

In the 21st Century, ‘apprenticeship’ is truly an international word. Almost every English-speaking country in the developed world has an apprenticeship programme, and Central Europe leads the way on learning that combines on-the-job training with qualifications.

But when you drill down to the differences between countries and their programmes, the gaps could not be wider.

That was the purpose of the first world class apprenticeships study tour organised by the International Skills Standards Organisation (INSSO), which led delegates from Northern Ireland, South Africa, New Zealand, Canada and the United States on a journey of discovery earlier this month.

I was lucky enough to accompany the group, and in this supplement I aim to report back on the lessons we learned on our tour of America and Canada.

The Federation for Industry Sector Skills and Standards (FISSS) produced a helpful report on apprenticeships in English-speaking countries last year, from which we present some information on page three to set the scene.
Our first destination was Washington DC’s Urban Institute, where speeches from apprenticeships expert Dr Bob Lerman and US Labour Secretary Thomas Perez shed some light on the inside view of apprenticeships, or rather, the lack of, in America. These are covered on pages four and five.

On pages six and seven, we explore the Canadian system, which is governed by the all-powerful Red Seal programme. We also hear from Sarah Watts-Rynard in the first of a series of transatlantic expert pieces aimed at opening up the debate on global apprenticeship policy.

After Canada, our trip took us to South Carolina to investigate one of the US’s real apprenticeship success stories. See pages 10 and 11 for the employers’ view, and another expert from Apprenticeship Carolina’s Brad Neese.
Our tour was led by Labour MP John Healey, whose link to the FE and skills sector remains strong a decade after his term as England’s first adult skills minister ended. We feature some of his speech to the Urban Institute, along with an exclusive interview, on pages 12 and 13.

Finally, on pages 14 and 15, we have a debrief with INSSO chief executive Tom Bewick and other delegates on how the tour shaped their views on apprenticeships in the global arena.

In defence of criticisms of UCU’s graded lesson observation report

Former Ofsted FE and skills inspector Phil Hatton was critical of a report from University of Wolverhampton academic Dr Matt O’Leary that raised “serious questions about the fitness for purpose” of graded lesson observations. This is Dr O’Leary’s response to Mr Hatton.

The University and College Union (UCU) research project into the use and impact of lesson observation in FE recently came in for some critique from Phil Hatton.

The project is the largest and most extensive of its kind carried out in the English education system and as such marks an important milestone in lesson observation research.

However, Mr Hatton seemed more intent on damning the report than seriously engaging with its key findings. This is disappointing but not surprising given that Mr Hatton seemed to have a particular axe to grind and, as it turns out, has not even read the report.

Mr Hatton describes himself as a ‘scientist’, yet there is a noticeable lack of empirical evidence or systematic argument in his article, much of which is based on personal anecdotes.

The fact that he dismisses the real experiences and views of thousands of UCU FE members displays a high level of contempt towards them. That he should also compare them to ‘turkeys’ voting for Christmas is an insult to the very serious issues raised in this research.

Whether or not he disagrees with the views of UCU members, to belittle them is disrespectful and irresponsible. It is clear Mr Hatton has not read the report in full and thus draws on his pre-established prejudices to support his argument, the antithesis of a ‘scientific’ approach.

Performance-driven observations are an extremely unreliable means of attempting to assess and measure professional competence

Mr Hatton takes issue with the representation of college managers in the study. The research sample included UCU members nationally. The fact that senior college managers comprised a small percentage of that sample is a reflection of the composition of UCU’s membership.

It has nothing to do with excluding a specific population group from the research, which Mr Hatton seems to imply in his comments.

A sample can only be drawn from the population in question. If Mr Hatton were to make the effort to read the report in full, he would indeed find there are numerous instances in which the views and voices of senior managers are included, often conflicting with those of teaching staff.

His comments suggest that he has little understanding of research methodology. If he did, then he would know that to reduce threats to the validity and reliability of any research, the methodology should be made explicit and transparent for all to see so that a judgement can be made on what data was collected, from whom, how it was collected and analysed etc.

Once again, had he read the report, he would realise that there is a section which discusses this in detail and is open to the external scrutiny of any reader.

Mr Hatton states: ‘I am very simplistic about my expectations of the FE system’. His simplistic position is not restricted to his expectations of FE, but extends to his conceptualisation of the way in which observation is used as a method and its role in informing judgements about professional competence.

In referring to a system of observation that he introduced at a college where he was responsible for managing quality, he conflates the use of grading performance with ‘identifying and spreading good practice’ as though this was something that is unproblematic and uncontested, let alone the disputed notion of what constitutes ‘good practice’.

However, in his defence, he does state that this was 18 years ago.

Times have certainly changed considerably since then and the failure to acknowledge the increasingly high stakes nature of graded observations in FE is merely one example of how out of touch he appears to be with the current debate.

His claim that ‘if you cannot put on a performance with notice, there has to be something very lacking in your ability’ is very revealing about Mr Hatton’s views of the purpose of observation.

He is right about associating the use of summative observations with ‘performance’. A key theme to emerge from the research data was the inauthenticity of the performance element of isolated, episodic observations.

There were repeated examples of ‘poor’ teachers raising their game for these one-off observations, only to go back to their poor practice for the rest of the year.

In contrast, some consistently effective teachers were so unnerved by these high stakes observations that they seemed to go to pieces during the observed lesson.

Thus the important lesson here is that performance-driven observations are an extremely unreliable means of attempting to assess and measure professional competence.

His final claim that ‘the best way of gauging the quality of the experience of learners is to observe what they are getting in a quantitative way, in a transparent way’ would seem a commendable suggestion, but it is one that belies the complexities of teaching and learning and seeks to measure them in a reductive and ultimately unreliable manner.

Let us continue to use observation to inform judgements about the quality of teaching and learning, along with other sources of evidence.

But let us also acknowledge its limitations and accept that the grading of performance is little more than a pseudo-scientific practice that gives rise to some very serious counterproductive consequences for the well-being of staff.

Dr Matt O’Leary, principal lecturer and research fellow in post-compulsory education at the University of Wolverhampton’s Centre for Research and Development in Lifelong Education (CRADLE), and author of Classroom Observation: A Guide to the Effective Observation of Teaching and Learning

Colleges under fire over Gazelle’s £3.5m

A month-long FE Week investigation into multi-million pound funding of Gazelle by UK colleges has resulted in criticism that public money was being used on “expensive initiatives which have little educational impact”.

The group’s five founding colleges have dished out more than £530,000 each to Gazelle, according to figures obtained from Freedom of Information Act.

More than 20 current and former member colleges were asked what they had spent on the organisation, which was launched in January 2012 with standard annual membership priced at £35,000.

Gazelle, which raked in around £3.5m from colleges, claims to, “develop innovative new learning models and new partnerships with business to deliver an improved outcome for students, their communities and the economy”.

Its chief executive, Fintan Donohue, said the “enrichment of student experiences and outcomes” was its “overriding goal”.

But no independent research has been carried out into whether learners benefit, while of the 11 Gazelle colleges inspected since 2013, six were rated as good, four were told they required improvement while one was branded inadequate. Four of these were an improvement, one was a decrease and the rest were no change.

The findings of the FE Week investigation have prompted University and College Union general secretary Sally Hunt to question the sums of cash being handed over by colleges.

“At a time of financial pressures on colleges across the UK, students and staff alike will be dismayed at how much is being paid by some institutions for Gazelle membership which seems to have little impact when it comes to improving learner experience,” she said.

“The amount that some colleges are paying Gazelle seems incredible given the apparent lack of return on investment for the institutions involved. We would seriously question whether this is resulting in a better education for learners.

“Colleges should focus more on ensuring better learning environments for students and working environments for staff, and less on expensive initiatives which have little educational impact.”

The highest paying Gazelle college was grade three-rated Gateshead, one of the founders, and it gave £642,000.

The payments included including £120,000 for “purchase of educational concept” and more than £22,000 for staff development and student activities, but deputy principal John Holt defended the contract.

He said: “As a college we place considerable value on key aspects of the Gazelle membership and activity.”

He said the benefits included the formation of pro-active development groups across key areas of curriculum innovation, engagement of students in national competitions, exposure to business and entrepreneurial expertise and innovation in teaching and learning.

The remaining founder colleges were Warwickshire, City College Norwich, New College Nottingham and North Hertfordshire, whose former chief executive, Mr Donohue, stepped down last year to focus on his role as Gazelle chief executive.

He said: “If our mission was simply to immediately improve Ofsted grades, we would invest our resources quite differently. Nevertheless, in the long term our expectation is that the creation of entrepreneurial learning and leadership will deliver enhanced Ofsted ratings — and among our 23 colleges, 18 are already rated as good or outstanding for leadership and management.

“Gazelle colleges recognise that the current funding challenge faced by the sector requires not just frugality in spending, but the investment of resources into ventures and partnerships that can deliver new revenue streams. That, alongside the enrichment of student experiences and outcomes, will remain our overriding goal, one that is fully supported by a fast-growing membership group.”

Editorial 

Gazelle leap of faith

It would be hard to disagree with the view of the UCU that Gazelle is an expensive initiative with member colleges paying at least £35k a-year in fees.

And that’s just the basic amount. One founder member of the group, for example, has splashed out more than £650k since 2012.

The group may be worth these eyebrow-raising figures, but where is the evidence?

How many more membership fees and other costs will be handed over without good reason to expect some kind of quality return?

Given this is public money that principals are paying out here, it’s only right that independent research be carried out into what effect, if any, Gazelle has on its member colleges.

The sums being given to this organisation make it quite some leap of financial faith by colleges.

But the sector has a self-improvement body in the Education and Training Foundation — perhaps it’s the one to look at whether £3.5m has been well spent.

After all, being business-minded, as Gazelle claims to promote, wouldn’t you want to know what bang you’re getting for your buck?