Governors bring a wealth of disparate expertise to the board table, but can sometimes fall short when it comes to their college knowledge. Lawrence Vincent provides 10 key questions governors should be armed with at their next meeting with the principal.

College governing bodies have a responsibility to ensure that the core function of the college — teaching, learning and assessment — is of the highest standard.

How then can college governors discharge this responsibility effectively? With great difficulty, I would argue, without the help of the expert knowledge that sits within the college’s senior leadership team.

It is easy for a principal to build ‘pseudo trust’ with their board by providing carefully controlled data sets that look impressive but give only a surface level view of the effectiveness of teaching and learning.

My Bournemouth and Poole College senior leadership team has built enormous trust with our board thanks to a set of questions under the heading ‘The 10 difficult questions no principal wants to be asked.’

This is, I accept, a masochistic approach to building trust and although these questions cause us serious discomfort, they have enhanced my board’s understanding of teaching and learning no end.

Firstly, how many students dropped out before Day 43, from which areas and why?

This question moves the discussion away from a glib set of standard answers and opens up a more fruitful line of enquiry. For example, is there a correlation between late enrolments and drop-outs? Are some areas of the college more aberrant than others and does this suggest sloppy interview and induction practices? Is there a culture of pressure to achieve enrolment targets that is then corrupting the integrity of student recruitment?

Secondly, how many 16 to 18-year-olds enrolled without maths and/or English grade C and what percentage of these are currently in maths/English classes?

The speed in which students who need maths and English are identified and put in classes reveals crucial information about whole college understanding and support of the maths and English policy and the initial assessment process.

Thirdly, what are student attendance rates for maths and English classes?

Although these questions cause us serious discomfort, they have enhanced my board’s understanding of teaching and learning no end

Whole college attendance figures do not reveal that attendance at maths/English classes is nearly always lower. This question separates the data and gives way to establishing what is being done to drive up maths and English attendance.

Fourthly, what is the lesson observation profile for all maths and English classes?

Again, whole college analysis of lesson observation grades often disguises the weaker profile in maths and English classes.

Fifthly, what is the lesson observation profile for all new teachers in their first year?

Support for new teachers is often weak. By concentrating on them as a distinctive group, wider questions can be asked about the whole college approach to supporting new teachers. This includes ensuring poor performers do not drift through their probationary period.

Sixthly, and for the same reasons, how many new teachers are there and how are they supported?

Seven, how many teachers at any one time are being formally disciplined/performance managed? How long is each case taking and why?

This question demands hard evidence that poorly-performing teachers are being carefully managed and light thrown on the amount of time this is taking.

Eight, which 10 full-time courses have the weakest student progression statistics and what is being done about it?

This moves the scrutiny away from a sole focus on success rates and generates interest in progression rates as a measure of effectiveness. It throws up some very interesting and uncomfortable ‘why are we offering this course’ conversations.

Nine, what percentage of full time 16 to 18 students have undertaken a work placement?

This is rarely looked at but is key within the new study programmes. This creates a whole college view of work experience and often reveals highly inconsistent practice.

And finally, how does the internal lesson observation profile compare with externally-validated lesson observations?

Internal lesson observation profiles are often dangerously inflated. College boards need to know that leadership teams are not fooling themselves. Scrutiny here forces external validation of lesson observations and allows valuable comparison.