Dr David Pendleton, Professor in Leadership

Algorithms are in the news.  An algorithm is a set of rules or a procedure for solving a problem.  Who would have thought that such an esoteric thing as an algorithm would be getting so many people so hot under the collar?  We all use algorithms all the time, following the same steps to achieve a goal.  In a sense, a habit is an algorithm though the problem we are trying to solve may be rather ill-defined in that case.  A checklist may also meet the definition.  Airlines use stepwise checklists on every flight to land the aircraft safely, for example.

So, what is the problem with algorithms?

What is the problem with algorithms such that resignations are called for and heads have to roll?  Surely the issue is that some algorithms, rather than solving a problem, make matters worse.  The recent examinations furore was created because an algorithm designed to correct for inconsistent marking across the country actually made matters worse, apparently creating significant injustice for many students.

So, what goes wrong with algorithms? 

Several factors can cause them to fail.  First, they may not have been quality checked to ensure the steps are the correct steps.  As I discovered yesterday, a set of instructions (another kind of algorithm) for assembling a desk chair, had two different lengths of screw the wrong way around so the screws did not fit.  The error was easy to spot and correcting the instructions was easy but the instructions were wrong.  Incorrect algorithms have little to commend them.  Second, the algorithms may not take all factors into account.  An algorithm that states if A happens then do step 1, if B happens then do step 2 is no good if C happens.  Incomplete algorithms fail some of the time.

More fundamentally, there is the generalisability problem.  An algorithm created from population data may work at the population level but should only be applied with care to any individual in that population.  Epidemiologists can make algorithm based predictions about the course of a disease in a population.  They will tell us, for example, rightly, that smoking leads to lung cancer but both my grandfathers smoked like chimneys and lived well into their 90s and died of something else.

This was the problem with the algorithms used to ‘correct’ the A-level grades this summer.  I did not see the algorithms used but it is highly likely that they were designed by experienced, bright and well-intentioned professionals.  It is highly likely that they were checked against previous populations before being applied to the current population.  But, while they can create guidelines to be considered, it will always be fraught with danger to apply them to individuals without humans taking special circumstances into account.  There will frequently be specific, uncommon factors to consider such as the school, reported in the news, that had changed their admission criteria two years ago to ensure a particularly bright intake.  The algorithm ‘corrected’ what looked like inflated marking and pulled their grades down because it did not ‘know’ about the special circumstances.

Algorithms are neither bad nor good

They are inevitable in a world that is complex and needs to move at pace.  But they often need to function as guidelines rather than being applied without thought to individual cases.

 And the reason I was moved to put this all on paper was because this morning I was sent an automated email from Amazon recommending, for the second time, a book to me.  It said ‘we have a recommendation for you.  Based on your recent activity, we thought you might be interested in this (book) Leadership: all you need to know.’  I am, after all, Professor in Leadership and buy books on the subject.  So what is wrong with this recommendation?  I wrote the book!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s