Tuesday, May 09, 2017

The End of Computing My Friends


Computing in schools, has it gone terribly wrong?

It was back in 2013 that I  wrote a blog in ComputerworldUK entitled The End of ICT my Friends which presaged the whole-scale deprecation of ICT in the school curriculum by the then Education Secretary Michael Gove.

The reasons for ICT’s demise were easy to understand. ICT had become little more than training in MS Office applications (Word, Excel, Access, Outlook and PowerPoint). Proficiency with  these applications had been identified by the previous Labour Government as essential modern workplace skills and through their quango BECTA had zealously introduced computers (suitably equipped) to all schools.

Then it all went pear-shaped. In 2011, the famously ‘flabbergasted’ Eric Schmidt, Chairman of Google, lambasted the almost complete absence of computing in the curriculum of the country that produced Turing and so many of computing’s pioneers. But also in the tradition of the double whammy, it had at that time become painfully obvious that the costs of Microsoft Office licencing to education was, shall we say, poorly controlled.

The rest is history as we say; ICT was given the chop, the BCS was summoned (British Computer Society), MIT’s Scratch programming endorsed by all and by 2015 62,000 students (nearly 15,000 girls) took GCSE Computing.

And there the story should end, all British mini Turings being turned out on a regular basis, evil Microsoft consigned to ignominy with only Geography trying to keep ‘Death by Powerpoint’ alive. But, there is always a but, all may not be well as we approach the 2016  9-1 revamped, ‘demanding’ world-class GCSEs ( viz ‘hard’ ) having had the GCE A Level updated in 2015 in a similar macho way.

You guessed it, we are struggling with numbers, the number of students wanting to do computing that is. The new GCSE and GCEs are hard and getting harder. It looks like every crusty/nerdy/stackoverflow BCSesque illuminatus has put their pet ‘indispensable’ knowledge into every nook and cranny. This is deja vu all over again. The same thing happened when all the GCSEs were created 30 years ago. ‘Experts in their fields’ created a curriculum that took 210% of the available time. Maggie Thatcher never forgave education experts for their idiocy.

So pity the poor ICT teacher trying to convert to Computing at their school, even Comp Sci graduate will still have to do his prep to teach much of the syllabuses ( I say ‘his’ as the number of Comp Sci girls is very small). Except Comp Sci graduates already have jobs that pay better and are less stressful.

Don’t worry we can fill the gap with folk from Eastern Europe they have loads of coders looking for jobs (my colleague is from Slovenia); oops we’ve Brexit to contend with so that’s banjaxed that idea. Oh, and I forgot to mention, now that we don’t teach ICT most of our youngsters come out without MS Office skills …  as our parents constantly remind us. Good job immigrants do have these skills … rats! I forgot about immigration targets. Maybe we should not be looking to schools for the future of British coders obviously we will find them in the new CTC Universities ( City Technology Colleges). What? they’re only half full. Oh dear.

Did I mention that the small uptake numbers for computing means small option groups, which means big costs for schools ( I teacher - 3 students at GCE? 1-7?) during a funding crisis.

So here we go again ‘The End of Computing my Friends’.  

I give it two years.

Tuesday, May 02, 2017

Homo Algorithmicus ( AI for Monkeys)


The ascent of Homo Algorithmicus

The modern workplace, be it the operating theatre, city office, or maybe the sports-field or  even a race track, are all highly competitive environments. Here responses to stimuli are finely tuned, honed to operate at maximum efficiency and gain advantage. Often these environments now respond in accordance with a ‘play-book’. A book of rules which has been refined over many years and which embodies the very best of practice. The concept of ‘minimal gains’  familiar to most of us today implicitly acknowledges this refinement and at the same time hints at a limiting ceiling of perfection?


I once listened with great attention to an international rugby player lament a loss with the words “at that stage in the game after the second breakdown (in play) we take a scrum ... but we took a kick and the lads didn’t know what to do”. Something jarred in me and not for the first time, they failed because of a failure to follow the algorithm?


I have worked in a school of 4,000 16-18 yr old students where they produced great results through their systems of teaching, learning and operation. Conformity to their model produced the results required at A Level and this place was duly rated an ‘outstanding’ establishment. Clearly systems work. So what’s my problem?


There are many more examples of success in the vein described above and it does not take a genius to spot the emergence of  ‘algorithms of success’: there is a smart way of doing things in these contexts.


Artificial Intelligence ‘bots’ obviously use smart algorithms, it’s hardly saying anything at all to point this out and increasingly, unsurprisingly they are finding a place working alongside of us ( when not actually taking our jobs). This is to be expected because we and they are on convergent cognitive paths.  I have pointed out before in a previous blog but to reiterate explicitly: increasingly robots ‘think’ like us and we ‘think’ like them.


I call this AUI or Acquired Un-Intelligence. From the dawn of time an organism has to acquire a set of useful algorithms of behaviour in order to survive in a dynamic ( ie changing)  environment. Shine a light on a woodlouse and it quickly  for a woodlouse) moves into the dark … and so on throughout a million examples from Nature.  We humans have complex brains and we can move from such simple ‘rules of thumb’ to more elaborate and complex behaviours thanks to the fact they can be written down, thought about and refined by a ‘thousand eyes’ distributed globally.


Oh good.


In my opinion It’s a mistake to mistake this ability as cognitive progress. The effectiveness of our smartest algorithm followed to the letter does not make us smart unless we shift away from calling machines dumb for doing just the same!


Another example of acquired dumbness is the rise and rise of ‘Big Data’ or rather the rise of the algorithms that ‘mine’ Big Data’. Machines can look for patterns in vast data sets, which is another way of saying that they can find patterns in data which is one step from finding associations which are of course correlations.


Increasing Post Doc students particularly in medicine are given vast sets of medical data to research. In other words the apply algorithms they do not understand to find correlation that they are looking for. Of course they find them and we seem programmed to be unable not to attribute significance ( even causal significance) to all but the most ludicrous associations.


I used to tease my students in the 19080s with the strong association between dyslexia in a child and the wearing of dungarees by the mother that drove Volvo cars. It’s real, so I advised mothers to avoid this clothing and buy a Fiat. The correlation did not mention that dyslexia was being newly diagnosed in those times and was the preserve of a certain middle class family whose mothers at that time would have met the criteria above.

Homo algorithmicus is dumb and getting dumber. Machines are dumb but getting smarter. We need to nurture our creativity; we are at our best when we are makers; makers of art, tools, music, buildings and fables. Not followers of algorithms, that day is nearly done, we make rubbish robots.