Tuesday, November 28, 2017

UK Computer Cheats




As a final comment on computing in UK schools Ofqual has cancelled mid-year all non-examination coding assessments from all 1-9 GCSEs due to wide-spread cheating by teachers and students in the, now legacy, ‘easier’ exams taken this summer.

Yes cheating was and is widespread in GCSE coursework generally; teachers have become corrupt as a result of, amongst other things, having their pay linked to their GCSE grades through their annual performance ratings. Computing teachers simply barely exist at all and press-ganged ICT teachers make as much use of the online forums containing the answers as the kids do ( as Ofqual found out from the IP addresses).

Unfortunately for computing it is almost impossible to assess coding skills through written exam papers, hence all except the IGCSE have programming coursework … or rather did have coursework. All coders I know, code with one screen web-linked  open on StackOverflow, another on the docs and one on an IDE(integrated development environment) to format and debug as you go. None, as far as I know, do it all with pencil and paper like I did with punch cards in the past.

I moved my computing classes in 2016 to the IGCSE (International GCSE) run by Cambridge board. No course-work and deadly dull but we saw a lot of this coming. The sheer difficulty of the new ‘hard’ 1-9 syllabuses put us off, god knows what the ICT teachers thought. The assessments I saw were obscure and difficult. It is blindingly obvious that this level of challenge will produce almost universal cheating.

Put yourself in this position:

you have been teaching ICT successfully for years leveraging your MS Excel skills ( the envy of your colleagues indeed) and still keen enough to attend a three days of Python training to upskill into the coding world;

you work in an bully-boy Academy/Independent school with a ‘no one fails here’ *ethos teaching Y11;

your increase to the higher tiers of pay depends on your performance ( ie the grades);

the GCSE suddenly gets a lot harder.

Enough said.

The UK gov has said in its budget this Autumn that it will recruit 8000 new computing teachers … good luck with that. I’ll say it for the third and mercifully final time, (I retire this year) … computing in schools is finished. Ofqual delivered the coup de grace this week.

Friday, November 10, 2017

UK School Computing is terminal (official)



It’s official UK school computing is dying before it could even toddle.
The Royal Society reports this week:
  • Across the UK,  11% of students in England took GCSE computer science
  • 20% were female and the figure fell to 10% at A-level
Anyone who reads my blogs will know that I have been warning for some time that Computer Science is going to fail to establish itself as a subject in schools. There are three problems encapsulated in the bullet points above but for the time being I’ll let those points just hang there to be fully absorbed in all their bleakness.
At the school I teach we offer Computer Science at GCSE and A level and introduce it in Y6. There are two of us in the department both of whom can code to professional/hobbyist level in two languages.   Girls make up our very best students from Y6 but fail to opt to do it for GCSE (see above); forty years separates me from my colleague;  I will retire this year and my colleague from Eastern Europe will leave after Brexit.
We struggle to work out exactly what the exam boards want for their wildly varying syllabuses and our subject is the most expensive in the school due to the small number taking the subject compared with Geography or History for example.
I write the anecdote above because it explains everything. There are vanishingly few UK teachers between 30 and 60 years old; there is little to no substantial training for existing teachers to fill that gap; the subject appeals to those with the geek mindest which means a small uptake and a shortage of girls and finally this means it’s expensive - schools are short of money by the way.

That’s it folks, stop the hand wringing, none of  the above will be solved anytime soon.Even if you inject loads of cash we’ll just buy toys with it and the training courses will simply prove to the trainees that this is not for them.

And if you (misguidedly) make it ‘girl friendly’ you’ll patronise both women and the subject with pink-eco-friendly-code ... even ‘relational’ databases are so yesterday.
The computer-trained Euros are going home and the ZX80 generation are dying off. Just get over it and think afresh.

There is a solution, and computing should lead the way. UK is a third world country wrt teaching computing so we need third world tech solutions. All schools have broadband and whiteboard. The exam boards themselves should directly employ teachers  to deliver lessons to any signed up school in the 4pm to 5pm slot: done. Google classroom will provide the glue but what about ‘marking’? I hear you say. Don’t be daft, do you want to get rid of the teachers over again? .

Tuesday, September 19, 2017

The End of UK School Computing 2


Recently I wrote a post in which I argued that the much trumpeted reboot of computer science in UK schools to replace the mocked ICT qualifications was going to end in tears.
Well, now into my third year of teaching the new Computer Science syllabuses, I am more convinced than ever that it’s all gone horribly wrong. Being a generous soul I give it two more years.

Here are a few problems to mull over:

Raspberry Pi: what a wonderful British Linux machine this is and what a glorious world it has opened … but for whom? The school children it was aimed at? Not at all, the Raspberry Pi Bs are all in a drawer. Pi sales are very healthy indeed but worldwide to hobbyist crusties like me. It’s easy to see why, just look at the syllabuses.

The syllabuses at GCSE and A level are very diverse especially with regard to coursework and assessment. One GCSE syllabus had substantial coding exercises in their exams without any access to an IDE; one has no coursework;  another syllabus had 60% of the final marks as controlled assessment (class-based project work under ‘exam conditions’) one produced a 9-1 GCSE syllabus so difficult  Ist Year undergrads would respect it.

All this is not you may say exactly the end of the world, maybe the lack of consensus is merely unnerving but you can at least pick a syllabus to suit taste and ability. However what follows is more serious.

Few take CompSci at GCSE, AS and A level. At GCSE the numbers rival  those of German GCSE and for those in the business that will tell you all you need to know. Of those few, a fifth are girls. Additionally post 16, most  who start CompSci abandon the subject after taking the AS in Y12 leaving  a very select cohort to be ‘norm referenced’ into grade bands. In other words a good student by any objective criteria may well get a mediocre grade by comparison with uber-geeks.

However you view the above situation it means low numbers of mostly boys take CompSci qualifications. Low numbers mean very high costs. This can be sustained for a few years as a subject is introduced but not long term, budget cuts will see it off.

Finally, there is a teacher shortage in general, a STEM teacher shortage specifically and a dire shortage of CompSci teachers in particular, most of whom in any case are ICT teachers who have been pressed into teaching the most arcane of subjects having been offered a three day Python course. A specialised teacher shortage also translates into a shortage of specialised exam moderators which is death to quality control and so to customer satisfaction.

CompSci is great to teach don't get me wrong, it’s just I don’t think this model will fly. The National Project to create coders rather than MS Office users won’t be achieved this way. This begs the question ‘is there any solution?’

I think, and I think I said this years ago, that we need a technical baccalaureate post 16. An academic artisan a unique UK oxymoron. Maths, Design and Technology and Computing in one qualification worth two current A levels. Until this happens we will be stuck in a groundhog day of ‘more of the same’ post 16 ‘academic’ qualifications, we need more humanities specialists don’t we.

Monday, August 07, 2017

Diesel fumes are good for your mitochondria.

A hot topic for urban dwellers is exhaust emissions from diesel cars. Specifically the debate, centres on  NOX emissions and various figures for the number of premature deaths resulting from their inhalation: The Guardian reports 38,000 globally die each year but are not specific as to whether this is due to NOX from diesel cars or more generally from particulate or hydrocarbon inhalation concomitant with car use.

The good bad and ugly of exhaust emissions often, usually get conflated. First up, particulates, especially the fine particulates are bad, potentially very bad. Larger visible particles from inefficient combustion (soot) will damage lungs and exacerbate all pulmonary disorders but the very fine, near invisible particles can penetrate the blood brain barrier and even damage mitochondrial function. Particles such as these are as much a function of wear on the modern ‘safe’ fat, soft tyres as of they are of exhaust pipes which have particulate filters fitted. But what about NOX?

NOX stands for the oxides of nitrogen, it’s unfortunate that NOX sounds like noxious for memetically this is a very potent thematic link to its deprecation. The oxides of nitrogen from exhaust emissions are N2O, NO and NO2 , traditionally called nitrous oxide, nitric oxide and nitrogen dioxide respectively. N2O is a potent ozone depleting gas which persists in atmosphere for over a hundred years until photochemically oxidised to NO. NO2  is a choking dense brown gas which dissolves in water to form nitrous acid and ultimately to nitrites and nitrates.

This brings us to NO, nitric oxide, NO is the primary nitrogen oxide of exhaust emissions1. And here’s the thing … nitric oxide is good for you! The list of good stuff is long and varied so here are three of the best.

The ancient Egyptians were famed for their extensive use of Kohl ( galena based) around the eyes of men and women. They had good medical reasons for doing so, the local release of NO would have made good use of its anti-bacterial properties reducing eye infections and parasitism in a region where this was an endemic health risk. Bringing the story up to date 2017 saw the release of a nitric oxide based cream for treating psoriasis. The list of topical applications of nitric oxide in treating skin diseases from eczema to acne is long.

Nitric oxide has important internal biological signalling functions. Nitric oxide causes large vessel vasodilation naturally in the body improving circulation generally and which has encouraged the sports community to explore its potential in performance enhancement ( before it’s banned I guess).

Nitric oxide appears to be one of, if not  the major signalling factor in mitochondrial biogenesis2. New, young mitochondria are the holy grail of the aging research community as has been mentioned many times before in this blog



You will get the idea by now. We should be looking at urban populations for better skin, athletic performance and longevity, all down to nitric oxide from the modern diesel engine … if the particulates don't get them first, even electric cars have tires.





  1. http://jcs.biologists.org/content/119/14/2855

Friday, August 04, 2017

Overclocking, mitochondria and aging

Overclocking Mitochondria.

‘Overclocking’ is a concept well known to computer-game enthusiasts. Simply put, they can increase the performance of their computer by speeding up the CPU ( central processing unit) or the GPU (graphics processing unit) by increasing the clock speed (instruction-cycles per second). They do this essentially by upping the voltage supplied to the chips …  upto and including the point where they becomes too hot and inevitably unstable. Typically, such voltages are between 3 and 5 volts and CPU cycles run at many MHz (million cycles per second). The ‘free’ extra performance is much prized by ‘overclockers’ with equally serious cooling systems and a penchant for breaking things.

Mitochondria have somethings in common with CPUs. They too are cycle driven electronic devices;  their cycle is called (variously) Kreb’s Cycle, the TCA cycle  or the Citric acid cycle. As it cycles, it fetches acetyl ‘food’ molecules from its surroundings spewing out waste water, carbon dioxide and heat as it makes ATP: magical, universal, ATP. ATP in turn is used to provide the free energy needed to drive all negative-entropy biological processes such as synthesis, repair, nervous conduction and locomotion. It makes over 100 ATP molecule per second which is about 3Hz in computer parlance* It does this with normal (trans-membrane) voltages between 110 and 150 mv ( 1/10th of a volt), 140mv being associated with optimum ATP efficiency.

Overclocked mitochondria, like CPUs fall into two categories; working well under conditions of high demand and just broken. The former would be the case say for  substrate-stimulated mitochondria using molecules such as glutamate and malate which speed up the TCA cycle and raise membrane potentials to 180mv. The latter would be for example hyper-polarised mitochondria found in cancer cells. These do not produce any ATP at all and are effectively inactive. They have membrane potentials of 220 mv, over ⅕ of a volt. This is put in perspective when 200 mv is usually recognised as the upper limit supported by lipid bilayers before breakdown as a result of exceeding the dielectric capacity.

Computer CPUs and mitochondria, not unexpectedly work best in their ‘Goldilocks zone” just the right amount of volts! With regard to mitochondria, very low membrane potentials, flrting with depolarisation  are associated with cell death, enlarged mitochondria and fewer cristae. The opposite seems to be the case, higher membrane potentials are associated with smaller mitochondria with many cristae …  and possibly with the inhibition of cell death whic intriguingly this may be the ‘motivation’ of the cancer cell’s hyper-polarisation of mitochondria thereby turning off the mitochondria’s ability to kill a rogue cell.





Exploring The Goldilocks Zone

We can downregulate (or even destroy membrane potential) and speed up the cycle by increasing metabolic demand or by using natural or exogenous uncoupling agents. These include the natural uncoupling proteins (UCPs) and chemicals like DNP ( dinitrophenylhydrazine) …  too much of these will uncouple the mitochondria and cause complete collapse of potential …  followed by cell death.

Conversely, membrane potential can be increased by stimulation (without extra metabolic demand) using substrates such as malate and glutamate salts or as mentioned earlier by blocking ATP synthesis.

If we assume operating at a level of metabolic demand which reflects good levels physical activity, nervous activity and tissue repair activity, it would be good if our mitochondria were operating above minimum ‘tick-over’ thresholds ( 108mv). Otherwise this would mean that increasing demand further on these mitochondria might depolarise them ... with disastrous effects. A capacity to work hard should be reflected best in the 150-180 mv range, probably the higher the better. We know that cancer cells can achieve massive potentials of over 200mv when blocking metabolic activity but it’s not clear whether such overpotentials can be utilized when oxidative metabolic demands are being made, even so stimulated mitochondria that have not been uncoupled will show potentials of 180mv.

We also know that animals with lots of brown fat, the cells of which are thermogenic because they contain partially uncoupled mitochondria, also have longer life spans than those of a similar size (cf squirrels @ 20 years, bats @ 30 years). In these animals mitochondria typically still manage membrane potentials of 140 mv despite being partially uncoupled . Their cycles are running very quickly but generate heat rather than coupled to the generation of ATP

Finally we also know from the work of Bruce Ames a decade ago that stimulation of mitochondria with acyl-carnitine ( which will increase the cycles and membrane potential as for malate and glutamate) and they associated the stimulation with increased cognitive behaviour in aged rats.

As an aside they added lipoic acid to ‘mop up’ excess free radicals. This point is worth expanding on. Free radical production by highly active mitochondria was once the bogeyman in the aging world. Free radicals cause damage to proteins and DNA which thens needs repair/replacement. Free radical damage accumulation is still one of the most popular aging theories. My point of view is that all damage can be repaired if there is the free energy so to do. In effect this means if you have enough ATP available then repair is not an issue. It’s an entropy thing. So in other words, free radical damage that accumulates with age reflects a repair-free energy deficit not a damage surfeit.  This is borne out by the fact that it has been well established that higher metabolic rates ( ie higher free radical damage) correlates not with lower life spans but longer.


So, keeping mitochondria spinning within their goldilocks zone seem to be the trick. In an oxidative environment where ATP production is not blocked there seems to be no downside to high membrane potentials. However there is one fly in the ointment and that is mitochondria in poor condition need to be destroyed so they they do not divide and compromise the total cell population of mitochondria. This, as has been explored in  a previous blog, is achieved through autophagy which in turn is stimulated or signaled by low mitochondrial membrane potentials.

Outside the Goldilocks Zone

So far so confusing, what about outside of the Goldilocks Zone? Ultra high membrane potentials can be dealt with quickly. They can only be produced in intact, mitochondria in good condition and reach their maximum when metabolic demand is at zero. Like the pressure in a water line, it is at a maximum when the tap is closed. Otherwise they seem to be harmless. Low potentials on the other hand have major effects.

Complete depolarisation, the collapse of the membrane potential, usually results in the release of cytochrome c from the inner membrane of the mitochondria triggering the cascade of reactions that lead to cell death. It looks like flickering depolarisation may act as a signal or a label for autophagy. This potentially is a ‘good thing’ in that sub-standard mitochondria will be eliminated from the population. A flickering state could be triggered by high metabolic demand. For example in bats, which have exceptionally well coupled low ROS mitochondria have ultra long life spans of 30 years, remarkable for such a small mammal (cf rat @ 3 years), they only develop these mitochondria after they start flying: in other words after extreme oxidative metabolic demands are made.

Anoxia, ( low oxygen tension at altitude for example) causes both mitochondrial depletion and hyperpolarization. High altitude training of athletes results in fewer mitochondria within muscle cells but those that remain are better coupled.


Summary

Like many before me I am getting lost in the woods as it were. My overclocking analogy does not really hold for very long. CPUs cycle at a clock rate proportional to voltage but independently of external load whereas mitochondria behave more like simple electric motors which cycle according to voltage and load, according to work done. What has worked though is an exploration of mitochondrial voltage, it looks like there is an optimal voltage and cycle rate which generates enough ATP to keep the system working with an excess of free energy.

I think that mitochondrial stimulation and energetic loads must be important in preventing senescence, finding the sweet spot is the challenge.

Wednesday, July 19, 2017

Middle ClassLife Expectancy Falls: shock!

The ‘ever increasing’ life expectancy of men and women in the UK has come to a shuddering halt according to University College London expert Sir Michael Marmot.

Austerity measures got the blame, but I don’t think that’s so as analysis of the figures show that the decrease is down to the middle-classes. Now that is shocking; this after all is a demographic that is notoriously health-aware, a group that devours articles on diet, exercise and lifestyle. Worse is to come, the data show that deaths from dementia make up a significant portion of the decrease.

Alongside the above comes data that supports the age-related cognitive benefits of the Mediterranean diet  and (trendily) the Nordic diet ( basically fish, berries and apples).
So what is going on?

I think it’s simple. Middle Class people over 60 tend to follow the advice from their Sunday Supplement style gurus and local GPs. This this means that they are:  a) most likely to have reduced the fat and cholesterol in their diet, b) are most likely to take statin prophylactics to keep cholesterol down, c) most likely to take ‘precautionary’ medication for borderline ( usually systolic)  hypertension and d) use sunscreen to block harmful UV rays.

This is to my way of thinking a ‘perfect storm’ of disastrous choices.

Good diets, by consensus,  are very high in cholesterol or the progenitors of cholesterol: fish, olive oil and fungi topping the list. Good diets are high in fruit acids, citric (in citrus fruits) malic (apples) and both in tomatoes.

1)We need cholesterol for so many functions of our body but principally for electrical integrity in the nervous system and the mitochondrial energy generation systems. We need cholesterol to make Vitamin D, a well recognised vitamin important to nervous function, which is synthesised when its cholesterol-derived precursor is exposed to sunlight.

2) Citric acid and malic acid are well known stimulators of mitochondrial function.

3) Mild hypertension is an age-related adaptation needed to supply oxygen and nutrients ( see above) to the brain as the circulation system becomes more resistant to blood flow with time.

So here’s the deal. Let’s deprive the brain’s neurons and nerve fibres of their fatty raw materials, their mitochondrial energy stimulants and even oxygen …  and do this deliberately by limiting them in the diet plus with the aid of chemicals such as statins, calcium channel blockers and sunscreen.

I’ve just read the UK pension age for the Generation Y has been raised to 68. I just hope they stay away from their GP!

Tuesday, May 09, 2017

The End of Computing My Friends


Computing in schools, has it gone terribly wrong?

It was back in 2013 that I  wrote a blog in ComputerworldUK entitled The End of ICT my Friends which presaged the whole-scale deprecation of ICT in the school curriculum by the then Education Secretary Michael Gove.

The reasons for ICT’s demise were easy to understand. ICT had become little more than training in MS Office applications (Word, Excel, Access, Outlook and PowerPoint). Proficiency with  these applications had been identified by the previous Labour Government as essential modern workplace skills and through their quango BECTA had zealously introduced computers (suitably equipped) to all schools.

Then it all went pear-shaped. In 2011, the famously ‘flabbergasted’ Eric Schmidt, Chairman of Google, lambasted the almost complete absence of computing in the curriculum of the country that produced Turing and so many of computing’s pioneers. But also in the tradition of the double whammy, it had at that time become painfully obvious that the costs of Microsoft Office licencing to education was, shall we say, poorly controlled.

The rest is history as we say; ICT was given the chop, the BCS was summoned (British Computer Society), MIT’s Scratch programming endorsed by all and by 2015 62,000 students (nearly 15,000 girls) took GCSE Computing.

And there the story should end, all British mini Turings being turned out on a regular basis, evil Microsoft consigned to ignominy with only Geography trying to keep ‘Death by Powerpoint’ alive. But, there is always a but, all may not be well as we approach the 2016  9-1 revamped, ‘demanding’ world-class GCSEs ( viz ‘hard’ ) having had the GCE A Level updated in 2015 in a similar macho way.

You guessed it, we are struggling with numbers, the number of students wanting to do computing that is. The new GCSE and GCEs are hard and getting harder. It looks like every crusty/nerdy/stackoverflow BCSesque illuminatus has put their pet ‘indispensable’ knowledge into every nook and cranny. This is deja vu all over again. The same thing happened when all the GCSEs were created 30 years ago. ‘Experts in their fields’ created a curriculum that took 210% of the available time. Maggie Thatcher never forgave education experts for their idiocy.

So pity the poor ICT teacher trying to convert to Computing at their school, even Comp Sci graduate will still have to do his prep to teach much of the syllabuses ( I say ‘his’ as the number of Comp Sci girls is very small). Except Comp Sci graduates already have jobs that pay better and are less stressful.

Don’t worry we can fill the gap with folk from Eastern Europe they have loads of coders looking for jobs (my colleague is from Slovenia); oops we’ve Brexit to contend with so that’s banjaxed that idea. Oh, and I forgot to mention, now that we don’t teach ICT most of our youngsters come out without MS Office skills …  as our parents constantly remind us. Good job immigrants do have these skills … rats! I forgot about immigration targets. Maybe we should not be looking to schools for the future of British coders obviously we will find them in the new CTC Universities ( City Technology Colleges). What? they’re only half full. Oh dear.

Did I mention that the small uptake numbers for computing means small option groups, which means big costs for schools ( I teacher - 3 students at GCE? 1-7?) during a funding crisis.

So here we go again ‘The End of Computing my Friends’.  

I give it two years.

Tuesday, May 02, 2017

Homo Algorithmicus ( AI for Monkeys)


The ascent of Homo Algorithmicus

The modern workplace, be it the operating theatre, city office, or maybe the sports-field or  even a race track, are all highly competitive environments. Here responses to stimuli are finely tuned, honed to operate at maximum efficiency and gain advantage. Often these environments now respond in accordance with a ‘play-book’. A book of rules which has been refined over many years and which embodies the very best of practice. The concept of ‘minimal gains’  familiar to most of us today implicitly acknowledges this refinement and at the same time hints at a limiting ceiling of perfection?


I once listened with great attention to an international rugby player lament a loss with the words “at that stage in the game after the second breakdown (in play) we take a scrum ... but we took a kick and the lads didn’t know what to do”. Something jarred in me and not for the first time, they failed because of a failure to follow the algorithm?


I have worked in a school of 4,000 16-18 yr old students where they produced great results through their systems of teaching, learning and operation. Conformity to their model produced the results required at A Level and this place was duly rated an ‘outstanding’ establishment. Clearly systems work. So what’s my problem?


There are many more examples of success in the vein described above and it does not take a genius to spot the emergence of  ‘algorithms of success’: there is a smart way of doing things in these contexts.


Artificial Intelligence ‘bots’ obviously use smart algorithms, it’s hardly saying anything at all to point this out and increasingly, unsurprisingly they are finding a place working alongside of us ( when not actually taking our jobs). This is to be expected because we and they are on convergent cognitive paths.  I have pointed out before in a previous blog but to reiterate explicitly: increasingly robots ‘think’ like us and we ‘think’ like them.


I call this AUI or Acquired Un-Intelligence. From the dawn of time an organism has to acquire a set of useful algorithms of behaviour in order to survive in a dynamic ( ie changing)  environment. Shine a light on a woodlouse and it quickly  for a woodlouse) moves into the dark … and so on throughout a million examples from Nature.  We humans have complex brains and we can move from such simple ‘rules of thumb’ to more elaborate and complex behaviours thanks to the fact they can be written down, thought about and refined by a ‘thousand eyes’ distributed globally.


Oh good.


In my opinion It’s a mistake to mistake this ability as cognitive progress. The effectiveness of our smartest algorithm followed to the letter does not make us smart unless we shift away from calling machines dumb for doing just the same!


Another example of acquired dumbness is the rise and rise of ‘Big Data’ or rather the rise of the algorithms that ‘mine’ Big Data’. Machines can look for patterns in vast data sets, which is another way of saying that they can find patterns in data which is one step from finding associations which are of course correlations.


Increasing Post Doc students particularly in medicine are given vast sets of medical data to research. In other words the apply algorithms they do not understand to find correlation that they are looking for. Of course they find them and we seem programmed to be unable not to attribute significance ( even causal significance) to all but the most ludicrous associations.


I used to tease my students in the 19080s with the strong association between dyslexia in a child and the wearing of dungarees by the mother that drove Volvo cars. It’s real, so I advised mothers to avoid this clothing and buy a Fiat. The correlation did not mention that dyslexia was being newly diagnosed in those times and was the preserve of a certain middle class family whose mothers at that time would have met the criteria above.

Homo algorithmicus is dumb and getting dumber. Machines are dumb but getting smarter. We need to nurture our creativity; we are at our best when we are makers; makers of art, tools, music, buildings and fables. Not followers of algorithms, that day is nearly done, we make rubbish robots.

Wednesday, April 26, 2017

Memory Loss: Exams and beer


I have spent thirty five years teaching students during their 16-18 phase of education which is well characterised as their ‘exam years’.  My job is clear: follow a syllabus, do my best to make it understandable, provide opportunities to apply new knowledge and through testing, revise and learn stuff in preparation for a final public exam. Memory recall is of course  fundamental to their final performance.

It has been shown recently1  that short term memories (located in hippocampus) and long term memories (in cortex) are laid down at the same time in response to an event. There is a day or so lag in any access to the long term memory though it’s there. The researchers state that communication occurs between the two types of memory but their findings have moved away from the older idea that long term memory is created by input from the short term memory often by repetition  … until so to speak it ‘sticks’.

This conventional model of ‘short creating long’ also asserted that powerful and meaningful events ‘stuck’ more readily than the trivial. In short, in a teaching context, it is noticeably harder for a young person studying computing to recall the features of ‘Big O’ theory than it is to remember any errors relating to, say, their first sexual encounter!

Our response as teachers, unable or unwilling to compete with the last sentence is to saturate their days with repetition of the deeply unmemorable …  hoping for a ‘stick’.  Sticky by repetition? I doesn’t really work for me. The new findings though beg the question as to why two copies are created at the same time.  I think the answer must lay in a concept of ‘synchrony’.

If hypothetically, long term memory constantly ‘polls’ short term memory, looking for similarities and differences, then when comparing its latest image with the current or recent in short term memory, the act of comparison itself may give a pointer to a mechanism to explain much of how we remember.

So for example, given an everyday event such as an encounter with a pet over a period of time, there will be a lot of synchrony between the recent ‘updates’ to the long term memory and the current store of short term memory. There will of course be small differences due to the dynamic relationship between owner and pet, so using the categories ‘same and different’ there will be some things in the ‘different’ camp and much in the ‘same’ camp. The long term store could be using the equivalent of a computer’s journaled file system where the differences only are updated into a permanent sore and the similarities acknowledged  but with no action taken.

Obviously, near perfect synchrony could be achieved when, say, encountering a vase in the house, maybe many times each day. This always will result ‘no action’ and enough ‘no-action’ will result in  the ‘event’ being downgraded to ‘wallpaper’, that is, near cognitive invisibility. On the other hand, the sudden loss of a loved pet for example will result in the long term memory being unable to get any significant synchrony with the short term memory. It may register total absence as a kind of change that cannot easily be integrated as an update and duly inform the conscious brain. Items like the pet’s basket may now incidentally increase in visual-memory significance (in the absence of the pet) during the brain’s repeated attempts at ‘normal synchrony’. Such attempts if regarded as important ( to memory integrity) may be the source of the conditions recognised as distress, grief and so on.

A severely frightening event, say in combat environments, will be stored in both memories so removal from the situation could cause a similar absence of synchronous events in the short term memory as described above. Again, this lack of synchrony will cause an alert, only this time (as your very survival is at stake), the long-term memory will rapidly and repeatedly scavenge the short term memory for synchrony ‘hooks’. This may be much more significant than the empty pet basket, possibly resulting in inappropriate behaviour as non-synchronous events are labelled significant and in desperation, labelled falsely as  candidates for synchrony. For example a car backfire could be mistaken for a gunshot as both the short and long term memories are updated and ‘synced’.

In my own case having suffered multiple bee stings resulting in anaphylaxis many years ago and having been removed from the likely possibility of this happening again reacted or rather over reacted to the sound of a ‘drone’ copter mistaking it for a bee swarm. I had no idea I was looking out for this sound, but part of me certainly was.

There is a lot to be said however for having two copies of an event, one quickly refreshed, the other permanently stored. It gives us the ability to recognise change. An animal that fails to spot that a stone has been moved overnight may miss the predator lurking behind it!  ‘Same and Different’ is a very common and familiar ability for us. We soon ignore the wallpaper and we quickly spot changes in behaviour.

Back to the more prosaic world of teaching. As a teacher my job is to get students to understand and hopefully remember a body of knowledge. I worked using the principles of the old model; repeat, practise, test  so what difference could the new model make?

Hmm...ok both long and short term memories are created at the same time.

Empirically ‘rememberability’ is proportional to the significance the student’s brain attributes to the ‘learning event’ … this is usually quite low probably because in a school day short-term memory updates ( ie lessons with facts) are so frequent that they have to be ever-changing-wallpaper in a sense and therefore ‘same’ not ‘different’... Hence the teacher’s endless search for novelty, impact, stimulation and so on in order to upgrade the events  to a ‘stickier’ significance … this is conventionally regarded as good teaching.

If however they have had the lesson, no matter how quotidian, it’s already in short term, and it should  be in the long term memory ( the Holy Grail of teaching) too. and so the only problem is accessing it! But how?

The answer must be to force the brain’s hand as it were by wiping out the short term memory on demand after a period of continuous revision. Maybe surgery is too drastic to copy from the originl experiements with mice but there is another way.

The easiest way to do this is with alcohol consumption. Many have suffered short term memory loss from over indulgence.

Research helpfully has shown that alcohol the night before does not affect academic performance in student2 exams, other work with more consumption ( seven pints of beer)  even showed an improvement3.

I think we’re onto something her.


1) http://www.bbc.co.uk/news/health-39518580
3) https://www.theregister.co.uk/2010/03/24/drinking_and_exams_mix_well/shows an


Thursday, March 23, 2017

Quantum computing: mitochondrial style



‘Robots will take our jobs’, so go the headlines as Artificial Intelligence flexes its muscles ever more impressively and yet again we debate the coming or not of machine consciousness.

In the 1980’s Roger Penrose1 first concluded that ‘mind’ or intelligence was not a result of processes analogous to digital computing. Forty years on, despite the astonishing achievements of computing he remains of that opinion. Somewhere, he and others believe, the answer to mind lays in the quantum world replete with its mysterious world of probability, superposition and entaglement.

Today2, his search is for biological mechanisms and structures that could support such quantum states and put clear blue conceptual water between man and machine . Ironically at the same time,  in the hitherto deterministic digital world of computing, machines are being constructed that use the same quantum principles to create probabilistic computational devices based on ‘qubits’.

For a biologist these directions of travel look like another example of convergent evolution, viz the similarities between sharks and dolphins or birds and bats*.   Convergence may be the key to understanding the gulf between AI /robotics and the mind of living creatures. I think we should see it this way:

The biological mind allowed creatures to behave like robots.

Robots increasingly are able to replicate our ‘robotic’ behaviours ...

To be more specific,  a creature behaves ‘robotically’ when it carries out simple or complex algorithms in response to its environment. For example, the move towards food or light by a simple organism is robotic and readily replicated by a robot equipped with appropriate sensors and algorithms.But ditto more complex behaviours such as car assembly, playing poker, Go!, chess, playing the stock market or diagnosing illness by humans can be replicated or even bettered by robots so long as the behaviour can be represented algorithmically. We should not then be surprised when robots out-robot us! But how can we behave like robots?

The ability to do this comes with the appearance of the eukaryotes. Environmental responses by simpler organisms are known ( taxis along chemical gradients) but rare.

Eukaryotes are by contrast very complex they have a nucleus, mitochondria, and cytoskeletons. The DNA of the nucleus with its store of genes amounting to a vast permanent library of information, tempts us strongly to think in terms of familiar digital storage metaphors such as ‘hard-drives’ and ‘files’. Seductively the nervous system, especially the brain, is a shoe-in for the modern computer’s CPU as first envisaged by Von Neumann in the 1940s. Together they create the computer analogy rejected by Penrose.

But single celled creatures without a  nervous system have minds too! The advanced behaviours of single-celled Dictyostelium discoideum or ‘slime mold’ is a good starting point. It can transit between motile hunting amoebae to aggregated, mold-like multicellular creatures. No nervous system is available to act as a classical computational device, a CPU, so what is?

There are two candidates in my view. Penrose favours the cytoskeleton and asserts that this structure is able to maintain coherent quantum states. My mitochondrial chauvinism disposes me to favour the mitochondrion for the following reasons:

  1. Mitochondria separate electrons from protons and are able to move electrons through a redox chain of proteins using quantum tunneling in the process known as oxidative phosphorylation. Ultimately hydrogen and electrons are united with atomic oxygen radicals to form water.
  2. The mitochondrial membrane is rich in cholesterol and hydrophobic lipids and so has a high dielectric which is uses to maintain a constant charge separation known as the membrane potential.
  3. Each mitochondrion has thousands of these supercomplexes of respiratory ‘chains’
  4. Mitochondria can hold and trap ions within their multi-protein super-complexes


In mitochondria we have permanently charged devices capable of thousands of quantum separations each with the possibilities known as entanglement and superposition.
Meanwhile (to fuel my fantasy further), in the digital world promising qubit implementations have been made using the trapped ion quantum3 states.  

How would a mitochondrial qubit machine work? Well how does a digital qubit machine work?
Basically they would work in the same way. That is qubits would be in a superpositional state until an external interaction causes this to collapse into a regular state. So for example, in Computing, a ‘bit’ can exist in two states 0 or 1 (or any binary state that is equivalent). A qubit though can be 0 or 1, or both at the same time until an interaction causes it to ‘collapse’ into either 0 or 1. You will have to trust me that these computers do work though the output is not determined absolutely as in the non-quantum world, it is expressed as the most probable outcome.

Enough of quantum computers, a little knowledge is enough to make it conceivable that a similar indeterminate superpositional state of affairs held by mitochondria could collapse into alternative fixed states as a result of environmental stimuli. A nervous system, a brain indeed, is simply an elaboration of the basic setup.Here we have cells that specialise in quantum computing, rich in mitochondria and responding to simple electrical stimuli rather than the full spectrum of the outside  world.

In summary we should not be surprised that the external lives or behaviours of computers and creatures are on a converging path. It may be that their underlying technologies are also converging, but of their inner lives there is not much we can say other than they have them; like birds and bats they will be similar but utterly unalike.



*Dolphins and bats are mammals which appeared about 50 million years ago whereas sharks and birds appear over 400 millions years ago.

  1. The Emperor’s New Mind by Roger Penrose 1989 OUP ISBN 0-19-851973
  2. Shadows of the Mind by Roger Penrose 1994 OUP ISBN 978-0679454434
  3. https://en.wikipedia.org/wiki/Trapped_ion_quantum_computer