I thought this up after reading an article about how US Govt. intelligence agencies are using ‘new’ tools eg blogging and wiki’s to share data. It made the point that allowing anyone in the organisation to post information works well, but that only the very smartest people must be in control of whittling down what’s posted.
It seems to me that that’s largely what consciousness does. It acts as a kind of arbiter between the multiple calls to action which the mind throws up….and chooses what should be done next (we seem to be able to do only a very small number of things in parallel -ie one. Although some women seem capable of time sharing numerous different thought streams effectively).
This filtering also appears to do more than just that. It looks to me that if you have, say 1000, mental processes competing to be chosen as the next one to be given control, consciousness doesn’t just make a choice….it also reorders the remaining candidates….a kind of page ranking applied to queueing mental programs. Exactly how this gets accomplished isn’t clear at all.
(I’m reminded that a similar thing occurs in attempts to minimise network congestion, whereby when packets are competing to pass through some gateway, those not selected are forced backwards in the queue by some pseudorandom amount).
What I know about operating systems could be written in fat felt tip on the head of a pin, but process scheduling in a computer has always looked pretty primitive, especially by comparison with the complexity of some of the programs themselves. Processes wait in a queue and get intermittent access to resources if they are next and no higher priority process appears. Priority tends to be based on some very simple, static rule for each OS (eg Round Robin, First-in-first-out etc).
Invention of the day therefore is a ‘page ranking’ system for computing processes, using a simple model of conscious supervision. This would almost certainly need to involve a feedback mechanism whereby certain system outputs caused a state of happiness and others fear, disgust etc. This effectively defines eg fear as ‘the degree to which some event makes me select a self-protecting process’. Anger would therefore be ‘the extent to which some event makes me select an agressive response process’. Notice I’m not using quotes here: the machine would be actually feeling these responses. Processes could then be reordered according to the extent to which they had contributed to increasing system happiness in the past. According to this model, certain processes finding themselves repeatedly demoted in the queue (starved) could be regarded as ‘repressed.’