Some time ago I agreed to do a weekly short podcast which I entitled my Disjecta Membra or ‘Scattered Thoughts’ (the Latin being a reference to the dry bones of Ezechiel 37). It was re-branded as ‘Father Fessio in Five’. (My nomenclature, however, remains the more accurate.)
Because of the growing interest in (and hype about) artificial intelligence, I decided to connect some of the dry bones and do four or five episodes (not sure of the number because I haven’t finished yet) explaining in simple terms just what artificial intelligence is (and what it is not).
Since AI is an operation of computers, the first episode is about computers.
Brief summary:
We use words to refer to things. (E.g. a ‘book’.) Words are sounds which generally have no natural relation to things. We, because we have natural intelligence, are able to relate the sounds to the things by convention and we communicate by speech. The next step is to devise written signs to represent the sounds—for us the alphabet (hieroglyphics for the Egyptians, characters for the Chinese, etc.)—so we can ‘store’ words and communicate by writing. Again, this is by convention, which is why different languages have different sounds and signs. (So ‘book’ in English is ‘livre’ in French.)
Computers process information (which is why the core of a computer is the CPU or central processing unit), but they need to ingest information in order to process it; and computers can’t really process sounds or signs. We have to convert the signs (e.g. letters) into something computers can process—numbers. Again this is done by convention. As it happens, there is an agreed convention so that ‘book’, in our normal decimal system would be 98 111 111 107.
But computers aren’t smart enough to process decimal numbers either, so we have to convert the numbers to binary form, in this case: 01100010 01101111 01101111 01101011. Nothing but zeroes and ones.
But numbers themselves are too much for a computers ‘brain’ so we (humans with natural intelligence) have to create another convention. Computers contain transistors, small unitary devices, which can hold many electrons (charged or high voltage state) or few electrons (discharged or low voltage state). We, by convention associate the charged state with the number 1 and the discharged state with the number 0.
So everything that goes into a computer (not just words, but musical notes, images, etc.) is stored as a sequence of zeroes and ones, which are then processed by the CPU (following instructions, which are also sequences of zeroes and ones which we have conventionally associated with operations—like addition and subtraction).
There’s more on this in the podcast. But from the above it’s already clear that computers are not so smart—they can’t count past 1, although they can do it very quickly. And most importantly, they don’t ‘know’ what they are doing.
I noticed tonight that X is having a lively discussion about AI "scheming". Apparently creators have noticed undesirable behavior creeping into their AI products - such as a refusal by the AI to turn itself off, even when given an unambiguous directive to do so. This is called scheming when different AIs collude on their own to produce the same subversive result. I asked Grok if a moral code like the Ten Commandments could be incorporated into the AI. We had an interesting discussion.
Interesting