I use ‘occult’ in the secondary sense of ‘unseen’ and not mystical, just to be clear, although there may be a tangential connection with the usual meaning, at least for those who don’t mind stretching a point.
Last night, as I tried to relax on the couch following an unfortunate incident while I slept, an analogy between what is erroneously called artificial intelligence and a different human capacity came to mind which I’ve not seen elsewhere. I’ve discussed the topic of machine learning before, which is often taken to be artificial intelligence in some way, but I’d like to reiterate the point of interest (if only to me) right here:
When a programmer is given a task to solve, typically the steps that we’re encoding for the computer to follow are either well-known at the time of the assignment, or they can be deduced through simple inspection, or they can be collected out in the real world. An example of the last choice comes from the world of medicine, where early attempts at creating a diagnosis AI began with collecting information from doctors on how to map symptomology to disease diagnosis.
These steps may be laborious or tricky to code, either due to their nature or the limitations of the computers they will be run on, but at their heart they’re well-known and describable.
My observations of ML, on the other hand, is that ML installations are coded in such a way as to not assume that the recipe is known. At its heart, ML must discover the recipe that leads to the solution through observation and feedback from an authority entity. To take this back to the deferment I requested a moment ago, the encoding of the discovered recipe is often opaque and difficult to understand, as the algorithms are often statistical in nature.
Last night it occurred to me that there’s an analogy to something else than human intelligence, and that’s human intuition. Intuition is
The faculty of knowing or understanding something without reasoning or proof. [wordnik]
Or, more accurately, reasoning without knowing the rules. In my observation on machine learning, above, I suggested that in order for something to qualify as such, the algorithm must work out the rules based on experience, rather than have them encoded by the programmer. This deduction of the rules isn’t necessarily elucidatable, and, to my mind, that obscurity might qualify to suggest that what currently is called artificial intelligence, and is sometimes categorized as machine learning, might even be better described as machine intuition.
And while I can’t think of how that will generally advantage us in the future, it always makes any scientist or engineer happier to have properly categorized something. It’s just the way we are. And my relative lack of respect for same why I more or less inhabit the fringes of the profession.