AI - 1.04 - history - patterns
Another area of artificial intelligence research is in regard to pattern recognition. Human beings are very good at recognizing patterns. Human beings are also very good at seeing patterns which they have not seen before, and recognizing that they are patterns. Computers are no good at recognizing patterns at all. Computers will identify an exact match, but they have great difficulty in recognizing two items as being similar in any way, if they are not identical. (I tend to tell people that computers are bad at pattern recognition because they have no natural predators. Human beings got very good at recognizing patterns while watching for sabretooth tigers hidden in tall grass. The human beings that didn't recognize patterns quickly, didn't survive.)
Pattern recognition is very important when we want to get computers to see something. Computer vision is an area that we have been working on for a great many years, indeed, a number of decades, and we still haven't got it completely right. Human children are very adept at recognizing patterns, and do it all the time. My grandson's first word was "clock," and he was very good at recognizing all kinds of different clocks, and identifying them as clocks. There was one clock that had numerals on the face, and was surrounded by a sunburst pattern. There was another wall clock where a number of the numerals had fallen off. It was mounted on a burl with irregular and ragged edges, but was still recognized as a clock. Wrist watches were also recognized as clocks, including his mother's wrist watch, which had absolutely nothing on the face of it except the hands. He recognized the pattern that made for a clock. As I say, this was his first word. He was probably about seven or eight months old when he started recognizing things as clocks.
Recognizing patterns is also important in speech recognition. This is recognizing how to parse out the words in verbal speech, when we speak to computers. This is definitely not the same as voice recognition, which we use in biometric authentication. Recognizing words, despite different intonations, and possibly even dialects, is very important to being able to speak to computers and get them to recognize what we are saying. Similar types of pattern recognition is involved in parsing out the words in the speech that we speak to computers, and then parsing the meaning of what we say, in regard to commands to the computer, or even just typing out the words so that we can dictate to our phones.
Interestingly, the same type of pattern recognition also comes into play when, having identified the words, we get the computer to do what we know as natural language processing, in terms of identifying what it is that we are requesting the computer to do and identifying meanings in what we say.
Going back to computer vision, we are trying to improve computer vision in order to implement driverless cars. While computer vision is still imperfect, and we are constantly working to improve it, it is interesting to note, if you look at the actual statistics, that driverless cars are already better drivers than we are. Yes, you will hear a number of bad news reports about a driverless car that has failed, or stalled, or hit someone, or created some kind of an accident. The fact that these events make the news proves that driverless cars are better than we are. Driverless cars have driven millions of miles, and there are a number of situations which are still very tricky for them, but the fact that any accident with the driverless car makes the news indicates how rare such accidents actually are. We cannot retrofit all the existing cars on the road with driving software, and not all the cars on the road have the sensors necessary to process it, but if we did ban human drivers, and give over driving to driverless cars, we would, even at this point of development, be saving lives.
One of the areas relating to this is that of fuzzy logic. As I have said, computers are good at finding an exact match, but very poor at finding something that is similar. Fuzzy logic is an attempt to implement the idea of "similar" in computers.
An interesting point is that, at the same time that we are pursuing artificial intelligence with increasing vigor, we are also developing quantum computers. Quantum computing is quite different from traditional computing, and one of the areas in which quantum computers will probably excel is in regard to pattern recognition, and the identification of items or situations which are similar.
Introduction and ToC: https://fibrecookery.blogspot.com/2026/01/ai-000-intro-table-of-contents.html
Next: TBA