Pattern Recognition: A Double-Edged Sword

Joe Schueller
4 min readFeb 2, 2021

--

Nov 29, 2016 • Joe Schueller

Why is pattern recognition so great?

Just take a second to think about pattern recognition. Long ago, cavemen recognized the patterns of the seasons. This helped humans prepare for seasonal changes (…and not die ). In recent history, scientists and inventors spotted patterns in nature and society. We got discoveries and cool stuff. Sounds pretty sweet, right? Not so fast there..

Why does pattern recognition suck?

Even though humans are really good at recognizing patterns, sometime we fall into the trap of spotting patterns that aren’t actually there. The stars in the night sky can become gods and goddesses. The pattern I see in stock market means I’m gonna lose my retirement account. The burn marks in my grilled cheese kinda looks like a face (That one is pretty sweet!)

There is no way around it. If humans are good at spotting patterns, we are bound to pick up some false patterns as well. Our tendency to pick up false patterns actually has a name: Apophenia.

Where do we find false patterns?

False patterns creep up in situations where there is a lot of randomness. For example, a random blob of ink could look like a face, or a cloud could kinda look like a boat. Our minds are constantly trying to make sense of the world through patterns, whether or not there is any true meaning.

How do we deal with it?

Most of the time, we are pretty smart in determining if these patterns we see are actually true. After a bit of investigation, we could conclude that something happened just by chance, or it was a great coincidence. However, there is still a lot of room to let a false pattern slip by our reasoning and logic.

There is also a certain level of risk that comes with false pattern recognition. False positives and false negatives in the medical setting carries a high amount of risk. Accounting for the risk involved can get complicated. It might involve attaching a percentage which represents the probablility of error. It could also involve developing a margin of error for the results.

Now we’ve already covered pattern recongnition in humans, but we have yet to discuss pattern recognition in computers. Computers are getting (creepingly) good at detecting patterns, especially faces. When uploading pictures to facebook, you’re asked to tag certain people detected in the picture. But just like humans, computers can fall prey to false pattern recognition. “Yes, that person in the background of my cover photo does look a bit like my dad, but it’s not him..stupid facebook.”

Why do computers recognize false patterns?

Reason 1. Computers recognize false patterns because humans are the one’s programming computers. Humans can make mistakes which are programmed into the computer. This isn’t really the computer’s fault, so we’ll just ignore this one.

Reason 2. Computers also recognize false patterns because they really suck at knowing the context of a situation. A pattern might yeild good results in 95% of the situations but fail in 5% of situations. For example, a computer could find the faces of people pretty easily, but it would have trouble figuring out if a face belongs to an actual person, or just a picture of face (or a picture of a picture of a face).

Gosling and Culkin Shirt-ception

So why is this dangerous?

Even though computers suck at knowing context, they are amazing at repetition and speed. Computers have no trouble repeating the same mistakes a hundred, a thousand, or a million times. They just follow the algorithms that were programmed.

The combination of limited context and speed can lead to things like the Flash Crash of 2010. In the Flash Crash, the stock market dropped rapidly with no apparent reason. Computers systems that recognized a sharp decline decided to sell which only accelerated the drop. Within a matter of minutes, the stock market dropped 9%. The trillion dollar crash, which recovered soon afterwards, prompted investigations to determine the role of automated systems in the stock market. Situations like this show the speed and scale of computer systems.

In conclusion…

False patterns that humans just “obviously know are false” pose a great danger when creating pattern recognition systems for computers. As technology and research advances, we will have to be mindful of the speed and scale at which computers operate. We should create systems that recognize their own limits and fail gracefully.

Unlisted

--

--