I have a breakthrough article to share with you. It’s about a technology that detects skin cancer. Before I tell you about that, however, I need to teach you a few things. For example, do you know what AI is? How about machine learning? What about CNN? (This column is a nonpolitical arena, so, no, not that CNN).

AI stands for artificial intelligence. We are surrounded by it everywhere – computers, cars, and cell phones all use AI. AI describes a machine with the ability to problem solve, to create, to understand, to learn. These are characteristics we call “intelligence,” hence, artificial intelligence.

When machines do things that we recognize as human, we describe them in anthropomorphic terms. Alexa “listens” for my voice, my Macbook Pro “sees” me in photos, and Siri “understands” me. And now, when computers get better through practice, we say they “learn,” thus “machine learning.” But how?

You and I intuitively know that a picture of a chair is a chair. This is true of an folding chair, a Barcelona chair, or a Ghost chair. This ability – to intuit – is a hallmark of humans. Computers don’t intuit, they learn. We don’t need to study 3 million chairs to identify chairs. (Nor could we study 3 million pictures of chairs, a feat that would take years.) Computers, in contrast, can review 3 million pictures of chairs. And learn. In minutes.

Not only do computers learn from millions of examples, they also layer learning. For example, one set of programs will look only for lines that appear to be legs of chairs. This information is then passed on to another layer of programming that can look for seats, then another for backs, then another and another until a final layer puts it together. Do these layers remind you of something we all learned in medical school? It is analogous to the mammalian visual cortex! In the brain, one layer of neurons talks with another. In machines, one layer of programs pushes information to another. We call these machine layers “neural networks.” A convoluted neural network or CNN, therefore, describes a complex network that is analogous to brain cortex. The implications are astounding.

Things get interesting when a CNN is given a complex task to learn and a massive observational data set to learn on. With recent advances in chips called GPUs, deeply nested program layers can accomplish difficult tasks like recognizing faces, understanding voices, and avoiding a bicyclist on a foggy day. Self-driving cars, airport security, and voice-activated assistants all rely on this “deep learning.” And they are getting smarter everyday.

So, now when I say a team at Stanford University has used a CNN and deep learning to diagnose melanoma from pictures, you’ll understand what I mean. And you’ll realize computers can do something heretofore unthinkable – make diagnoses as accurately as a doctor. That story should make you both a little giddy and afraid. But wait, there’s more! Read all about it next time.

Dr. Benabio is a partner physician and chief of service for the department of dermatology of the Southern California Permanente Group in San Diego. Dr. Benabio is @Dermdoc on Twitter. Write to him at dermnews@frontlinemedcom.com . He has no disclosures related to this column.

Ads