What are the Main Types of Neural Networks?: CNNs vs RNNs

By now, you’ve begun to familiarize yourself with neural networks and just how important they are to the continued success of the Artificial Intelligence industry. Still, once you delve into the technical aspects of Artificial Neural Networks, it’s easy to get lost in the weeds. More than one type of ANN exists and each one functions differently. In truth, ANNs can be boiled down to main types, which are the central focus of this post.

To set the scene, it’s important to review what backpropagation is. In a basic sense, it’s the process by which all neural networks learn. To be more specific, this also refers to the practice of looping data back through neural networks so that they can learn from their own mistakes. One of the easiest ways to understand this is to say that propagating data is sending it forward through the hidden layer of a neural network, and to its’ future status as a specific sort of output.

Backpropagation includes this initial process but increases the ability of the ANN to remember the data it just processed, especially when said data does not come out in an ideal fashion. In effect, this means that this particular algorithmic process is likely what gave rise to any research trend that involves AI and human-like “learning.”

With all of this in mind, we can more easily move into a discussion of the two primary types of artificial neural networks, as mentioned above. Convolutional Neural Networks have been defined as those that specialize in analyzing information in a “feed-forward manner.” With that said, it is logical to wonder what makes them different from Recurrent Neural Networks.

What this comes down to is how each sort of network “backpropagates.” CNNs, or all ANNs that do not function recurrently, backpropagate each processed data set only once, to store the results they have obtained from it. RNNs do so continuously or as some sources say, “in loops,” which apparently makes them the most “human-like” AI systems in existence. The reason this can be said is that when RNNs are stacked together, they enable “Deep Learning.”

Imagine backpropagation that theoretically, might never-end. Imagine the insights that could be discovered, given the level of optimization that such an AI is afforded.

As a general rule, industry experts appear to believe that deeper the network is, the more it understands the data it takes in. Because RNNs enable this and CNNs do not, you can think of CNNs as a sort of first iteration of a neural network and RNNs as an evolved version of them.

Furthermore, as mentioned above, even CNNs can backpropagate, just not as much as RNNs. In the end, I hope that this quick note on the differences inherent in ANNs helps you progress on your journey towards understanding all things AI and ML. If you’re interested in continuing the discussion, check out our list of suggested resources below.

In the future, we’ll take a more nuanced look at the various use cases of both CNNs and RNNs, as well as the importance of backpropagation. Until then, keep in mind that backpropagation is a type of Supervised Learning algorithm and therefore, any further research on this subject should continue with this connection.

Resources:

https://towardsdatascience.com/a-comprehensive-guide-to-convolutional-neural-networks-the-eli5-way-3bd2b1164a53

http://news.mit.edu/2019/how-tell-whether-machine-learning-systems-are-robust-enough-real-worl-0510

https://medium.com/explore-artificial-intelligence/an-introduction-to-recurrent-neural-networks-72c97bf0912

https://skymind.ai/wiki/backpropagation

About Ian LeViness 113 Articles
Professional Writer/Teacher, dedicated to making emergent industries acceptable to the general populace