Nand Kishor Contributor

Nand Kishor is the Product Manager of House of Bots. After finishing his studies in computer science, he ideated & re-launched Real Estate Business Intelligence Tool, where he created one of the leading Business Intelligence Tool for property price analysis in 2012. He also writes, research and sharing knowledge about Artificial Intelligence (AI), Machine Learning (ML), Data Science, Big Data, Python Language etc... ...

Full Bio 
Follow on

Nand Kishor is the Product Manager of House of Bots. After finishing his studies in computer science, he ideated & re-launched Real Estate Business Intelligence Tool, where he created one of the leading Business Intelligence Tool for property price analysis in 2012. He also writes, research and sharing knowledge about Artificial Intelligence (AI), Machine Learning (ML), Data Science, Big Data, Python Language etc...

3 Best Programming Languages For Internet of Things Development In 2018
832 days ago

Data science is the big draw in business schools
1005 days ago

7 Effective Methods for Fitting a Liner
1015 days ago

3 Thoughts on Why Deep Learning Works So Well
1015 days ago

3 million at risk from the rise of robots
1015 days ago

Top 10 Hot Artificial Intelligence (AI) Technologies
344319 views

2018 Data Science Interview Questions for Top Tech Companies
96609 views

Want to be a millionaire before you turn 25? Study artificial intelligence or machine learning
91293 views

Here's why so many data scientists are leaving their jobs
89661 views

Google announces scholarship program to train 1.3 lakh Indian developers in emerging technologies
69576 views

3 Thoughts on Why Deep Learning Works So Well

By Nand Kishor |Email | Apr 11, 2018 | 18069 Views

While answering a posed question in his recent Quora Session, Yann LeCun also shared 3 high-level thoughts on why deep learning works so well.

Last week, deep learning research leader Yann LeCun took part in a Quora Session, during which he answered questions from community members on a wide variety of (mostly machine/deep learning) topics.

During the session, this question was posed:

When will we see a theoretical background and mathematical foundation for deep learning?

The answer turned into a very eloquent overview of three particular thoughts on why deep learning works so well. Here is a quick overview.

LeCun's first point of explanation, which maps to a good reason why deep learning works so well, is as follows:

One theoretical puzzle is why the type of non-convex optimization that needs to be done when training deep neural nets seems to work reliably.

The main idea here is that local minima do not arise in very high dimensional space, so greedy-search gradient optimization is not trapped in a "box." As LeCun states:

Itâ??s hard to build a box in 100 million dimensions.

Moving on, LeCun introduces his next point as:

Another interesting theoretical question is why multiple layers help.

The point here, beyond LeCun stating that there is not a complete understanding as to why, is that multiple layers help to implement complex functions more concisely. While he points out that computer scientists are accustomed to the idea of sequential steps and multiple layers of computation, this doesn't quite cover the reasons why multiple layers in deep neural networks work as they do.

For his last point, he turns to a specific neural network architecture.

A third interesting question is why ConvNets work so well.

Interesting question indeed. This article gets cited as reading for why ConvNet architectures are right for analyzing certain types of signals, touching on the fact that ConvNets actually work very well for some type of signals, like spatial. Sorting out why is this is so is one thing, but noting that it is true places squarely into perspective just how well ConvNets work when they do.

While Yann LeCun was answering the question posed to him on when we could expect a mathematical foundation for deep leaning to emerge, in doing so he provided valuable insight into why deep learning functions as well as it does.

Read the rest of his Quora Session here.

Source: HOB