Preskoči na glavno vsebino

Beyond Backpropagation

Advances in Machine Learning Research

Gartner is predicting a very bright near future for the "Machine learning". 2015 was a peak year of inflated expectations, now, in 2016 is following period of disillusionment and in 2017 should be reached the plateau of productivity. Elsewhere this process usually last for 10 years. One kind of the most popular modern "machine learning" is named "Deep Learning" what is another name for neural networks with little bit more layers and perhaps even with a convolution and/or recursion. The learning of this kinds networks was until now usually based on gradient descent, on slow, iterative, non-reliable process named Backpropagation. That kind of learning is very demanding and extensive. On plain computer can last for hours or even many days and is often unsuccessful concluded. Recently are appeared two algorithms that significantly improve this kind of machine learning: "Bipropagation" and "Border pairs method".

Bipropagation algorithm is still iterative like a "backpropagation", but internal layers are not hidden anymore since their desired values are now calculated in advance before learning. That way can machine learning be conducted layer by layer, what represents great improvement (it could be more than a few ten times faster).

"Border pairs method" is a totally new algorithm which has many advantages over "backpropagation" algorithm. Firstly we look for the pairs of patterns of opposite class which are so near, that no third pattern does lie in between them. This is the Border pairs and only this patterns are significant when we want to draw a borderline between classes. So the majority of learning patterns (sometimes more than 90%) is eliminated even before the learning begins. Then we are trying to separate all border pairs with a minimal number of border lines, which represent neurons of the 1st layer, so we find the minimal structure of the neural network. While we draw one by one borderline, the learning of the first layer is done with only one neuron at the same time. Since these neurons have a hard limiter, they put binary output values and so the next layers could even be replaced with a logical circuit. Border pairs also allow a simple reduction of the noise.

More info here.

Please feel free to press G+ button if you like these algorithms.


Priljubljene objave iz tega spletnega dnevnika

Bipropagation demo in TensorFlow

Bipropagation is a new Deep Learning algorithm. It is much faster and much more reliable than Backpropagation. Here is the demo from the  ResearchGate and GitHub. Inner layers of the Neural Network have not hidden anymore. Learning is done layer by layer with much fewer iterations. Please cite me in your work.

Click the G+button if you like this demo. Any comments are desirable.

A new Deep Learning Algorithm: One-Step Method

We are living in the AI era where progress is faster and faster each and every single day. Here is another one discovery in this field: One Step Method, a new machine learning algorithm which can do many things, amongst other can replace digital circuits with neurons, can find the even better construction of neural network than Border Pairs Method. More you can find in the 3rd chapter of our book: Machine Learning: Advances in Research and Applications from Nova Science Publishers.

This new algorithm is also suitable for Deep Learning in combination with other methods like convolutional learning, bipropagation, border pairs method, autoencoder and others.

Three new Deep Learning Algorithms at IBM Developers Unconference 2018

I have been an invited speaker at "IBM developers conference 2018" and at "IBM research lab" which is both located in Zurich (Switzerland, Europe) where I have presented three new Deep Learning Algorithms (click to view).  One of them (Border Pairs Method) has 11 advantages over the famous Backpropagation. The audience was large, the response was good and was followed by a lively debate.