Preskoči na glavno vsebino

Beyond Backpropagation


Advances in Machine Learning Research



Gartner is predicting a very bright near future for the "Machine learning". 2015 was a peak year of inflated expectations, now, in 2016 is following period of disillusionment and in 2017 should be reached the plateau of productivity. Elsewhere this process usually last for 10 years. One kind of the most popular modern "machine learning" is named "Deep Learning" what is another name for neural networks with little bit more layers and perhaps even with a convolution and/or recursion. The learning of this kinds networks was until now usually based on gradient descent, on slow, iterative, non-reliable process named Backpropagation. That kind of learning is very demanding and extensive. On plain computer can last for hours or even many days and is often unsuccessful concluded. Recently are appeared two algorithms that significantly improve this kind of machine learning: "Bipropagation" and "Border pairs method".


Bipropagation algorithm is still iterative like a "backpropagation", but internal layers are not hidden anymore since their desired values are now calculated in advance before learning. That way can machine learning be conducted layer by layer, what represents great improvement (it could be more than a few ten times faster).


"Border pairs method" is a totally new algorithm which has many advantages over "backpropagation" algorithm. Firstly we look for the pairs of patterns of opposite class which are so near, that no third pattern does lie in between them. This is the Border pairs and only this patterns are significant when we want to draw a borderline between classes. So the majority of learning patterns (sometimes more than 90%) is eliminated even before the learning begins. Then we are trying to separate all border pairs with a minimal number of border lines, which represent neurons of the 1st layer, so we find the minimal structure of the neural network. While we draw one by one borderline, the learning of the first layer is done with only one neuron at the same time. Since these neurons have a hard limiter, they put binary output values and so the next layers could even be replaced with a logical circuit. Border pairs also allow a simple reduction of the noise.


More info here.

Please feel free to press G+ button if you like these algorithms.

Komentarji

Priljubljene objave iz tega spletnega dnevnika

Optimization for Multi Layer Perceptron: Without the Gradient

These days, the publishing house Nova Publishers published the book, entitled Advances in Machine Learning Research. In it is a chapter entitled OPTIMIZATION FOR MULTI LAYER PERCEPTRON: WITHOUT THE GRADIENT where I describe two new algorithms for neural networks learning (Bipropagation and Border Pairs Method ). Both of them are much more powerful than their predecessors - Backpropagation algorithm. The second algorithm is among other things constructive.
Abstract of the book chapter
During the last twenty years, gradient-based methods have been primarily focused on the Feed Forward Artificial Neural Network learning field. They are the derivatives of Backpropagation method with various deficiencies. Some of these include an inability to: cluster and reduce noise, quantify data quality information, redundant learning data elimination. Other potential areas for improvement have been identified; including, random initialization of values of free parameters, dynamic learning from new da…

Po poteh nekega algoritma

Ko sem med raziskovanjem za potrebe podiplomskega študija dobil idejo za nov algoritem strojnega učenja, me je prevzel notranji nemir. Zaslutil sem, da sem na sledi pomembnega odkritja in v hipu sem začutil kako se mi po žilah pretaka adrenalin. Pravijo, da je raziskovalna strast lahko večja  celo od tiste hazarderske,  ki je menda zakrivila številne zgodbe iz črne kronike. No, na vso srečo pa raziskovalna strast ni povezana s tako nizkotnimi pobudami kot hazarderska.
    Ideji algoritma je nato sledil njegov razvoj, ki je trajal več kot leto in je bil prežet s številnimi vzponi in padci. Navidezne težavice so pogosto preraščale v težave, a na srečo se je vedno našla rešitev za njih. V meni sta se tako prepletala dvom in radost, dokler eksperimenti niso potrdili vseh mojih pričakovanj. Takrat so me preplavili prijetni občutki vznesenosti, ki bi jih lahko primerjali z nekakšno zaljubljenostjo. Ko si vznesen si stvarnost slikaš lepšo, kot je v resnici in tako sem naivno pričakoval, …

Three new Deep Learning Algorithms at IBM Developers Unconference 2018

I have been an invited speaker at "IBM developers conference 2018" and at "IBM research lab" which is both located in Zurich (Switzerland, Europe) where I have presented three new Deep Learning Algorithms (click to view).  One of them (Border Pairs Method) has 11 advantages over the famous Backpropagation. The audience was large, the response was good and was followed by a lively debate.