Preskoči na glavno vsebino

Objave

Prikaz objav z oznako deep learning

Bipropagation demo in TensorFlow

Bipropagation is a new Deep Learning algorithm. It is much faster and much more reliable than Backpropagation. Here is the demo from the  ResearchGate and GitHub. Inner layers of the Neural Network have not hidden anymore. Learning is done layer by layer with much fewer iterations. Please cite me in your work. Click the G+   button if you like this demo. Any comments are desirable.

Zgodovinski trenutek za umetno inteligenco

Trenutno globoko učenje (Deep learning) in z njim celotna umetna inteligenca doživlja razcvet, saj je v preteklem letu bilo doseženih nekaj pomembnih in odmevnih rezultatov kot so prepoznavanje fotografij in govora, prevajanje govorjenega besedila, pisanje besedila na osnovi podanih ključnih besed,... Danes, 12. 3. 2016 pa je bil tej zbirki uspehov dodan še en. Googlova globoka nevronska mreža ( Deep neural Network ) poimenovana AlphaGo je premagala v igri Go najboljšega igralca na svetu, Leeja Sedola . Igralna plošča azijske miselne igre GO V tej azijski miselni igri nasprotnika izmenjaje polagata črne in bele kamne na mrežo velikosti 19 krat 19. Nasprotnikovi kamni, ki so obdani z vseh 4 strani se odstranijo iz igralnega polja, cilj igre pa je zavzeti čim večji del tega polja. Rezultat zgodovinskega dvoboja med človekom in strojem je bil 3 proti 0 v korist stroja. Poraženec Lee je po dvoboju izjavil, da je pozitivno presenečen nad zmogljivostjo umetne inteligence. Ta d...

Bipropagation demonstration in MatLAB

Here is given an example of the "bipropagation" algorithm for learning of NN. It is written in MatLAB language (R2015a) and is as similar as possible  to  "Deep learning" example "AutoencoderDigitsExample.m" which is included in MatLAB's Neural Network Toolbox. So you can easily compare both algorithms. I believe that my algorithm have few advantages over autoencoder. Please tell me what do you think about it. Please cite me in your works. Thanks a lot. Download demo ====================================================== %% Training a Deep Neural Network for Digit Classification % This example shows how to use the Neural Network Toolbox(TM) to train a % deep neural network to classify images of digits and is very similar to % "AutoencoderDigitsExample.m" which is included in % Neuronal Network ToolBox from MatLAB. This example is made for comparison % of both algorithms. % % Neural networks with multiple hidden layers can be usefu...

Optimization for Multi Layer Perceptron: Without the Gradient

  These days, the publishing house Nova Publishers published the book, entitled Advances in Machine Learning Research . In it is a chapter entitled OPTIMIZATION FOR MULTI LAYER PERCEPTRON: WITHOUT THE GRADIENT where I describe two new algorithms for neural networks learning ( Bipropagation and Border Pairs Method ). Both of them are much more powerful than their predecessors - Backpropagation algorithm. The second algorithm is among other things constructive. Abstract of the book chapter During the last twenty years, gradient-based methods have been primarily focused on the Feed Forward Artificial Neural Network learning field. They are the derivatives of Backpropagation method with various deficiencies. Some of these include an inability to: cluster and reduce noise, quantify data quality information, redundant learning data elimination. Other potential areas for improvement have been identified; including, random initialization of values of fre...

Border Pairs Method—constructive MLP learning classification algorithm

Border Pairs Method (BPM) is a new constructive method for supervised learning of multilayer perceptron (MLP), which calculates , values of weights and biases directly from the geometry of learning patterns. To determine BPM’s capabilities, we compared it with three other supervised machine learning methods: Backpropagation , SVM  and Decision Trees. The  comparison were made on six databases: XOR, Triangle, Iris, Pen-Based Recognition of Handwritten Digits, Online Pen-Based Recognition of Handwritten Digits and synthetically generated noisy data. Border Pairs Method found near minimal MLP architecture in all described cases. For classification of the Iris Setosa only two border pairs (only four patterns out of 150) were enough for learning the whole data set correctly. In the classification of ‘Pen-Based Recognition of Handwritten Digits’ dataset only 200 learning patterns were used for learning. The BPM correctly identified more than 95% from 3498 handwritten digits, ...