My Realm

Adventure Time: HAL9000 recommending you Daisy Bell and some Future House



A god's gift, that's what I would call the the invention of Neural Network("God" here being the creator of the Neural Networks(Atheist here)). I am not a strong Maths person, actually not a maths person at all. But, for my little HAL9000 I had to learn the elvish language. Not like pick up books and sleep with them. Please, Neural Networks enthusiast don't need to do that to begin their adventure. I started with the Udacity course of it, which talked about Decision Trees, Regression, and finally Neural Networks(and a lot more which I skipped). And boy I saw that Math after a long time. Honestly I haven't done any calculus in like a year or two. But then all of those basic concepts came rushing back and made me realize it is not that tough. For those who are in it and have just begun, Gradient Descent IS Back Propagation. I had to search this to know that, I mean no one wrote or told this directly. And since that course was very theoretical, I had to look somewhere else and found the perfect place. Prefect because it is text(I love text, text over videos any day), and also that it is practical.With knowledge in hand and laziness in heart, I searched for an ANN library for JAVA and there it was, neuroph. Simple, easy and powerful. No, it became simple, easy and powerful after a lot of Googling. I find the lack of proper wiki disturbing.

Now, now, I have some basic idea how ANNs work, have a library to make them, and have successfully tested it, what should I do next? The answer is to make an effing service built on top of the Grid because people hate empty jar of cookies. I might not have told you before but we got the network(Grid not the neural) working, we have tested it, and it can send packets and do amazing stuff which it currently doesn't do. So, if you think you can make amazing stuff on it, ping me. So, services, well one thing we plan to do is to make a media streaming app. So, just listen to cool music, other people's cool music, without the internet. Think of it like Twitch, actually exactly like twitch, there goes our originality. Talking of originality, I thought of making a recommendation system based on what your taste in music is, and then suggesting streams based on that.

Plan:

To have a multilayer perceptron with 128 input neurons and 1 output and 64 being in the hidden layer. I will get the raw data from MP3 as shown in here, and as that raw data is sliced, I will be taking 128 data points from each slice. And for testing purposes I will just mark the slice data points as good or bad(0.0 or 1.0). So, a full song will create a huge data set with each row having 128 inputs and one output. That data set will be fed to the ANN and it will learn through back propagation.

Execution:



Result:

Not so epic fail. Yes, the ann can recognize when the test song matches to the trained song. But it matches to everything and anything.

Problem:

The problem here seems that the 128 neurons are supposed to have a meaning and purpose. Each of them should represent something about the data but rather I took the raw PCM data directly and fed it to them. Now that I have fed them data points and each neuron has been trained with almost every data point, the neurons have lost the game. Look at it like, one neuron received inputs such as 25,10,30,45,100,26. Since that's a lot of variety it can now fire on any of those inputs in test data and hence yields wrong results.

Solution:

To extract features and have limited no. of meaningful input neurons, and feed the features to the neurons rather than raw substance. It's 3:04 AM and I will be continuing with the solution tomorrow. 

No comments:

Post a Comment

I am on Wordpress too.|No Copyright © 2014



Powered by Blogger.