Making a neural network yourself is a good way to familiarize yourself with a neural network concept. This is also relatively simple to do. The library we will use, Dannjs, makes it even easier to have clean and simple code.
Let’s understand the basics of binary numbers first.
To teach our Neural Network to count, we will translate our well-known integers into binary data (4-bits). Each bit is going to correspond to one input/output neuron in the neural network.
Ideally, we want to train our network to output the next number we feed it. For example, If we feed the network [0,0,0,1] we want it to output [0,0,1,0]. Since there are only 4 spaces for boolean information, our model will only count up to 15, which is equal to 1111 in binary. You could try adding more inputs and outputs to your model. In this case, your dataset would have to be bigger to accommodate 5 bits of information.
First off, we’re going to install Dannjs locally in our project directory.
We would now need to require the node module we installed in our main js file.
Creating the Neural Network
This is how we would create the neural network. We are giving it 4 input neurons for the 4 bits of binary data to input. We are also going to give it 4 output neurons for the 4 bits of binary data the model is going to have to output.
We’re not done yet! We need to add some hidden layers. I’ve found that 16 neurons worked well enough. You could experiment with this value later. A hidden layer is basically a neuron layer that can perform computations. The name ‘hidden’ comes from the fact that you do not need to see the values of each neuron, unlike the input/output layers. You can learn more about hidden layers & the basics surrounding it here. We are also going to set the activation function to 'leakyReLU' , activation functions are also explained in the link above.
Technically, we have finished the creation of our model by now. We could test it right away with the Dann.log(); command or by feeding the model some 4-bit data.
The log function displays information about the model we just created. We also specify in the feedForward options that we want to log the predictions of the model.
We can see that this gives us some random predictions. Obviously, this was going to happen because we never trained the model… In order to train our model, we need some sort of dataset telling the neural network what to output according to what input is given.
Setting up the Dataset
To train the model, we’re going to need a dataset. Here is a lightweight js dataset for 4-bits binary counting. It basically looks like this:
You can access the dataset here
We can see that this dataset contains one number x in 4-bit binary, as the input value and the number x+1 in 4-bit binary as the target value. I commented out the element [1,0,1,1] so we can have a test sample the neural network has never seen. To access the data, we can copy the code included in the GitHub gist above and save it in binaryCountData.js in the same directory as our project. We can then require the file new file:
We can now access the data this way:
Now that we have access to the dataset let’s apply it by calling Dann.backpropagate(); for each data point in our dataset array. This will tune the weights of the model according to the data you give it.
This counts as 1 epoch. We iterated through every element in our dataset once. Sadly, 1 epoch is not enough for a Neural Network to train properly. We would need to perform multiple epochs in order to achieve satisfying results. Let's also add a feedForward() to test what the model outputs after the training.
And after training 100 000 epochs, it outputs:
We made it! We can see that it guesses pretty close to [1,1,0,0], which is a good answer.
This is what the final js code should look like:
In this tutorial, we trained a neural network to count binary integers. We learned how binary numbers worked in order to digitalize them into our neural network. We then learned how to use Dannjs, the new neural network library that just came out, without any prerequisite ML knowledge. Feel free to tweak all the settings you want, experiment, play around, you’ll eventually get a grasp of what affects what. The only source of knowledge is experience.