Fluent training with the MNIST database

In the following example, we will train our CNN against the MNIST database of images.

To declare a function, use the following code:

private void MnistDemo()
{

Next, download the training and testing datasets with the following command:

var datasets = new DataSets();

Load 100 validation sets with the following command:

if (!datasets.Load(100))
{
return;
}

Now it's time to create the neural network using the Fluent API, as follows:

this._net = FluentNet<double>.Create(24, 24, 1)
.Conv(5, 5, 8).Stride(1).Pad(2)
.Relu()
.Pool(2, 2).Stride(2)
.Conv(5, 5, 16).Stride(1).Pad(2)
.Relu()
.Pool(3, 3).Stride(3)
.FullyConn(10)
.Softmax(10)
.Build();

Create the stochastic gradient descent trainer from the network with the following command:

this._trainer = new SgdTrainer<double>(this._net)
{
LearningRate = 0.01,
BatchSize = 20,
L2Decay = 0.001,
Momentum = 0.9
};
do
{

Next, get the NextBatch of data with the following code:

var trainSample = datasets.Train.NextBatch(this._trainer.BatchSize);

Train the data received with the following command:

Train(trainSample.Item1, trainSample.Item2, trainSample.Item3);

It's now time to get the NextBatch of data; to do so, use the following command:

var testSample = datasets.Test.NextBatch(this._trainer.BatchSize);

The code can be tested with the following command:

Test(testSample.Item1, testSample.Item3, this._testAccWindow);

To report the accuracy, input the following command:

Console.WriteLine("Loss: {0} Train accuracy: {1}% Test accuracy: {2}%", this._trainer.Loss, Math.Round(this._trainAccWindow.Items.Average() * 100.0, 2),
Math.Round(this._testAccWindow.Items.Average() * 100.0, 2));
} while (!Console.KeyAvailable);
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset