How to do it...

  1. Add a search space for the layer size using IntegerParameterSpace:
ParameterSpace<Integer> layerSizeParam = new IntegerParameterSpace(startLimit,endLimit);
  1. Add a search space for the learning rate using ContinuousParameterSpace:
ParameterSpace<Double> learningRateParam = new ContinuousParameterSpace(0.0001,0.01);
  1. Use MultiLayerSpace to build a configuration space by adding all the search spaces to the relevant network configuration:
MultiLayerSpace hyperParamaterSpace = new MultiLayerSpace.Builder()
.updater(new AdamSpace(learningRateParam))
.addLayer(new DenseLayerSpace.Builder()
.activation(Activation.RELU)
.nIn(11)
.nOut(layerSizeParam)
.build())
.addLayer(new DenseLayerSpace.Builder()
.activation(Activation.RELU)
.nIn(layerSizeParam)
.nOut(layerSizeParam)
.build())
.addLayer(new OutputLayerSpace.Builder()
.activation(Activation.SIGMOID)
.lossFunction(LossFunctions.LossFunction.XENT)
.nOut(1)
.build())
.build();

  1. Create candidateGenerator from MultiLayerSpace:
Map<String,Object> dataParams = new HashMap<>();
dataParams.put("batchSize",new Integer(10));

CandidateGenerator candidateGenerator = new RandomSearchGenerator(hyperParamaterSpace,dataParams);

  1. Create a data source by implementing the DataSource interface:
public static class ExampleDataSource implements DataSource{
public ExampleDataSource(){
//implement methods from DataSource
}
}

We will need to implement four methods: configure(), trainData(), testData(), and getDataType():

    • The following is an example implementation of configure():
public void configure(Properties properties) {
this.minibatchSize = Integer.parseInt(properties.getProperty("minibatchSize", "16"));
}
    • Here's an example implementation of getDataType():
public Class<?> getDataType() {
return DataSetIterator.class;
}
    • Here's an example implementation of trainData():
public Object trainData() {
try{
DataSetIterator iterator = new RecordReaderDataSetIterator(dataPreprocess(),minibatchSize,labelIndex,numClasses);
return dataSplit(iterator).getTestIterator();
}
catch(Exception e){
throw new RuntimeException();
}
}
    • Here's an example implementation of testData():
public Object testData() {
try{
DataSetIterator iterator = new RecordReaderDataSetIterator(dataPreprocess(),minibatchSize,labelIndex,numClasses);
return dataSplit(iterator).getTestIterator();
}
catch(Exception e){
throw new RuntimeException();
}
}
  1. Create an array of termination conditions:
TerminationCondition[] conditions = {
new MaxTimeCondition(maxTimeOutInMinutes, TimeUnit.MINUTES),
new MaxCandidatesCondition(maxCandidateCount)
};
  1. Calculate the score of all models that were created using different combinations of configurations:
ScoreFunction scoreFunction = new EvaluationScoreFunction(Evaluation.Metric.ACCURACY);
  1. Create OptimizationConfiguration and add termination conditions and the score function:
OptimizationConfiguration optimizationConfiguration = new OptimizationConfiguration.Builder()
.candidateGenerator(candidateGenerator)
.dataSource(ExampleDataSource.class,dataSourceProperties)
.modelSaver(modelSaver)
.scoreFunction(scoreFunction)
.terminationConditions(conditions)
.build();
  1. Create LocalOptimizationRunner to run the hyperparameter tuning process:
IOptimizationRunner runner = new LocalOptimizationRunner(optimizationConfiguration,new MultiLayerNetworkTaskCreator());
  1. Add listeners to LocalOptimizationRunner to ensure events are logged properly (skip to step 11 to add ArbiterStatusListener):
runner.addListeners(new LoggingStatusListener());
  1. Execute the hyperparameter tuning by calling the execute() method:
runner.execute();

  1. Store the model configurations and replace LoggingStatusListener with ArbiterStatusListener:
StatsStorage storage = new FileStatsStorage(new File("HyperParamOptimizationStatsModel.dl4j"));
runner.addListeners(new ArbiterStatusListener(storage));
  1. Attach the storage to UIServer:
UIServer.getInstance().attach(storage);
  1. Run the hyperparameter tuning session and go to the following URL to view the visualization:
http://localhost:9000/arbiter
  1. Evaluate the best score from the hyperparameter tuning session and display the results in the console:
double bestScore = runner.bestScore();
int bestCandidateIndex = runner.bestScoreCandidateIndex();
int numberOfConfigsEvaluated = runner.numCandidatesCompleted();

You should see the output shown in the following snapshot. The model's best score, the index where the best model is located, and the number of configurations evaluated in the process are displayed:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset