How to do it...

  1. Create a method to generate a schema for the user input:
private static Schema generateSchema(){
Schema schema = new Schema.Builder()
.addColumnString("RowNumber")
.addColumnInteger("CustomerId")
.addColumnString("Surname")
.addColumnInteger("CreditScore")
.addColumnCategorical("Geography", Arrays.asList("France","Germany","Spain"))
.addColumnCategorical("Gender", Arrays.asList("Male","Female"))
.addColumnsInteger("Age", "Tenure")
.addColumnDouble("Balance")
.addColumnsInteger("NumOfProducts","HasCrCard","IsActiveMember")
.addColumnDouble("EstimatedSalary")
.build();
return schema;
}
  1. Create a TransformProcess from the schema:
private static RecordReader applyTransform(RecordReader recordReader, Schema schema){
final TransformProcess transformProcess = new TransformProcess.Builder(schema)
.removeColumns("RowNumber","CustomerId","Surname")
.categoricalToInteger("Gender")
.categoricalToOneHot("Geography")
.removeColumns("Geography[France]")
.build();
final TransformProcessRecordReader transformProcessRecordReader = new TransformProcessRecordReader(recordReader,transformProcess);
return transformProcessRecordReader;
}

  1. Load the data into a record reader instance:
private static RecordReader generateReader(File file) throws IOException, InterruptedException {
final RecordReader recordReader = new CSVRecordReader(1,',');
recordReader.initialize(new FileSplit(file));
final RecordReader transformProcessRecordReader=applyTransform(recordReader,generateSchema());
  1. Restore the model using ModelSerializer:
File modelFile = new File(modelFilePath);
MultiLayerNetwork network = ModelSerializer.restoreMultiLayerNetwork(modelFile);
NormalizerStandardize normalizerStandardize = ModelSerializer.restoreNormalizerFromFile(modelFile);
  1. Create an iterator to traverse through the entire set of input records:
DataSetIterator dataSetIterator = new RecordReaderDataSetIterator.Builder(recordReader,1).build();
normalizerStandardize.fit(dataSetIterator);
dataSetIterator.setPreProcessor(normalizerStandardize);
  1. Design an API function to generate output from user input:
public static INDArray generateOutput(File inputFile, String modelFilePath) throws IOException, InterruptedException {
File modelFile = new File(modelFilePath);
MultiLayerNetwork network = ModelSerializer.restoreMultiLayerNetwork(modelFile);
RecordReader recordReader = generateReader(inputFile);
NormalizerStandardize normalizerStandardize = ModelSerializer.restoreNormalizerFromFile(modelFile);
DataSetIterator dataSetIterator = new RecordReaderDataSetIterator.Builder(recordReader,1).build();
normalizerStandardize.fit(dataSetIterator);
dataSetIterator.setPreProcessor(normalizerStandardize);
return network.output(dataSetIterator);
}

For a further example, see: https://github.com/PacktPublishing/Java-Deep-Learning-Cookbook/blob/master/03_Building_Deep_Neural_Networks_for_Binary_classification/sourceCode/cookbookapp/src/main/java/com/javadeeplearningcookbook/api/CustomerRetentionPredictionApi.java 

  1. Build a shaded JAR of your DL4J API project by running the Maven command:
mvn clean install
  1. Run the Spring Boot project included in the source directory. Import the Maven project to your IDE: https://github.com/PacktPublishing/Java-Deep-Learning-Cookbook/tree/master/03_Building_Deep_Neural_Networks_for_Binary_classification/sourceCode/spring-dl4j.

Add the following VM options in under run configurations:

-DmodelFilePath={PATH-TO-MODEL-FILE}

PATH-TO-MODEL-FILE is the location where you stored the actual model file. It can be on your local disk or in a cloud as well. 

Then, run the SpringDl4jApplication.java file:

  1. Test your Spring Boot app at http://localhost:8080/:

  1. Verify the functionality by uploading an input CSV file.

Use a sample CSV file to upload into the web application: https://github.com/PacktPublishing/Java-Deep-Learning-Cookbook/blob/master/03_Building_Deep_Neural_Networks_for_Binary_classification/sourceCode/cookbookapp/src/main/resources/test.csv.

The prediction results will be displayed as shown here:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset