Java implementation of Multilayer Perceptron
It basically has three main modules MLP
, Gradient Function
and Activation Function
.
MLP: It is the main driver class that helps create the model to fit and predict.
Activation Function: This java implementation currently supports these following Activation Functions
- Relu
- LeakyRelu
- Tanh
- Sigmoid
Gradient Function: This java implementation currently supports these following Gradient Functions
- AdaGrad
- Adam
- Momentum
- RMSProp
- SGD
- Windows, Linux or MacOS
- Java 1.8
After all the pre-requisites mentioned above is installed on the machine then follow the steps provided below.
- Clone this repository
https://github.com/sreetamparida/Taso.git
and openMLP.java
. - Provide the path to Training and Tesing dataset as shown below.
File inputFile = new File("cancer.csv");
File targetFile = new File("cancer_target.csv");
- Create the model with your specified
Gradient Function
,Activation Function
andHidden Layers
. - Specify the Gradient Function while creating the model object by adding the Gradient Function object as parameter.
- For adding
Hidden Layers
useaddHiddenLayer()
function. - Specify the Activation Function for the layer by adding the Activation Function object as parameter.
- Then perform the
fit()
operation by providing number ofepochs
. - Use the
predict()
function to specify the target data.
An implementation of a model with
- Gradient Function - AdaGrad
- Activation Function - TanH in the hidden layers and Relu at the output layer
- Number of Hidden Layers - 4
MLP model = new MLP(new AdaGrad());
model.addInputLayer(noAttr);
model.addHiddenLayer(noAttr,new Tanh());
model.addHiddenLayer(noAttr+1,new Tanh());
model.addHiddenLayer(noAttr+1,new Tanh());
model.addHiddenLayer(noAttr,new Tanh());
model.addOutputLayer(1,new Relu());
model.generateModel();
model.fit(input,target,5000);
- Execute the comand
javac MLP.java
to get your desired predictions.