File Exchange > DataAnalysis >    Neural Network Regression

Author:
OriginLab Technical Support
Date Added:
7/28/2020
Last Update:
12/16/2020
Downloads (90 Days):
866
Total Ratings:
2
File Size:
299 KB
Average Rating:
File Name:
NNR.opx
File Version:
1.25
Minimum Versions:
License:
Free
Summary:

Perform neural network fitting using Python.

Screen Shot and Video:
Description:

Purpose

This App provides a tool for fitting data with neural network backpropagation. It trains a neural network to map between a set of inputs and output. You can use it to predict response of independent variables. 

Notes:

  • It needs Embedded Python and scikit-learn library. Other dependent libraries include joblib, threadpoolctl, numpy and scipy

  • This App uses backpropagation algorithm, which is different from that of Neural Network Fitting App (RPROP and GRPROP algorithm). The regression results of the two Apps could differ from each other in some cases.

Installation

  1. Download the NNR.opx file, then drag-and-drop onto the Origin workspace.
  2. The App will start downloading dependent Python libraries. Wait a few minutes until the download is completed and restart Origin.

Operation

  1. Activate a worksheet or a graph. Click the App icon to bring up the dialog.
  2. On Input Data tab, select single or multiple datasets for Independent Variables and specify Dependent Variable by selecting a single dataset.
  3. On Options tab, change settings to fit a neural network.
    • Number of Hidden Neurons in Each Layer: Specify space-separated list of number of hidden neurons. For example, if you have 3 layers and each layer contains 3,5 and 4 neurons, enter '3 5 4'.
    • Maximum Iterations: Maximum number of iterations. The solver iterates until convergence or this number of iterations.
    • Learning Rate: Learning rate schedule for weight updates.
    • Loss Tolerance: Tolerance for the optimization.
    • Maximum Number of Epochs without Change: Maximum number of epochs to not meet Loss Tolerance improvement. When the loss is not improving by at least Loss Tolerance for iterations defined by this option, convergence is considered to be reached and training stops.
    • Activation Function: Activation function for the hidden layer.
    • K-fold Cross Validation: The data sample is split into K groups. For each unique group, take the group as a test data set. Take the remaining groups as a training data set. Fit a model on the training set and evaluate it on the test set.
    • Standardize Independent Variables:(a) None: Variables are not standardized. (b) Z scores (standardize to N(0, 1)):Variables are transformed to the standard normal distribution. (c) Normalize to (0, 1):Variable are transformed to the range of 0 and 1.
    • Print Progress Messages: Print progress messages from the packages.
  4. On Quantities and Plots tab, choose which quantities and plots to output.
  5. On Prediction tab, you can select a range of independent data to predict the response with the fitted neural network.
  6. Click OK to output reports.

Sample OPJU File
This app provides a sample OPJU file.  Right click the App icon in the Apps Gallery window, and choose Show Samples Folder from the short-cut menu. A folder will open. Drag-and-drop the project file Neural Network Regression Sample.opju from the folder onto Origin. The Notes window in the project shows detailed steps.
Note: If you wish to save the OPJU after changing, it is recommended that you save to a different folder location (e.g. User Files Folder).

Updates:

v1.25: Add Print Progress Messages option.
v1.1: Add Standardize Independent Variables option.

Reviews and Comments:
03/16/2021OriginLabHi,
Prediction is a very complicated process far more than linear fitting. We don’t support it right now.

03/16/2021kbrandenburgWorks well. A more detailed discussion of parameter settings would be nice. I would also like to see added a way to export the trained network in a format that can be inserted in the user's code and/or used as a prediction tool separate from the training algorithm. In the mean time, if anybody has any thoughts on a way to achieve this right now, please share.