# NAG Library Function Document

## 1Purpose

nag_rand_kfold_xyw (g05pvc) generates training and validation datasets suitable for use in cross-validation or jack-knifing.

## 2Specification

 #include #include
 void nag_rand_kfold_xyw (Integer k, Integer fold, Integer n, Integer m, Nag_DataByObsOrVar sordx, double x[], Integer pdx, double y[], double w[], Integer *nt, Integer state[], NagError *fail)

## 3Description

Let ${X}_{o}$ denote a matrix of $n$ observations on $m$ variables and ${y}_{o}$ and ${w}_{o}$ each denote a vector of length $n$. For example, ${X}_{o}$ might represent a matrix of independent variables, ${y}_{o}$ the dependent variable and ${w}_{o}$ the associated weights in a weighted regression.
nag_rand_kfold_xyw (g05pvc) generates a series of training datasets, denoted by the matrix, vector, vector triplet $\left({X}_{t},{y}_{t},{w}_{t}\right)$ of ${n}_{t}$ observations, and validation datasets, denoted $\left({X}_{v},{y}_{v},{w}_{v}\right)$ with ${n}_{v}$ observations. These training and validation datasets are generated as follows.
Each of the original $n$ observations is randomly assigned to one of $K$ equally sized groups or folds. For the $k$th sample the validation dataset consists of those observations in group $k$ and the training dataset consists of all those observations not in group $k$. Therefore at most $K$ samples can be generated.
If $n$ is not divisible by $K$ then the observations are assigned to groups as evenly as possible, therefore any group will be at most one observation larger or smaller than any other group.
When using $K=n$ the resulting datasets are suitable for leave-one-out cross-validation, or the training dataset on its own for jack-knifing. When using $K\ne n$ the resulting datasets are suitable for $K$-fold cross-validation. Datasets suitable for reversed cross-validation can be obtained by switching the training and validation datasets, i.e., use the $k$th group as the training dataset and the rest of the data as the validation dataset.
One of the initialization functions nag_rand_init_repeatable (g05kfc) (for a repeatable sequence if computed sequentially) or nag_rand_init_nonrepeatable (g05kgc) (for a non-repeatable sequence) must be called prior to the first call to nag_rand_kfold_xyw (g05pvc).

None.

## 5Arguments

1:    $\mathbf{k}$IntegerInput
On entry: $K$, the number of folds.
Constraint: $2\le {\mathbf{k}}\le {\mathbf{n}}$.
2:    $\mathbf{fold}$IntegerInput
On entry: the number of the fold to return as the validation dataset.
On the first call to nag_rand_kfold_xyw (g05pvc) ${\mathbf{fold}}$ should be set to $1$ and then incremented by one at each subsequent call until all $K$ sets of training and validation datasets have been produced. See Section 8 for more details on how a different calling sequence can be used.
Constraint: $1\le {\mathbf{fold}}\le {\mathbf{k}}$.
3:    $\mathbf{n}$IntegerInput
On entry: $n$, the number of observations.
Constraint: ${\mathbf{n}}\ge 1$.
4:    $\mathbf{m}$IntegerInput
On entry: $m$, the number of variables.
Constraint: ${\mathbf{m}}\ge 1$.
5:    $\mathbf{sordx}$Nag_DataByObsOrVarInput
On entry: determines how variables are stored in x.
Constraint: ${\mathbf{sordx}}=\mathrm{Nag_DataByVar}$ or $\mathrm{Nag_DataByObs}$.
6:    $\mathbf{x}\left[\mathit{dim}\right]$doubleInput/Output
Note: the dimension, dim, of the array x must be at least
• ${\mathbf{pdx}}×{\mathbf{m}}$ when ${\mathbf{sordx}}=\mathrm{Nag_DataByVar}$;
• ${\mathbf{pdx}}×{\mathbf{n}}$ when ${\mathbf{sordx}}=\mathrm{Nag_DataByObs}$.
The way the data is stored in x is defined by sordx.
If ${\mathbf{sordx}}=\mathrm{Nag_DataByVar}$, ${\mathbf{x}}\left[\left(\mathit{j}-1\right)×{\mathbf{pdx}}+\mathit{i}-1\right]$ contains the $\mathit{i}$th observation for the $\mathit{j}$th variable, for $i=1,2,\dots ,{\mathbf{n}}$ and $j=1,2,\dots ,{\mathbf{m}}$.
If ${\mathbf{sordx}}=\mathrm{Nag_DataByObs}$, ${\mathbf{x}}\left[\left(\mathit{i}-1\right)×{\mathbf{pdx}}+\mathit{j}-1\right]$ contains the $\mathit{i}$th observation for the $\mathit{j}$th variable, for $i=1,2,\dots ,{\mathbf{n}}$ and $j=1,2,\dots ,{\mathbf{m}}$.
On entry: if ${\mathbf{fold}}=1$, x must hold ${X}_{o}$, the values of $X$ for the original dataset, otherwise, x must not be changed since the last call to nag_rand_kfold_xyw (g05pvc).
On exit: values of $X$ for the training and validation datasets, with ${X}_{t}$ held in observations $1$ to ${\mathbf{nt}}$ and ${X}_{v}$ in observations ${\mathbf{nt}}+1$ to ${\mathbf{n}}$.
7:    $\mathbf{pdx}$IntegerInput
On entry: the stride separating row elements in the two-dimensional data stored in the array x.
Constraints:
• if ${\mathbf{sordx}}=\mathrm{Nag_DataByObs}$, ${\mathbf{pdx}}\ge {\mathbf{m}}$;
• otherwise ${\mathbf{pdx}}\ge {\mathbf{n}}$.
8:    $\mathbf{y}\left[{\mathbf{n}}\right]$doubleInput/Output
If the original dataset does not include ${y}_{o}$ then y must be set to NULL.
On entry: if ${\mathbf{fold}}\ne 1$, y must not be changed since the last call to nag_rand_kfold_xyw (g05pvc).
On exit: values of $y$ for the training and validation datasets, with ${y}_{t}$ held in elements $1$ to ${\mathbf{nt}}$ and ${y}_{v}$ in elements ${\mathbf{nt}}+1$ to ${\mathbf{n}}$.
9:    $\mathbf{w}\left[{\mathbf{n}}\right]$doubleInput/Output
If the original dataset does not include ${w}_{o}$ then w must be set to NULL.
On entry: if ${\mathbf{fold}}\ne 1$, w must not be changed since the last call to nag_rand_kfold_xyw (g05pvc).
On exit: values of $w$ for the training and validation datasets, with ${w}_{t}$ held in elements $1$ to ${\mathbf{nt}}$ and ${w}_{v}$ in elements ${\mathbf{nt}}+1$ to ${\mathbf{n}}$.
10:  $\mathbf{nt}$Integer *Output
On exit: ${n}_{t}$, the number of observations in the training dataset.
11:  $\mathbf{state}\left[\mathit{dim}\right]$IntegerCommunication Array
Note: the dimension, $\mathit{dim}$, of this array is dictated by the requirements of associated functions that must have been previously called. This array MUST be the same array passed as argument state in the previous call to nag_rand_init_repeatable (g05kfc) or nag_rand_init_nonrepeatable (g05kgc).
On entry: contains information on the selected base generator and its current state.
On exit: contains updated information on the state of the generator.
12:  $\mathbf{fail}$NagError *Input/Output
The NAG error argument (see Section 3.7 in How to Use the NAG Library and its Documentation).

## 6Error Indicators and Warnings

NE_ALLOC_FAIL
Dynamic memory allocation failed.
See Section 2.3.1.2 in How to Use the NAG Library and its Documentation for further information.
NE_ARRAY_SIZE
On entry, ${\mathbf{pdx}}=〈\mathit{\text{value}}〉$ and ${\mathbf{m}}=〈\mathit{\text{value}}〉$.
Constraint: if ${\mathbf{sordx}}=\mathrm{Nag_DataByObs}$, ${\mathbf{pdx}}\ge {\mathbf{m}}$.
On entry, ${\mathbf{pdx}}=〈\mathit{\text{value}}〉$ and ${\mathbf{n}}=〈\mathit{\text{value}}〉$.
Constraint: if ${\mathbf{sordx}}=\mathrm{Nag_DataByVar}$, ${\mathbf{pdx}}\ge {\mathbf{n}}$.
On entry, argument $〈\mathit{\text{value}}〉$ had an illegal value.
NE_INT
On entry, ${\mathbf{m}}=〈\mathit{\text{value}}〉$.
Constraint: ${\mathbf{m}}\ge 1$.
On entry, ${\mathbf{n}}=〈\mathit{\text{value}}〉$.
Constraint: ${\mathbf{n}}\ge 1$.
NE_INT_2
On entry, ${\mathbf{fold}}=〈\mathit{\text{value}}〉$ and ${\mathbf{k}}=〈\mathit{\text{value}}〉$.
Constraint: $1\le {\mathbf{fold}}\le {\mathbf{k}}$.
On entry, ${\mathbf{k}}=〈\mathit{\text{value}}〉$ and ${\mathbf{n}}=〈\mathit{\text{value}}〉$.
Constraint: $2\le {\mathbf{k}}\le {\mathbf{n}}$.
NE_INTERNAL_ERROR
An internal error has occurred in this function. Check the function call and any array sizes. If the call is correct then please contact NAG for assistance.
See Section 2.7.6 in How to Use the NAG Library and its Documentation for further information.
NE_INVALID_STATE
On entry, state vector has been corrupted or not initialized.
NE_NO_LICENCE
Your licence key may have expired or may not have been installed correctly.
See Section 2.7.5 in How to Use the NAG Library and its Documentation for further information.
NW_POTENTIAL_PROBLEM
More than $50%$ of the data did not move when the data was shuffled. $〈\mathit{\text{value}}〉$ of the $〈\mathit{\text{value}}〉$ observations stayed put.

## 7Accuracy

Not applicable.

nag_rand_kfold_xyw (g05pvc) will be computationality more efficient if each observation in x is contiguous, that is ${\mathbf{sordx}}=\mathrm{Nag_DataByObs}$.
Because of the way nag_rand_kfold_xyw (g05pvc) stores the data you should usually generate the $K$ training and validation datasets in order, i.e., set ${\mathbf{fold}}=1$ on the first call and increment it by one at each subsequent call. However, there are times when a different calling sequence would be beneficial, for example, when performing different cross-validation analyses on different threads. This is possible, as long as the following is borne in mind:
• nag_rand_kfold_xyw (g05pvc) must be called with ${\mathbf{fold}}=1$ first.
• Other than the first set, you can obtain the training and validation dataset in any order, but for a given x you can only obtain each once.
For example, if you have three threads, you would call nag_rand_kfold_xyw (g05pvc) once with ${\mathbf{fold}}=1$. You would then copy the x returned onto each thread and generate the remaing ${\mathbf{k}}-1$ sets of data by splitting them between the threads. For example, the first thread runs with ${\mathbf{fold}}=2,\dots ,{L}_{1}$, the second with ${\mathbf{fold}}={L}_{1}+1,\dots ,{L}_{2}$ and the third with ${\mathbf{fold}}={L}_{2}+1,\dots ,{\mathbf{k}}$.

## 9Example

This example uses nag_rand_kfold_xyw (g05pvc) to facilitate $K$-fold cross-validation.
A set of simulated data is split into $5$ training and validation datasets. nag_glm_binomial (g02gbc) is used to fit a logistic regression model to each training dataset and then nag_glm_predict (g02gpc) is used to predict the response for the observations in the validation dataset.
The counts of true and false positives and negatives along with the sensitivity and specificity is then reported.

### 9.1Program Text

Program Text (g05pvce.c)

### 9.2Program Data

Program Data (g05pvce.d)

### 9.3Program Results

Program Results (g05pvce.r)

© The Numerical Algorithms Group Ltd, Oxford, UK. 2017