ForeTiS.model._additionalmodels.lstmbayes_intel
Module Contents
Classes
Implementation of a class for a bayesian Long Short-Term Memory (LSTM) network. |
- class ForeTiS.model._additionalmodels.lstmbayes_intel.LSTM(optuna_trial, datasets, featureset_name, optimize_featureset, pca_transform=None, current_model_name=None, batch_size=None, n_epochs=None, target_column=None)
Bases:
ForeTiS.model._torch_model.TorchModel
Implementation of a class for a bayesian Long Short-Term Memory (LSTM) network.
See
BaseModel
andTorchModel
for more information on the attributes.- Parameters:
- define_model()
Definition of a bayesian LSTM network.
- Architecture:
Bayesian LSTM, Dropout, Linear
Bayesian Linear output layer
Number of output channels of the first layer, dropout rate, frequency of a doubling of the output channels and number of units in the first linear layer. may be fixed or optimized.
- Return type:
torch.nn.Sequential
- define_hyperparams_to_tune()
See
BaseModel
for more information on the format.See
TorchModel
for more information on hyperparameters common for all torch models.- Return type:
- train_val_loader(train, val)
Get the Dataloader with training and validation data
- Poram train:
training data
- Parameters:
val (pandas.DataFrame) – validation data
train (pandas.DataFrame) –
- Returns:
train_loader, val_loader, val
- predict(X_in)
Implementation of a prediction based on input features for the bayes lstm model. See
BaseModel
for more information- Parameters:
X_in (pandas.DataFrame) –
- Return type:
numpy.array
- get_dataloader(X, y=None, only_transform=None, predict=False, shuffle=False)
Get a Pytorch DataLoader using the specified data and batch size
- Parameters:
- Returns:
Pytorch DataLoader
- Return type:
torch.utils.data.DataLoader