[source]

SearchParam

galaxy_ml.keras_galaxy_models.SearchParam(s_param, value)

Sortable Wrapper class for search parameters


[source]

KerasLayers

galaxy_ml.keras_galaxy_models.KerasLayers(name='sequential_1', layers=[])

Parameters

name: str
layers: list of dict, the configuration of model

[source]

BaseKerasModel

galaxy_ml.keras_galaxy_models.BaseKerasModel(config, model_type='sequential', optimizer='sgd', loss='binary_crossentropy', metrics=[], lr=None, momentum=None, decay=None, nesterov=None, rho=None, amsgrad=None, beta_1=None, beta_2=None, schedule_decay=None, epochs=1, batch_size=None, seed=None, callbacks=None, validation_fraction=0.1, steps_per_epoch=None, validation_steps=None, verbose=0)

Base class for Galaxy Keras wrapper

Parameters

  • config: dictionary
    from model.get_config()
  • model_type: str
    'sequential' or 'functional'
  • optimizer: str, default 'sgd'
    'sgd', 'rmsprop', 'adagrad', 'adadelta', 'adam', 'adamax', 'nadam'
  • loss: str, default 'binary_crossentropy'
    same as Keras loss
  • metrics: list of strings, default []
  • lr: None or float
    optimizer parameter, default change with optimizer
  • momentum: None or float
    for optimizer sgd only, ignored otherwise
  • nesterov: None or bool
    for optimizer sgd only, ignored otherwise
  • decay: None or float
    optimizer parameter, default change with optimizer
  • rho: None or float
    optimizer parameter, default change with optimizer
  • amsgrad: None or bool
    for optimizer adam only, ignored otherwise
  • beta_1: None or float
    optimizer parameter, default change with optimizer
  • beta_2: None or float
    optimizer parameter, default change with optimizer
  • schedule_decay: None or float
    optimizer parameter, default change with optimizer
  • epochs: int
    fit_param from Keras
  • batch_size: None or int, default=None
    fit_param, if None, will default to 32
  • callbacks: None or list of dict
    fit_param, each dict contains one type of callback configuration. e.g. {"callback_selection": {"callback_type": "EarlyStopping", "monitor": "val_loss" "baseline": None, "min_delta": 0.0, "patience": 10, "mode": "auto", "restore_best_weights": False}}
  • validation_fraction: Float. default=0.1
    The proportion of training data to set aside as validation set. Must be within [0, 1). Will be ignored if validation_data is set via fit_params.
  • steps_per_epoch: int, default is None
    fit param. The number of train batches per epoch
  • validation_steps: None or int, default is None
    fit params, validation steps. if None, it will be number of samples divided by batch_size.
  • seed: None or int, default None
    backend random seed
  • verbose: 0, 1 or 2
    Verbosity mode. 0 = silent, 1 = progress bar, 2 = one line per epoch. If > 0, log device placement

[source]

KerasGClassifier

galaxy_ml.keras_galaxy_models.KerasGClassifier(config, model_type='sequential', optimizer='sgd', loss='binary_crossentropy', metrics=[], lr=None, momentum=None, decay=None, nesterov=None, rho=None, amsgrad=None, beta_1=None, beta_2=None, schedule_decay=None, epochs=1, batch_size=None, seed=None, callbacks=None, validation_fraction=0.1, steps_per_epoch=None, validation_steps=None, verbose=0)

Scikit-learn classifier API for Keras


[source]

KerasGRegressor

galaxy_ml.keras_galaxy_models.KerasGRegressor(config, model_type='sequential', optimizer='sgd', loss='binary_crossentropy', metrics=[], lr=None, momentum=None, decay=None, nesterov=None, rho=None, amsgrad=None, beta_1=None, beta_2=None, schedule_decay=None, epochs=1, batch_size=None, seed=None, callbacks=None, validation_fraction=0.1, steps_per_epoch=None, validation_steps=None, verbose=0)

Scikit-learn API wrapper for Keras regressor


[source]

KerasGBatchClassifier

galaxy_ml.keras_galaxy_models.KerasGBatchClassifier(config, data_batch_generator, model_type='sequential', optimizer='sgd', loss='binary_crossentropy', metrics=[], lr=None, momentum=None, decay=None, nesterov=None, rho=None, amsgrad=None, beta_1=None, beta_2=None, schedule_decay=None, epochs=1, batch_size=None, seed=None, n_jobs=1, callbacks=None, validation_fraction=0.1, steps_per_epoch=None, validation_steps=None, verbose=0, prediction_steps=None, class_positive_factor=1)

keras classifier with batch data generator

Parameters

  • config: dictionary
    from model.get_config()
  • data_batch_generator: instance of batch data generator
  • model_type: str
    'sequential' or 'functional'
  • optimizer: str, default 'sgd'
    'sgd', 'rmsprop', 'adagrad', 'adadelta', 'adam', 'adamax', 'nadam'
  • loss: str, default 'binary_crossentropy'
    same as Keras loss
  • metrics: list of strings, default []
  • lr: None or float
    optimizer parameter, default change with optimizer
  • momentum: None or float
    for optimizer sgd only, ignored otherwise
  • nesterov: None or bool
    for optimizer sgd only, ignored otherwise
  • decay: None or float
    optimizer parameter, default change with optimizer
  • rho: None or float
    optimizer parameter, default change with optimizer
  • amsgrad: None or bool
    for optimizer adam only, ignored otherwise
  • beta_1: None or float
    optimizer parameter, default change with optimizer
  • beta_2: None or float
    optimizer parameter, default change with optimizer
  • schedule_decay: None or float
    optimizer parameter, default change with optimizer
  • epochs: int
    fit_param from Keras
  • batch_size: None or int, default=None
    fit_param, if None, will default to 32
  • callbacks: None or list of dict
    each dict contains one type of callback configuration. e.g. {"callback_selection": {"callback_type": "EarlyStopping", "monitor": "val_loss" "baseline": None, "min_delta": 0.0, "patience": 10, "mode": "auto", "restore_best_weights": False}}
  • validation_fraction: Float. default=0.1
    The proportion of training data to set aside as validation set. Must be within [0, 1). Will be ignored if validation_data is set via fit_params.
  • steps_per_epoch: int, default is None
    fit param. The number of train batches per epoch
  • validation_steps: None or int, default is None
    fit params, validation steps. if None, it will be number of samples divided by batch_size.
  • seed: None or int, default None
    backend random seed
  • verbose: 0, 1 or 2
    Verbosity mode. 0 = silent, 1 = progress bar, 2 = one line per epoch. If > 0, log device placement
  • n_jobs: int, default=1
  • prediction_steps: None or int, default is None
    prediction steps. If None, it will be number of samples divided by batch_size.
  • class_positive_factor: int or float, default=1
    For binary classification only. If int, like 5, will convert to class_weight {0: 1, 1: 5}. If float, 0.2, corresponds to class_weight {0: 1/0.2, 1: 1}