RELU unscaled results

Joy train on 9 folds, test on 1

2020-05-13 12:15:50,734 - INFO - allennlp.common.util - Metrics: {
  "best_epoch": 12,
  "peak_cpu_memory_MB": 2626.072,
  "peak_gpu_0_memory_MB": 8085,
  "peak_gpu_1_memory_MB": 21478,
  "training_duration": "0:23:41.867688",
  "training_start_epoch": 0,
  "training_epochs": 21,
  "epoch": 21,
  "training_pearson": 0.9939782728494088,
  "training_mae": 0.14345509548909208,
  "training_loss": 0.03389307007519076,
  "training_cpu_memory_MB": 2626.072,
  "training_gpu_0_memory_MB": 7409,
  "training_gpu_1_memory_MB": 18680,
  "validation_pearson": 0.8524911266953036,
  "validation_mae": 0.7244842611554498,
  "validation_loss": 0.8840915312369665,
  "best_validation_pearson": 0.8559774687113151,
  "best_validation_mae": 0.6610433984638224,
  "best_validation_loss": 0.7440112556020418
}

Sadness train on 9 folds, test on 1

These are longer and they're causing clipping. I don't know how many are causing clipping though because allennlp only reports the first case of clipping.

2020-05-13 12:22:49,449 - INFO - allennlp.common.util - Metrics: {
  "best_epoch": 1,
  "peak_cpu_memory_MB": 2759.188,
  "peak_gpu_0_memory_MB": 7409,
  "peak_gpu_1_memory_MB": 18680,
  "training_duration": "0:17:02.438299",
  "training_start_epoch": 0,
  "training_epochs": 10,
  "epoch": 10,
  "training_pearson": -0.01633020200422353,
  "training_mae": 1.3776687749682646,
  "training_loss": 2.781699788029837,
  "training_cpu_memory_MB": 2759.188,
  "training_gpu_0_memory_MB": 7409,
  "training_gpu_1_memory_MB": 11,
  "validation_pearson": 0,
  "validation_mae": 1.355329878786777,
  "validation_loss": 2.7624370823515223,
  "best_validation_pearson": 0.23847302176509885,
  "best_validation_mae": 1.3548242284896526,
  "best_validation_loss": 2.7604094781774156
}