31 lines
7.7 KiB
Plaintext
31 lines
7.7 KiB
Plaintext
|
2.4.1+cu121
|
||
|
True
|
||
|
Model: PFCFuse
|
||
|
Number of epochs: 60
|
||
|
Epoch gap: 40
|
||
|
Learning rate: 0.0001
|
||
|
Weight decay: 0
|
||
|
Batch size: 1
|
||
|
GPU number: 0
|
||
|
Coefficient of MSE loss VF: 1.0
|
||
|
Coefficient of MSE loss IF: 1.0
|
||
|
Coefficient of RMI loss VF: 1.0
|
||
|
Coefficient of RMI loss IF: 1.0
|
||
|
Coefficient of Cosine loss VF: 1.0
|
||
|
Coefficient of Cosine loss IF: 1.0
|
||
|
Coefficient of Decomposition loss: 2.0
|
||
|
Coefficient of Total Variation loss: 5.0
|
||
|
Clip gradient norm value: 0.01
|
||
|
Optimization step: 20
|
||
|
Optimization gamma: 0.5
|
||
|
[Epoch 0/60] [Batch 0/6487] [loss: 5.934072] ETA: 10 days, 0
[Epoch 0/60] [Batch 1/6487] [loss: 7.354875] ETA: 10:34:34.9
[Epoch 0/60] [Batch 2/6487] [loss: 7.649462] ETA: 11:02:11.7
[Epoch 0/60] [Batch 3/6487] [loss: 4.681341] ETA: 9:54:34.61
[Epoch 0/60] [Batch 4/6487] [loss: 15.397819] ETA: 9:54:57.26
[Epoch 0/60] [Batch 5/6487] [loss: 11.085931] ETA: 10:22:22.5
[Epoch 0/60] [Batch 6/6487] [loss: 13.419497] ETA: 11:44:06.8
[Epoch 0/60] [Batch 7/6487] [loss: 8.841534] ETA: 10:31:40.0
[Epoch 0/60] [Batch 8/6487] [loss: 4.809514] ETA: 10:40:48.2
[Epoch 0/60] [Batch 9/6487] [loss: 5.460008] ETA: 10:35:16.4
[Epoch 0/60] [Batch 10/6487] [loss: 6.607483] ETA: 11:04:44.2
[Epoch 0/60] [Batch 11/6487] [loss: 8.002920] ETA: 10:30:11.7
[Epoch 0/60] [Batch 12/6487] [loss: 6.442471] ETA: 10:17:31.8
[Epoch 0/60] [Batch 13/6487] [loss: 12.265147] ETA: 11:30:47.2
[Epoch 0/60] [Batch 14/6487] [loss: 4.954008] ETA: 11:42:32.6
[Epoch 0/60] [Batch 15/6487] [loss: 10.585257] ETA: 10:55:33.8
[Epoch 0/60] [Batch 16/6487] [loss: 8.780766] ETA: 10:38:00.8
[Epoch 0/60] [Batch 17/6487] [loss: 8.221046] ETA: 10:50:32.7
[Epoch 0/60] [Batch 18/6487] [loss: 4.333150] ETA: 10:44:08.0
[Epoch 0/60] [Batch 19/6487] [loss: 3.702891] ETA: 11:06:37.4
[Epoch 0/60] [Batch 20/6487] [loss: 5.839406] ETA: 11:43:10.3
[Epoch 0/60] [Batch 21/6487] [loss: 3.961552] ETA: 10:05:33.6
[Epoch 0/60] [Batch 22/6487] [loss: 3.017392] ETA: 10:41:59.1
[Epoch 0/60] [Batch 23/6487] [loss: 10.637247] ETA: 10:19:07.7
[Epoch 0/60] [Batch 24/6487] [loss: 2.622610] ETA: 10:14:55.3
[Epoch 0/60] [Batch 25/6487] [loss: 4.779226] ETA: 9:54:12.09
[Epoch 0/60] [Batch 26/6487] [loss: 2.632162] ETA: 9:37:18.26
[Epoch 0/60] [Batch 27/6487] [loss: 3.904268] ETA: 9:56:16.71
[Epoch 0/60] [Batch 28/6487] [loss: 7.274397] ETA: 9:40:39.06
[Epoch 0/60] [Batch 29/6487] [loss: 2.386863] ETA: 10:30:09.7
[Epoch 0/60] [Batch 30/6487] [loss: 2.582200] ETA: 10:08:56.5
[Epoch 0/60] [Batch 31/6487] [loss: 3.013002] ETA: 10:15:19.2
[Epoch 0/60] [Batch 32/6487] [loss: 5.693004] ETA: 10:05:53.4
[Epoch 0/60] [Batch 33/6487] [loss: 2.899549] ETA: 10:03:38.5
[Epoch 0/60] [Batch 34/6487] [loss: 3.194182] ETA: 10:33:07.2
[Epoch 0/60] [Batch 35/6487] [loss: 3.742230] ETA: 10:10:17.6
[Epoch 0/60] [Batch 36/6487] [loss: 3.236870] ETA: 9:40:05.59
[Epoch 0/60] [Batch 37/6487] [loss: 2.928835] ETA: 9:37:05.77
[Epoch 0/60] [Batch 38/6487] [loss: 2.313494] ETA: 9:51:05.79
[Epoch 0/60] [Batch 39/6487] [loss: 2.723912] ETA: 9:23:56.15
[Epoch 0/60] [Batch 40/6487] [loss: 1.680832] ETA: 9:50:13.36
[Epoch 0/60] [Batch 41/6487] [loss: 1.744040] ETA: 9:30:19.47
[Epoch 0/60] [Batch 42/6487] [loss: 1.740687] ETA: 9:33:38.13
[Epoch 0/60] [Batch 43/6487] [loss: 2.356352] ETA: 9:46:33.46
[Epoch 0/60] [Batch 44/6487] [loss: 2.442476] ETA: 10:26:42.3
[Epoch 0/60] [Batch 45/6487] [loss: 1.624857] ETA: 9:38:42.11
[Epoch 0/60] [Batch 46/6487] [loss: 1.195396] ETA: 10:01:12.9
[Epoch 0/60] [Batch 47/6487] [loss: 1.149045] ETA: 10:24:53.1
[Epoch 0/60] [Batch 48/6487] [loss: 1.695918] ETA: 9:39:42.25
[Epoch 0/60] [Batch 49/6487] [loss: 2.567844] ETA: 9:28:53.50
[Epoch 0/60] [Batch 50/6487] [loss: 1.230891] ETA: 9:34:51.10
[Epoch 0/60] [Batch 51/6487] [loss: 1.958381] ETA: 10:21:43.3
[Epoch 0/60] [Batch 52/6487] [loss: 1.503905] ETA: 9:47:06.89
[Epoch 0/60] [Batch 53/6487] [loss: 2.220990] ETA: 9:38:16.26
[Epoch 0/60] [Batch 54/6487] [loss: 1.354937] ETA: 10:13:29.8
[Epoch 0/60] [Batch 55/6487] [loss: 1.741669] ETA: 9:47:41.13
[Epoch 0/60] [Batch 56/6487] [loss: 1.656092] ETA: 9:38:35.94
[Epoch 0/60] [Batch 57/6487] [loss: 1.487051] ETA: 9:43:43.98
[Epoch 0/60] [Batch 58/6487] [loss: 1.158252] ETA: 9:37:07.43
[Epoch 0/60] [Batch 59/6487] [loss: 1.418594] ETA: 9:45:47.30
[Epoch 0/60] [Batch 60/6487] [loss: 0.926793] ETA: 9:34:48.45
[Epoch 0/60] [Batch 61/6487] [loss: 1.154947] ETA: 9:45:05.27
[Epoch 0/60] [Batch 62/6487] [loss: 1.088354] ETA: 9:36:13.17
[Epoch 0/60] [Batch 63/6487] [loss: 1.382724] ETA: 9:45:53.15
[Epoch 0/60] [Batch 64/6487] [loss: 1.270256] ETA: 9:34:49.67
[Epoch 0/60] [Batch 65/6487] [loss: 1.212800] ETA: 9:38:32.91
[Epoch
|
||
|
File "/home/star/whaiDir/PFCFuse/train.py", line 180, in <module>
|
||
|
loss.backward()
|
||
|
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/_tensor.py", line 521, in backward
|
||
|
torch.autograd.backward(
|
||
|
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/autograd/__init__.py", line 289, in backward
|
||
|
_engine_run_backward(
|
||
|
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/autograd/graph.py", line 769, in _engine_run_backward
|
||
|
return Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass
|
||
|
KeyboardInterrupt
|