pfcfuse/logs/log_20241006_164707.log

44 lines
20 KiB
Plaintext
Raw Normal View History

2.4.1+cu121
True
Model: PFCFuse
Number of epochs: 60
Epoch gap: 40
Learning rate: 0.0001
Weight decay: 0
Batch size: 1
GPU number: 0
Coefficient of MSE loss VF: 1.0
Coefficient of MSE loss IF: 1.0
Coefficient of RMI loss VF: 1.0
Coefficient of RMI loss IF: 1.0
Coefficient of Cosine loss VF: 1.0
Coefficient of Cosine loss IF: 1.0
Coefficient of Decomposition loss: 2.0
Coefficient of Total Variation loss: 5.0
Clip gradient norm value: 0.01
Optimization step: 20
Optimization gamma: 0.5
[Epoch 0/60] [Batch 0/6487] [loss: 6.843450] ETA: 10 days, 1 [Epoch 0/60] [Batch 1/6487] [loss: 17.473789] ETA: 10:18:13.0 [Epoch 0/60] [Batch 2/6487] [loss: 6.973145] ETA: 9:26:17.17 [Epoch 0/60] [Batch 3/6487] [loss: 7.598927] ETA: 9:24:09.86 [Epoch 0/60] [Batch 4/6487] [loss: 7.397294] ETA: 9:21:32.02 [Epoch 0/60] [Batch 5/6487] [loss: 17.675234] ETA: 9:24:20.36 [Epoch 0/60] [Batch 6/6487] [loss: 11.842889] ETA: 9:43:36.42 [Epoch 0/60] [Batch 7/6487] [loss: 8.561872] ETA: 9:32:16.41 [Epoch 0/60] [Batch 8/6487] [loss: 8.628882] ETA: 9:40:58.48 [Epoch 0/60] [Batch 9/6487] [loss: 3.025908] ETA: 9:26:52.01 [Epoch 0/60] [Batch 10/6487] [loss: 12.309198] ETA: 9:37:37.31 [Epoch 0/60] [Batch 11/6487] [loss: 10.065054] ETA: 9:23:22.58 [Epoch 0/60] [Batch 12/6487] [loss: 5.186013] ETA: 9:38:36.34 [Epoch 0/60] [Batch 13/6487] [loss: 5.387490] ETA: 9:49:38.61 [Epoch 0/60] [Batch 14/6487] [loss: 5.509142] ETA: 9:21:29.86 [Epoch 0/60] [Batch 15/6487] [loss: 6.785795] ETA: 9:27:37.98 [Epoch 0/60] [Batch 16/6487] [loss: 7.973134] ETA: 9:29:15.88 [Epoch 0/60] [Batch 17/6487] [loss: 21.794090] ETA: 9:29:22.66 [Epoch 0/60] [Batch 18/6487] [loss: 4.961427] ETA: 9:33:11.30 [Epoch 0/60] [Batch 19/6487] [loss: 14.073445] ETA: 9:27:57.20 [Epoch 0/60] [Batch 20/6487] [loss: 6.013936] ETA: 9:26:16.99 [Epoch 0/60] [Batch 21/6487] [loss: 13.236930] ETA: 9:24:25.93 [Epoch 0/60] [Batch 22/6487] [loss: 8.306091] ETA: 9:36:14.12 [Epoch 0/60] [Batch 23/6487] [loss: 3.355170] ETA: 9:59:40.66 [Epoch 0/60] [Batch 24/6487] [loss: 2.904986] ETA: 9:07:01.95 [Epoch 0/60] [Batch 25/6487] [loss: 2.231014] ETA: 9:44:41.24 [Epoch 0/60] [Batch 26/6487] [loss: 6.787667] ETA: 9:43:30.81 [Epoch 0/60] [Batch 27/6487] [loss: 7.387001] ETA: 9:34:29.57 [Epoch 0/60] [Batch 28/6487] [loss: 4.501630] ETA: 9:39:13.60 [Epoch 0/60] [Batch 29/6487] [loss: 2.489206] ETA: 9:28:18.69 [Epoch 0/60] [Batch 30/6487] [loss: 3.574013] ETA: 9:29:23.56 [Epoch 0/60] [Batch 31/6487] [loss: 6.969161] ETA: 9:43:31.66 [Epoch 0/60] [Batch 32/6487] [loss: 3.075920] ETA: 9:34:19.75 [Epoch 0/60] [Batch 33/6487] [loss: 2.088318] ETA: 9:28:40.24 [Epoch 0/60] [Batch 34/6487] [loss: 3.432371] ETA: 9:32:47.99 [Epoch 0/60] [Batch 35/6487] [loss: 4.036960] ETA: 9:39:26.43 [Epoch 0/60] [Batch 36/6487] [loss: 2.675624] ETA: 9:32:24.90 [Epoch 0/60] [Batch 37/6487] [loss: 2.401388] ETA: 9:36:22.25 [Epoch 0/60] [Batch 38/6487] [loss: 2.432465] ETA: 9:29:30.37 [Epoch 0/60] [Batch 39/6487] [loss: 3.220938] ETA: 9:31:25.53 [Epoch 0/60] [Batch 40/6487] [loss: 2.949226] ETA: 9:56:07.63 [Epoch 0/60] [Batch 41/6487] [loss: 2.188518] ETA: 9:39:39.16 [Epoch 0/60] [Batch 42/6487] [loss: 2.371767] ETA: 9:21:48.31 [Epoch 0/60] [Batch 43/6487] [loss: 2.663700] ETA: 9:31:15.52 [Epoch 0/60] [Batch 44/6487] [loss: 1.953101] ETA: 9:36:49.47 [Epoch 0/60] [Batch 45/6487] [loss: 1.967318] ETA: 9:28:22.67 [Epoch 0/60] [Batch 46/6487] [loss: 1.681611] ETA: 9:40:13.14 [Epoch 0/60] [Batch 47/6487] [loss: 1.203847] ETA: 11:38:35.6 [Epoch 0/60] [Batch 48/6487] [loss: 1.616149] ETA: 10:20:08.6 [Epoch 0/60] [Batch 49/6487] [loss: 2.641722] ETA: 10:36:38.7 [Epoch 0/60] [Batch 50/6487] [loss: 2.627393] ETA: 10:16:08.0 [Epoch 0/60] [Batch 51/6487] [loss: 2.047213] ETA: 10:27:03.4 [Epoch 0/60] [Batch 52/6487] [loss: 1.524367] ETA: 10:14:40.0 [Epoch 0/60] [Batch 53/6487] [loss: 1.499193] ETA: 10:31:07.6 [Epoch 0/60] [Batch 54/6487] [loss: 1.482741] ETA: 10:04:11.4 [Epoch 0/60] [Batch 55/6487] [loss: 0.953166] ETA: 10:25:37.7 [Epoch 0/60] [Batch 56/6487] [loss: 1.346713] ETA: 10:16:22.0 [Epoch 0/60] [Batch 57/6487] [loss: 1.526123] ETA: 10:15:25.8 [Epoch 0/60] [Batch 58/6487] [loss: 1.643487] ETA: 10:17:12.1 [Epoch 0/60] [Batch 59/6487] [loss: 0.794820] ETA: 10:02:32.9 [Epoch 0/60] [Batch 60/6487] [loss: 1.490152] ETA: 10:10:54.4 [Epoch 0/60] [Batch 61/6487] [loss: 1.175192] ETA: 10:23:03.9 [Epoch 0/60] [Batch 62/6487] [loss: 1.577219] ETA: 10:32:47.5 [Epoch 0/60] [Batch 63/6487] [loss: 1.808056] ETA: 10:17:46.9 [Epoch 0/60] [Batch 64/6487] [loss: 1.146155] ETA: 10:13:44.6 [Epoch 0/60] [Batch 65/6487] [loss: 0.982462] ETA: 10:10:09.3 [Epo
File "/home/star/whaiDir/PFCFuse/train.py", line 151, in <module>
feature_V_B, feature_V_D, _ = DIDF_Encoder(data_VIS)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 170, in forward
for t in chain(self.module.parameters(), self.module.buffers()):
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 2310, in buffers
for _, buf in self.named_buffers(recurse=recurse):
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 2337, in named_buffers
yield from gen
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 2223, in _named_members
for module_prefix, module in modules:
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 2435, in named_modules
yield from module.named_modules(memo, submodule_prefix, remove_duplicate)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 2435, in named_modules
yield from module.named_modules(memo, submodule_prefix, remove_duplicate)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 2435, in named_modules
yield from module.named_modules(memo, submodule_prefix, remove_duplicate)
[Previous line repeated 2 more times]
KeyboardInterrupt