pfcfuse/logs/log_20241006_221548.log

61 lines
19 KiB
Plaintext
Raw Permalink Normal View History

2.4.1+cu121
True
Model: PFCFuse
Number of epochs: 60
Epoch gap: 40
Learning rate: 0.0001
Weight decay: 0
Batch size: 1
GPU number: 0
Coefficient of MSE loss VF: 1.0
Coefficient of MSE loss IF: 1.0
Coefficient of RMI loss VF: 1.0
Coefficient of RMI loss IF: 1.0
Coefficient of Cosine loss VF: 1.0
Coefficient of Cosine loss IF: 1.0
Coefficient of Decomposition loss: 2.0
Coefficient of Total Variation loss: 5.0
Clip gradient norm value: 0.01
Optimization step: 20
Optimization gamma: 0.5
[Epoch 0/60] [Batch 0/6487] [loss: 9.264997] ETA: 9 days, 23 [Epoch 0/60] [Batch 1/6487] [loss: 34.969955] ETA: 10:10:55.4 [Epoch 0/60] [Batch 2/6487] [loss: 4.685072] ETA: 9:18:17.32 [Epoch 0/60] [Batch 3/6487] [loss: 6.396908] ETA: 9:07:01.12 [Epoch 0/60] [Batch 4/6487] [loss: 7.493115] ETA: 9:13:54.91 [Epoch 0/60] [Batch 5/6487] [loss: 10.458361] ETA: 9:13:42.39 [Epoch 0/60] [Batch 6/6487] [loss: 6.037536] ETA: 9:16:39.82 [Epoch 0/60] [Batch 7/6487] [loss: 8.048889] ETA: 9:16:37.41 [Epoch 0/60] [Batch 8/6487] [loss: 8.864985] ETA: 9:21:25.64 [Epoch 0/60] [Batch 9/6487] [loss: 5.235039] ETA: 9:21:18.97 [Epoch 0/60] [Batch 10/6487] [loss: 8.189146] ETA: 9:10:39.52 [Epoch 0/60] [Batch 11/6487] [loss: 28.157166] ETA: 9:12:02.58 [Epoch 0/60] [Batch 12/6487] [loss: 24.957096] ETA: 9:20:00.30 [Epoch 0/60] [Batch 13/6487] [loss: 6.462063] ETA: 9:14:48.89 [Epoch 0/60] [Batch 14/6487] [loss: 7.996860] ETA: 9:10:29.91 [Epoch 0/60] [Batch 15/6487] [loss: 31.331648] ETA: 9:09:35.26 [Epoch 0/60] [Batch 16/6487] [loss: 3.947951] ETA: 9:13:43.21 [Epoch 0/60] [Batch 17/6487] [loss: 3.096982] ETA: 9:13:46.84 [Epoch 0/60] [Batch 18/6487] [loss: 10.590578] ETA: 9:20:45.90 [Epoch 0/60] [Batch 19/6487] [loss: 24.223045] ETA: 9:16:02.33 [Epoch 0/60] [Batch 20/6487] [loss: 4.455335] ETA: 9:18:29.60 [Epoch 0/60] [Batch 21/6487] [loss: 6.397137] ETA: 9:17:36.25 [Epoch 0/60] [Batch 22/6487] [loss: 5.610237] ETA: 9:14:34.20 [Epoch 0/60] [Batch 23/6487] [loss: 14.335532] ETA: 9:16:27.60 [Epoch 0/60] [Batch 24/6487] [loss: 6.131247] ETA: 9:21:42.91 [Epoch 0/60] [Batch 25/6487] [loss: 25.698673] ETA: 9:48:32.66 [Epoch 0/60] [Batch 26/6487] [loss: 17.905416] ETA: 8:45:44.79 [Epoch 0/60] [Batch 27/6487] [loss: 12.629596] ETA: 9:29:06.19 [Epoch 0/60] [Batch 28/6487] [loss: 2.759912] ETA: 9:19:09.09 [Epoch 0/60] [Batch 29/6487] [loss: 3.437797] ETA: 9:16:02.49 [Epoch 0/60] [Batch 30/6487] [loss: 4.955575] ETA: 9:15:34.85 [Epoch 0/60] [Batch 31/6487] [loss: 12.267755] ETA: 9:25:52.84 [Epoch 0/60] [Batch 32/6487] [loss: 6.603993] ETA: 9:23:30.97 [Epoch 0/60] [Batch 33/6487] [loss: 4.700416] ETA: 9:19:23.87 [Epoch 0/60] [Batch 34/6487] [loss: 2.034435] ETA: 9:21:06.78 [Epoch 0/60] [Batch 35/6487] [loss: 2.524534] ETA: 9:20:06.57 [Epoch 0/60] [Batch 36/6487] [loss: 2.145031] ETA: 9:21:24.33 [Epoch 0/60] [Batch 37/6487] [loss: 5.916213] ETA: 9:10:36.86 [Epoch 0/60] [Batch 38/6487] [loss: 2.583171] ETA: 9:21:26.02 [Epoch 0/60] [Batch 39/6487] [loss: 3.443858] ETA: 9:21:26.21 [Epoch 0/60] [Batch 40/6487] [loss: 2.324850] ETA: 9:18:54.13 [Epoch 0/60] [Batch 41/6487] [loss: 1.963822] ETA: 9:18:47.65 [Epoch 0/60] [Batch 42/6487] [loss: 2.487358] ETA: 9:15:25.28 [Epoch 0/60] [Batch 43/6487] [loss: 4.819328] ETA: 9:22:20.42 [Epoch 0/60] [Batch 44/6487] [loss: 2.229982] ETA: 9:17:03.19 [Epoch 0/60] [Batch 45/6487] [loss: 2.486492] ETA: 9:20:22.41 [Epoch 0/60] [Batch 46/6487] [loss: 1.955426] ETA: 9:20:36.80 [Epoch 0/60] [Batch 47/6487] [loss: 2.649206] ETA: 9:26:42.38 [Epoch 0/60] [Batch 48/6487] [loss: 2.574780] ETA: 9:20:39.78 [Epoch 0/60] [Batch 49/6487] [loss: 2.521971] ETA: 9:21:56.24 [Epoch 0/60] [Batch 50/6487] [loss: 1.926707] ETA: 9:48:40.60 [Epoch 0/60] [Batch 51/6487] [loss: 1.874200] ETA: 8:55:21.46 [Epoch 0/60] [Batch 52/6487] [loss: 1.727591] ETA: 9:21:21.56 [Epoch 0/60] [Batch 53/6487] [loss: 1.662356] ETA: 9:21:41.05 [Epoch 0/60] [Batch 54/6487] [loss: 1.870807] ETA: 9:23:10.96 [Epoch 0/60] [Batch 55/6487] [loss: 2.336739] ETA: 9:31:00.64 [Epoch 0/60] [Batch 56/6487] [loss: 2.303160] ETA: 9:16:50.56 [Epoch 0/60] [Batch 57/6487] [loss: 1.943992] ETA: 9:22:21.99 [Epoch 0/60] [Batch 58/6487] [loss: 2.118214] ETA: 9:20:51.35 [Epoch 0/60] [Batch 59/6487] [loss: 1.870196] ETA: 9:27:12.69 [Epoch 0/60] [Batch 60/6487] [loss: 2.090588] ETA: 9:13:20.16 [Epoch 0/60] [Batch 61/6487] [loss: 1.935504] ETA: 9:29:00.42 [Epoch 0/60] [Batch 62/6487] [loss: 1.839970] ETA: 10:03:28.4 [Epoch 0/60] [Batch 63/6487] [loss: 1.744011] ETA: 9:11:34.13 [Epoch 0/60] [Batch 64/6487] [loss: 2.003899] ETA: 10:14:38.9 [Epoch 0/60] [Batch 65/6487] [loss: 1.508669] ETA: 9:44:13.88
File "/home/star/whaiDir/PFCFuse/train.py", line 151, in <module>
feature_V_B, feature_V_D, _ = DIDF_Encoder(data_VIS)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 184, in forward
return self.module(*inputs[0], **module_kwargs[0])
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/star/whaiDir/PFCFuse/net.py", line 455, in forward
out_enc_level1 = self.encoder_level1(inp_enc_level1)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/container.py", line 219, in forward
input = module(input)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/star/whaiDir/PFCFuse/net.py", line 412, in forward
x = x + self.ffn(self.norm2(x))
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/star/whaiDir/PFCFuse/net.py", line 331, in forward
return to_4d(self.body(to_3d(x)), h, w)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/star/whaiDir/PFCFuse/net.py", line 317, in forward
sigma = x.var(-1, keepdim=True, unbiased=False)
KeyboardInterrupt