pfcfuse/logs/log_20241006_221548.log
zjut 0e6064181a 新增SAR图像处理功能并优化模型性能
- 新增BaseFeatureExtractionSAR和DetailFeatureExtractionSAR类,专门用于SAR图像的特征提取
- 在Restormer_Encoder中加入SAR图像处理的支持,通过新增的SAR特征提取模块提高模型对SAR图像的处理能力
- 更新test_IVF.py,增加对SAR图像的测试,验证模型在不同数据集上的性能
- 通过这些修改,模型在TNO和RoadScene数据集上的表现得到显著提升,详细指标见日志文件
2024-10-09 12:09:30 +08:00

61 lines
19 KiB
Plaintext

2.4.1+cu121
True
Model: PFCFuse
Number of epochs: 60
Epoch gap: 40
Learning rate: 0.0001
Weight decay: 0
Batch size: 1
GPU number: 0
Coefficient of MSE loss VF: 1.0
Coefficient of MSE loss IF: 1.0
Coefficient of RMI loss VF: 1.0
Coefficient of RMI loss IF: 1.0
Coefficient of Cosine loss VF: 1.0
Coefficient of Cosine loss IF: 1.0
Coefficient of Decomposition loss: 2.0
Coefficient of Total Variation loss: 5.0
Clip gradient norm value: 0.01
Optimization step: 20
Optimization gamma: 0.5
[Epoch 0/60] [Batch 0/6487] [loss: 9.264997] ETA: 9 days, 23
[Epoch 0/60] [Batch 1/6487] [loss: 34.969955] ETA: 10:10:55.4
[Epoch 0/60] [Batch 2/6487] [loss: 4.685072] ETA: 9:18:17.32
[Epoch 0/60] [Batch 3/6487] [loss: 6.396908] ETA: 9:07:01.12
[Epoch 0/60] [Batch 4/6487] [loss: 7.493115] ETA: 9:13:54.91
[Epoch 0/60] [Batch 5/6487] [loss: 10.458361] ETA: 9:13:42.39
[Epoch 0/60] [Batch 6/6487] [loss: 6.037536] ETA: 9:16:39.82
[Epoch 0/60] [Batch 7/6487] [loss: 8.048889] ETA: 9:16:37.41
[Epoch 0/60] [Batch 8/6487] [loss: 8.864985] ETA: 9:21:25.64
[Epoch 0/60] [Batch 9/6487] [loss: 5.235039] ETA: 9:21:18.97
[Epoch 0/60] [Batch 10/6487] [loss: 8.189146] ETA: 9:10:39.52
[Epoch 0/60] [Batch 11/6487] [loss: 28.157166] ETA: 9:12:02.58
[Epoch 0/60] [Batch 12/6487] [loss: 24.957096] ETA: 9:20:00.30
[Epoch 0/60] [Batch 13/6487] [loss: 6.462063] ETA: 9:14:48.89
[Epoch 0/60] [Batch 14/6487] [loss: 7.996860] ETA: 9:10:29.91
[Epoch 0/60] [Batch 15/6487] [loss: 31.331648] ETA: 9:09:35.26
[Epoch 0/60] [Batch 16/6487] [loss: 3.947951] ETA: 9:13:43.21
[Epoch 0/60] [Batch 17/6487] [loss: 3.096982] ETA: 9:13:46.84
[Epoch 0/60] [Batch 18/6487] [loss: 10.590578] ETA: 9:20:45.90
[Epoch 0/60] [Batch 19/6487] [loss: 24.223045] ETA: 9:16:02.33
[Epoch 0/60] [Batch 20/6487] [loss: 4.455335] ETA: 9:18:29.60
[Epoch 0/60] [Batch 21/6487] [loss: 6.397137] ETA: 9:17:36.25
[Epoch 0/60] [Batch 22/6487] [loss: 5.610237] ETA: 9:14:34.20
[Epoch 0/60] [Batch 23/6487] [loss: 14.335532] ETA: 9:16:27.60
[Epoch 0/60] [Batch 24/6487] [loss: 6.131247] ETA: 9:21:42.91
[Epoch 0/60] [Batch 25/6487] [loss: 25.698673] ETA: 9:48:32.66
[Epoch 0/60] [Batch 26/6487] [loss: 17.905416] ETA: 8:45:44.79
[Epoch 0/60] [Batch 27/6487] [loss: 12.629596] ETA: 9:29:06.19
[Epoch 0/60] [Batch 28/6487] [loss: 2.759912] ETA: 9:19:09.09
[Epoch 0/60] [Batch 29/6487] [loss: 3.437797] ETA: 9:16:02.49
[Epoch 0/60] [Batch 30/6487] [loss: 4.955575] ETA: 9:15:34.85
[Epoch 0/60] [Batch 31/6487] [loss: 12.267755] ETA: 9:25:52.84
[Epoch 0/60] [Batch 32/6487] [loss: 6.603993] ETA: 9:23:30.97
[Epoch 0/60] [Batch 33/6487] [loss: 4.700416] ETA: 9:19:23.87
[Epoch 0/60] [Batch 34/6487] [loss: 2.034435] ETA: 9:21:06.78
[Epoch 0/60] [Batch 35/6487] [loss: 2.524534] ETA: 9:20:06.57
[Epoch 0/60] [Batch 36/6487] [loss: 2.145031] ETA: 9:21:24.33
[Epoch 0/60] [Batch 37/6487] [loss: 5.916213] ETA: 9:10:36.86
[Epoch 0/60] [Batch 38/6487] [loss: 2.583171] ETA: 9:21:26.02
[Epoch 0/60] [Batch 39/6487] [loss: 3.443858] ETA: 9:21:26.21
[Epoch 0/60] [Batch 40/6487] [loss: 2.324850] ETA: 9:18:54.13
[Epoch 0/60] [Batch 41/6487] [loss: 1.963822] ETA: 9:18:47.65
[Epoch 0/60] [Batch 42/6487] [loss: 2.487358] ETA: 9:15:25.28
[Epoch 0/60] [Batch 43/6487] [loss: 4.819328] ETA: 9:22:20.42
[Epoch 0/60] [Batch 44/6487] [loss: 2.229982] ETA: 9:17:03.19
[Epoch 0/60] [Batch 45/6487] [loss: 2.486492] ETA: 9:20:22.41
[Epoch 0/60] [Batch 46/6487] [loss: 1.955426] ETA: 9:20:36.80
[Epoch 0/60] [Batch 47/6487] [loss: 2.649206] ETA: 9:26:42.38
[Epoch 0/60] [Batch 48/6487] [loss: 2.574780] ETA: 9:20:39.78
[Epoch 0/60] [Batch 49/6487] [loss: 2.521971] ETA: 9:21:56.24
[Epoch 0/60] [Batch 50/6487] [loss: 1.926707] ETA: 9:48:40.60
[Epoch 0/60] [Batch 51/6487] [loss: 1.874200] ETA: 8:55:21.46
[Epoch 0/60] [Batch 52/6487] [loss: 1.727591] ETA: 9:21:21.56
[Epoch 0/60] [Batch 53/6487] [loss: 1.662356] ETA: 9:21:41.05
[Epoch 0/60] [Batch 54/6487] [loss: 1.870807] ETA: 9:23:10.96
[Epoch 0/60] [Batch 55/6487] [loss: 2.336739] ETA: 9:31:00.64
[Epoch 0/60] [Batch 56/6487] [loss: 2.303160] ETA: 9:16:50.56
[Epoch 0/60] [Batch 57/6487] [loss: 1.943992] ETA: 9:22:21.99
[Epoch 0/60] [Batch 58/6487] [loss: 2.118214] ETA: 9:20:51.35
[Epoch 0/60] [Batch 59/6487] [loss: 1.870196] ETA: 9:27:12.69
[Epoch 0/60] [Batch 60/6487] [loss: 2.090588] ETA: 9:13:20.16
[Epoch 0/60] [Batch 61/6487] [loss: 1.935504] ETA: 9:29:00.42
[Epoch 0/60] [Batch 62/6487] [loss: 1.839970] ETA: 10:03:28.4
[Epoch 0/60] [Batch 63/6487] [loss: 1.744011] ETA: 9:11:34.13
[Epoch 0/60] [Batch 64/6487] [loss: 2.003899] ETA: 10:14:38.9
[Epoch 0/60] [Batch 65/6487] [loss: 1.508669] ETA: 9:44:13.88
[Epoch 0/60] [Batch 66/6487] [loss: 1.015858] ETA: 9:36:19.12
[Epoch 0/60] [Batch 67/6487] [loss: 2.166922] ETA: 9:57:38.49
[Epoch 0/60] [Batch 68/6487] [loss: 1.349658] ETA: 9:53:34.94
[Epoch 0/60] [Batch 69/6487] [loss: 2.234611] ETA: 9:53:26.31
[Epoch 0/60] [Batch 70/6487] [loss: 1.512321] ETA: 9:49:59.22
[Epoch 0/60] [Batch 71/6487] [loss: 1.137223] ETA: 9:35:58.17
[Epoch 0/60] [Batch 72/6487] [loss: 2.020911] ETA: 9:28:33.67
[Epoch 0/60] [Batch 73/6487] [loss: 1.814069] ETA: 9:35:30.72
[Epoch 0/60] [Batch 74/6487] [loss: 1.172289] ETA: 9:38:04.83
[Epoch 0/60] [Batch 75/6487] [loss: 1.442941] ETA: 10:07:36.7
[Epoch 0/60] [Batch 76/6487] [loss: 3.720819] ETA: 8:59:20.53
[Epoch 0/60] [Batch 77/6487] [loss: 1.971423] ETA: 9:34:30.98
[Epoch 0/60] [Batch 78/6487] [loss: 2.130239] ETA: 9:36:38.93
[Epoch 0/60] [Batch 79/6487] [loss: 1.784470] ETA: 9:38:30.45
[Epoch 0/60] [Batch 80/6487] [loss: 1.534155] ETA: 9:35:42.25
[Epoch 0/60] [Batch 81/6487] [loss: 1.588059] ETA: 9:45:49.02
[Epoch 0/60] [Batch 82/6487] [loss: 1.342675] ETA: 9:40:06.77
[Epoch 0/60] [Batch 83/6487] [loss: 1.641068] ETA: 9:36:54.44
[Epoch 0/60] [Batch 84/6487] [loss: 1.234618] ETA: 10:04:31.4
[Epoch 0/60] [Batch 85/6487] [loss: 1.534733] ETA: 9:36:36.08
[Epoch 0/60] [Batch 86/6487] [loss: 1.438340] ETA: 9:30:23.77
[Epoch 0/60] [Batch 87/6487] [loss: 1.263020] ETA: 9:43:36.64
[Epoch 0/60] [Batch 88/6487] [loss: 1.189186] ETA: 9:27:47.64
[Epoch 0/60] [Batch 89/6487] [loss: 1.248143] ETA: 9:32:12.15
[Epoch 0/60] [Batch 90/6487] [loss: 1.272425] ETA: 9:45:12.58
[Epoch 0/60] [Batch 91/6487] [loss: 1.629742] ETA: 9:35:25.50
[Epoch 0/60] [Batch 92/6487] [loss: 1.344467] ETA: 9:28:24.12
[Epoch 0/60] [Batch 93/6487] [loss: 1.508719] ETA: 9:53:04.26
[Epoch 0/60] [Batch 94/6487] [loss: 2.321456] ETA: 9:36:06.98
[Epoch 0/60] [Batch 95/6487] [loss: 1.912823] ETA: 9:41:35.22
[Epoch 0/60] [Batch 96/6487] [loss: 2.139884] ETA: 9:35:29.42
[Epoch 0/60] [Batch 97/6487] [loss: 0.978228] ETA: 9:33:52.19
[Epoch 0/60] [Batch 98/6487] [loss: 1.276696] ETA: 9:34:23.93
[Epoch 0/60] [Batch 99/6487] [loss: 1.071877] ETA: 9:36:02.46
[Epoch 0/60] [Batch 100/6487] [loss: 1.387129] ETA: 9:43:49.30
[Epoch 0/60] [Batch 101/6487] [loss: 0.843462] ETA: 9:35:28.42
[Epoch 0/60] [Batch 102/6487] [loss: 2.240291] ETA: 9:27:49.29
[Epoch 0/60] [Batch 103/6487] [loss: 1.120261] ETA: 9:44:39.87
[Epoch 0/60] [Batch 104/6487] [loss: 1.016900] ETA: 9:41:50.84
[Epoch 0/60] [Batch 105/6487] [loss: 1.226098] ETA: 9:44:45.62
[Epoch 0/60] [Batch 106/6487] [loss: 0.977422] ETA: 9:58:58.94
[Epoch 0/60] [Batch 107/6487] [loss: 1.398026] ETA: 9:25:55.85
[Epoch 0/60] [Batch 108/6487] [loss: 1.806949] ETA: 9:27:47.19
[Epoch 0/60] [Batch 109/6487] [loss: 1.668649] ETA: 9:32:05.56
[Epoch 0/60] [Batch 110/6487] [loss: 1.131265] ETA: 9:45:49.46
[Epoch 0/60] [Batch 111/6487] [loss: 1.278082] ETA: 9:30:43.84
[Epoch 0/60] [Batch 112/6487] [loss: 1.386241] ETA: 9:31:44.14
[Epoch 0/60] [Batch 113/6487] [loss: 1.487655] ETA: 9:36:19.95
[Epoch 0/60] [Batch 114/6487] [loss: 2.873362] ETA: 9:33:28.52
[Epoch 0/60] [Batch 115/6487] [loss: 1.397036] ETA: 9:47:07.40
[Epoch 0/60] [Batch 116/6487] [loss: 1.702965] ETA: 9:27:01.12
[Epoch 0/60] [Batch 117/6487] [loss: 1.580123] ETA: 9:45:47.81
[Epoch 0/60] [Batch 118/6487] [loss: 1.535376] ETA: 9:57:00.95
[Epoch 0/60] [Batch 119/6487] [loss: 1.180449] ETA: 9:28:43.74
[Epoch 0/60] [Batch 120/6487] [loss: 1.315663] ETA: 9:25:18.08
[Epoch 0/60] [Batch 121/6487] [loss: 1.181417] ETA: 9:32:45.60
[Epoch 0/60] [Batch 122/6487] [loss: 1.230970] ETA: 9:26:48.91
[Epoch 0/60] [Batch 123/6487] [loss: 1.158526] ETA: 9:23:05.43
[Epoch 0/60] [Batch 124/6487] [loss: 1.047434] ETA: 9:39:59.48
[Epoch 0/60] [Batch 125/6487] [loss: 1.000449] ETA: 9:52:41.11
[Epoch 0/60] [Batch 126/6487] [loss: 0.979729] ETA: 9:13:14.62
[Epoch 0/60] [Batch 127/6487] [loss: 0.960404] ETA: 9:37:42.11
[Epoch 0/60] [Batch 128/6487] [loss: 1.194977] ETA: 9:51:09.83
[Epoch 0/60] [Batch 129/6487] [loss: 1.269805] ETA: 9:38:29.89
[Epoch 0/60] [Batch 130/6487] [loss: 1.356613] ETA: 9:33:26.08
[Epoch 0/60] [Batch 131/6487] [loss: 1.014038] ETA: 9:32:13.45
[Epoch 0/60] [Batch 132/6487] [loss: 1.064105] ETA: 9:35:38.84
[Epoch 0/60] [Batch 133/6487] [loss: 0.845038] ETA: 9:33:11.25
[Epoch 0/60] [Batch 134/6487] [loss: 1.129201] ETA: 9:37:27.10
[Epoch 0/60] [Batch 135/6487] [loss: 1.482561] ETA: 9:35:32.08
[Epoch 0/60] [Batch 136/6487] [loss: 1.449351] ETA: 9:32:26.83
[Epoch 0/60] [Batch 137/6487] [loss: 1.237889] ETA: 9:31:36.74
[Epoch 0/60] [Batch 138/6487] [loss: 1.296223] ETA: 9:31:15.32
[Epoch 0/60] [Batch 139/6487] [loss: 2.374418] ETA: 9:33:03.77
[Epoch 0/60] [Batch 140/6487] [loss: 1.336745] ETA: 9:53:23.80
[Epoch 0/60] [Batch 141/6487] [loss: 1.438642] ETA: 9:45:41.75
[Epoch 0/60] [Batch 142/6487] [loss: 1.407925] ETA: 9:49:21.32
[Epoch 0/60] [Batch 143/6487] [loss: 1.250783] ETA: 9:38:01.74
[Epoch 0/60] [Batch 144/6487] [loss: 1.145167] ETA: 9:26:50.60
[Epoch 0/60] [Batch 145/6487] [loss: 1.113495] ETA: 9:30:50.03
[Epoch 0/60] [Batch 146/6487] [loss: 1.338002] ETA: 9:34:35.72
[Epoch 0/60] [Batch 147/6487] [loss: 1.714365] ETA: 9:33:09.74
[Epoch 0/60] [Batch 148/6487] [loss: 1.792521] ETA: 9:35:12.74
[Epoch 0/60] [Batch 149/6487] [loss: 1.617640] ETA: 9:28:31.65
[Epoch 0/60] [Batch 150/6487] [loss: 1.396900] ETA: 9:35:55.14
[Epoch 0/60] [Batch 151/6487] [loss: 1.346544] ETA: 9:30:48.85
[Epoch 0/60] [Batch 152/6487] [loss: 0.957624] ETA: 9:29:31.40
[Epoch 0/60] [Batch 153/6487] [loss: 1.148031] ETA: 9:33:38.89
[Epoch 0/60] [Batch 154/6487] [loss: 3.903023] ETA: 9:38:24.88
[Epoch 0/60] [Batch 155/6487] [loss: 1.112403] ETA: 9:38:39.07
[Epoch 0/60] [Batch 156/6487] [loss: 2.077783] ETA: 9:31:41.75
[Epoch 0/60] [Batch 157/6487] [loss: 2.053895] ETA: 9:33:39.46
[Epoch 0/60] [Batch 158/6487] [loss: 2.626952] ETA: 9:35:02.02
[Epoch 0/60] [Batch 159/6487] [loss: 0.956100] ETA: 9:34:55.63
[Epoch 0/60] [Batch 160/6487] [loss: 2.261317] ETA: 9:33:13.60
[Epoch 0/60] [Batch 161/6487] [loss: 0.918781] ETA: 9:38:06.07
[Epoch 0/60] [Batch 162/6487] [loss: 1.693949] ETA: 9:33:19.45
[Epoch 0/60] [Batch 163/6487] [loss: 0.839539] ETA: 9:30:22.01
[Epoch 0/60] [Batch 164/6487] [loss: 1.108890] ETA: 9:34:10.29
[Epoch 0/60] [Batch 165/6487] [loss: 1.471871] ETA: 9:50:27.87
[Epoch 0/60] [Batch 166/6487] [loss: 1.319818] ETA: 9:44:21.20
[Epoch 0/60] [Batch 167/6487] [loss: 1.503072] ETA: 9:49:04.39
[Epoch 0/60] [Batch 168/6487] [loss: 0.896298] ETA: 9:42:26.93
[Epoch 0/60] [Batch 169/6487] [loss: 1.548724] ETA: 10:07:42.4
[Epoch 0/60] [Batch 170/6487] [loss: 1.222259] ETA: 9:56:36.77
[Epoch 0/60] [Batch 171/6487] [loss: 1.127339] ETA: 10:07:42.4
[Epoch 0/60] [Batch 172/6487] [loss: 1.107970] ETA: 9:48:36.30
[Epoch 0/60] [Batch 173/6487] [loss: 1.561673] ETA: 10:42:36.6
[Epoch 0/60] [Batch 174/6487] [loss: 1.351226] ETA: 11:16:50.0
[Epoch 0/60] [Batch 175/6487] [loss: 1.491185] ETA: 11:35:26.6
[Epoch 0/60] [Batch 176/6487] [loss: 1.395391] ETA: 12:48:57.2
[Epoch 0/60] [Batch 177/6487] [loss: 0.986128] ETA: 11:22:11.9
[Epoch 0/60] [Batch 178/6487] [loss: 1.064102] ETA: 12:02:39.7
[Epoch 0/60] [Batch 179/6487] [loss: 1.170317] ETA: 11:47:03.5
[Epoch 0/60] [Batch 180/6487] [loss: 0.842737] ETA: 9:50:33.00
[Epoch 0/60] [Batch 181/6487] [loss: 1.032115] ETA: 9:54:11.53
[Epoch 0/60] [Batch 182/6487] [loss: 0.780621] ETA: 9:47:51.42
[Epoch 0/60] [Batch 183/6487] [loss: 0.836482] ETA: 9:54:14.68
[Epoch 0/60] [Batch 184/6487] [loss: 0.951862] ETA: 8:54:12.61
[Epoch 0/60] [Batch 185/6487] [loss: 1.273444] ETA: 9:53:33.60
[Epoch 0/60] [Batch 186/6487] [loss: 1.206780] ETA: 9:51:58.99
[Epoch 0/60] [Batch 187/6487] [loss: 1.061020] ETA: 12:26:16.1
[Epoch 0/60] [Batch 188/6487] [loss: 0.924903] ETA: 16:50:23.1
[Epoch 0/60] [Batch 189/6487] [loss: 1.036489] ETA: 9:24:24.66
[Epoch 0/60] [Batch 190/6487] [loss: 0.976098] ETA: 12:40:30.3
[Epoch 0/60] [Batch 191/6487] [loss: 0.987703] ETA: 10:06:22.7
[Epoch 0/60] [Batch 192/6487] [loss: 3.107231] ETA: 10:55:38.0
[Epoch 0/60] [Batch 193/6487] [loss: 0.772683] ETA: 10:41:51.3
[Epoch 0/60] [Batch 194/6487] [loss: 0.946480] ETA: 10:39:10.2
[Epoch 0/60] [Batch 195/6487] [loss: 0.976230] ETA: 16:51:01.0
[Epoch 0/60] [Batch 196/6487] [loss: 0.992235] ETA: 10:18:47.9
[Epoch 0/60] [Batch 197/6487] [loss: 0.936621] ETA: 15:57:35.8
[Epoch 0/60] [Batch 198/6487] [loss: 0.978449] ETA: 10:11:02.2
[Epoch 0/60] [Batch 199/6487] [loss: 1.187767] ETA: 9:32:48.45
[Epoch 0/60] [Batch 200/6487] [loss: 1.089627] ETA: 9:59:25.32
[Epoch 0/60] [Batch 201/6487] [loss: 0.997098] ETA: 9:57:26.88
[Epoch 0/60] [Batch 202/6487] [loss: 1.034188] ETA: 9:54:09.05
[Epoch 0/60] [Batch 203/6487] [loss: 1.012351] ETA: 9:38:44.99
[Epoch 0/60] [Batch 204/6487] [loss: 0.706249] ETA: 9:29:51.50
[Epoch 0/60] [Batch 205/6487] [loss: 0.639864] ETA: 9:34:36.34
[Epoch 0/60] [Batch 206/6487] [loss: 0.763759] ETA: 9:26:21.72
[Epoch 0/60] [Batch 207/6487] [loss: 1.835739] ETA: 9:28:11.91
[Epoch 0/60] [Batch 208/6487] [loss: 1.017551] ETA: 9:30:26.21
[Epoch 0/60] [Batch 209/6487] [loss: 0.759422] ETA: 9:42:10.82
[Epoch 0/60] [Batch 210/6487] [loss: 0.868113] ETA: 9:13:41.86
[Epoch 0/60] [Batch 211/6487] [loss: 0.824947] ETA: 9:39:29.17
[Epoch 0/60] [Batch 212/6487] [loss: 0.768332] ETA: 9:24:17.38
[Epoch 0/60] [Batch 213/6487] [loss: 1.508149] ETA: 9:45:10.94
[Epoch 0/60] [Batch 214/6487] [loss: 1.059426] ETA: 10:51:35.7
[Epoch 0/60] [Batch 215/6487] [loss: 0.997546] ETA: 9:08:52.90
[Epoch 0/60] [Batch 216/6487] [loss: 0.894794] ETA: 9:39:58.95
[Epoch 0/60] [Batch 217/6487] [loss: 1.146052] ETA: 10:39:22.8
[Epoch 0/60] [Batch 218/6487] [loss: 1.169286] ETA: 8:53:31.31
[Epoch 0/60] [Batch 219/6487] [loss: 1.430303] ETA: 10:05:00.9
[Epoch 0/60] [Batch 220/6487] [loss: 1.306383] ETA: 9:59:43.78
[Epoch 0/60] [Batch 221/6487] [loss: 0.734779] ETA: 9:19:41.51
[Epoch 0/60] [Batch 222/6487] [loss: 0.823837] ETA: 9:28:08.55
[Epoch 0/60] [Batch 223/6487] [loss: 1.148168] ETA: 9:29:32.12
[Epoch 0/60] [Batch 224/6487] [loss: 1.276476] ETA: 9:33:11.37
[Epoch 0/60] [Batch 225/6487] [loss: 0.850903] ETA: 9:48:17.57
[Epoch 0/60] [Batch 226/6487] [loss: 0.838463] ETA: 9:59:23.29
[Epoch 0/60] [Batch 227/6487] [loss: 0.793223] ETA: 10:30:13.0
[Epoch 0/60] [Batch 228/6487] [loss: 1.316302] ETA: 9:01:30.81
[Epoch 0/60] [Batch 229/6487] [loss: 0.875635] ETA: 9:43:40.47
[Epoch 0/60] [Batch 230/6487] [loss: 1.370393] ETA: 9:56:30.88
[Epoch 0/60] [Batch 231/6487] [loss: 0.916656] ETA: 10:09:28.3
[Epoch 0/60] [Batch 232/6487] [loss: 1.030352] ETA: 9:32:16.60
[Epoch 0/60] [Batch 233/6487] [loss: 0.995410] ETA: 9:39:07.91
[Epoch 0/60] [Batch 234/6487] [loss: 0.744982] ETA: 10:09:24.7
[Epoch 0/60] [Batch 235/6487] [loss: 1.081333] ETA: 10:04:24.9
[Epoch 0/60] [Batch 236/6487] [loss: 1.095582] ETA: 9:36:30.82
[Epoch 0/60] [Batch 237/6487] [loss: 1.291879] ETA: 10:33:27.3
[Epoch 0/60] [Batch 238/6487] [loss: 0.764991] ETA: 10:27:58.5
[Epoch 0/60] [Batch 239/6487] [loss: 0.885778] ETA: 9:39:38.44
[Epoch 0/60] [Batch 240/6487] [loss: 0.754958] ETA: 9:58:45.27
[Epoch 0/60] [Batch 241/6487] [loss: 0.648712] ETA: 10:28:16.6
[Epoch 0/60] [Batch 242/6487] [loss: 0.767040] ETA: 9:52:28.19
[Epoch 0/60] [Batch 243/6487] [loss: 0.746988] ETA: 10:09:11.4
[Epoch 0/60] [Batch 244/6487] [loss: 0.757815] ETA: 10:22:31.3
[Epoch 0/60] [Batch 245/6487] [loss: 0.847708] ETA: 9:03:49.05
[Epoch 0/60] [Batch 246/6487] [loss: 0.835909] ETA: 8:55:07.31
[Epoch 0/60] [Batch 247/6487] [loss: 1.204782] ETA: 10:15:00.7
[Epoch 0/60] [Batch 248/6487] [loss: 0.697187] ETA: 9:54:13.46
[Epoch 0/60] [Batch 249/6487] [loss: 0.864193] ETA: 9:48:31.16
[Epoch 0/60] [Batch 250/6487] [loss: 0.679215] ETA: 9:55:04.00
[Epoch 0/60] [Batch 251/6487] [loss: 2.023219] ETA: 9:48:05.11
[Epoch 0/60] [Batch 252/6487] [loss: 0.958474] ETA: 11:27:19.2
[Epoch 0/60] [Batch 253/6487] [loss: 0.979305] ETA: 10:50:49.3
[Epoch 0/60] [Batch 254/6487] [loss: 0.834743] ETA: 9:32:21.42
[Epoch 0/60] [Batch 255/6487] [loss: 0.943245] ETA: 9:03:17.24
[Epoch 0/60] [Batch 256/6487] [loss: 0.638397] ETA: 9:39:46.38
[Epoch 0/60] [Batch 257/6487] [loss: 0.552354] ETA: 9:32:30.34
[Epoch 0/60] [Batch 258/6487] [loss: 0.872985] ETA: 9:36:28.30
[Epoch 0/60] [Batch 259/6487] [loss: 0.787122] ETA: 9:44:40.36Traceback (most recent call last):
File "/home/star/whaiDir/PFCFuse/train.py", line 151, in <module>
feature_V_B, feature_V_D, _ = DIDF_Encoder(data_VIS)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 184, in forward
return self.module(*inputs[0], **module_kwargs[0])
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/star/whaiDir/PFCFuse/net.py", line 455, in forward
out_enc_level1 = self.encoder_level1(inp_enc_level1)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/container.py", line 219, in forward
input = module(input)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/star/whaiDir/PFCFuse/net.py", line 412, in forward
x = x + self.ffn(self.norm2(x))
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/star/whaiDir/PFCFuse/net.py", line 331, in forward
return to_4d(self.body(to_3d(x)), h, w)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/star/anaconda3/envs/pfcfuse/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/star/whaiDir/PFCFuse/net.py", line 317, in forward
sigma = x.var(-1, keepdim=True, unbiased=False)
KeyboardInterrupt