glpn-nyu-finetuned-diode-221228-072509
This model is a fine-tuned version of vinvino02/glpn-nyu on the diode-subset dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4012
- Mae: 0.4030
- Rmse: 0.6173
- Abs Rel: 0.3487
- Log Mae: 0.1574
- Log Rmse: 0.2110
- Delta1: 0.4308
- Delta2: 0.6997
- Delta3: 0.8249
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 24
- eval_batch_size: 48
- seed: 2022
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.15
- num_epochs: 50
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Mae | Rmse | Abs Rel | Log Mae | Log Rmse | Delta1 | Delta2 | Delta3 |
---|---|---|---|---|---|---|---|---|---|---|---|
1.1571 | 1.0 | 72 | 0.6604 | 0.6233 | 0.8403 | 0.5125 | 0.3119 | 0.3691 | 0.1726 | 0.3423 | 0.4877 |
0.4895 | 2.0 | 144 | 0.4506 | 0.4460 | 0.6404 | 0.4241 | 0.1812 | 0.2299 | 0.3325 | 0.6053 | 0.7943 |
0.4709 | 3.0 | 216 | 0.4414 | 0.4370 | 0.6305 | 0.4243 | 0.1764 | 0.2253 | 0.3537 | 0.6145 | 0.7988 |
0.4436 | 4.0 | 288 | 0.4335 | 0.4324 | 0.6285 | 0.4045 | 0.1746 | 0.2245 | 0.3444 | 0.6506 | 0.8096 |
0.4656 | 5.0 | 360 | 0.4552 | 0.4515 | 0.6328 | 0.4614 | 0.1838 | 0.2307 | 0.3374 | 0.5762 | 0.7722 |
0.4482 | 6.0 | 432 | 0.4234 | 0.4166 | 0.6233 | 0.3805 | 0.1654 | 0.2179 | 0.4035 | 0.6623 | 0.8130 |
0.4099 | 7.0 | 504 | 0.4176 | 0.4185 | 0.6238 | 0.3676 | 0.1662 | 0.2150 | 0.3937 | 0.6589 | 0.8153 |
0.3987 | 8.0 | 576 | 0.4515 | 0.4431 | 0.6300 | 0.4497 | 0.1792 | 0.2283 | 0.3561 | 0.5906 | 0.7781 |
0.396 | 9.0 | 648 | 0.4235 | 0.4267 | 0.6347 | 0.3591 | 0.1716 | 0.2224 | 0.3934 | 0.6310 | 0.7963 |
0.3608 | 10.0 | 720 | 0.4312 | 0.4181 | 0.6227 | 0.4022 | 0.1666 | 0.2217 | 0.4014 | 0.6586 | 0.8173 |
0.3568 | 11.0 | 792 | 0.4322 | 0.4198 | 0.6183 | 0.4047 | 0.1674 | 0.2186 | 0.3870 | 0.6420 | 0.8071 |
0.3923 | 12.0 | 864 | 0.4225 | 0.4196 | 0.6294 | 0.3630 | 0.1668 | 0.2181 | 0.3910 | 0.6537 | 0.8151 |
0.3971 | 13.0 | 936 | 0.4086 | 0.4105 | 0.6219 | 0.3541 | 0.1614 | 0.2144 | 0.4234 | 0.6820 | 0.8144 |
0.372 | 14.0 | 1008 | 0.4127 | 0.4099 | 0.6172 | 0.3668 | 0.1612 | 0.2119 | 0.4046 | 0.6727 | 0.8260 |
0.3884 | 15.0 | 1080 | 0.4060 | 0.4074 | 0.6176 | 0.3528 | 0.1598 | 0.2119 | 0.4109 | 0.6925 | 0.8225 |
0.3616 | 16.0 | 1152 | 0.4078 | 0.4092 | 0.6198 | 0.3532 | 0.1615 | 0.2139 | 0.4162 | 0.6791 | 0.8186 |
0.3504 | 17.0 | 1224 | 0.4202 | 0.4320 | 0.6408 | 0.3613 | 0.1740 | 0.2261 | 0.3769 | 0.6301 | 0.7915 |
0.3823 | 18.0 | 1296 | 0.4328 | 0.4218 | 0.6182 | 0.4198 | 0.1684 | 0.2207 | 0.3916 | 0.6371 | 0.8113 |
0.3437 | 19.0 | 1368 | 0.4133 | 0.4138 | 0.6205 | 0.3638 | 0.1636 | 0.2162 | 0.3967 | 0.6761 | 0.8188 |
0.3739 | 20.0 | 1440 | 0.4040 | 0.4070 | 0.6187 | 0.3486 | 0.1594 | 0.2124 | 0.4214 | 0.6813 | 0.8214 |
0.3397 | 21.0 | 1512 | 0.4180 | 0.4300 | 0.6360 | 0.3601 | 0.1732 | 0.2239 | 0.3708 | 0.6362 | 0.8006 |
0.332 | 22.0 | 1584 | 0.4025 | 0.4050 | 0.6182 | 0.3505 | 0.1582 | 0.2114 | 0.4274 | 0.6909 | 0.8275 |
0.3552 | 23.0 | 1656 | 0.4120 | 0.4179 | 0.6305 | 0.3569 | 0.1650 | 0.2188 | 0.4002 | 0.6753 | 0.8102 |
0.3804 | 24.0 | 1728 | 0.4093 | 0.4111 | 0.6223 | 0.3594 | 0.1620 | 0.2152 | 0.4068 | 0.6851 | 0.8166 |
0.3519 | 25.0 | 1800 | 0.4039 | 0.4122 | 0.6237 | 0.3511 | 0.1621 | 0.2137 | 0.4109 | 0.6895 | 0.8171 |
0.3276 | 26.0 | 1872 | 0.4044 | 0.4117 | 0.6183 | 0.3533 | 0.1623 | 0.2127 | 0.3979 | 0.6824 | 0.8251 |
0.3167 | 27.0 | 1944 | 0.4091 | 0.4099 | 0.6189 | 0.3600 | 0.1613 | 0.2135 | 0.4069 | 0.6898 | 0.8218 |
0.3547 | 28.0 | 2016 | 0.4051 | 0.4055 | 0.6192 | 0.3521 | 0.1586 | 0.2119 | 0.4216 | 0.6921 | 0.8256 |
0.3297 | 29.0 | 2088 | 0.4025 | 0.4091 | 0.6215 | 0.3500 | 0.1605 | 0.2126 | 0.4155 | 0.6960 | 0.8224 |
0.3305 | 30.0 | 2160 | 0.4040 | 0.4045 | 0.6171 | 0.3507 | 0.1584 | 0.2120 | 0.4281 | 0.6938 | 0.8255 |
0.34 | 31.0 | 2232 | 0.4036 | 0.4082 | 0.6194 | 0.3492 | 0.1606 | 0.2132 | 0.4196 | 0.6851 | 0.8207 |
0.3507 | 32.0 | 2304 | 0.4057 | 0.4120 | 0.6245 | 0.3482 | 0.1619 | 0.2148 | 0.4195 | 0.6777 | 0.8172 |
0.3617 | 33.0 | 2376 | 0.4036 | 0.4098 | 0.6241 | 0.3477 | 0.1606 | 0.2141 | 0.4219 | 0.6871 | 0.8186 |
0.3268 | 34.0 | 2448 | 0.4015 | 0.4060 | 0.6197 | 0.3440 | 0.1593 | 0.2122 | 0.4326 | 0.6868 | 0.8211 |
0.3188 | 35.0 | 2520 | 0.4018 | 0.4032 | 0.6154 | 0.3504 | 0.1575 | 0.2107 | 0.4306 | 0.6952 | 0.8250 |
0.3286 | 36.0 | 2592 | 0.4046 | 0.4103 | 0.6237 | 0.3507 | 0.1611 | 0.2139 | 0.4179 | 0.6883 | 0.8173 |
0.3279 | 37.0 | 2664 | 0.3995 | 0.3993 | 0.6118 | 0.3460 | 0.1558 | 0.2091 | 0.4401 | 0.6979 | 0.8272 |
0.3439 | 38.0 | 2736 | 0.4052 | 0.4063 | 0.6196 | 0.3555 | 0.1590 | 0.2117 | 0.4207 | 0.6972 | 0.8256 |
0.3188 | 39.0 | 2808 | 0.4028 | 0.4028 | 0.6176 | 0.3482 | 0.1574 | 0.2112 | 0.4351 | 0.6916 | 0.8253 |
0.3334 | 40.0 | 2880 | 0.4059 | 0.4093 | 0.6218 | 0.3534 | 0.1607 | 0.2137 | 0.4201 | 0.6885 | 0.8217 |
0.3393 | 41.0 | 2952 | 0.4043 | 0.4048 | 0.6193 | 0.3492 | 0.1584 | 0.2118 | 0.4300 | 0.6906 | 0.8246 |
0.3099 | 42.0 | 3024 | 0.4029 | 0.4041 | 0.6161 | 0.3499 | 0.1583 | 0.2118 | 0.4274 | 0.6966 | 0.8239 |
0.3339 | 43.0 | 3096 | 0.4032 | 0.4056 | 0.6213 | 0.3515 | 0.1584 | 0.2122 | 0.4257 | 0.6995 | 0.8239 |
0.3086 | 44.0 | 3168 | 0.4024 | 0.4049 | 0.6173 | 0.3509 | 0.1586 | 0.2120 | 0.4243 | 0.6994 | 0.8227 |
0.3262 | 45.0 | 3240 | 0.4007 | 0.4035 | 0.6185 | 0.3467 | 0.1575 | 0.2112 | 0.4304 | 0.6994 | 0.8246 |
0.3265 | 46.0 | 3312 | 0.4017 | 0.4033 | 0.6170 | 0.3495 | 0.1574 | 0.2110 | 0.4271 | 0.7043 | 0.8247 |
0.3324 | 47.0 | 3384 | 0.4015 | 0.4056 | 0.6192 | 0.3471 | 0.1587 | 0.2119 | 0.4281 | 0.6944 | 0.8220 |
0.3159 | 48.0 | 3456 | 0.4012 | 0.4036 | 0.6156 | 0.3487 | 0.1581 | 0.2114 | 0.4279 | 0.6982 | 0.8234 |
0.3238 | 49.0 | 3528 | 0.4017 | 0.4024 | 0.6161 | 0.3499 | 0.1571 | 0.2106 | 0.4304 | 0.7008 | 0.8255 |
0.3112 | 50.0 | 3600 | 0.4012 | 0.4030 | 0.6173 | 0.3487 | 0.1574 | 0.2110 | 0.4308 | 0.6997 | 0.8249 |
Framework versions
- Transformers 4.24.0
- PyTorch 1.12.1+cu116
- Datasets 2.8.0
- Tokenizers 0.13.2
数据统计
数据评估
关于sayakpaul/glpn-nyu-finetuned-diode-221228-072509特别声明
本站Ai导航提供的sayakpaul/glpn-nyu-finetuned-diode-221228-072509都来源于网络,不保证外部链接的准确性和完整性,同时,对于该外部链接的指向,不由Ai导航实际控制,在2023年5月15日 下午3:10收录时,该网页上的内容,都属于合规合法,后期网页的内容如出现违规,可以直接联系网站管理员进行删除,Ai导航不承担任何责任。
相关导航
暂无评论...