Loading...
HF计算机视觉

sayakpaul/glpn-nyu-finetuned-diode-230131-041708

glpn-nyu-finetuned-diode-23...

标签:


glpn-nyu-finetuned-diode-230131-041708

This model is a fine-tuned version of vinvino02/glpn-nyu on the diode-subset dataset.
It achieves the following results on the evaluation set:

  • Loss: 0.4425
  • Mae: 0.4270
  • Rmse: 0.6196
  • Abs Rel: 0.4543
  • Log Mae: 0.1732
  • Log Rmse: 0.2288
  • Delta1: 0.3787
  • Delta2: 0.6298
  • Delta3: 0.8083


Model description

More information needed


Intended uses & limitations

More information needed


Training and evaluation data

More information needed


Training procedure


Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 24
  • eval_batch_size: 48
  • seed: 2022
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP


Training results

Training Loss Epoch Step Validation Loss Mae Rmse Abs Rel Log Mae Log Rmse Delta1 Delta2 Delta3
0.5276 1.0 72 0.4701 0.4590 0.6348 0.4983 0.1903 0.2393 0.3169 0.5544 0.7661
0.4595 2.0 144 0.4867 0.4690 0.6369 0.5588 0.1956 0.2483 0.3090 0.5269 0.7532
0.4802 3.0 216 0.4854 0.4648 0.6344 0.5581 0.1935 0.2475 0.3135 0.5355 0.7531
0.4566 4.0 288 0.4709 0.4559 0.6756 0.4223 0.1890 0.2516 0.3668 0.6329 0.7696
0.4916 5.0 360 0.4835 0.4555 0.6343 0.5302 0.1881 0.2447 0.3435 0.5716 0.7437
0.4822 6.0 432 0.4756 0.4585 0.6301 0.5264 0.1894 0.2414 0.3238 0.5628 0.7435
0.4588 7.0 504 0.4655 0.4481 0.6509 0.4425 0.1843 0.2413 0.3498 0.6157 0.7809
0.4214 8.0 576 0.4869 0.4706 0.6391 0.5669 0.1961 0.2500 0.3033 0.5388 0.7371
0.426 9.0 648 0.4835 0.4679 0.6472 0.5117 0.1951 0.2486 0.3216 0.5474 0.7399
0.4135 10.0 720 0.4621 0.4439 0.6287 0.4803 0.1825 0.2365 0.3451 0.5887 0.7878
0.3778 11.0 792 0.4756 0.4566 0.6337 0.5174 0.1892 0.2431 0.3297 0.5560 0.7690
0.426 12.0 864 0.4542 0.4362 0.6219 0.4621 0.1779 0.2303 0.3572 0.6083 0.7835
0.4282 13.0 936 0.4514 0.4306 0.6195 0.4678 0.1754 0.2307 0.3661 0.6228 0.8083
0.4045 14.0 1008 0.4575 0.4390 0.6315 0.4530 0.1794 0.2343 0.3641 0.6128 0.7787
0.4351 15.0 1080 0.4669 0.4373 0.6423 0.4322 0.1796 0.2423 0.3917 0.6233 0.7850
0.4001 16.0 1152 0.4540 0.4356 0.6331 0.4336 0.1767 0.2320 0.3919 0.6132 0.7732
0.3741 17.0 1224 0.4890 0.4645 0.6361 0.5707 0.1926 0.2494 0.3253 0.5469 0.7386
0.4128 18.0 1296 0.4815 0.4593 0.6328 0.5511 0.1899 0.2457 0.3302 0.5571 0.7471
0.3809 19.0 1368 0.5002 0.4768 0.6425 0.6061 0.1991 0.2560 0.3105 0.5222 0.7118
0.4089 20.0 1440 0.4503 0.4311 0.6449 0.4081 0.1752 0.2370 0.4147 0.6445 0.7823
0.3612 21.0 1512 0.4541 0.4280 0.6215 0.4543 0.1735 0.2302 0.3823 0.6291 0.7968
0.3664 22.0 1584 0.4425 0.4251 0.6347 0.3970 0.1717 0.2300 0.4181 0.6374 0.7860
0.3787 23.0 1656 0.4722 0.4477 0.6378 0.4868 0.1846 0.2432 0.3541 0.6041 0.7733
0.4184 24.0 1728 0.4749 0.4506 0.6303 0.5329 0.1857 0.2434 0.3465 0.5752 0.7698
0.3928 25.0 1800 0.4646 0.4485 0.6395 0.4744 0.1847 0.2407 0.3528 0.5946 0.7816
0.3704 26.0 1872 0.4492 0.4340 0.6331 0.4344 0.1765 0.2326 0.3778 0.6314 0.7916
0.3462 27.0 1944 0.4467 0.4307 0.6314 0.4296 0.1751 0.2317 0.3840 0.6359 0.7983
0.3808 28.0 2016 0.4758 0.4622 0.6331 0.5236 0.1913 0.2425 0.3230 0.5439 0.7438
0.3641 29.0 2088 0.4609 0.4452 0.6315 0.4545 0.1824 0.2339 0.3484 0.5934 0.7716
0.3602 30.0 2160 0.4546 0.4413 0.6230 0.4729 0.1804 0.2318 0.3515 0.5944 0.7778
0.3638 31.0 2232 0.4498 0.4340 0.6245 0.4449 0.1764 0.2296 0.3725 0.6079 0.7923
0.3699 32.0 2304 0.4472 0.4305 0.6228 0.4568 0.1750 0.2307 0.3757 0.6239 0.8000
0.3805 33.0 2376 0.4647 0.4439 0.6325 0.4875 0.1823 0.2392 0.3609 0.5921 0.7833
0.3454 34.0 2448 0.4640 0.4442 0.6276 0.5008 0.1820 0.2376 0.3573 0.5865 0.7866
0.3452 35.0 2520 0.4646 0.4454 0.6276 0.4966 0.1827 0.2374 0.3489 0.5913 0.7726
0.3509 36.0 2592 0.4522 0.4394 0.6259 0.4605 0.1799 0.2321 0.3534 0.6001 0.7944
0.3432 37.0 2664 0.4656 0.4484 0.6290 0.5067 0.1841 0.2390 0.3487 0.5802 0.7687
0.381 38.0 2736 0.4630 0.4405 0.6287 0.4970 0.1807 0.2387 0.3565 0.6067 0.7907
0.3591 39.0 2808 0.4637 0.4452 0.6269 0.4995 0.1825 0.2374 0.3487 0.5966 0.7654
0.3826 40.0 2880 0.4723 0.4527 0.6307 0.5279 0.1867 0.2421 0.3338 0.5745 0.7713
0.3585 41.0 2952 0.4485 0.4306 0.6238 0.4470 0.1749 0.2297 0.3736 0.6251 0.7995
0.3518 42.0 3024 0.4369 0.4229 0.6293 0.4111 0.1701 0.2277 0.4004 0.6563 0.8009
0.359 43.0 3096 0.4545 0.4348 0.6274 0.4607 0.1777 0.2338 0.3592 0.6237 0.8000
0.3274 44.0 3168 0.4595 0.4359 0.6278 0.4781 0.1779 0.2357 0.3729 0.6093 0.7980
0.3368 45.0 3240 0.4617 0.4434 0.6253 0.5001 0.1819 0.2368 0.3400 0.5966 0.7953
0.3638 46.0 3312 0.4634 0.4380 0.6264 0.4925 0.1794 0.2371 0.3576 0.6158 0.7907
0.3698 47.0 3384 0.4559 0.4343 0.6223 0.4890 0.1776 0.2346 0.3579 0.6103 0.8110
0.3392 48.0 3456 0.4646 0.4477 0.6267 0.5029 0.1837 0.2374 0.3451 0.5798 0.7665
0.3548 49.0 3528 0.4598 0.4394 0.6245 0.4885 0.1793 0.2351 0.3647 0.6016 0.7815
0.3375 50.0 3600 0.4441 0.4271 0.6226 0.4487 0.1729 0.2293 0.3808 0.6354 0.8075
0.3315 51.0 3672 0.4613 0.4403 0.6292 0.4868 0.1805 0.2373 0.3630 0.6016 0.7905
0.3313 52.0 3744 0.4445 0.4307 0.6442 0.4108 0.1746 0.2342 0.3942 0.6577 0.7932
0.3372 53.0 3816 0.4456 0.4258 0.6269 0.4404 0.1720 0.2308 0.3924 0.6489 0.8027
0.3285 54.0 3888 0.4526 0.4348 0.6241 0.4723 0.1772 0.2328 0.3615 0.6160 0.8027
0.3474 55.0 3960 0.4498 0.4369 0.6258 0.4595 0.1782 0.2315 0.3617 0.6070 0.7978
0.3349 56.0 4032 0.4613 0.4428 0.6307 0.4858 0.1819 0.2376 0.3523 0.6012 0.7875
0.3207 57.0 4104 0.4476 0.4342 0.6230 0.4500 0.1765 0.2289 0.3658 0.6151 0.7910
0.3399 58.0 4176 0.4600 0.4413 0.6248 0.4940 0.1812 0.2360 0.3531 0.5954 0.7814
0.3327 59.0 4248 0.4463 0.4339 0.6215 0.4570 0.1770 0.2294 0.3590 0.6069 0.8063
0.3215 60.0 4320 0.4482 0.4317 0.6203 0.4595 0.1756 0.2295 0.3698 0.6154 0.8034
0.3276 61.0 4392 0.4406 0.4218 0.6192 0.4370 0.1705 0.2268 0.3878 0.6425 0.8111
0.3179 62.0 4464 0.4530 0.4331 0.6217 0.4765 0.1764 0.2327 0.3660 0.6121 0.8068
0.3129 63.0 4536 0.4614 0.4398 0.6263 0.5002 0.1803 0.2378 0.3529 0.6023 0.7974
0.3354 64.0 4608 0.4538 0.4374 0.6234 0.4777 0.1788 0.2333 0.3565 0.6013 0.7995
0.3261 65.0 4680 0.4367 0.4258 0.6283 0.4249 0.1724 0.2291 0.3861 0.6484 0.8026
0.3114 66.0 4752 0.4565 0.4366 0.6225 0.4852 0.1780 0.2334 0.3647 0.6073 0.7939
0.3377 67.0 4824 0.4519 0.4308 0.6185 0.4771 0.1755 0.2314 0.3681 0.6175 0.8079
0.3266 68.0 4896 0.4372 0.4216 0.6167 0.4345 0.1702 0.2245 0.3850 0.6392 0.8163
0.3347 69.0 4968 0.4343 0.4193 0.6179 0.4318 0.1690 0.2252 0.3890 0.6557 0.8114
0.3207 70.0 5040 0.4426 0.4269 0.6180 0.4465 0.1728 0.2266 0.3810 0.6296 0.8038
0.3313 71.0 5112 0.4362 0.4234 0.6177 0.4360 0.1712 0.2252 0.3777 0.6386 0.8133
0.326 72.0 5184 0.4392 0.4251 0.6182 0.4431 0.1723 0.2265 0.3783 0.6356 0.8088
0.3141 73.0 5256 0.4532 0.4385 0.6214 0.4818 0.1796 0.2327 0.3513 0.5981 0.8044
0.3301 74.0 5328 0.4536 0.4361 0.6230 0.4808 0.1783 0.2333 0.3585 0.6097 0.8037
0.3194 75.0 5400 0.4501 0.4335 0.6216 0.4698 0.1765 0.2312 0.3623 0.6164 0.8033
0.3071 76.0 5472 0.4455 0.4310 0.6201 0.4598 0.1751 0.2292 0.3625 0.6231 0.8087
0.3174 77.0 5544 0.4472 0.4316 0.6219 0.4625 0.1756 0.2307 0.3654 0.6256 0.8022
0.3171 78.0 5616 0.4461 0.4305 0.6204 0.4614 0.1750 0.2298 0.3663 0.6263 0.8052
0.3244 79.0 5688 0.4501 0.4328 0.6226 0.4725 0.1765 0.2324 0.3611 0.6233 0.8083
0.3188 80.0 5760 0.4427 0.4280 0.6199 0.4507 0.1735 0.2281 0.3757 0.6307 0.8054
0.3212 81.0 5832 0.4383 0.4222 0.6196 0.4365 0.1702 0.2266 0.3875 0.6476 0.8093
0.3234 82.0 5904 0.4434 0.4278 0.6216 0.4479 0.1735 0.2288 0.3728 0.6337 0.8064
0.3024 83.0 5976 0.4502 0.4331 0.6214 0.4728 0.1764 0.2317 0.3645 0.6192 0.8070
0.3145 84.0 6048 0.4409 0.4258 0.6199 0.4475 0.1726 0.2280 0.3778 0.6357 0.8075
0.329 85.0 6120 0.4491 0.4302 0.6221 0.4710 0.1749 0.2322 0.3755 0.6246 0.8065
0.3034 86.0 6192 0.4504 0.4321 0.6241 0.4699 0.1757 0.2325 0.3767 0.6217 0.8035
0.3074 87.0 6264 0.4373 0.4224 0.6188 0.4396 0.1706 0.2267 0.3878 0.6439 0.8107
0.3089 88.0 6336 0.4379 0.4235 0.6191 0.4402 0.1709 0.2266 0.3893 0.6410 0.8089
0.2995 89.0 6408 0.4448 0.4292 0.6193 0.4597 0.1744 0.2292 0.3740 0.6225 0.8065
0.3248 90.0 6480 0.4413 0.4279 0.6227 0.4494 0.1732 0.2283 0.3766 0.6305 0.8100
0.3203 91.0 6552 0.4445 0.4290 0.6213 0.4568 0.1740 0.2295 0.3754 0.6290 0.8085
0.3109 92.0 6624 0.4452 0.4295 0.6203 0.4597 0.1744 0.2297 0.3749 0.6245 0.8035
0.3241 93.0 6696 0.4419 0.4258 0.6190 0.4533 0.1725 0.2285 0.3833 0.6313 0.8092
0.3078 94.0 6768 0.4446 0.4278 0.6201 0.4597 0.1736 0.2297 0.3778 0.6284 0.8097
0.3141 95.0 6840 0.4466 0.4305 0.6208 0.4660 0.1749 0.2306 0.3720 0.6233 0.8058
0.3198 96.0 6912 0.4440 0.4275 0.6194 0.4584 0.1736 0.2293 0.3774 0.6279 0.8088
0.3 97.0 6984 0.4426 0.4269 0.6192 0.4545 0.1731 0.2287 0.3770 0.6302 0.8091
0.3096 98.0 7056 0.4433 0.4274 0.6197 0.4568 0.1735 0.2292 0.3765 0.6296 0.8088
0.3317 99.0 7128 0.4394 0.4244 0.6196 0.4448 0.1716 0.2276 0.3844 0.6374 0.8096
0.3132 100.0 7200 0.4425 0.4270 0.6196 0.4543 0.1732 0.2288 0.3787 0.6298 0.8083


Framework versions

数据统计

数据评估

sayakpaul/glpn-nyu-finetuned-diode-230131-041708浏览人数已经达到957,如你需要查询该站的相关权重信息,可以点击"5118数据""爱站数据""Chinaz数据"进入;以目前的网站数据参考,建议大家请以爱站数据为准,更多网站价值评估因素如:sayakpaul/glpn-nyu-finetuned-diode-230131-041708的访问速度、搜索引擎收录以及索引量、用户体验等;当然要评估一个站的价值,最主要还是需要根据您自身的需求以及需要,一些确切的数据则需要找sayakpaul/glpn-nyu-finetuned-diode-230131-041708的站长进行洽谈提供。如该站的IP、PV、跳出率等!

关于sayakpaul/glpn-nyu-finetuned-diode-230131-041708特别声明

本站Ai导航提供的sayakpaul/glpn-nyu-finetuned-diode-230131-041708都来源于网络,不保证外部链接的准确性和完整性,同时,对于该外部链接的指向,不由Ai导航实际控制,在2023年5月15日 下午3:12收录时,该网页上的内容,都属于合规合法,后期网页的内容如出现违规,可以直接联系网站管理员进行删除,Ai导航不承担任何责任。

相关导航

暂无评论

暂无评论...