
TL;DR
- I read this because.. : #57 에서 사용한 trick
- task : long-tail image classification
- problem : real-world에서는 class가 imbalance한 경우가 많다
- idea : label frequency 기반으로 logit adjustment를 함
- architecture : ResNet-32, ResNet-50
- objective : softmax의 exponential에 들어가는 값에다가 class별 frequency를 $\tau$를 곱해서 더함.
- baseline : ERM, weight normalisation, Adaptive, Equalized
- data : CIFAR-10-LT, CIFAR-100-LT, ImageNet-LT, iNaturalist2018
- evaluation : balanced error(class별 평균)
- result : outperform baselines
- limitation / things I cannot understand : 수식 및 논리를 이해하지는 않고 읽음
Details
Post-hoc logit adjustment

Logit adjusted softmax cross-entropy

Result
