label-smooth, amsoftmax, partial-fc, focal-loss, triplet-loss, lovasz-softmax. Maybe useful
-
Updated
Oct 17, 2024 - Python
label-smooth, amsoftmax, partial-fc, focal-loss, triplet-loss, lovasz-softmax. Maybe useful
Li Yong pytorchShi Xian Tu Xiang Fen Lei De Yi Ge Wan Zheng De Dai Ma ,Xun Lian ,Yu Ce ,TTA,Mo Xing Rong He ,Mo Xing Bu Shu ,cnnTi Qu Te Zheng ,svmHuo Zhe Sui Ji Sen Lin Deng Jin Xing Fen Lei ,Mo Xing Zheng Liu ,Yi Ge Wan Zheng De Dai Ma
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
Corrupted labels and label smoothing
[ICML2022 Long Talk] Official Pytorch implementation of "To Smooth or Not? When Label Smoothing Meets Noisy Labels"
An implementation of MobileNetV3 with pyTorch
Code of our method MbLS (Margin-based Label Smoothing) for network calibration. To Appear at CVPR 2022. Paper : https://arxiv.org/abs/2111.15430
label smoothing PyTorch implementation
Noise Injection Techniques provides a comprehensive exploration of methods to make machine learning models more robust to real-world bad data. This repository explains and demonstrates Gaussian noise, dropout, mixup, masking, adversarial noise, and label smoothing, with intuitive explanations, theory, and practical code examples.
Mean Teacher-based Cross-Domain Activity Recognition using WiFi Signals, IoTJ 2023
Source code of our paper "Focus on the Target's Vocabulary: Masked Label Smoothing for Machine Translation" @acl-2022
Build an algorithm that can predict multiple future states of Limit Order Books using high-frequency, multi-variate, short time-frame data
Supplementary material and code for "From Label Smoothing to Label Relaxation" as published at AAAI 2021.
Label smoothed Aggregation cross entropy loss for generalisation in sequence to sequence tasks.
Label Smoothing applied in Focal Loss
[ICML 2022] This work investigates the compatibility between label smoothing (LS) and knowledge distillation (KD). We suggest to use an LS-trained teacher with a low-temperature transfer to render high performance students.
Simple Tool Box with Pytorch
Soft Target and Label Smoothing in Text Classification for Probability Calibration of Output Distributions.
Adding Image-context in the Label Smoothing process via Geodesic distance
Building High Performance Convolutional Neural Networks with TensorFlow
Add a description, image, and links to the label-smoothing topic page so that developers can more easily learn about it.
To associate your repository with the label-smoothing topic, visit your repo's landing page and select "manage topics."