Skip to content

Latest commit

 

History

History
1866 lines (1301 loc) · 112 KB

awesome_paper_date.md

File metadata and controls

1866 lines (1301 loc) · 112 KB

Awesome papers by date

Here, we list some papers related to transfer learning by date (starting from 2021-07). For papers older than 2021-07, please refer to the papers by topic, which contains more papers.

2024-10

  • Transfer Learning on Multi-Dimensional Data: A Novel Approach to Neural Network-Based Surrogate Modeling [arxiv]

    • Transfer learning on multi-dimensioal data
  • TransAgent: Transfer Vision-Language Foundation Models with Heterogeneous Agent Collaboration [arxiv]

    • Transfer vision-language models for collaboration
  • Test-time adaptation for image compression with distribution regularization [arxiv]

    • Test-time adaptation for image compression with distribution regularization
  • WeatherDG: LLM-assisted Procedural Weather Generation for Domain-Generalized Semantic Segmentation [arxiv]

    • Weather domain generalization
  • Can In-context Learning Really Generalize to Out-of-distribution Tasks? [arxiv]

    • Can in-context learning generalize to OOD tasks?
  • Domain-Conditioned Transformer for Fully Test-time Adaptation [arxiv]

    • Fully test-tim adaptation with domain-conditioned transformer
  • Safety-Aware Fine-Tuning of Large Language Models [arxiv]

    • Fine-tuning with safety in LLMs
    • Deep Transfer Learning: Model Framework and Error Analysis [arxiv]
    • Deep transfer learning framework
  • Cross-Domain Distribution Alignment for Segmentation of Private Unannotated 3D Medical Images [arxiv]

    • Cross-domain adaptation of private unannotated 3D medical images
  • Stratified Domain Adaptation: A Progressive Self-Training Approach for Scene Text Recognition [arxiv]

    • Stratified domain adaptation
  • Efficiently Learning at Test-Time: Active Fine-Tuning of LLMs [arxiv]

    • Active fine-tuning of LLMs
  • LLM Embeddings Improve Test-time Adaptation to Tabular Y|X shifts [arxiv]

    • Test-time adaptation via LLMs
  • AHA: Human-Assisted Out-of-Distribution Generalization and Detection [arxiv]

    • Human-assisted OOD generalization and detection

2024-09

  • Transfer Learning Applied to Computer Vision Problems: Survey on Current Progress, Limitations, and Opportunities [arxiv]

    • Transfer learning for computer vision survey
  • Spatial Adaptation Layer: Interpretable Domain Adaptation For Biosignal Sensor Array Applications [arxiv]

    • Interpretable domain adaptation
  • DICS: Find Domain-Invariant and Class-Specific Features for Out-of-Distribution Generalization [arxiv]

    • Domain-invariant and class-specific features for OOD generalization
  • Unsupervised Domain Adaptation Via Data Pruning [arxiv]

    • Using pruning for domain adaptation
  • LLM-wrapper: Black-Box Semantic-Aware Adaptation of Vision-Language Foundation Models [arxiv]

    • Black-box adaptation of vision language models
  • Can Your Generative Model Detect Out-of-Distribution Covariate Shift? [arxiv]

    • Can your generative models detect OOD covariate shift?
  • Fine-tuning large language models for domain adaptation: Exploration of training strategies, scaling, model merging and synergistic capabilities [arxiv]

    • Fine-tuning LLMs for domain adaptation
  • Dual-Path Adversarial Lifting for Domain Shift Correction in Online Test-time Adaptation [arxiv]

    • Online test-time adaptation using dual-path adversarial lifting
  • Rethinking Knowledge Transfer in Learning Using Privileged Information [arxiv]

    • Using privileged information for knowledge transfer
  • Transfer Learning from Simulated to Real Scenes for Monocular 3D Object Detection [arxiv]

    • Transfer learning from simulated to real scens for monocular 3D
  • Multi-source Domain Adaptation for Panoramic Semantic Segmentation [arxiv]

    • Multi-source domain adaptation for panoramic semantic segmentation
  • Adapting Vision-Language Models to Open Classes via Test-Time Prompt Tuning [arxiv]

    • Test-time prompt tuning for open classes
  • A More Unified Theory of Transfer Learning [arxiv]

    • More unified theory of transfer learning
  • Low Saturation Confidence Distribution-based Test-Time Adaptation for Cross-Domain Remote Sensing Image Classification [arxiv]

    • Test-time adaptation for remote sensing image classification

2024-08

  • Unsupervised Domain Adaption Harnessing Vision-Language Pre-training [arxiv]

    • Domain adaptation using vision-language pre-training
  • Domain penalisation for improved Out-of-Distribution Generalisation [arxiv]

    • OOD using domain penalization
  • Weighted Risk Invariance: Domain Generalization under Invariant Feature Shift [arxiv]

    • Domain generalization under invariant feature shift

2024-07

  • Reducing Spurious Correlation for Federated Domain Generalization [arxiv]

    • Federated domain generalization by reducing spurious correlation
  • Can Modifying Data Address Graph Domain Adaptation? [arxiv]

    • Alignment and rescaling for graph DA
  • Improving Domain Adaptation Through Class Aware Frequency Transformation [arxiv]

    • Class aware frequency transformation for domain adaptation
  • Rethinking Domain Adaptation and Generalization in the Era of CLIP [arxiv]

    • Rethinking domain adaptation in CLIP era
  • Training-Free Model Merging for Multi-target Domain Adaptation [arxiv]

    • Model merging for multi-target domain adaptation
  • SAFT: Towards Out-of-Distribution Generalization in Fine-Tuning [arxiv]

    • OOD fine-tuning for foundation models 大模型的OOD微调
  • Multi-Task Domain Adaptation for Language Grounding with 3D Objects [arxiv]

    • Multi-task domain adaptation for language grounding

2024-05

  • Transfer Learning for CSI-based Positioning with Multi-environment Meta-learning [arxiv]

    • Transfer learning for CSI-based positioning 用迁移学习进行基于CSI的定位
  • Versatile Teacher: A Class-aware Teacher-student Framework for Cross-domain Adaptation [arxiv]

    • Teacher-student framework for cross-domain adaptation 教师-学生框架进行跨领域适配
  • MICCAI'24 MediCLIP: Adapting CLIP for Few-shot Medical Image Anomaly Detection [arxiv]

    • Adapting clip for few-shot medical image anomaly detection 对CLIP模型进行适配,以用于少样本图片异常检测

2024-04

  • MDDD: Manifold-based Domain Adaptation with Dynamic Distribution for Non-Deep Transfer Learning in Cross-subject and Cross-session EEG-based Emotion Recognition [arxiv]

    • Manifold-based domain adaptation for EEG-based emotion recognition 基于流形的DA用于EEG情绪识别
  • Domain Adaptation for Learned Image Compression with Supervised Adapters [arxiv]

    • Domain adaptation for learned image compression DA用于图片压缩
  • Test-Time Training on Graphs with Large Language Models (LLMs) [arxiv]

    • Test-time training on graphs with LLMs 使用大语言模型在图上进行测试时训练
  • DACAD: Domain Adaptation Contrastive Learning for Anomaly Detection in Multivariate Time Series [arxiv]

    • Domain adaptation for anomaly detection 使用域自适应进行时间序列异常检测
  • CVPR'24 Exploring the Transferability of Visual Prompting for Multimodal Large Language Models [arxiv]

    • Explore the transferability of visual prompting for multimodal LLM 探索多模态大模型visual prompt tuning的可迁移性
  • DGMamba: Domain Generalization via Generalized State Space Model [arXiv]

    • Domain generalization using mamba 用Mamba结构进行DG
  • CVPR'24 Unified Language-driven Zero-shot Domain Adaptation [arxiv]

    • Language-driven zero-shot domain adaptation 语言驱动的零样本 DA
  • ICASSP'24 Learning Inference-Time Drift Sensor-Actuator for Domain Generalization [IEEE]

    • Inference-time drift actuator for OOD generalization
  • ICASSP'24 SBM: Smoothness-Based Minimization for Domain Generalization [IEEE]

    • Smoothness-based minimization for OOD generalization
  • ICASSP'24 G2G: Generalized Learning by Cross-Domain Knowledge Transfer for Federated Domain Generalization [IEEE]

    • Federated domain generalization
  • ICASSP'24 Single-Source Domain Generalization in Fundus Image Segmentation Via Moderating and Interpolating Input Space Augmentation [IEEE]

    • Single-source DG in fundus image segmentation
  • ICASSP'24 Style Factorization: Explore Diverse Style Variation for Domain Generalization [IEEE]

    • Style variation for domain generalization
  • ICASSP'24 SPDG-Net: Semantics Preserving Domain Augmentation through Style Interpolation for Multi-Source Domain Generalization [IEEE]

    • Domain augmentation for multi-source DG
  • ICASSP'24 Domaindiff: Boost out-of-Distribution Generalization with Synthetic Data [IEEE]

    • Using synthetic data for OOD generalization
  • ICASSP'24 Multi-Level Augmentation Consistency Learning and Sample Selection for Semi-Supervised Domain Generalization [IEEE]

    • Multi-level augmentation for semi-supervised domain generalization
  • ICASSP'24 MMS: Morphology-Mixup Stylized Data Generation for Single Domain Generalization in Medical Image Segmentation [IEEE]

    • Morphology-mixup for domain generalization

2024-03

  • On the Benefits of Over-parameterization for Out-of-Distribution Generalization [arxiv]

    • Over-parameterazation for OOD generalizaiton 分析了过参数化对OOD的影响
  • CoDA: Instructive Chain-of-Domain Adaptation with Severity-Aware Visual Prompt Tuning [arxiv]

    • Chain-of-domain adaptation with visual prompt tuning 领域链adaptation
  • Deep Domain Adaptation: A Sim2Real Neural Approach for Improving Eye-Tracking Systems [arxiv]

    • Domain adaptation for eye-tracking systems 用DA进行眼球追踪
  • EAGLE: A Domain Generalization Framework for AI-generated Text Detection [arxiv]

    • Domain generalization for AI content detection 用DG进行AI生成内容检测
  • DPStyler: Dynamic PromptStyler for Source-Free Domain Generalization [arxiv]

    • Dynamic propmtstyler for source-free DG 动态prompt分格化用于source-free DG
  • Neurocomputing'24 Uncertainty-Aware Pseudo-Label Filtering for Source-Free Unsupervised Domain Adaptation [arxiv]

    • Unvertainty-aware source-free domain adaptation 基于不确定性伪标签的domain adaptation
  • Efficient Domain Adaptation for Endoscopic Visual Odometry [arxiv]

    • Efficient domain adaptation for visual odometry 高效DA用于odometry
  • Potential of Domain Adaptation in Machine Learning in Ecology and Hydrology to Improve Model Extrapolability [arxiv]

    • Domain adaptation in ecology and hydrology 研究生态学和水文学中的DA
  • ICLR'24 SF(DA)2: Source-free Domain Adaptation Through the Lens of Data Augmentation [arxiv]

    • Source-free DA by data augmentation 通过数据增强来进行source-free DA
  • CVPR'24 Universal Semi-Supervised Domain Adaptation by Mitigating Common-Class Bias [arxiv]

    • Unviersal semi-supervised DA 通过公共类bias进行半监督DA
  • Domain Adaptation Using Pseudo Labels for COVID-19 Detection [arxiv]

    • Domain adaptation for COVID-19 detection 用DA进行covid-19检查
  • Ensembling and Test Augmentation for Covid-19 Detection and Covid-19 Domain Adaptation from 3D CT-Scans [arxiv]

    • Covid-19 test using domain adaptation 使用集成和测试增强用于DA covid-19
  • V2X-DGW: Domain Generalization for Multi-agent Perception under Adverse Weather Conditions [arxiv]

    • DG for multi-agent perception 领域泛化用于极端天气
  • Bidirectional Multi-Step Domain Generalization for Visible-Infrared Person Re-Identification [arxiv]

    • Bidirectional multi-step DG for REID 双向领域泛化用于REID
  • MedMerge: Merging Models for Effective Transfer Learning to Medical Imaging Tasks [arxiv]

    • Model merge for medical transfer learning 通过模型合并进行医学迁移学习
  • SPA: A Graph Spectral Alignment Perspective for Domain Adaptation [NeurIPS 2023] [Pytorch]

    • Graph spectral alignment and neighbor-aware propagation for domain adaptation
  • Addressing Source Scale Bias via Image Warping for Domain Adaptation [arxiv]

    • Address the source scale bias for domain adaptation 解决源域的scale bias
  • ICLR'24 扩展版 Learning with Noisy Foundation Models [arxiv]

    • Fine-tune a noisy foundation model 基础模型有noisy的时候如何finetune
  • Visual Foundation Models Boost Cross-Modal Unsupervised Domain Adaptation for 3D Semantic Segmentation [arxiv]

    • Foundation models help domain adaptation 基础模型帮助领域自适应
  • Attention Prompt Tuning: Parameter-efficient Adaptation of Pre-trained Models for Spatiotemporal Modeling [arxiv]

    • Parameter-efficient adaptation for spatiotemporal modeling
  • ICASSP'24 Test-time Distribution Learning Adapter for Cross-modal Visual Reasoning [arxiv]

    • Test-time distribution learning adapter
  • A Study on Domain Generalization for Failure Detection through Human Reactions in HRI [arxiv]

    • Domain generalization for failure detection through human reactions in HRI
  • ICLR'24 Towards Robust Out-of-Distribution Generalization Bounds via Sharpness [arxiv]

    • Robust OOD generalization bounds
  • Learning with Noisy Foundation Models [arxiv]

    • Learning with noisy foundation models

2024-02

  • Unsupervised Domain Adaptation within Deep Foundation Latent Spaces [arxiv]
    • Domain adaptation using foundation models

2024-01

  • Facing the Elephant in the Room: Visual Prompt Tuning or Full Finetuning? [arxiv]

    • A comparison between visual prompt tuning and full finetuning 比较prompt tuning和全finetune
  • Out-of-Distribution Detection & Applications With Ablated Learned Temperature Energy [arxiv]

    • OOD detection for ablated learned temperature energy
  • LanDA: Language-Guided Multi-Source Domain Adaptation [arxiv]

    • Language guided multi-source DA 在多源域自适应中使用语言指导
  • AdaEmbed: Semi-supervised Domain Adaptation in the Embedding Space [arxiv]

    • Semi-spuervised domain adaptation in the embedding space 在嵌入空间中进行半监督域自适应
  • Inter-Domain Mixup for Semi-Supervised Domain Adaptation [arxiv]

    • Inter-domain mixup for semi-supervised domain adaptation 跨领域mixup用于半监督域自适应
  • Source-Free and Image-Only Unsupervised Domain Adaptation for Category Level Object Pose Estimation [arxiv]

    • Source-free and image-only unsupervised domain adaptation
  • ICLR'24 spotlight Understanding and Mitigating the Label Noise in Pre-training on Downstream Tasks [arxiv]

    • A new research direction of transfer learning in the era of foundation models 大模型时代一个新研究方向:研究预训练数据的噪声对下游任务影响
  • ICLR'24 Supervised Knowledge Makes Large Language Models Better In-context Learners [arxiv]

    • Small models help large language models for better OOD 用小模型帮助大模型进行更好的OOD
  • NeurIPS'23 Geodesic Multi-Modal Mixup for Robust Fine-Tuning [paper]

    • Geodesic mixup for robust fine-tuning
  • NeurIPS'23 Parameter and Computation Efficient Transfer Learning for Vision-Language Pre-trained Models [paper]

    • Parameter and computation efficient transfer learning by reinforcement learning
  • NeurIPS'23 Test-Time Distribution Normalization for Contrastively Learned Visual-language Models [paper]

    • Test-time distribution normalization for contrastively learned VLM
  • NeurIPS'23 A Closer Look at the Robustness of Contrastive Language-Image Pre-Training (CLIP) [paper]

    • A fine-gained analysis of CLIP robustness
  • NeurIPS'23 When Visual Prompt Tuning Meets Source-Free Domain Adaptive Semantic Segmentation [paper]

    • Source-free domain adaptation using visual prompt tuning
  • NeurIPS'23 CODA: Generalizing to Open and Unseen Domains with Compaction and Disambiguation [arxiv]

    • Open set domain generalization using extra classes
  • CPAL'24 FIXED: Frustratingly Easy Domain Generalization with Mixup [arxiv]

    • Easy domain generalization with mixup
  • SDM'24 Towards Optimization and Model Selection for Domain Generalization: A Mixup-guided Solution [arxiv]

    • Optimization and model selection for domain generalization
  • Leveraging SAM for Single-Source Domain Generalization in Medical Image Segmentation [arxiv]

    • SAM for single-source domain generalization
  • Multi-Source Domain Adaptation with Transformer-based Feature Generation for Subject-Independent EEG-based Emotion Recognition [arxiv]

    • Multi-source DA with Transformer-based feature generation

2023-12

  • Multi-Modal Domain Adaptation Across Video Scenes for Temporal Video Grounding [arxiv]

    • Multi-modal domain adaptation 多模态领域自适应
  • Domain Adaptive Graph Classification [arxiv]

    • Domain adaptive graph classification 域适应的图分类
  • Understanding and Estimating Domain Complexity Across Domains [arxiv]

    • Understanding and estimating domain complexity 解释领域复杂性
  • Prompt-based Domain Discrimination for Multi-source Time Series Domain Adaptation [arxiv]

    • Prompt-based domain discrimination for time series domain adaptation 基于prompt的时间序列域自适应
  • NeurIPS'23 SwapPrompt: Test-Time Prompt Adaptation for Vision-Language Models [arxiv]

    • Test-time prompt adaptation for vision language models 对视觉-语言大模型的测试时prompt自适应
  • AAAI24 Relax Image-Specific Prompt Requirement in SAM: A Single Generic Prompt for Segmenting Camouflaged Objects [arxiv][code]

    • A training-free test-time adaptation approach to relax the instance-specific prompts requirment in SAM.
  • Open Domain Generalization with a Single Network by Regularization Exploiting Pre-trained Features [arxiv]

    • Open domain generalization with a single network 用单一网络结构进行开放式domain generalizaition
  • Stronger, Fewer, & Superior: Harnessing Vision Foundation Models for Domain Generalized Semantic Segmentation [arxiv]

    • Using vision foundation models for domain genealized semantic segmentation 用视觉基础模型进行域泛化语义分割
  • DARNet: Bridging Domain Gaps in Cross-Domain Few-Shot Segmentation with Dynamic Adaptation [arxiv]

    • Dynamic adaptation for cross-domain few-shot segmentation 动态适配用于跨领域小样本分割
  • A Unified Framework for Unsupervised Domain Adaptation based on Instance Weighting [arxiv]

    • Instance weighting for domain adaptation 样本加权用于领域自适应
  • Target-agnostic Source-free Domain Adaptation for Regression Tasks [arxiv]

    • Target-agnostic source-free DA for regression 用于回归任务的source-free DA
  • On the Out-Of-Distribution Robustness of Self-Supervised Representation Learning for Phonocardiogram Signals [arxiv]

    • OOD robustness for self-supervised learning for phonocardiogram 心音图信号自监督的OOD鲁棒性
  • Student Activity Recognition in Classroom Environments using Transfer Learning [arxiv]

    • Using transfer learning to recognize student activities 用迁移学习来识别学生课堂行为

2023-11

  • A2XP: Towards Private Domain Generalization [arxiv]

    • Private domain generalization 隐私保护的domain generalization
  • Layer-wise Auto-Weighting for Non-Stationary Test-Time Adaptation [arxiv]

    • Auto-weighting for test-time adaptation 自动权重的TTA
  • Domain Generalization by Learning from Privileged Medical Imaging Information [arxiv]

    • Domain generalizaiton by learning from privileged medical imageing inforamtion
  • SSL-DG: Rethinking and Fusing Semi-supervised Learning and Domain Generalization in Medical Image Segmentation [arxiv]

    • Semi-supervised learning + domain generalization 把半监督和领域泛化结合在一起
  • WACV'24 Learning Class and Domain Augmentations for Single-Source Open-Domain Generalization [arxiv]

    • Class and domain augmentation for single-source open-domain DG 结合类和domain增强做单源DG
  • Proposal-Level Unsupervised Domain Adaptation for Open World Unbiased Detector [arxiv]

    • Proposal-level unsupervised domain adaptation
  • Robust Fine-Tuning of Vision-Language Models for Domain Generalization [arxiv]

    • Robust fine-tuning for domain generalization 用于领域泛化的鲁棒微调
  • NeurIPS 2023 Distilling Out-of-Distribution Robustness from Vision-Language Foundation Models [arxiv]

    • Distill OOD robustness from vision-language foundational models 从VLM模型中蒸馏出OOD鲁棒性
  • UbiComp 2024 Optimization-Free Test-Time Adaptation for Cross-Person Activity Recognition [arxiv]

    • Test-time adaptation for activity recognition 测试时adaptation用于行为识别

2023-10

  • PromptStyler: Prompt-driven Style Generation for Source-free Domain Generalization [arxiv]

    • Prompt-driven style generation for source-free domain generalization
  • A Survey of Heterogeneous Transfer Learning [arxiv]

    • A recent survey of heterogeneous transfer learning 一篇最近的关于异构迁移学习的综述
  • Equivariant Adaptation of Large Pre-Trained Models [arxiv]

    • Equivariant adaptation of large pre-trained models 对大模型进行等边自适应
  • Effective and Parameter-Efficient Reusing Fine-Tuned Models [arxiv]

    • Effective and parameter-efficient reusing fine-tuned models 高效使用预训练模型
  • Prompting-based Efficient Temporal Domain Generalization [arxiv]

    • Prompt based temporal domain generalization 基于prompt的时间域domain generalization
  • Understanding and Mitigating the Label Noise in Pre-training on Downstream Tasks [arxiv]

    • Noisy model learning: fine-tuning to supress the bad effect of noisy pretraining data 通过使用轻量级finetune减少噪音预训练数据对下游任务的影响
  • ZooPFL: Exploring Black-box Foundation Models for Personalized Federated Learning [arxiv]

    • Black-box foundation models for personalized federated learning 黑盒的blackbox模型进行个性化迁移学习

2023-09

  • Domain Generalization with Fourier Transform and Soft Thresholding [arxiv]

    • Domain generalization with Fourier transform 基于傅里叶变换和软阈值进行domain generalization
  • DePT: Decomposed Prompt Tuning for Parameter-Efficient Fine-tuning [arxiv]

    • Decomposed prompt tuning for parameter-efficient fine-tuning 基于分解prompt tuning的参数高效微调
  • Better Practices for Domain Adaptation [arxiv]

    • Better practice for domain adaptation
  • Domain Adaptation for Efficiently Fine-tuning Vision Transformer with Encrypted Images [arxiv]

    • Domain adaptation for efficient ViT
  • Robust Activity Recognition for Adaptive Worker-Robot Interaction using Transfer Learning [arxiv]

    • Activity recognition using domain adaptation

2023-08

  • IJCV'23 Exploring Vision-Language Models for Imbalanced Learning [arxiv] [code]

    • Explore vision-language models for imbalanced learning 探索视觉大模型在不平衡问题上的表现
  • ICCV'23 Improving Generalization of Adversarial Training via Robust Critical Fine-Tuning [arxiv] [code]

    • 达到对抗鲁棒性和泛化能力的trade off
  • ICCV'23 Domain-Specificity Inducing Transformers for Source-Free Domain Adaptation [arxiv]

    • Domain-specificity for source-free DA 用领域特异性驱动的source-free DA
  • Unsupervised Domain Adaptation via Domain-Adaptive Diffusion [arxiv]

    • Domain-adaptive diffusion for domain adaptation 领域自适应的diffusion
  • Multi-Scale and Multi-Layer Contrastive Learning for Domain Generalization [arxiv]

    • Multi-scale and multi-layer contrastive learning for DG 多尺度和多层对比学习用于DG
  • Exploring the Transfer Learning Capabilities of CLIP in Domain Generalization for Diabetic Retinopathy [arxiv]

    • Domain generalization for diabetic retinopathy 用领域泛化进行糖尿病视网膜病
  • Federated Fine-tuning of Billion-Sized Language Models across Mobile Devices [arxiv]

    • Federated fine-tuning for large models 大模型联邦微调
  • Source-Free Collaborative Domain Adaptation via Multi-Perspective Feature Enrichment for Functional MRI Analysis [arxiv]

    • Source-free domain adaptation for MRI analysis
  • Towards Realistic Unsupervised Fine-tuning with CLIP [arxiv]

    • Unsupervised fine-tuning of CLIP
  • Fine-tuning can cripple your foundation model; preserving features may be the solution [arxiv]

    • Fine-tuning will cripple foundation model
  • Exploring Transfer Learning in Medical Image Segmentation using Vision-Language Models [arxiv]

    • Transfer learning for medical image segmentation
  • Transfer Learning for Portfolio Optimization [arxiv]

    • Transfer learning for portfolio optimization
  • NormAUG: Normalization-guided Augmentation for Domain Generalization [arxiv]

    • Normalization augmentation for domain generalization

2023-07

  • Benchmarking Algorithms for Federated Domain Generalization [arxiv]

    • Benchmark algorthms for federated domain generalization 对联邦域泛化算法进行的benchmark
  • DISPEL: Domain Generalization via Domain-Specific Liberating [arxiv]

    • Domain generalization via domain-specific liberating
  • Review of Large Vision Models and Visual Prompt Engineering [arxiv]

    • A survey of large vision model and prompt tuning 一个关于大视觉模型的prompt tuning的综述
  • Intra- & Extra-Source Exemplar-Based Style Synthesis for Improved Domain Generalization [arxiv]

    • Exemplar-based style synthesis for domain generalization 样例格式合成用于DG
  • SAM-DA: UAV Tracks Anything at Night with SAM-Powered Domain Adaptation [arxiv]

    • Using SAM for domain adaptation 使用segment anything进行domain adaptation
  • Unified Transfer Learning Models for High-Dimensional Linear Regression [arxiv]

    • Transfer learning for high-dimensional linar regression 迁移学习用于高维线性回归

2023-06

  • Pruning for Better Domain Generalizability [arxiv]

    • Using pruning for better domain generalization 使用剪枝操作进行domain generalization
  • TMLR'23 Generalizability of Adversarial Robustness Under Distribution Shifts [openreview]

    • Evaluate the OOD perormance of adversarial training 评测对抗训练模型的OOD鲁棒性
  • Scaling Down to Scale Up: A Guide to Parameter-Efficient Fine-Tuning [arxiv]

    • A guide for parameter-efficient fine-tuning 一个对parameter efficient fine-tuning的全面介绍
  • ICML'23 A Kernel-Based View of Language Model Fine-Tuning [arxiv]

    • A kernel-based view of language model fine-tuning 一种以kernel的视角来看待fine-tuning的方法
  • ICML'23 Improving Visual Prompt Tuning for Self-supervised Vision Transformers [arxiv]

    • Improving visual prompt tuning for self-supervision 为自监督模型提高其 prompt tuning 表现
  • Cross-Database and Cross-Channel ECG Arrhythmia Heartbeat Classification Based on Unsupervised Domain Adaptation [arxiv]

    • EEG using unsupervised domain adaptation 用无监督DA来进行EEG心跳分类
  • Real-Time Online Unsupervised Domain Adaptation for Real-World Person Re-identification [arxiv]

    • Real-time online unsupervised domain adaptation for REID 无监督DA用于REID
  • Federated Domain Generalization: A Survey [arxiv]

    • A survey on federated domain generalization 一篇关于联邦域泛化的综述
  • Domain Generalization for Domain-Linked Classes [arxiv]

    • Domain generalization for domain-linked classes
  • Can We Evaluate Domain Adaptation Models Without Target-Domain Labels? A Metric for Unsupervised Evaluation of Domain Adaptation [arxiv]

    • Evaluate domain adaptation models 评测domain adaptation的模型
  • Universal Test-time Adaptation through Weight Ensembling, Diversity Weighting, and Prior Correction [arxiv]

    • Universal test-time adaptation
  • Adapting Pre-trained Language Models to Vision-Language Tasks via Dynamic Visual Prompting [arxiv]

    • Using dynamic visual prompting for model adaptation 用动态视觉prompt进行模型适配

2023-05

  • Selective Mixup Helps with Distribution Shifts, But Not (Only) because of Mixup [arxiv]

    • Why mixup works for domain generalization? 系统性研究为啥mixup对OOD很work
  • ACL'23 Parameter-Efficient Fine-Tuning without Introducing New Latency [arxiv]

    • Parameter-efficient finetuning 参数高效的finetune
  • Universal Domain Adaptation from Foundation Models [arxiv]

    • Using foundation models for universal domain adaptation
  • Ahead-of-Time P-Tuning [arxiv]

    • Ahead-ot-time P-tuning for language models
  • Multi-Source to Multi-Target Decentralized Federated Domain Adaptation [arxiv]

    • Decentralized federated domain adaptation
  • Benchmarking Low-Shot Robustness to Natural Distribution Shifts [arxiv]

    • Low-shot robustness to distribution shifts

2023-04

  • Multi-Source to Multi-Target Decentralized Federated Domain Adaptation [arxiv]

    • Multi-source to multi-target federated domain adaptation 多源多目标的联邦域自适应
  • ICML'23 AdaNPC: Exploring Non-Parametric Classifier for Test-Time Adaptation [arxiv]

    • Adaptive test-time adaptation 非参数化分类器进行测试时adaptation
  • Improved Test-Time Adaptation for Domain Generalization [arxiv]

    • Improved test-time adaptation for domain generalization
  • Reweighted Mixup for Subpopulation Shift [arxiv]

    • Reweighted mixup for subpopulation shift
  • CVPR'23 Zero-shot Generative Model Adaptation via Image-specific Prompt Learning [arxiv]

    • Zero-shot generative model adaptation via image-specific prompt learning 零样本的生成模型adaptation
  • Source-free Domain Adaptation Requires Penalized Diversity [arxiv]

    • Source-free DA requires penalized diversity
  • Domain Generalization with Adversarial Intensity Attack for Medical Image Segmentation [arxiv]

    • Domain generalization for medical segmentation 用domain generalization进行医学分割
  • CVPR'23 Meta-causal Learning for Single Domain Generalization [arxiv]

    • Meta-causal learning for domain generalization
  • Domain Generalization In Robust Invariant Representation [arxiv]

    • Domain generalization in robust invariant representation
  • Beyond Empirical Risk Minimization: Local Structure Preserving Regularization for Improving Adversarial Robustness [arxiv]

    • Local structure preserving for adversarial robustness 通过保留局部结构来进行对抗鲁棒性
  • TFS-ViT: Token-Level Feature Stylization for Domain Generalization [arxiv]

    • Token-level feature stylization for domain generalization 用token-level特征变换进行domain generalization
  • Are Data-driven Explanations Robust against Out-of-distribution Data? [arxiv]

    • Data-driven explanations robust? 探索数据驱动的解释是否是OOD鲁棒的
  • ERM++: An Improved Baseline for Domain Generalization [arxiv]

    • Improved ERM for domain generalization 提高的ERM用于domain generalization
  • CVPR'23 Feature Alignment and Uniformity for Test Time Adaptation [arxiv]

    • Feature alignment for test-time adaptation 使用特征对齐进行测试时adaptation
  • Finding Competence Regions in Domain Generalization [arxiv]

    • Finding competence regions in domain generalization 在DG中发现能力区域
  • CVPR'23 TWINS: A Fine-Tuning Framework for Improved Transferability of Adversarial Robustness and Generalization [arxiv]

    • Improve generalization and adversarial robustness 同时提高鲁棒性和泛化性
  • CVPR'23 Trainable Projected Gradient Method for Robust Fine-tuning [arxiv]

    • Trainable PGD for robust fine-tuning 可训练的pgd用于鲁棒的微调技术
  • Parameter-Efficient Tuning Makes a Good Classification Head [arxiv]

    • Parameter-efficient tuning makes a good classification head 参数高效的迁移学习成就一个好的分类头
  • Complementary Domain Adaptation and Generalization for Unsupervised Continual Domain Shift Learning [arxiv]

    • Continual domain shift learning using adaptation and generalization 使用 adaptation和DG进行持续分布变化的学习

2023-03

  • CVPR'23 A New Benchmark: On the Utility of Synthetic Data with Blender for Bare Supervised Learning and Downstream Domain Adaptation [arxiv]

    • A new benchmark for domain adaptation 一个对于domain adaptation最新的benchmark
  • Unsupervised domain adaptation by learning using privileged information [arxiv]

    • Domain adaptation by privileged information 使用高级信息进行domain adaptation
  • A Unified Continual Learning Framework with General Parameter-Efficient Tuning [arxiv]

    • A continual learning framework for parameter-efficient tuning 一个对于参数高效迁移的连续学习框架
  • CVPR'23 Sharpness-Aware Gradient Matching for Domain Generalization [arxiv]

    • Sharpness-aware gradient matching for DG 利用梯度匹配进行domain generalization
  • TempT: Temporal consistency for Test-time adaptation [arxiv]

    • Temporeal consistency for test-time adaptation 时间一致性用于test-time adaptation
  • TMLR'23 Learn, Unlearn and Relearn: An Online Learning Paradigm for Deep Neural Networks [arxiv]

    • A framework for online learning 一个在线学习的框架
  • ICLR'23 workshop SPDF: Sparse Pre-training and Dense Fine-tuning for Large Language Models [arxiv]

    • Sparse pre-training and dense fine-tuning
  • CVPR'23 ALOFT: A Lightweight MLP-like Architecture with Dynamic Low-frequency Transform for Domain Generalization [arxiv]

    • A lightweight module for domain generalization 一个用于DG的轻量级模块
  • ICLR'23 Contrastive Alignment of Vision to Language Through Parameter-Efficient Transfer Learning [arxiv]

    • Contrastive alignment for vision language models using transfer learning 使用参数高效迁移进行视觉语言模型的对比对齐
  • Probabilistic Domain Adaptation for Biomedical Image Segmentation [arxiv]

    • Probabilistic domain adaptation for biomedical image segmentation 概率的domain adaptation用于生物医疗图像分割
  • Imbalanced Domain Generalization for Robust Single Cell Classification in Hematological Cytomorphology [arxiv]

    • Imbalanced domain generalization for single cell classification 不平衡的DG用于单细胞分类
  • Revisit Parameter-Efficient Transfer Learning: A Two-Stage Paradigm [arxiv]

    • Parameter-efficient transfer learning: a two-stage approach 一种两阶段的参数高效迁移学习
  • Unsupervised Cumulative Domain Adaptation for Foggy Scene Optical Flow [arxiv]

    • Domain adaptation for foggy scene optical flow 领域自适应用于雾场景的光流
  • ICLR'23 AutoTransfer: AutoML with Knowledge Transfer -- An Application to Graph Neural Networks [arxiv]

    • GNN with autoML transfer learning 用于GNN的自动迁移学习
  • Transfer Learning for Real-time Deployment of a Screening Tool for Depression Detection Using Actigraphy [arxiv]

    • Transfer learning for Depression detection 迁移学习用于脉动计焦虑检测
  • Domain Generalization via Nuclear Norm Regularization [arxiv]

    • Domain generalization via nuclear norm regularization 使用核归一化进行domain generalization
  • To Stay or Not to Stay in the Pre-train Basin: Insights on Ensembling in Transfer Learning [arxiv]

    • Ensembling in transfer learning 调研迁移学习中的集成
  • CVPR'13 Masked Images Are Counterfactual Samples for Robust Fine-tuning [arxiv]

    • Masked images for robust fine-tuning 调研masked image对于fine-tuning的影响
  • FedCLIP: Fast Generalization and Personalization for CLIP in Federated Learning [arxiv]

    • Fast generalization for federated CLIP 在联邦中进行快速的CLIP训练
  • Robust Representation Learning with Self-Distillation for Domain Generalization [arxiv]

    • Robust representation learning with self-distillation
  • ICLR-23 Temporal Coherent Test-Time Optimization for Robust Video Classification [arxiv]

    • Temporal distribution shift in video classification
  • WSDM-23 A tutorial on domain generalization [link] | [website]

    • A tutorial on domain generalization

2023-02

  • On the Robustness of ChatGPT: An Adversarial and Out-of-distribution Perspective [arxiv] | [code]

    • Adversarial and OOD evaluation of ChatGPT 对ChatGPT鲁棒性的评测
  • Transfer learning for process design with reinforcement learning [arxiv]

    • Transfer learning for process design with reinforcement learning 使用强化迁移学习进行过程设计
  • Domain Adaptation for Time Series Under Feature and Label Shifts [arxiv]

    • Domain adaptation for time series 用于时间序列的domain adaptation
  • How Reliable is Your Regression Model's Uncertainty Under Real-World Distribution Shifts? [arxiv]

    • Regression models uncertainty for distribution shift 回归模型对于分布漂移的不确定性
  • ICLR'23 SoftMatch: Addressing the Quantity-Quality Tradeoff in Semi-supervised Learning [arxiv]

    • Semi-supervised learning algorithm 解决标签质量问题的半监督学习方法
  • Empirical Study on Optimizer Selection for Out-of-Distribution Generalization [arxiv]

    • Opimizer selection for OOD generalization OOD泛化中的学习器选择
  • ICML'22 Understanding the failure modes of out-of-distribution generalization [arxiv]

    • Understand the failure modes of OOD generalization 探索OOD泛化中的失败现象
  • ICLR'23 Out-of-distribution Representation Learning for Time Series Classification [arxiv]

    • OOD for time series classification 时间序列分类的OOD算法

2023-01

  • ICLR'23 FreeMatch: Self-adaptive Thresholding for Semi-supervised Learning [arxiv]

    • New baseline for semi-supervised learning 半监督学习新算法
  • CLIP the Gap: A Single Domain Generalization Approach for Object Detection [arxiv]

    • Using CLIP for domain generalization object detection 使用CLIP进行域泛化的目标检测
  • Language-Informed Transfer Learning for Embodied Household Activities [arxiv]

    • Transfer learning for robust control in household 在家居机器人上使用强化迁移学习
  • Does progress on ImageNet transfer to real-world datasets? [arxiv]

    • ImageNet accuracy does not transfer to down-stream tasks
  • TPAMI'23 Source-Free Unsupervised Domain Adaptation: A Survey [arxiv]

    • A survey on source-free domain adaptation 关于source-free DA的一个最新综述
  • Discriminative Radial Domain Adaptation [arxiv]

    • Discriminative radial domain adaptation 判别性的放射式domain adaptation

2022-12

  • WACV'23 Cross-Domain Video Anomaly Detection without Target Domain Adaptation [arxiv]

    • Cross-domain video anomaly detection without target domain adaptation 跨域视频异常检测
  • Co-Learning with Pre-Trained Networks Improves Source-Free Domain Adaptation [arxiv]

    • Pre-trained models for source-free domain adaptation 用预训练模型进行source-free DA
  • TMLR'22 A Unified Survey on Anomaly, Novelty, Open-Set, and Out of-Distribution Detection: Solutions and Future Challenges [openreview]

    • A recent survey on OOD/anomaly detection 一篇最新的关于OOD/anomaly detection的综述
  • NeurIPS'18 A Simple Unified Framework for Detecting Out-of-Distribution Samples and Adversarial Attacks [paper]

    • Using class-conditional distribution for OOD detection 使用类条件概率进行OOD检测
  • ICLR'22 Discrete Representations Strengthen Vision Transformer Robustness [arxiv]

    • Embed discrete representation for OOD generalization 在ViT中加入离散表征增强OOD性能
  • CONDA: Continual Unsupervised Domain Adaptation Learning in Visual Perception for Self-Driving Cars [arxiv]

    • Continual DA for self-driving cars 连续的domain adaptation用于自动驾驶
  • Finetune like you pretrain: Improved finetuning of zero-shot vision models [arxiv]]

    • Improved fine-tuning of zero-shot models 针对zero-shot model提高fine-tuneing

2022-11

  • ECCV-22 DecoupleNet: Decoupled Network for Domain Adaptive Semantic Segmentation [arXiv] [Code]

    • Domain adaptation in semantic segmentation 语义分割域适应
  • Robust Mean Teacher for Continual and Gradual Test-Time Adaptation [arxiv]

    • Mean teacher for test-time adaptation 在测试时用mean teacher进行适配
  • Learning to Learn Domain-invariant Parameters for Domain Generalization [[arxiv](Learning to Learn Domain-invariant Parameters for Domain Generalization)]

    • Learning to learn domain-invariant parameters for DG 元学习进行domain generalization
  • HMOE: Hypernetwork-based Mixture of Experts for Domain Generalization [arxiv]

    • Hypernetwork-based ensembling for domain generalization 超网络构成的集成学习用于domain generalization
  • The Evolution of Out-of-Distribution Robustness Throughout Fine-Tuning [arxiv]

    • OOD using fine-tuning 系统总结了基于fine-tuning进行OOD的一些结果
  • GLUE-X: Evaluating Natural Language Understanding Models from an Out-of-distribution Generalization Perspective [arxiv]

    • OOD for natural language processing evaluation 提出GLUE-X用于OOD在NLP数据上的评估
  • CVPR'22 Delving Deep Into the Generalization of Vision Transformers Under Distribution Shifts [arxiv]

    • Vision transformers generalization under distribution shifts 评估ViT的分布漂移
  • NeurIPS'22 Models Out of Line: A Fourier Lens on Distribution Shift Robustness [arxiv]

    • A fourier lens on distribution shift robustness 通过傅里叶视角来看分布漂移的鲁棒性
  • CVPR'22 Does Robustness on ImageNet Transfer to Downstream Tasks? [arxiv]

    • Does robustness on imagenet transfer lto downstream tasks?
  • Normalization Perturbation: A Simple Domain Generalization Method for Real-World Domain Shifts [arxiv]

    • Normalization perturbation for domain generalization 通过归一化扰动来进行domain generalization
  • FIXED: Frustraitingly easy domain generalization using Mixup [arxiv]

    • 使用Mixup进行domain generalization
  • Learning to Learn Domain-invariant Parameters for Domain Generalization [arxiv]

    • Learning to learn domain-invariant parameters for domain generalization
  • NeurIPS'22 Improved Fine-Tuning by Better Leveraging Pre-Training Data [openreview]

    • Using pre-training data for fine-tuning 用预训练数据来做微调
  • NeurIPS'22 Divide and Contrast: Source-free Domain Adaptation via Adaptive Contrastive Learning [openreview]

    • Adaptive contrastive learning for source-free DA 自适应的对比学习用于source-free DA
  • NeurIPS'22 LOG: Active Model Adaptation for Label-Efficient OOD Generalization [openreview]

    • Model adaptation for label-efficient OOD generalization
  • NeurIPS'22 MetaTeacher: Coordinating Multi-Model Domain Adaptation for Medical Image Classification [openreview]

    • Multi-model domain adaptation mor medical image classification 多模型DA用于医疗数据
  • NeurIPS'22 Domain Adaptation under Open Set Label Shift [openreview]

    • Domain adaptation under open set label shift 在开放集的label shift中的DA
  • NeurIPS'22 Domain Generalization without Excess Empirical Risk [openreview]

    • Domain generalization without excess empirical risk
  • NeurIPS'22 FedSR: A Simple and Effective Domain Generalization Method for Federated Learning [openreview]

    • FedSR for federated learning domain generalization 用于联邦学习的domain generalization
  • NeurIPS'22 Probable Domain Generalization via Quantile Risk Minimization [openreview]

    • Domain generalization with quantile risk minimization 用quantile风险最小化的domain generalization
  • NeurIPS'22 Beyond Not-Forgetting: Continual Learning with Backward Knowledge Transfer [arxiv]

    • Continual learning with backward knowledge transfer 反向知识迁移的持续学习
  • NeurIPS'22 Test Time Adaptation via Conjugate Pseudo-labels [openreview]

    • Test-time adaptation with conjugate pseudo-labels 用伪标签进行测试时adaptation
  • NeurIPS'22 Your Out-of-Distribution Detection Method is Not Robust! [openreview]

    • OOD models are not robust 分布外泛化模型不够鲁棒

2022-10

  • NeurIPS'22 Respecting Transfer Gap in Knowledge Distillation [arxiv]

    • Transfer gap in distillation 知识蒸馏中的迁移gap
  • Transfer of Machine Learning Fairness across Domains [arxiv]

    • Fairness transfer in transfer learning 迁移学习中的公平性迁移
  • On Fine-Tuned Deep Features for Unsupervised Domain Adaptation [arxiv]

    • Fine-tuned features for domain adaptation 微调的特征用于域自适应
  • WACV-23 ConfMix: Unsupervised Domain Adaptation for Object Detection via Confidence-based Mixing [arxiv]

    • Domain adaptation for object detection using confidence mixing 用置信度mix做domain adaptation
  • CVPR-20 Regularizing CNN Transfer Learning With Randomised Regression [arxiv]

    • Using randomized regression to regularize CNN 用随机回归约束CNN迁移学习
  • AAAI-21 TransTailor: Pruning the Pre-trained Model for Improved Transfer Learning [arxiv]

    • Pruning pre-trained model for transfer learning 通过对预训练模型进行剪枝来进行迁移学习
  • PhDthesis Generalizing in the Real World with Representation Learning [arxiv]

    • A phd thesis about generalization in real world 一篇关于现实世界如何做Generalization的博士论文
  • The Evolution of Out-of-Distribution Robustness Throughout Fine-Tuning [arxiv]

    • Evolution of OOD robustness by fine-tuning
  • Visual Prompt Tuning for Test-time Domain Adaptation [arxiv]

    • VPT for test-time adaptation 用prompt tuning进行test-time DA
  • Unsupervised Domain Adaptation for COVID-19 Information Service with Contrastive Adversarial Domain Mixup [arxiv]

    • Domain adaptation for COVID-19 用DA进行COVID-19预测
  • ICONIP'22 IDPL: Intra-subdomain adaptation adversarial learning segmentation method based on Dynamic Pseudo Labels [arxiv]

    • Intra-domain adaptation for segmentation 子领域对抗Adaptation
  • NeurIPS'22 Polyhistor: Parameter-Efficient Multi-Task Adaptation for Dense Vision Tasks [arxiv]

    • Parameter-efficient multi-task adaptation 参数高效的多任务adaptation
  • Out-of-Distribution Generalization in Algorithmic Reasoning Through Curriculum Learning [arxiv]

    • OOD in algorithmic reasoning 算法reasoning过程中的OOD
  • Towards Out-of-Distribution Adversarial Robustness [arxiv]

    • OOD adversarial robustness OOD对抗鲁棒性
  • TripleE: Easy Domain Generalization via Episodic Replay [arxiv]

    • Easy domain generalization by episodic replay
  • Deep Spatial Domain Generalization [arxiv]

    • Deep spatial domain generalization

2022-09

  • Assaying Out-Of-Distribution Generalization in Transfer Learning [arXiv]

    • A lot of experiments to show OOD performance
  • ICML-21 Accuracy on the Line: on the Strong Correlation Between Out-of-Distribution and In-Distribution Generalization [arxiv]

    • Strong correlation between ID and OOD
  • Deep Domain Adaptation for Detecting Bomb Craters in Aerial Images [arxiv]

    • Bomb craters detection using domain adaptation 用DA检测遥感图像中的炮弹弹坑
  • WACV-23 TeST: Test-time Self-Training under Distribution Shift [arxiv]

    • Test-time self-training 测试时训练
  • StyleTime: Style Transfer for Synthetic Time Series Generation [arxiv]

    • Style transfer for time series generation 时间序列生成的风格迁移
  • Robust Domain Adaptation for Machine Reading Comprehension [arxiv]

    • Domain adaptation for machine reading comprehension 机器阅读理解的domain adaptation
  • Generalized representations learning for time series classification [arxiv]

    • OOD for time series classification 域泛化用于时间序列分类
  • USB: A Unified Semi-supervised Learning Benchmark [arxiv] [code]

    • Unified semi-supervised learning codebase 半监督学习统一代码库
  • Test-Time Training with Masked Autoencoders [arxiv]

    • Test-time training with MAE MAE的测试时训练
  • Test-Time Prompt Tuning for Zero-Shot Generalization in Vision-Language Models [arxiv]

    • Test-time prompt tuning 测试时的prompt tuning
  • TeST: test-time self-training under distribution shift [arxiv]

    • Test-time self-training 测试时的self-training
  • Language-aware Domain Generalization Network for Cross-Scene Hyperspectral Image Classification [arxiv]

    • Domain generalization for cross-scene hyperspectral image classification 域泛化用于高光谱图像分类
  • IEEE-TMM'22 Uncertainty Modeling for Robust Domain Adaptation Under Noisy Environments [IEEE]

    • Uncertainty modeling for domain adaptation 噪声环境下的domain adaptation
  • Improving Robustness to Out-of-Distribution Data by Frequency-based Augmentation arxiv

    • OOD by frequency-based augmentation 通过基于频率的数据增强进行OOD
  • Domain Generalization for Prostate Segmentation in Transrectal Ultrasound Images: A Multi-center Study arxiv

    • Domain generalizationfor prostate segmentation 领域泛化用于前列腺分割
  • Domain Adaptation from Scratch arxiv

    • Domain adaptation from scratch
  • Towards Optimization and Model Selection for Domain Generalization: A Mixup-guided Solution arxiv

    • Model selection for domain generalization 域泛化中的模型选择问题
  • Conv-Adapter: Exploring Parameter Efficient Transfer Learning for ConvNets

    • Parameter efficient CNN adapter for transfer learning 参数高效的CNN adapter用于迁移学习
  • Equivariant Disentangled Transformation for Domain Generalization under Combination Shift

    • Equivariant disentangled transformation for domain generalization 新的建模domain generalization的思路

2022-08

2022-07

2022-06

2022-05

2022-04

Updated at 2022-04-29:

2022-03

2022-02

2022-01

2021-12

2021-11

2021-10

2021-09

2021-08

2021-07