Skip to content

Latest commit

 

History

History
48 lines (41 loc) · 1.39 KB

README.md

File metadata and controls

48 lines (41 loc) · 1.39 KB

Super In-Context Learning (SuperICL)

Code for "Small Models are Valuable Plug-ins for Large Language Models".

supericl_workflow

How to Run Code

Setup

Install Requirements

pip install -r requirements.txt

Add OpenAI API Key

cp api_config_example.py api_config.py
vi api_config.py

GLUE

python run_glue.py \
--model_path roberta-large-mnli \
--model_name RoBERTa-Large \
--dataset mnli-m \
--explanation  # Include this to enable explanation for overrides

For all supported tasks, see here.

For the complete set of parameters, see the code here.

XNLI

python run_xnli.py \
--model_path /path/to/model \
--model_name XLM-V \
--lang en,ar,bg,de,el,es,fr,hi,ru,sw,th,tr,ur,vi,zh 

For the complete set of parameters, see the code here.

Citation

@article{xu2023small,
  title={Small Models are Valuable Plug-ins for Large Language Models},
  author={Canwen Xu and Yichong Xu and Shuohang Wang and Yang Liu and Chenguang Zhu and Julian McAuley},
  journal={arXiv preprint arXiv:2305.08848},
  year={2023}
}