Change the repository type filter
All
Repositories list
12 repositories
MoH
PublicVitron
PublicNeurIPS 2024 Paper: A Unified Pixel-level Vision LLM for Understanding, Generating, Segmenting, Editing- MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts
Skywork-Reward
PublicDAQ-VS
PublicLLMs-as-Instructors
PublicGamba
PublicSkywork-MoE
PublicPointCloudMamba
Publicvllm
PublicSkywork
PublicSkywork series models are pre-trained on 3.2TB of high-quality multilingual (mainly Chinese and English) and code data. We have open-sourced the model, training data, evaluation data, evaluation methods, etc. 天工系列模型在3.2TB高质量多语言和代码数据上进行预训练。我们开源了模型参数,训练数据,评估数据,评估方法。