Does the app support Chinise? #2377
Replies: 15 comments 55 replies
-
You can help Immich to translate it to Chinese by contacting @alextran1502 on discord or by mail. He will send you invitation to the translation service and you can start translating |
Beta Was this translation helpful? Give feedback.
-
I admire this project and its smart search function. I am personally comfortable with English but my parents are not good at English and they want to search in Chinese. However, searching in Chinese is not supported correctly. Take the following example: "昆虫" means insects in English. I searched in Chinese on the left and in English on the right part. As the picture illustrates, Chinese searching gave the wrong answer while English searching is working perfectly. I would appreciate it if the project would improve the search to fully support Chinese. |
Beta Was this translation helpful? Give feedback.
-
大佬,你怎么解决的?我同样需要 |
Beta Was this translation helpful? Give feedback.
-
老哥,有解决么?我后面想的用国内一个叫MT Photo的,感觉框架都是用immich做的,但MT Photo不支持全景照片,无奈了。要支持估计得整个重写后端对接的AI识别服务。 |
Beta Was this translation helpful? Give feedback.
-
我看了下源码,只支持MCLIP 和OpenCLIP 的两类模型,不然加载 就会报错 调用加载模型的源码是: from typing import Any
from app.schemas import ModelType
from .base import InferenceModel
from .clip import MCLIPEncoder, OpenCLIPEncoder
from .constants import is_insightface, is_mclip, is_openclip
from .facial_recognition import FaceRecognizer
def from_model_type(model_type: ModelType, model_name: str, **model_kwargs: Any) -> InferenceModel:
match model_type:
case ModelType.CLIP:
if is_openclip(model_name):
return OpenCLIPEncoder(model_name, **model_kwargs)
elif is_mclip(model_name):
return MCLIPEncoder(model_name, **model_kwargs)
case ModelType.FACIAL_RECOGNITION:
if is_insightface(model_name):
return FaceRecognizer(model_name, **model_kwargs)
case _:
raise ValueError(f"Unknown model type {model_type}")
raise ValueError(f"Unknown {model_type} model {model_name}") immich_machine_learning service only support mclip or openclip. so replace model file is not work. |
Beta Was this translation helpful? Give feedback.
-
I tried to change it to support cn clip, you can pull and try if you are interested https://github.com/si9ma/immich |
Beta Was this translation helpful? Give feedback.
-
@isolierte It WORK!!! |
Beta Was this translation helpful? Give feedback.
-
刚刚提交了 |
Beta Was this translation helpful? Give feedback.
-
N5105的NAS通过Docker compose部署,Job中的任务正常进行但是无法调用核显,是还需要额外设置什么参数吗。用的是immich-machine-learning-cn-clip-cpu @Tea-NT |
Beta Was this translation helpful? Give feedback.
-
感谢大佬的付出,这个应用潜力无限 |
Beta Was this translation helpful? Give feedback.
-
关于中文模型的问题,我想我必须在这里说一下,给有需要的人。 在官方说明中,这里有一个问答,简单提到了换模型的办法。 而immich支持的多语言模型,是支持100多种语言的,其中就包含了中文,immich支持的模型都在下载这里下载。 而多语言模型中,我最推荐的是性能超强的XLM-Roberta-Large-Vit-B-16Plus。 这个模型不仅支持100多种语言,而在中文甚至包括英文的结果,都超过了原版的CLIP。这是我从该模型的对比数据中看的,应该是有可信度的。 如果有人还在关注这个事情,可以尝试一下,我今天正打算换。 |
Beta Was this translation helpful? Give feedback.
-
另外中文方面的对比详情在这里,Validation & Training Curves。 如果可以的话,就不必再用CN-CLIP。 如果我理解有误,或者该模型与CN-CLIP性能差距较大,请忽略。 |
Beta Was this translation helpful? Give feedback.
-
如果下载不了模型的话,可以去网盘下载,我打包上传到网盘了 如果你的网络拉取这几个模型比较费劲,可以去下载,下载之后解压上传到模型目录即可 Smart Search用的模型是:immich-app/XLM-Roberta-Large-Vit-B-16Plus(支持中文,体验很好,默认的ViT-B-32__openai不支持中文) |
Beta Was this translation helpful? Give feedback.
-
我刚才看到有人问我,我在这儿提醒一次,网络可以的话不需要去网站里下载模型。 如果要换的话,直接在immich网页的设置页面中,填入 XLM-Roberta-Large-Vit-B-16Plus ,是的,粘贴就可以了!下载等步骤,它会自动进行的。 |
Beta Was this translation helpful? Give feedback.
-
https://github.com/deadash/immich_ai 我使用rust写了一个独立的immich的机器识别程序(目前只支持clip),使用的是chinese_clip来识别,并且支持tensor_rt/cuda/dml 可以使用英特尔的显卡来加速(N100可用),可以用官方的仓库然后设置服务器地址到这个程序后可以成功识别。 模型下载使用的魔塔,国内也可以高速下载。 |
Beta Was this translation helpful? Give feedback.
-
The feature
thanks for the product, but as my english is poor, Does it will support Chinise?
Platform
Beta Was this translation helpful? Give feedback.
All reactions