Replies: 2 comments
-
Smart Connections will utilize whatever available tokens it has. So if you use the 16k model, it will use more notes for context. The effect this has on results will be purely dependent on your query, as some queries benefit from or need more context than others. 🌴 Brian |
Beta Was this translation helpful? Give feedback.
-
I answer the token per query question here: #330 (comment) But ya as Brian said, it's mostly to do with your prompt and or context. With more tokens you can write larger more specific and robust prompts, you can also send more text from or source that you're summarizing lets say, so it has more context to make a proper summary, or utilize more of your notes in your vault. So basically you're just able to give the AI more data to work with. |
Beta Was this translation helpful? Give feedback.
-
Hi, I wonder if I use normal GPT-3.5 model instead of GPT-3.5-16k, how it would affect smart connect? Does smaller token window mean some long note cannot be processed?
Also, how many tokens usually does it take for one query?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions