diff --git a/_posts/2024-11-29-quarkus-jlama.adoc b/_posts/2024-11-29-quarkus-jlama.adoc index 5eb576cae0..e218c7e5fa 100644 --- a/_posts/2024-11-29-quarkus-jlama.adoc +++ b/_posts/2024-11-29-quarkus-jlama.adoc @@ -11,7 +11,7 @@ Currently the vast majority of LLM-based applications relie on external services Even worse, this usage pattern also comes with both privacy and security concerns, since it is virtually impossible to be sure how those service providers will eventually re-use the prompts of their customers, which in some cases could also contain sensitive information. -For these reasons many companies are deciding to train or fine-tune smaller models that do not claim to be usable in any context, but that will be tailored for the business specific needs and to run these models on premise or on private clouds. +For these reasons many companies are exploring the option of training or fine-tuning smaller models that do not claim to be usable in a general context, but that are tailored towards specific business needs and subsequently running (serving in LLM terms) these models on premise or on private clouds. The features provided by these specialized models need to be integrated into the existing software infrastructure, that in the enterprise world are very often written in Java. This could be accomplished following a traditional client-server architecture, for instance serving the model through an external server like https://ollama.com/[Ollama] and querying it through REST calls. While this should not present any particular problem for Java developers, they could work more efficiently, if they could consume the model directly in Java and without any need to install additional tools. Finally the possibility of embedding the LLM interaction directly in the same Java process running the application will make it easier to move from local dev to deployment, relieving IT from the burden of managing an external server, thus bypassing the need for a more mature platform engineering strategy. This is where Jlama comes into play.