diff --git a/_posts/2023-11-16-quarkus-meets-langchain4j.adoc b/_posts/2023-11-16-quarkus-meets-langchain4j.adoc
new file mode 100644
index 0000000000..d1dc5c61df
--- /dev/null
+++ b/_posts/2023-11-16-quarkus-meets-langchain4j.adoc
@@ -0,0 +1,277 @@
+---
+layout: post
+title: 'When Quarkus meets LangChain4j'
+date: 2023-11-15
+tags: AI langchain
+synopsis: 'Learn about the new quarkus-langchain4j extension to integrate LLMs in Quarkus applications.'
+author: cescoffier
+---
+:imagesdir: /assets/images/posts/llms
+
+Large language models (LLMs) are reshaping the world of software, altering the way we interact with users and develop business logic.
+
+Popularized by https://openai.com/[OpenAI]'s https://chat.openai.com/[ChatGPT], LLMs are now available in many flavors and sizes. The https://huggingface.co/models[Hugging-Face] platform references hundreds of them, and major tech companies like Facebook, Google, Microsoft, Amazon and IBM are also providing their own models.
+
+LLMs are not a new concept. They have been around for a while, but they were not as powerful or as accessible the became when OpenAI made ChatGPT API's publically available. Since then the Quarkus team have been thinking about what it would mean to integrate LLMs in the Quarkus ecosystem. The talk https://www.youtube.com/watch?app=desktop&v=BD1MSLbs9KE[Java Meets AI] from Lize Raes at Devoxx 2023 has been a great source of inspiration.
+
+Since, the Quarkus team, in collaboration with Dmytro Liubarskyi and the LangChain4j team, has been working on an extension to integrate LLMs in Quarkus applications. This extension is based on the https://github.com/langchain4j[LangChain4j library], which provides a common API to interact with LLMs. The LangChain4j project is a Java re-implementation of the famous https://www.langchain.com/[langchain] library.
+
+In this blog post, we will see how to use the just released https://docs.quarkiverse.io/quarkus-langchain4j/dev/index.html[quarkus-langchain4j] 0.1 extension to integrate LLMs in Quarkus applications. This extension is an exploration to understand how LLMs can be used in Quarkus applications.
+
+== Overview
+
+First, let's have a look at the big picture. When integrating an LLM into a Quarkus application, you need to describe what you want the AI to do. Unlike traditional code, you are going to explain the behavior of the AI using natural language. Of course, there are a few techniques to tame the AI, but we will explore that later.
+
+Strictly relying on the LLM's knowledge might not be enough. Thus, the Quarkus LangChain4j extension provides two mechanisms to extend AI knowledge:
+
+- 1) Tools - a tool lets the LLM execute actions in your application. For instance, you can use a tool to send an email, call a REST endpoint, or execute a database query. The LLM decides when to use the tool, the method parameters, and what to do with the result.
+- 2) Document stores - LLMs are not good at remembering things. In addition, their context has a size limit. Thus, the extension provides a way to store and retrieve information from document stores. Before calling the LLM, the extension can ask for relevant documents in a document store and attach them to the context. The LLM can then use this data to make a decision. For instance, you can load spreadsheet data, reports, or data from a database.
+
+The following diagram illustrates the interactions between the LLM, the tools, and the document stores:
+
+image::llms-big-picture.png[Quarkus LLM integration - the big picture,float="right",align="center"]
+
+
+== Show me some code!
+
+Alright, enough "bla bla", let's see some code! We are going to use Open AI GPT-3.5 (be careful that it's not the state-of-the-art model, but it's good enough for this demo), give it some product reviews, and ask the LLM to classify them between positive and negative reviews. The full code is available in the https://github.com/quarkiverse/quarkus-langchain4j/tree/main/samples/review-triage[quarkus-langchain4j repository].
+
+First, we need the `quarkus-langchain4j-openai` extension:
+
+[source, xml]
+----
+
+ io.quarkiverse.langchain4j
+ quarkus-langchain4j-openai
+ 0.1.0
+
+----
+
+Once we have the extension, it's time to tell the LLM what we want to do. The Quarkus LangChain4J extension provides a declarative way to describe LLM interactions. The idea is the same as the Quarkus REST client. We model the interaction using an interface annotated with `@RegisterAiService`:
+
+[source, java]
+----
+@RegisterAiService
+public interface TriageService {
+ // methods.
+}
+----
+
+The rest of the application would be able to use the LLM by injecting the `TriageService` interface and calling the methods.
+
+Speaking about methods, that's where the magic happens. You will describe what you want the LLM to do using natural language. First, you start with `@SystemMessage` to define the role and scope. Then, you can use `@UserMessage` to describe the task.
+
+[source, java]
+----
+@RegisterAiService
+public interface TriageService {
+ @SystemMessage("""
+ You are working for a bank, processing reviews about
+ financial products. Triage reviews into positive and
+ negative ones, responding with a JSON document.
+ """
+ )
+ @UserMessage("""
+ Your task is to process the review delimited by ---.
+ Apply sentiment analysis to the review to determine
+ if it is positive or negative, considering various languages.
+
+ For example:
+ - `I love your bank, you are the best!` is a 'POSITIVE' review
+ - `J'adore votre banque` is a 'POSITIVE' review
+ - `I hate your bank, you are the worst!` is a 'NEGATIVE' review
+
+ Respond with a JSON document containing:
+ - the 'evaluation' key set to 'POSITIVE' if the review is
+ positive, 'NEGATIVE' otherwise
+ - the 'message' key set to a message thanking or apologizing
+ to the customer. These messages must be polite and match the
+ review's language.
+
+ ---
+ {review}
+ ---
+ """)
+ TriagedReview triage(String review);
+}
+----
+
+VoilĂ ! That's all you need to do to describe the interaction with the LLM. The instructions follow a set of principles to shape the LLM response. Learn more about these techniques in https://docs.quarkiverse.io/quarkus-langchain4j/dev/prompt-engineering.html[the dedicated prompt engineering page].
+
+Now, to call the LLM from the application code, just inject the `TriageService` and call the `triage` method:
+
+[source, java]
+----
+@Path("/review")
+public class ReviewResource {
+
+ @Inject
+ TriageService triage;
+
+ record Review(String review) {
+ // User text
+ }
+
+ @POST
+ public TriagedReview triage(Review review) {
+ return triage.triage(review.review());
+ }
+
+}
+----
+
+That's it! The LLM is now integrated into the application. The `TriageService` interface is used as an ambassador to call the LLM. This declarative approach has many advantages:
+
+- Testability - you can easily mock the LLM by providing a fake implementation of the interface.
+- Observability - you can use the Quarkus metrics annotation to monitor the LLM methods.
+- Resilience - you can use the Quarkus fault-tolerance annotations to handle failures, timeouts, and other transient issues.
+
+== Tools and Document loader
+
+The previous example is a bit simplistic. In the real world, you will need to extend the LLM knowledge with tools and document stores. The `@RegisterAiService` annotation lets you define the tools and document stores to use.
+
+=== Tools
+
+Tools are methods that the LLM can invoke.
+
+To declare a tool, just use the `@Tool` annotation on a _bean_ method:
+
+[source, java]
+----
+@ApplicationScoped
+public class CustomerRepository implements PanacheRepository {
+
+ @Tool("get the customer name for the given customerId")
+ public String getCustomerName(long id) {
+ return find("id", id).firstResult().name;
+ }
+
+}
+----
+
+In this example, we are using the Panache repository pattern to access the database. We have a specific method annotated with `@Tool` to retrieve the customer name. When the LLM needs to get the customer name, it instructs Quarkus to call this method and receives the result.
+
+Obviously, it's not a good idea to expose every operation to the LLM. So, in addition to `@Tool`, you need to list the set of tools you allow the LLM to invoke in the `@RegisterAiService` annotation:
+
+[source, java]
+----
+@RegisterAiService(
+ tools = { TransactionRepository.class, CustomerRepository.class },
+ chatMemoryProviderSupplier = RegisterAiService.BeanChatMemoryProviderSupplier.class
+)
+public interface FraudDetectionAi {
+ // ...
+}
+----
+
+The `chatMemoryProviderSupplier` configuration may raise questions. When using tools, a sequence of messages unfolds behind the scenes. It becomes necessary to configure the AI service's memory to adeptly track these interactions. The `chatMemoryProviderSupplier` allows configuring how the memory is handled. The value `BeanChatMemoryProviderSupplier.class` instructs Quarkus to look for a `ChatMemoryProvider` bean, like the following:
+
+[source, java]
+----
+@RequestScoped
+public class ChatMemoryBean implements ChatMemoryProvider {
+
+ Map