[Question]: How to save and load knowledge graph. #8391
Replies: 10 comments 3 replies
-
🤖 Hi, To save the knowledge graph created from # Assuming `index_store` is an instance of SimpleIndexStore
index_store.persist(persist_path="path_to_your_file.json") In this example, The Please note that the actual implementation of the The The In the I hope this helps! If you have any other questions, feel free to ask. Best, Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
But, I don't sure this code below is correct or not for download knowledge. Because the result when I generate, it not relative with my document.
|
Beta Was this translation helpful? Give feedback.
-
That is indeed the correct way to load an index What's the exact issue you are facing? Is there a difference in behavior before and after saving/loading? |
Beta Was this translation helpful? Give feedback.
-
I have an idea to integrate knowledge graph with vector index for receive the better URL link referent. However, at the moment I need save and load the knowledge graph because it too large of data, But I am encountering issues where the results knowledge graph not match with my information in documents and provide incorrect information., and sometimes I receive empty results when the Knowledge graph. This my code below :
|
Beta Was this translation helpful? Give feedback.
-
Empty results can happen if no keywords from the query match the knowledge graph, then there is no info to generate a response with 🤔 However, you can combine a vector index with a kg index by following this notebook. No need to use nebula though, the default kg index works fine too |
Beta Was this translation helpful? Give feedback.
-
I'm trying to save and reload a Nebula graph knowledge index using the following example code, but I'm unsure how to proceed. Could you provide some guidance? WHat you provided above is only for SimpleIndexStore. I am using Nebula graph databa. Reproducible Examplefrom your_package import NebulaGraphStore, OpenAI, OpenAIEmbedding, ServiceContext, StorageContext, KnowledgeGraphIndex
# Initialize Nebula Graph Store
space_name = "MYEAC_ASSISTANT_TEST"
edge_types, rel_prop_names = ["relationship"], ["relationship"]
tags = ["entity"]
graph_store = NebulaGraphStore(
space_name=space_name,
edge_types=edge_types,
rel_prop_names=rel_prop_names,
tags=tags,
)
# Initialize Language Models
llm_model_name = 'gpt-3.5-turbo-16k'
embedding_model_name = 'text-embedding-ada-002'
llm = OpenAI(temperature=0, model=llm_model_name)
embedding_llm = OpenAIEmbedding(
model=embedding_model_name,
api_key="your_openai_api_key",
api_base="your_openai_api_base",
api_type="your_openai_api_type",
api_version="your_openai_api_version",
)
# Initialize Service Context
service_context = ServiceContext.from_defaults(
llm=llm,
embed_model=embedding_llm,
chunk_size_limit=512
)
# Initialize Graph Storage Context
graph_storage_context = StorageContext.from_defaults(graph_store=graph_store)
# Create Knowledge Graph Index
kg_index = KnowledgeGraphIndex.from_documents(
"your_base_document_data",
storage_context=graph_storage_context,
max_triplets_per_chunk=10,
space_name=space_name,
edge_types=edge_types,
rel_prop_names=rel_prop_names,
tags=tags,
include_embeddings=True,
) QuestionNow, I want to save the |
Beta Was this translation helpful? Give feedback.
-
Hey @soufianechami right now Dosu only responds to issue authors to keep the discussion focused to their issue. We're working on a way to allow users to "fork" a conversation and chat without outside of Github, so stay tuned! |
Beta Was this translation helpful? Give feedback.
-
@soufianechami the sample code there is basically correct. Setup the graph store to point to your existing graph store in nebula, throw it into the storage context, and off you go (since it's in nebula, there is no saving step) |
Beta Was this translation helpful? Give feedback.
-
Hey @logan-markewich, |
Beta Was this translation helpful? Give feedback.
-
The loading of a simple graph store is taking a very long time, is there any way to load it faster? |
Beta Was this translation helpful? Give feedback.
-
Question Validation
Question
Hi
Can anyone help me to find the solution for saving the knowledge graph create from KnowledgeGraphIndex.
Thanks
Beta Was this translation helpful? Give feedback.
All reactions