You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I use Rasqal to parse a TTL file with ~1000 triples. I execute ~350 SPARQL queries to get all info that I need in that file.
It takes ~45sec to do that, it seems super slow to me.
I have been trying to reduce this time without success so far:
I have tried to reuse the same data graph for all my queries but it is impossible since the data graph is freed when I free the query.
I have tried to use the same query object for all my queries (multiple calls of rasqal_query_prepare and rasqal_query_execute on the same query) , but it does not work.
For each query, I use a copy of the data graph by calling rasqal_new_data_graph_from_data_graph instead of creating a new one by calling rasqal_new_data_graph_from_uri, but it does not reduce execution time.
Do the file is read from the filesystem for each query ? It would explain the slowness of it.
The text was updated successfully, but these errors were encountered:
JonathanGirardeau
changed the title
How to make multiple querry on the same data graph efficiently
How to make multiple queries on the same data graph efficiently
Oct 14, 2024
JonathanGirardeau
changed the title
How to make multiple queries on the same data graph efficiently
How to make multiple queries on the same file efficiently
Oct 14, 2024
Hello,
I use Rasqal to parse a TTL file with ~1000 triples. I execute ~350 SPARQL queries to get all info that I need in that file.
It takes ~45sec to do that, it seems super slow to me.
I have been trying to reduce this time without success so far:
rasqal_query_prepare
andrasqal_query_execute
on the same query) , but it does not work.rasqal_new_data_graph_from_data_graph
instead of creating a new one by callingrasqal_new_data_graph_from_uri
, but it does not reduce execution time.Do the file is read from the filesystem for each query ? It would explain the slowness of it.
The text was updated successfully, but these errors were encountered: