You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
⚠️ This issue is generated, it means the nameing might be done differently in this package (ex: add_documents_json instead of addDocumentsJson). Keep the already existing way of naming in this package to stay idiomatic with the language and this repository.
📣 We strongly recommend doing multiple PRs to solve all the points of this issue
new valid formats to push data files, additionally to the JSON format: CSV and NDJSON formats.
it enforces the Content-type header for every route requiring a payload (POST and PUT routes)
Here are the expected changes to completely close the issue:
Currently, the SDKs always send Content-Type: application/json to every request. Only the POST and PUT requests should send the Content-Type: application/json and not the DELETE and GET ones.
Add the following methods and 🔥 the associated tests 🔥 to ADD the documents. Depending on the format type (csv or ndjson) the SDK should send Content-Type: application/x-dnjson or Content-Type: text/csv)
addDocumentsJson(string docs, string primaryKey)
addDocumentsCsv(string docs, string primaryKey)
addDocumentsCsvInBatches(string docs, int batchSize, string primaryKey)
addDocumentsNdjsonInBatches(string docs, int batchSize, string primaryKey)
Add the following methods and 🔥 the associated tests 🔥 to UPDATE the documents. Depending on the format type (csv or ndjson) the SDK should send Content-Type: application/x-dnjson or Content-Type: text/csv)
updateDocumentsNdjsonInBatches(string docs, int batchSize, string primaryKey)
docs are the documents sent as String primaryKey is the primary key of the index batchSize is the size of the batch. Example: you can send 2000 documents in raw String in docs and ask for a batchSize of 1000, so your documents will be sent to MeiliSearch in two batches.
add_documents_json
instead ofaddDocumentsJson
). Keep the already existing way of naming in this package to stay idiomatic with the language and this repository.📣 We strongly recommend doing multiple PRs to solve all the points of this issue
MeiliSearch v0.23.0 introduces two changes:
Content-type
header for every route requiring a payload (POST
andPUT
routes)Here are the expected changes to completely close the issue:
Currently, the SDKs always send
Content-Type: application/json
to every request. Only thePOST
andPUT
requests should send theContent-Type: application/json
and not theDELETE
andGET
ones.Add the following methods and 🔥 the associated tests 🔥 to ADD the documents. Depending on the format type (
csv
orndjson
) the SDK should sendContent-Type: application/x-dnjson
orContent-Type: text/csv
)addDocumentsJson(string docs, string primaryKey)
addDocumentsCsv(string docs, string primaryKey)
addDocumentsCsvInBatches(string docs, int batchSize, string primaryKey)
addDocumentsNdjson(string docs, string primaryKey)
addDocumentsNdjsonInBatches(string docs, int batchSize, string primaryKey)
Add the following methods and 🔥 the associated tests 🔥 to UPDATE the documents. Depending on the format type (
csv
orndjson
) the SDK should sendContent-Type: application/x-dnjson
orContent-Type: text/csv
)updateDocumentsJson(string docs, string primaryKey)
updateDocumentsCsv(string docs, string primaryKey)
updateDocumentsCsvInBatches(string docs, int batchSize, string primaryKey)
updateDocumentsNdjson(string docs, string primaryKey)
updateDocumentsNdjsonInBatches(string docs, int batchSize, string primaryKey)
docs
are the documents sent asString
primaryKey
is the primary key of the indexbatchSize
is the size of the batch. Example: you can send 2000 documents in raw String indocs
and ask for abatchSize
of 1000, so your documents will be sent to MeiliSearch in two batches.Example of PRs:
CSV
NDJSON
meilisearch-python#329Related to: meilisearch/integration-guides#146
If this issue is partially/completely implemented, feel free to let us know.
The text was updated successfully, but these errors were encountered: