You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Same code with Payload{Title: "title"} inserts successfully.
Research
I've discovered that this issue happens because input in pq.CopyIn mode is encoded as text using appendEncodedText.
So cyrillic symbols are encoded like this:
Apparently PostgreSQL does not accept such json and returns the error.
This is the reason why no issue appears on same json with latin value of the field title.
On the other hand I've checked how regular INSERT handle this case. It uses encode. So json is just not encoded with encodeBytea.
Are there any workarounds for this issue?
The text was updated successfully, but these errors were encountered:
Description
I'm implementing bulk insert using
pq.CopyIn
feature. One of column isjsonb
type and it contain non-latin symbols, for example:This json can be inserted with regular
INSERT
.When I've tried to wrap the code with
pq.CopyIn
, I got the following error:When I insert same json with latin-only strings, all works properly.
code example
what I want to see
Inserted successfully
what I get
Same code with
Payload{Title: "title"}
inserts successfully.Research
I've discovered that this issue happens because input in
pq.CopyIn
mode is encoded astext
using appendEncodedText.So cyrillic symbols are encoded like this:
Apparently PostgreSQL does not accept such json and returns the error.
This is the reason why no issue appears on same json with latin value of the field
title
.On the other hand I've checked how regular
INSERT
handle this case. It uses encode. So json is just not encoded withencodeBytea
.Are there any workarounds for this issue?
The text was updated successfully, but these errors were encountered: