Call pgconn_set_internal_encoding_index in all branches of pgconn_set_default_encoding #541
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR reintroduces a call to
pgconn_set_internal_encoding_index
inpgconn_set_default_encoding
. This call was removed in version1.3.0
(specifically in this PR).When
Encoding.default_internal
is set, ruby-pg sends aSET client_encoding TO ...
query, if that query succeeds everything works great. If it fails and the connection is closed, well nothing breaks.Now assuming Ruby and Postgres are both using UTF-8 encoding, if that query fails AND the connection remains open, we end up with desynchronized encoding between Ruby and Ruby-pg where Ruby's encoding is UTF-8, Postgres Encoding is UTF-8 but Ruby-pg falls back to
SQL_ASCII
(I think?) causing it to encode the incoming UTF-8 strings from Postgres into ASCII.Version
1.2.3
handles encoding query failure gracefully. The following output shows this behavior in action in versions1.2.3
,1.5.3
, and a patched version of1.5.3
This repo has a docker stack that reproduces that behavior and demonestrates the fix.