-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MapScan values are garbled #225
Comments
Can someone help me with this? I found a related issue here, but it looks like this should have been solved by now. Maybe I'm just not getting it. Any help would be appreciated. |
Hi @volkanunsal, can you replicate on an sqlite3 database? I received a similar report about base64 encoded values in mapScan in the past (#191), but I wasn't really sure how to replicate. If you can give me some more detail about your database, schema, query, I can look into it, especially if the database is one of the 3 in the sqlx test suite (postgres, sqlite3 or mysql). |
@jmoiron It happens with postgres. I'm not exactly sure how to create a test case for it in sqlite, but I'll give it a shot this weekend. |
The schema of the table is very simple.
The query is This is the part where I'm encoding it into JSON. Could this be where the encoding is getting messed up? type rowMap map[string]interface{}
type fieldMap map[string]string
type fieldsMap map[string]fieldMap
type successPayload struct {
Rows []rowMap `json:"rows"`
Fields fieldsMap `json:"fields"`
Elapsed float64 `json:"time"`
TotalRows int64 `json:"total_rows"`
}
res := successPayload{scannedRows, fields, elapsed, count}
// Write it back to the client.
w.Header().Set("Content-Type", "application/json; charset=utf-8")
json.NewEncoder(w).Encode(res) |
I just tried it with sqlite3 using the following schema. CREATE TABLE `users` (
`id` INTEGER,
`name` VARCHAR(64) NULL
); In sqlite3, the id field works, but the name field is still encoded in base64. Here is my test case: https://gist.github.com/volkanunsal/e5b84aef87317fb4dff75c97f4c875a8 |
@jmoiron Have you had a chance to look at the example above? Do you see any clues as to what might be happening? Is there anything I can do to fix this on my own end? Thanks. |
definately seeing this with a mysql backend |
I have the same problem with SliceScan(), it will return []byte for the string values which the JSON encoder then will convert to base64. My workaround is manually convert them to string before JSON. |
I got the same error with mysql for MapScan() and for SliceScan() as well. |
This may be related to prepared queries vs non-prepared queries. Try preparing the queries first and see if it fixes the problem. |
When performing a SELECT combined with Rows.mapScan, it base64 encodes the Postgres JSON data type instead of either a) parsing it or b) returning it as a string unaltered. Any advice? |
The std encode/json encode byte slice with base64. reproduce sample SHOW FULL COLUMNS FROM _tablename_ go: 1.9 // rows.Scan => []uint8
type dummy struct {
Field *string `db:"Field"`
Type *string `db:"Type"`
Collation *string `db:"Collation"`
Null *string `db:"Null"`
Key *string `db:"Key"`
Default *string `db:"Default"`
Extra *string `db:"Extra"`
Privileges *string `db:"Privileges"`
Comment *string `db:"Comment"`
} Fields with Doc for row#Scan database/sql/sql.go |
I also encountered this problem, so I try transform the field like this。 It works : tmp := make(map[string]interface{})
rows.MapScan(tmp)
for k, encoded := range tmp {
switch encoded.(type) {
case []byte:
tmp[k] = string(encoded.([]byte))
}
} |
how to solve this problem? |
columns, _ := rows.Columns()
columnTypes, _ := rows.ColumnTypes()
columnPointers := make([]interface{}, len(columns))
for i := 0; i < len(columns); i++ {
t := columnTypes[i].ScanType()
...
if t.Name() == "RawBytes" {
columnPointers[i] = new(sql.RawBytes)
}
...
}
...
for col, result := range results {
switch result.(type) {
case *sql.RawBytes:
v, _ := result.(*sql.RawBytes)
results[col] = string(*v)
....
}
}
|
I'm also having this problem. 10.2.32-MariaDB =-( |
Facing the same problem with mySql, data type after doing a mapScan []byte, whereas the actual type in DB is int and varchar |
also experiencing this with AWS Aurora (mysql) any good solutions? |
Have the same issue with MySQL |
My solution: for rows.Next() {
var data map[string]interface{} = make(map[string]interface{})
err = rows.MapScan(data)
if err != nil {
fmt.Println(err, sqlStr, params.params)
}
for k, v := range data {
if value, ok := v.([]byte); ok {
data[k] = string(value)
}
}
res = append(res, data)
} |
There are two problems. 1.
|
values := make([]interface{}, len(columns)) |
Maybe, database/sql
can do the conversion if small backward incompatibility is acceptable.
I'm implementing a SQL server and I got a strange result from using MapScan. It looks garbled:
I wonder if there is something I'm missing about how to use
MapScan
? Can you help me understand what's going wrong here?Here is my code:
The text was updated successfully, but these errors were encountered: