- Bug fix for issue 193: convertNamedValuesToSparkParams was incorrectly creating a Spark parameter value as "%!s()" when a named param was nil (databricks#199 by @esdrasbeleza)
- Fix formatting of *float64 parameters (databricks#215 by @esdrasbeleza)
- Added OAuth support for GCP (databricks#189 by @rcypher-databricks)
- Staging operations: stream files instead of loading into memory (databricks#197 by @mdibaiee)
- Staging operations: don't panic on REMOVE (databricks#205 by @candiduslynx)
- Fix formatting of Date/Time query parameters (databricks#207 by @candiduslynx)
- Bug fix for ArrowBatchIterator.HasNext(). Incorrectly returned true for result sets with zero rows.
- Added .us domain to inference list for AWS OAuth
- Bug fix for OAuth m2m scopes, updated m2m authenticator to use "all-apis" scope.
- Logging improvements
- Added handling for staging remove
- Named parameter support
- Better handling of bad connection errors and specifying server protocol
- OAuth implementation
- Expose Arrow batches to users
- Add support for staging operations
- Improve error information when query terminates in unexpected state
- Do not override global logger time format
- Enable Transport configuration for http client
- fix: update arrow to v12
- Updated doc.go for retrieving query id and connection id
- Bug fix issue 147: BUG with reading table that contains copied map
- Allow WithServerHostname to specify protocol
- bug fix for panic when executing non record producing statements using DB.Query()/DB.QueryExec()
- allow client provided authenticator
- more robust retry behaviour
- bug fix for null values in complex types
- Improved error types and info
- Feat: Support ability to retry on specific failures
- Fetch results in arrow format
- Improve error message and retry behaviour
Fixing cancel race condition
- Package doc (doc.go)
- Handle FLOAT values as float32
- Fix for result.AffectedRows
- Use new ctx when closing operation after cancel
- Set default port to 443
- Package doc (doc.go)
- Handle FLOAT values as float32
- Fix for result.AffectedRows
- Add or edit documentation above methods
- Tweaks to readme
- Use new ctx when closing operation after cancel
- Handle parsing negative years in dates
- fix thread safety issue
- Don't ignore error in InitThriftClient
- Close optimization for Rows
- Close operation after executing statement
- Minor change to examples
- P&R improvements
- Fix thread safety issue in connector
- Support for DirectResults
- Support for context cancellation and timeout
- Session parameters (e.g.: timezone)
- Thrift Protocol update
- Several logging improvements
- Added better examples. See workflow
- Added dbsql.NewConnector() function to help initialize DB
- Many other small improvements and bug fixes
- Removed support for client-side query parameterization
- Removed need to start DSN with "databricks://"
- Fix: Could not fetch rowsets greater than the value of
maxRows
(#18) - Updated default user agent
- Updated README and CONTRIBUTING
- Add escaping of string parameters.
- Fix timeout units to be milliseconds instead of nanos.
- Fix module name
- Initial release