- Added support for Python 3.13.
- Dropped support for Python 3.8 which has reached end of life. Python 3.9+ is now required.
PipelineApiClient.get_pipeline_instances
now supports querying pipeline instances by dataset ID and version.
- New methods
Dataset.delete_version
,Dataset.delete_edition
, andDataset.delete_distribution
for deleting dataset versions, editions, and distributions respectively.
- Removed dependency on the vulnerable (and seemingly abandoned) python-jose library.
- PyJWT is no longer a dependency.
- New method
Dataset.auto_create_edition
for creating a new edition with an automatic name based on the current time.
- Added support for Python 3.12.
- Dropped support for Python 3.7 which has reached end of life. Python 3.8+ is now required.
- Fixed version requirement for urllib3.
- The optional search filter in
Dataset.get_datasets
has been relaxed to allow matches anywhere in the dataset name (instead of only at the beginning). In addition it now also searches the dataset's ID.
- Added support for Python 3.11.
- New method
TeamClient.get_user_by_username
for looking up Keycloak users. - New method
TeamClient.update_team_members
for adding and/or removing members to/from a team.
- Fixed a bug in
TeamClient.update_team_attribute
whenvalue
is falsy.
- New parameter
include
added toTeamClient.get_teams
. - New method
TeamClient.get_team_members
for getting the members of a team. - New methods
TeamClient.update_team_name
andTeamClient.update_team_attribute
for updating team names and their attributes, respectively.
- New method
TeamClient.get_team_by_name
. - Fixed a deprecation warning from urllib3.
- A new client class
TeamClient
for retrieving information about teams has been added. - The classes
PostEvent
,EventStreamClient
, andElasticsearchQueries
for working with event streams have all been removed. - The
WebhookClient
class for managing webhooks has been removed. - The
SimpleDatasetAuthorizerClient
class was deprecated and has been removed.
- Added support for Python 3.10.
- Dropped support for Python 3.6.
- Use built-in JSON encoder from
requests
forpost
,put
andpatch
methods.
- Handle error responses from status API update calls.
- Tweaked the webhook client to create and authorize tokens based on operations (read, write) instead of tying them to a specific service.
- Add new SDK
WebhookClient
for managing webhook tokens with okdata-permission-api.
- Fix handling of Keycloak tokens that don't contain a refresh token.
-
Added support for the new permission API.
-
Retries have been re-enabled for low-level network errors (connection errors, read errors, and redirects). The
retry
parameter now only controls the maximum number of retries to perform on bad HTTP status codes. -
Fix refresh of Keycloak access token when refresh token is invalid, e.g. due to inactive session because Keycloak server restarted.
Dataset.update_dataset
now supports partial metadata updates when the keyword argumentpartial
is true.
- PyJWT 2.0.0 or above is now required.
- Authentication is no longer necessary for downloading public ("green") datasets.
PostEvent.post_event
now also supports retries.
- Added
Dataset.get_distribution
method. - Added
Dataset.update_*
methods for updating dataset, version, edition and distribution metadata. - Added retry parameter to SDK methods.
- The
confidentiality
metadata field is now fully replaced byaccessRights
.
- The
okdata
namespace package now uses the old-stylepkg_resources
declaration instead of being an implicit namespace package.
- Added
Status.update_status
method
- Rename project to
okdata-sdk
.- Repository name will be
okdata-sdk-python
- PyPI package
okdata-sdk
- Python modules will reside in
okdata.sdk.*
, whereokdata
is an implicit namespace.
- Repository name will be
- No changes
- Modules reorganized to leave top level
origo
namespace empty, all modules are now underorigo.sdk
. Ie.from origo.data.upload import Upload
now becomesfrom origo.sdk.data.upload import Upload
. This allows other packages to add their modules under theorigo
namespace.
-
Support the new status API response format. This affects the upload command response, now returning a
trace_id
key instead ofstatus
. -
PipelineInstance
now takes an optional parameterpipelineProcessorId
, intended to supersedepipelineArn
once all users have been updated. ThepipelineArn
parameter is now optional. -
The
taskConfig
toPipelineInstance
is now also optional. -
PipelineInstance
no longer accepts the obsolete parametersschemaId
,transformation
, anduseLatestEdition
.
- Enable, disable and get event stream sinks by type (not id).
- Add function for listing webhook tokens for a dataset.
- Rename event stream methods:
enable_subscribable
>enable_subscription
disable_subscribable
>disable_subscription
add_sink
>enable_sink
remove_sink
>disable_sink
- Add event stream API support.
- Use the same default value for
cacheCredentials
in both development and production environments. - Update Makefile for better Python 3 compatibility.
- Add function for downloading data.
- Call
raise_for_status()
when not using SDK class. - Reintroduce Python 3.6 support.
- Check if
refresh token
is expired before callingKeycloakOpenID.refresh_token()
.
- Change default environment to production.
- Add
simple-dataset-authorizer
.
- Get default value from
s3SignedData
if it doesn't exist. - Return both status and status ID for uploads to a dataset.
- Add support for
taskConfig
. - Add support for status API.
- Remove
DataExistError
. - Use
self.post()
inPostEvent
. - Use
response.raise_for_status()
as sole error handling in SDK.
- Use
python-keycloak
.
- Use API gateway mapped URL for Elasticsearch queries.
- Add Elasticsearch queries SDK.
- Update documentation with new code examples and structure.
- Add stream manager to SDK.
- Distribute JSON schemas for pipeline inputs and schemas.
- No changes
- Schema bugfix.
- Fix homepage URL in
setup.py
.
- Add schema.
- Add code examples for creating datasets, uploading files, and sending events.
- Move developer documentation to separate file.
- Add support for pipeline input.
- Update README.
- Export schema for pipeline and instance.
- Add possibility to create distribution via API.
- Use correct production value for
s3BucketUrl
. - Use correct resource name for pipeline instances.
- Update tests to match new filter implementation.
- Add filter possibilities on dataset list.
- Add helpers for pipeline API.
- Add pipeline instances to SDK client.
- Add pipeline API SDK client for pipelines.
- Run
is-git-clean
beforebump-patch
. - Add constructor argument for custom configuration object.
- Use correct production URLs.
- Add
__init__.py
toevent
module.
- No changes
- Include all packages, but exclude tests, in
setup.py
. - Bump version before building.
- Automatic tests with GitHub Actions.
- Add make targets for publishing to PyPI.
- Add classifiers for license.
- Initial release.