Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Download Sample Datasets #3725

Merged
merged 29 commits into from
Mar 6, 2019
Merged
Show file tree
Hide file tree
Changes from 25 commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
567ab8f
[WIP] download demo dataset
fm3 Feb 4, 2019
ab27fcd
prepare download + guard
fm3 Feb 11, 2019
0c95bcd
Merge branch 'master' into demo-dataset
fm3 Feb 11, 2019
0df51af
list available sample datasets, enable download
fm3 Feb 12, 2019
d7a243e
Merge branch 'master' of github.com:scalableminds/webknossos into dem…
daniel-wer Feb 14, 2019
e2ee248
add sample datasets modal, add temp button to datasets view to open m…
daniel-wer Feb 14, 2019
7c1e3ac
fix typo in downloading status string
daniel-wer Feb 14, 2019
16fdaa5
fix reporting of downloading state
fm3 Feb 18, 2019
8fba42d
Merge branch 'master' of github.com:scalableminds/webknossos into dem…
daniel-wer Feb 18, 2019
d2a57be
use all datastores for sample dataset downloads, fix error case
daniel-wer Feb 18, 2019
7f4884b
deterministic sorting for datastore list (by name)
fm3 Feb 19, 2019
1be1b42
use single datastore, detect failed datasets, refactoring
daniel-wer Feb 19, 2019
06609aa
rework onboarding dataset step and dataset list placeholder, add link…
daniel-wer Feb 20, 2019
a268dc9
Update messages
fm3 Feb 21, 2019
756d4fb
merge master into demo-dataset
fm3 Feb 22, 2019
c49628d
Merge branch 'master' of github.com:scalableminds/webknossos into dem…
daniel-wer Feb 25, 2019
9bf5e61
refactor sample dataset modal state management according to PR feedback
daniel-wer Feb 25, 2019
054e84b
create generic useFetch method, make sure opening the modal with down…
daniel-wer Feb 26, 2019
155825a
Merge branch 'master' of github.com:scalableminds/webknossos into dem…
daniel-wer Feb 26, 2019
23caa4c
small style tweaks for onboarding dataset view
daniel-wer Feb 26, 2019
75caaea
insert actual sample datasets
fm3 Feb 27, 2019
93ac035
add description to sample datasets modal - untested
daniel-wer Feb 28, 2019
b0d5b50
preserve whitespace in sample datasets modal
daniel-wer Feb 28, 2019
2c7564e
Merge branch 'master' of github.com:scalableminds/webknossos into dem…
daniel-wer Feb 28, 2019
c9cf39f
Merge branch 'master' into demo-dataset
fm3 Mar 4, 2019
171b681
update docs + changelog
fm3 Mar 5, 2019
835748d
merge
fm3 Mar 5, 2019
96d046b
merge master into demo-dataset
fm3 Mar 5, 2019
5300199
Merge branch 'master' into demo-dataset
fm3 Mar 6, 2019
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,16 +10,16 @@ For upgrade instructions, please check the [migration guide](MIGRATIONS.md).
[Commits](https://github.com/scalableminds/webknossos/compare/19.03.0...HEAD)

### Added
-
- webKnossos now comes with a list of sample datasets that can be automatically downloaded and imported from the menu. [#3725](https://github.com/scalableminds/webknossos/pull/3725)

### Changed
- The brush size will now be remembered across page reloads. [#3827](https://github.com/scalableminds/webknossos/pull/3827)

### Fixed
-
-

### Removed
-
-


## [19.03.0](https://github.com/scalableminds/webknossos/releases/tag/19.03.0) - 2019-02-25
Expand Down
6 changes: 6 additions & 0 deletions app/models/binary/DataStore.scala
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,12 @@ class DataStoreDAO @Inject()(sqlClient: SQLClient)(implicit ec: ExecutionContext
parsed
}

override def findAll(implicit ctx: DBAccessContext): Fox[List[DataStore]] =
for {
r <- run(sql"select #${columns} from webknossos.datastores_ order by name".as[DatastoresRow])
parsed <- Fox.combined(r.toList.map(parse))
} yield parsed

def updateUrlByName(name: String, url: String)(implicit ctx: DBAccessContext): Fox[Unit] = {
val q = for { row <- Datastores if notdel(row) && row.name === name } yield row.url
for { _ <- run(q.update(url)) } yield ()
Expand Down
3 changes: 3 additions & 0 deletions conf/messages
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,7 @@ dataSet.notImported=Dataset {0} is not imported
dataSet.name.invalid=A dataset name can only contain letters, digits and underscores
dataSet.import.impossible.name=Import impossible. Dataset name can only consist of a-Z, 0-9, "-" and "_".
dataSet.name.alreadyTaken=This dataset name is already in use.
dataSet.name.notInSamples=This dataset name is not one of the available sample datasets.
dataSet.source.notFound=The data source for the data source couldn’t be found
dataSet.import.success=The import of the dataset was successful
dataSet.import.failed=Failed to import the dataset
Expand All @@ -93,6 +94,8 @@ dataSet.list.failed=Failed to retrieve list of data sets.
dataSet.url.missing=URL missing in the supplied json
dataSet.dataStore.missing=dataStore missing in the supplied json
dataSet.dataSet.missing=dataSet missing in the supplied json
dataSet.downloadAlreadyRunning=Sample dataset download is already running.
dataSet.alreadyPresent=Sample dataset is already present.

dataSource.notFound=Datasource not found on datastore server

Expand Down
25 changes: 24 additions & 1 deletion docs/datasets.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,29 @@ The dataset is now ready to use.
If you uploaded the dataset along with a `datasource-properties.json` metadata file the dataset will be imported automatically without any additional manual steps.
{% endhint %}

### Sample Datasets

A list of sample datasets is provided with webKnossos. Click `Add a Sample Dataset` on the upload page to access it and choose datasets to be added and imported automatically. The three sample datasets currently available are:

- Sample_e2006_wkw: https://static.webknossos.org/data/e2006_wkw.zip
Raw SBEM data and segmentation (sample cutout, 120MB).
Connectomic reconstruction of the inner plexiform layer in the mouse retina.
M Helmstaedter, KL Briggman, S Turaga, V Jain, HS Seung, W Denk.
Nature. 08 August 2013. https://doi.org/10.1038/nature12346

- Sample_FD0144_wkw: https://static.webknossos.org/data/FD0144_wkw.zip
Raw SBEM data and segmentation (sample cutout, 316 MB).
FluoEM, virtual labeling of axons in three-dimensional electron microscopy data for long-range connectomics.
F Drawitsch, A Karimi, KM Boergens, M Helmstaedter.
eLife. 14 August 2018. https://doi.org/10.7554/eLife.38976

- Sample_MPRAGE_250um: https://static.webknossos.org/data/MPRAGE_250um.zip
MRI data (250 MB).
T1-weighted in vivo human whole brain MRI dataset with an ultrahigh isotropic resolution of 250 μm.
F Lüsebrink, A Sciarra, H Mattern, R Yakupov, O Speck.
Scientific Data. 14 March 2017. https://doi.org/10.1038/sdata.2017.32


## Edit Dataset
You can edit the properties of a dataset at any time.
In addition to the required properties that you need to fill in during import, there are more advanced properties that you can set.
Expand Down Expand Up @@ -81,4 +104,4 @@ Read more in the [Sharing guide](./sharing.md#dataset-sharing)
## Using External Datastores
The system architecture of webKnossos allows for versatile deployment options where you can install a dedicated datastore server directly on your lab's cluster infrastructure.
This may be useful when dealing with large datasets that should remain in your data center.
[Please contact us](mailto:[email protected]) if you require any assistance with your setup.
[Please contact us](mailto:[email protected]) if you require any assistance with your setup.
15 changes: 7 additions & 8 deletions docs/getting_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ This will create a folder for your data at `webknossos/binaryData/<organization

{% hint style='info' %}
For production setups, we recommend more elaborate configurations with a public domain name and HTTPS support.
[Please contact us](mailto:[email protected]) if you require any assistance with your production setup.
[Please contact us](mailto:[email protected]) if you require any assistance with your production setup.
{% endhint %}

You may also install webKnossos without Docker.
Expand Down Expand Up @@ -59,8 +59,7 @@ For small datasets (max. 1GB), you can use the upload functionality provide in t
For larger datasets, we recommend the file system upload.
Read more about the import functionality in the [Datasets guide](./datasets.md).

If you do not have a compatible dataset available, you can use [this small dataset excerpt (300 MB)](https://static.webknossos.org/data/e2006_wkw.zip) for testing purposes.
The data was published by [Helmstaedter et al., 2011](https://www.nature.com/articles/nn.2868).
If you do not have a compatible dataset available, you can use one of the [sample datasets](./datasets.md#sample-datasets) for testing purposes.

By default, datasets are visible to all users in your organization.
However, webKnossos includes fine-grained permissions to assign datasets to groups of users.
Expand All @@ -70,12 +69,12 @@ However, webKnossos includes fine-grained permissions to assign datasets to grou


## Your First Annotation
To get started with your first annotation, navigate to the `Datasets` tab on your [dashboard](./dashboard.md).
Identify a dataset that your interested in and click on `Start Skeleton Tracing` to create a new skeleton annotation.
webKnossos will launch the main annotation screen allowing you to navigate your dataset and place markers to reconstruct skeletons.
To get started with your first annotation, navigate to the `Datasets` tab on your [dashboard](./dashboard.md).
Identify a dataset that your interested in and click on `Start Skeleton Tracing` to create a new skeleton annotation.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know that you only changed the whitespace here, but it should be you are 🙈

webKnossos will launch the main annotation screen allowing you to navigate your dataset and place markers to reconstruct skeletons.

Drag the mouse while pressing the left mouse button to navigate the dataset.
Right-click in the data to place markers, called nodes.
Drag the mouse while pressing the left mouse button to navigate the dataset.
Right-click in the data to place markers, called nodes.
Basic movement in the dataset is done with the mouse wheel or by pressing the spacebar keyboard shortcut.

Learn more about the skeleton, volume, and hybrid annotations as well as the interface in the [Tracing UI guide](./tracing_ui.md).
Expand Down
23 changes: 23 additions & 0 deletions frontend/javascripts/admin/admin_rest_api.js
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ import {
type APIProjectProgressReport,
type APIProjectUpdater,
type APIProjectWithAssignments,
type APISampleDataset,
type APIScript,
type APIScriptCreator,
type APIScriptUpdater,
Expand Down Expand Up @@ -904,6 +905,28 @@ export async function getMappingsForDatasetLayer(
);
}

export function getSampleDatasets(
datastoreUrl: string,
organizationName: string,
): Promise<Array<APISampleDataset>> {
return doWithToken(token =>
Request.receiveJSON(`${datastoreUrl}/data/datasets/sample/${organizationName}?token=${token}`),
);
}

export async function triggerSampleDatasetDownload(
datastoreUrl: string,
organizationName: string,
datasetName: string,
) {
await doWithToken(token =>
Request.triggerRequest(
`${datastoreUrl}/data/datasets/sample/${organizationName}/${datasetName}/download?token=${token}`,
{ method: "POST" },
),
);
}

// #### Datastores
export async function getDatastores(): Promise<Array<APIDataStore>> {
const datastores = await Request.receiveJSON("/api/datastores");
Expand Down
6 changes: 6 additions & 0 deletions frontend/javascripts/admin/api_flow_types.js
Original file line number Diff line number Diff line change
Expand Up @@ -135,6 +135,12 @@ export type APIDataset = APIDatasetBase & {
+isActive: true,
};

export type APISampleDataset = {
+name: string,
+description: string,
+status: "available" | "downloading" | "present",
};

export type APIDataSourceWithMessages = {
+dataSource?: APIDataSource,
+messages: Array<APIMessage>,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ class DatasetAddForeignView extends React.PureComponent<Props> {

return (
<div className="container" style={{ paddingTop: 20 }}>
<Card title={<h3>Add Dataset</h3>}>
<Card title={<h3>Add Dataset</h3>} bordered={false}>
<p>
Specify the Dataset in the following format:
<br />
Expand Down Expand Up @@ -112,7 +112,7 @@ class DatasetAddForeignView extends React.PureComponent<Props> {
)}
</FormItem>
<FormItem>
<Button type="primary" htmlType="submit">
<Button size="large" type="primary" htmlType="submit" style={{ width: "100%" }}>
Add Dataset
</Button>
</FormItem>
Expand Down
89 changes: 62 additions & 27 deletions frontend/javascripts/admin/dataset/dataset_add_view.js
Original file line number Diff line number Diff line change
Expand Up @@ -2,49 +2,84 @@
import { type RouterHistory, withRouter } from "react-router-dom";
import { Tabs, Icon } from "antd";
import React from "react";
import { connect } from "react-redux";
import _ from "lodash";

import type { APIUser } from "admin/api_flow_types";
import type { OxalisState } from "oxalis/store";
import { enforceActiveUser } from "oxalis/model/accessors/user_accessor";
import DatasetAddForeignView from "admin/dataset/dataset_add_foreign_view";
import DatasetUploadView from "admin/dataset/dataset_upload_view";
import SampleDatasetsModal from "dashboard/dataset/sample_datasets_modal";
import features from "features";
import renderIndependently from "libs/render_independently";

const { TabPane } = Tabs;

type Props = {
type Props = {|
activeUser: APIUser,
|};
type PropsWithRouter = {|
...Props,
history: RouterHistory,
|};

const renderSampleDatasetsModal = (user: APIUser, history: RouterHistory) => {
renderIndependently(destroy => (
<SampleDatasetsModal
organizationName={user.organization}
destroy={destroy}
onOk={() => history.push("/dashboard/datasets")}
/>
));
};

const DatasetAddView = ({ history }: Props) => (
<Tabs defaultActiveKey="1" className="container">
<TabPane
tab={
<span>
<Icon type="upload" />
Upload Dataset
</span>
}
key="1"
>
<DatasetUploadView
onUploaded={(organization: string, datasetName: string) => {
const url = `/datasets/${organization}/${datasetName}/import`;
history.push(url);
}}
/>
</TabPane>
{features().addForeignDataset ? (
const DatasetAddView = ({ history, activeUser }: PropsWithRouter) => (
<React.Fragment>
<Tabs defaultActiveKey="1" className="container">
<TabPane
tab={
<span>
<Icon type="bars" />
Add foreign Dataset
<Icon type="upload" />
Upload Dataset
</span>
}
key="2"
key="1"
>
<DatasetAddForeignView onAdded={() => history.push("/dashboard")} />
<DatasetUploadView
onUploaded={(organization: string, datasetName: string) => {
const url = `/datasets/${organization}/${datasetName}/import`;
history.push(url);
}}
/>
</TabPane>
) : null}
</Tabs>
{features().addForeignDataset ? (
<TabPane
tab={
<span>
<Icon type="bars" />
Add foreign Dataset
</span>
}
key="2"
>
<DatasetAddForeignView onAdded={() => history.push("/dashboard")} />
</TabPane>
) : null}
</Tabs>
<div style={{ textAlign: "center" }}>
<p>or</p>
<p>
<a href="#" onClick={() => renderSampleDatasetsModal(activeUser, history)}>
Add a Sample Dataset
</a>
</p>
</div>
</React.Fragment>
);

export default withRouter(DatasetAddView);
const mapStateToProps = (state: OxalisState) => ({
activeUser: enforceActiveUser(state.activeUser),
});

export default connect<Props, {||}, _, _, _, _>(mapStateToProps)(withRouter(DatasetAddView));
Loading