diff --git a/.github/CODE_OF_CONDUCT.md b/.github/CODE_OF_CONDUCT.md new file mode 100644 index 000000000..0240e7aa8 --- /dev/null +++ b/.github/CODE_OF_CONDUCT.md @@ -0,0 +1,76 @@ +# Contributor Covenant Code of Conduct + +## Our Pledge + +In the interest of fostering an open and welcoming environment, we as +contributors and maintainers pledge to making participation in our project and +our community a harassment-free experience for everyone, regardless of age, body +size, disability, ethnicity, sex characteristics, gender identity and expression, +level of experience, education, socio-economic status, nationality, personal +appearance, race, religion, or sexual identity and orientation. + +## Our Standards + +Examples of behavior that contributes to creating a positive environment +include: + +- Using welcoming and inclusive language +- Being respectful of differing viewpoints and experiences +- Gracefully accepting constructive criticism +- Focusing on what is best for the community +- Showing empathy towards other community members + +Examples of unacceptable behavior by participants include: + +- The use of sexualized language or imagery and unwelcome sexual attention or + advances +- Trolling, insulting/derogatory comments, and personal or political attacks +- Public or private harassment +- Publishing others' private information, such as a physical or electronic + address, without explicit permission +- Other conduct which could reasonably be considered inappropriate in a + professional setting + +## Our Responsibilities + +Project maintainers are responsible for clarifying the standards of acceptable +behavior and are expected to take appropriate and fair corrective action in +response to any instances of unacceptable behavior. + +Project maintainers have the right and responsibility to remove, edit, or +reject comments, commits, code, wiki edits, issues, and other contributions +that are not aligned to this Code of Conduct, or to ban temporarily or +permanently any contributor for other behaviors that they deem inappropriate, +threatening, offensive, or harmful. + +## Scope + +This Code of Conduct applies both within project spaces and in public spaces +when an individual is representing the project or its community. Examples of +representing a project or community include using an official project e-mail +address, posting via an official social media account, or acting as an appointed +representative at an online or offline event. Representation of a project may be +further defined and clarified by project maintainers. + +## Enforcement + +Instances of abusive, harassing, or otherwise unacceptable behavior may be +reported by contacting the project team at support at dataverse dot org. All +complaints will be reviewed and investigated and will result in a response that +is deemed necessary and appropriate to the circumstances. The project team is +obligated to maintain confidentiality with regard to the reporter of an incident. +Further details of specific enforcement policies may be posted separately. + +Project maintainers who do not follow or enforce the Code of Conduct in good +faith may face temporary or permanent repercussions as determined by other +members of the project's leadership. + +## Attribution + +This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, +available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html + +[homepage]: https://www.contributor-covenant.org + +For answers to common questions about this code of conduct, see +https://www.contributor-covenant.org/faq diff --git a/.github/CONTRIBUTING.md b/.github/CONTRIBUTING.md new file mode 100644 index 000000000..67a47326c --- /dev/null +++ b/.github/CONTRIBUTING.md @@ -0,0 +1,123 @@ +# Guidance on how to contribute + +> All contributions to this project will be released under the Apache License, Version 2.0. +> By submitting a pull request or filing a bug, issue, or +> feature request, you are agreeing to comply with this waiver of copyright interest. +> Details can be found in our [LICENSE](LICENSE). + +Thank you for your interest in contributing to Dataverse Frontend! We are open to contributions from everyone. You don't +need permission to participate. Just jump in. If you have questions, please reach out using one or more of the channels +described below. + +We aren't just looking for developers. There are many ways to contribute to Dataverse. We welcome contributions +of ideas, bug reports, usability research/feedback, documentation, code, and more! Please, check the [Dataverse main repo] +for more information on how to contribute to the Dataverse project. + +Here are ways to get involved: + +1. [Star] the project! +2. Answer questions and join in on [issue tracker]. +3. [Report a bug] that you find. +4. Browse ["help wanted"] and ["good first issue"] labels for areas of Dataverse/JavaScript/code you know well to consider, build or document. + +## Bug Reports/Issues + +An issue is a bug (a feature is no longer behaving the way it should) or a feature (something new to Dataverse that helps +users complete tasks). You can browse the Dataverse Frontend [issue tracker] on GitHub by open or closed issues. + +Before submitting an issue, please search the existing issues by using the search bar at the top of the page. If there +is an existing open issue that matches the issue you want to report, please add a comment to it. + +If there is no pre-existing issue or it has been closed, please click on the "New Issue" button, log in, and write in +what the issue is (unless it is a security issue which should be reported privately to security@dataverse.org). + +If you do not receive a reply to your new issue or comment in a timely manner, please email support@dataverse.org with a link to the issue. + +### Writing an Issue + +For the subject of an issue, please start it by writing the feature or functionality it relates to, i.e. "Create Account:..." +or "Dataset Page:...". In the body of the issue, please outline the issue you are reporting with as much detail as possible. +In order for the Dataverse development team to best respond to the issue, we need as much information about the issue as +you can provide. Include steps to reproduce bugs. Indicate which version you're using, which is shown at the bottom of the page. We love screenshots! + +### Issue Attachments + +You can attach certain files (images, screenshots, logs, etc.) by dragging and dropping, selecting them, or pasting from +the clipboard. Files must be one of GitHub's [supported attachment formats] such as png, gif, jpg, txt, pdf, zip, etc. +(Pro tip: A file ending in .log can be renamed to .txt, so you can upload it.) If there's no easy way to attach your file, +please include a URL that points to the file in question. + +[supported attachment formats]: https://help.github.com/articles/file-attachments-on-issues-and-pull-requests/ + +## Documenting + +This is probably one of the most important things you can do as a contributor and probably one of the easiest things you can do. +For the moment we only use the [README], which you can edit right here on GitHub! + +if there is a mistake you can just press the edit button on the file and make the change directly even without an IDE. +These pull requests are highly appreciated and will help future generations of developers! + +[README]: https://github.com/IQSS/dataverse-frontend/edit/develop/README.md + +## Changing the code-base + +If you are interested in working on the Dataverse Frontend code, great! Before you start working on something we would +suggest talking to us in the [Zulip UI Dev Stream] + +Generally speaking, you should fork this repository, make changes in your own fork, and then submit a pull request. All +new code should have associated unit tests that validate implemented features and the presence or lack of defects. +Additionally, the code should follow any stylistic and architectural guidelines prescribed by the project. In the absence +of such guidelines, mimic the styles and patterns in the existing code-base. + +Pull requests are highly appreciated. More than [5 people] have written parts of Dataverse Frontend (so far). Here are some +guidelines to help: + +1. **Solve a problem** – Features are great, but even better is cleaning-up and fixing issues in the code that you discover. +2. **Write tests** – This helps preserve functionality as the codebase grows and demonstrates how your change affects the code. +3. **Write documentation** – We have a [README] that always needs updating. +4. **Small > big** – Better to have a few small pull requests that address specific parts of the code, than one big pull request that jumps all over. +5. **Comply with Coding Standards** – See next section. + +### How to start + +After you’ve forked the Dataverse Frontend repository, you should follow the Getting Started instructions in the +[Developer Guide](DEVELOPER_GUIDE.md) to get your local environment up and running. + +### GitHub reviews & assignments + +All PRs receive a review from at least one maintainer. We’ll do our best to do that review as soon as possible, but we’d +rather go slow and get it right than merge in code with issues that just lead to trouble. + +You might see us assign multiple reviewers, in this case these are OR checks (i.e. either Person1 or Person2) unless we +explicitly say it’s an AND type thing (i.e. can both Person3 and Person4 check this out?). + +We use the assignee to show who’s responsible at that moment. We’ll assign back to the submitter if we need additional +info/code/response, or it might be assigned to a branch maintainer if it needs more thought/revision (perhaps it’s directly +related to an issue that's actively being worked on). + +After making your pull request, your goal should be to help it advance through our kanban board at [IQSS Dataverse Project]. +If no one has moved your pull request to the code review column in a timely manner, please reach out. Note that once a pull request +is created for an issue, we'll remove the issue from the board so that we only track one card (the pull request). + +Thanks for your contribution! + +## Other ways to contribute to the code + +We love code contributions. Developers are not limited to the Frontend Dataverse code in this git repo. You can help with +API client libraries in your favorite language that are mentioned in the [API Guide][] or create a new library. In this +repo we are using the [JavaScript Dataverse API client library] in which you can contribute to as well. + +[API Guide]: http://guides.dataverse.org/en/latest/api +[issue tracker]: https://github.com/IQSS/dataverse-frontend/issues +[Dataverse main repo]: https://github.com/IQSS/dataverse/blob/develop/CONTRIBUTING.md +[Star]: https://github.com/iqss/dataverse-frontend/stargazers +[Report a bug]: https://github.com/iqss/dataverse-frontend/issues/new?assignees=&labels=&projects=&title=%5BBUG%5D+Your+title +["help wanted"]: https://github.com/iqss/dataverse-frontend/labels/help%20wanted: +["good first issue"]: https://github.com/iqss/dataverse-frontend/labels/good%20first%20issue +[Zulip UI Dev Stream]: https://dataverse.zulipchat.com/#narrow/stream/410361-ui-dev +[5 people]: https://github.com/iqss/dataverse-frontend/graphs/contributors +[Getting Started Section]: https://github.com/IQSS/dataverse-frontend?tab=readme-ov-file#getting-started +[TypeScript Deep Dive Style Guide]: https://basarat.gitbook.io/typescript/styleguide +[pre-commit]: https://www.npmjs.com/package/pre-commit +[IQSS Dataverse Project]: https://github.com/orgs/IQSS/projects/34 +[JavaScript Dataverse API client library]: https://github.com/IQSS/dataverse-client-javascript diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md index cf668e2fe..999e90fbe 100644 --- a/.github/ISSUE_TEMPLATE/bug_report.md +++ b/.github/ISSUE_TEMPLATE/bug_report.md @@ -1,20 +1,41 @@ -# What steps does it take to reproduce the issue? +--- +name: Bug report +about: Did you encounter something unexpected or incorrect in the Dataverse Frontend? + We'd like to hear about it! +title: '' +labels: 'Type: Bug' +assignees: '' +--- -## When does this issue occur? + -## What did you expect to happen? +**What steps does it take to reproduce the issue?** -## Which version of Dataverse Frontend are you using? +- When does this issue occur? -## Any related open or closed issues to this bug report? +- Which page(s) does it occur on? -# Screenshots: +- What happens? + +- To whom does it occur (all users, curators, superusers)? + +- What did you expect to happen? + +**Which version of Dataverse Frontend are you using?** + +**Any related open or closed issues to this bug report?** + +**Screenshots:** No matter the issue, screenshots are always welcome. diff --git a/.github/ISSUE_TEMPLATE/feature_request.md b/.github/ISSUE_TEMPLATE/feature_request.md index 8c62e03b4..e1fa50e35 100644 --- a/.github/ISSUE_TEMPLATE/feature_request.md +++ b/.github/ISSUE_TEMPLATE/feature_request.md @@ -1,13 +1,30 @@ -## Overview of the Feature Request +--- +name: Feature request +about: Suggest an idea or new feature for the Dataverse software! +title: 'Feature Request/Idea:' +labels: 'Type: Feature' +assignees: '' +--- -## What kind of user is the feature intended for? + -## What existing behavior do you want changed? +**Overview of the Feature Request** -## Any brand new behavior do you want to add to Dataverse? +**What kind of user is the feature intended for?** +(Example users roles: API User, Curator, Depositor, Guest, Superuser, Sysadmin) -## Any open or closed issues related to this feature request? +**What inspired the request?** + +**What existing behavior do you want changed?** + +**Any brand-new behavior do you want to add to Dataverse?** + +**Any open or closed issues related to this feature request?** diff --git a/.github/PULL_REQUEST_TEMPLATE.md b/.github/PULL_REQUEST_TEMPLATE.md index f28b0de8c..b57aa23fc 100644 --- a/.github/PULL_REQUEST_TEMPLATE.md +++ b/.github/PULL_REQUEST_TEMPLATE.md @@ -1,15 +1,15 @@ -## What this PR does / why we need it: +**What this PR does / why we need it**: -## Which issue(s) this PR closes: +**Which issue(s) this PR closes**: -- Closes # +Closes # -## Special notes for your reviewer: +**Special notes for your reviewer**: -## Suggestions on how to test this: +**Suggestions on how to test this**: -## Does this PR introduce a user interface change? If mockups are available, please link/include them here: +**Does this PR introduce a user interface change? If mockups are available, please link/include them here**: -## Is there a release notes update needed for this change?: +**Is there a release notes update needed for this change?**: -## Additional documentation: +**Additional documentation**: diff --git a/.gitignore b/.gitignore index a9076c592..a53c72760 100644 --- a/.gitignore +++ b/.gitignore @@ -46,3 +46,82 @@ yarn-error.log* /dev-env/dataverse /dev-env/dataverse-sample-data /packages/design-system/.nyc_output + + +# Compiled source # +################### +*.com +*.class +*.dll +*.exe +*.o +*.so +_site/ + +# Packages # +############ +# it's better to unpack these files and commit the raw source +# git has its own built in compression methods +*.7z +*.dmg +*.gz +*.iso +*.jar +*.rar +*.tar +*.zip + +# Logs and databases # +###################### +*.log +*.sql +*.sqlite + +# OS generated files # +###################### +.DS_Store +.DS_Store? +.Spotlight-V100 +.Trashes +Icon? +ehthumbs.db +Thumbs.db + +# Vim swap files # +################## +*.swp + +# Python # +################# +*.pyc +*.egg-info/ +__pycache__/ +*.py[cod] +.env +.python-version + +# pyenv # +######### +.python-version + +# Django # +################# +*.egg-info +.installed.cfg + +# Unit test / coverage reports +################# +htmlcov/ +.tox/ +.coverage +.cache +nosetests.xml +coverage.xml + +# Front-End # +############# +node_modules/ +bower_components/ +.grunt/ +src/vendor/ +dist/ \ No newline at end of file diff --git a/DEVELOPER_GUIDE.md b/DEVELOPER_GUIDE.md new file mode 100644 index 000000000..3e121292b --- /dev/null +++ b/DEVELOPER_GUIDE.md @@ -0,0 +1,869 @@ + + +
+
+

Developer Guide

+

+ Guidelines and instructions for developers working on the Dataverse Frontend +

+
+ +--- + +## Table of Contents + +
    +
  1. + Getting Started + +
  2. +
  3. Coding Standards
  4. +
  5. Writing test cases
  6. +
  7. Deployment
  8. +
  9. Publishing the Design System
  10. +
+ +--- + +## Getting Started + +_To get a local copy up and running follow these simple example steps._ + +### Prerequisites + +[![DockerDesktop][_shield_docker]][_uses_docker_url] + +1. **Node & NPM**: `node >= v16` and `npm >= v8`. Recommended versions for this project are `node v19` and `npm v9`. +2. **Docker**: We use Docker Desktop, but the Docker CLI will also work. +3. **Create a Copy of .npmrc**: In the project directory root, duplicate `.npmrc.example`, saving the copy as `.npmrc`. + + > ```bash + > # root project directory + > cp .npmrc.example .npmrc + > ``` + +4. **Create and Add Your Npm and GitHub Tokens**: In order to connect with the Dataverse API, developers will need to + install [@iqss/dataverse-client-javascript][dv_repo_dvclientjs_npm_url] from the GitHub registry by following the steps + outlined below. Read more about access tokens on [GitHub Docs][dv_docs_github_userauthtoken_url]. + + > **Getting a GitHub Token** + > + > 1. Go to your GitHub [Personal Access Tokens][dv_docs_github_token_url] settings + > 2. Select `Generate new token (classic)` + > 3. Give the token a name and select scope `read:packages` + > 4. Copy the generated token and replace the string `YOUR_GITHUB_AUTH_TOKEN` in the previously created `.npmrc` file. + > Now, you should be able to install the [Dataverse JavaScript][dv_repo_dvclientjs_url] client using npm. + +Afterwards, your .npmrc file should resemble the following: + +```properties +# .npmrc +legacy-peer-deps=true +# js-dataverse registry +//npm.pkg.github.com/:_authToken= +@iqss:registry=https://npm.pkg.github.com/ +``` + +> **Note:** The .npmrc file is not identical to .npmrc.example, as the latter contains the registry to publish the design +> system, see [Publishing the Design System](#publishing-the-design-system) for more information. To run the project you only +> need the above configuration. + +

(back to top)

+
+ +### Installation & Setup + +1. Install the project & its dependencies + +```bash +# root project directory +npm install +``` + +> [!WARNING] +> You may see a message about vulnerabilities after running this command. +> +> Please check this announcement from Create React App repository +> [facebook/create-react-app#11174][_uses_repo_cra_error_url]. These vulnerabilities will not be included in the +> production build since they come from libraries only used during development. + +2. Build the UI Library, needed for running the application. + +```bash +# root project directory +cd packages/design-system && npm run build +``` + +**Running & Building the App:** + +Run the app in the development mode. Open [http://localhost:5173][dv_app_localhost_build_url] to view it in your browser. + +```bash +# root project directory +npm start +``` + +The application will actively watch the directory for changes and reload when changes are saved. You may also see any +existing linting errors in the console. + +```bash +# root project directory + +# Build the app for production to the `/dist/` directory: +npm run build + +# Locally preview the production build: +npm run preview + +``` + +
+ +**Storybook:** + +Runs the Storybook in the development mode. + +There are 2 Storybook instances, one for the general Design System and one for the Dataverse Frontend component +specifications. Both should be started automatically and available at: + +- Dataverse Frontend Storybook: [http://localhost:6006/][dv_app_localhost_storybook_url] +- Dataverse Design System Storybook: [http://localhost:6007/][dv_app_localhost_designsystem_url] + +```bash +# $ cd packages/design-system + +# `npm run storybook` should automatically open in your default browser +npm run storybook + +# $ cd packages/design-system +npm run build-storybook + +``` + +Note that both Storybook instances are also published to Chromatic as part of the build and merge processes, located at: + +- [DataverseFrontend-Chromatic](https://www.chromatic.com/builds?appId=646f68aa9beb01b35c599acd) +- [DataverseDesignSystem-Chromatic](https://www.chromatic.com/builds?appId=646fbe232a8d3b501a1943f3) + +

(back to top)

+
+ +### Running the Project Locally + +A containerized environment, oriented to local development, is available to be run from the repository. + +This environment contains a dockerized instance of the Dataverse backend with its dependent services (database, +mail server, etc.), as well as a npm development server running the SPA frontend (With code watching). + +This environment is intended for locally testing any functionality that requires access to the Dataverse API from the +SPA frontend. + +There is a Nginx reverse proxy container on top of the frontend and backend containers to avoid CORS issues while +testing the application. + +
+ +**Run the Environment** + +As the script argument, add the name of the Dataverse image tag you want to deploy. + +```bash +# /dev-env/ directory + +# Installs and runs project off latest tagged container image +$ ./run-env.sh + +# Removes the project and its dependencies +$ ./rm-env.sh +``` + +Please note that the image tag must be pre-pushed to the Docker registry; otherwise, the script will fail. You can find +the existing tags for alpha and unstable versions on DockerHub at [@gdcc/dataverse][dv_app_docker_image_url]. Images +associated with pull requests (PRs) are available in the [GitHub Container Registry]. + +````bash + +If you are running the script for the first time, it may take a while, since npm has to install all project dependencies. +This can also happen if you added new dependencies to `package.json`, or used the _uninstall_ script to remove current +project files and shut down any running containers. + +Once the script has finished, you will be able to access Dataverse via: + +- Dataverse SPA Frontend: [http://localhost:8000/spa][dv_app_localhost_spa_url] +- Dataverse JSF Application: [http://localhost:8000][dv_app_localhost_legacy_url] + +Note: The Dataverse configbaker takes some time to start the application, so the application will not be accessible until +the bootstrapping is complete. + +
+ +**Adding Custom Test Data** + +If you want to add test data (collections and datasets) to the Dataverse instance, run the following command: + +```bash +# /dev-env/ directory + +$ ./add-env-data.sh +```` + +> Note: The above command uses the [dataverse-sample-data][dv_repo_dvsampledata_url] repository whose scripts occasionally +> fail, so some test data may not be added. + +

(back to top)

+
+ +## Coding Standards + +This project adheres to the following coding standards to ensure consistency, readability, and maintainability of the codebase. + +### General Principles + +- Code should be self-documenting. Choose descriptive names for variables and functions. +- Comment your code when necessary. Comments should explain the 'why' and not the 'what'. +- Keep functions small and focused. A function should ideally do one thing. +- Follow the DRY (Don't Repeat Yourself) principle. Reuse code as much as possible. But don't over-engineer, sometimes + it's better to duplicate code than to overcomplicate it. + +### TypeScript + +- Follow the [TypeScript Deep Dive Style Guide]. +- Use `PascalCase` for class names, `camelCase` for variables and functions, `UPPERCASE` for constants. +- Always specify the type when declaring variables, parameters, and return types. + +### JavaScript Standards + +- Use ES6+ syntax whenever possible. +- Prefer const and let to var for variable declarations. +- Use arrow functions () => {} for anonymous functions. + +### React Standards + +- Component Design: Prefer functional components with hooks over class components. +- State Management: Use local state (useState, useReducer) where possible and consider context or Redux for global state. +- Event Handlers: Prefix handler names with handle, e.g., handleClick. +- JSX: Keep JSX clean and readable. Split into smaller components if necessary. + +### CSS/SASS Standards + +Modularization: Store stylesheets near their respective components and import them as modules. + +### Linting + +We use ESLint to automatically check and apply the coding standards to our codebase, reducing the manual work to a minimum + +To run all checks, you can run the `lint` script. + +```bash +npm run lint +``` + +You can also apply coding style fixes automatically. + +```bash +npm run lint:fix +``` + +### Check and apply formatting standards + +Launches the prettier formatter. We recommend you to configure your IDE to run prettier on save. + +```bash +npm run format +``` + +### Enforcing coding standards using pre-commit hooks + +We use [pre-commit] library to add pre-commit hooks which automatically check the committed +code changes for any coding standard violations. + +### Tests + +Use the following commands to ensure your build passes checks for coding standards and coverage: + +`npm run test:unit` Launches the test runner for the unit tests in the interactive watch mode. +If you prefer to see the tests executing in cypress you can run `npm run cy:open-unit` +You can check the coverage with `npm run test:coverage` + +`npm run test:e2e` Launches the Cypress test runner for the end-to-end tests. +If you prefer to see the tests executing in cypress you can run `npm run cy:open-e2e` + +```bash +# root project directory + +# Launches the Cypress test runner for the end-to-end [or unit] tests: +npm run test:e2e [test:unit] + +# If you prefer to see the tests executing in Cypress you can run: +npm run cy:open-e2e [cy:open-unit] + +# See current code coverage +npm run test:coverage + +``` + +
+ +
+Running specific tests + +> The project includes [@cypress/grep][_uses_lib_grep_url] for running specific tests. +> +> Some examples used by the grep library are below for reference: +> +> ```bash +> # root project directory +> +> # run only the tests with "auth user" in the title +> $ npx cypress run --env grep="auth user" +> +> # run tests with "hello" or "auth user" in their titles by separating them with ";" character +> $ npx cypress run --env grep="hello; auth user" +> ``` +> +> To target specific tests, add `--spec` argument +> +> ```bash +> # root project directory +> +> $ npx cypress run --env grep="loads the restricted files when the user is logged in as owner" +> \ --spec tests/e2e-integration/e2e/sections/dataset/Dataset.spec.tsx +> ``` +> +> **Repeat and burn tests** +> You can run the selected tests multiple times by using the `burn=N` parameter. For example, run all all the tests in +> the spec Five times using: +> +> ```bash +> # root project directory +> +> $ npx cypress run --env burn=5 --spec tests/e2e-integration/e2e/sections/dataset/Dataset.spec.tsx +> ``` + +
+ +

(back to top)

+
+ +## Writing test cases + +Testing is a crucial part of the SPA development. Our React application employs three main types of tests: Unit tests, +Integration tests, and End-to-End (e2e) tests. In this section, we'll describe each type of test and how to implement them. + +### 1. **Unit Tests or Component tests:** + +Unit tests are designed to test individual React components in isolation. In our approach, we focus on testing components +from the user's perspective, following the principles of the [React Testing Library](https://testing-library.com/docs/react-testing-library/intro/). +This means: + +- **Test What Users See:** Focus on the output of the components, such as text, interactions, and the DOM, rather than + internal implementation details like classes or functions. +- **Exceptions to the Rule:** In complex scenarios or where performance is a critical factor, we might test implementation + details to ensure a repository is correctly called, for example. However, these cases are exceptions, and the primary + goal remains on user-centric testing. +- **Avoid Testing Implementation Details:** Avoid testing implementation details like internal state or methods, as these + tests are more brittle and less valuable than user-centric tests. +- **Mocking:** We use mocks to isolate components from their dependencies, ensuring that tests are focused on the component + itself and not on its dependencies. +- **Tools and Frameworks:** We use [Cypress Component testing](https://docs.cypress.io/guides/component-testing/overview) + alongside React Testing Library to render components in isolation and test their behavior. + +
+Example of a Unit Test + +```javascript +//tests/component/sections/home/Home.spec.tsx + +import { Home } from '../../../../src/sections/home/Home' +import { DatasetRepository } from '../../../../src/dataset/domain/repositories/DatasetRepository' +import { DatasetPreviewMother } from '../../dataset/domain/models/DatasetPreviewMother' + +const datasetRepository: DatasetRepository = {} as DatasetRepository +const totalDatasetsCount = 10 +const datasets = DatasetPreviewMother.createMany(totalDatasetsCount) +describe('Home page', () => { + beforeEach(() => { + datasetRepository.getAll = cy.stub().resolves(datasets) + datasetRepository.getTotalDatasetsCount = cy.stub().resolves(totalDatasetsCount) + }) + + it('renders Root title', () => { + cy.customMount() + cy.findByRole('heading').should('contain.text', 'Root') + }) +}) +``` + +
+ +### 2. **Integration Tests:** + +Test the integration of the SPA with external systems, such as APIs, third-party +libraries (like js-dataverse), or databases. This ensures that the application communicates correctly with outside +resources and services. + +- **External Integrations:** Test the integration of the SPA with external systems, such as APIs, third-party + libraries (like [js-dataverse](https://github.com/IQSS/dataverse-client-javascript/pkgs/npm/dataverse-client-javascript/97068340)), or databases. +- **Tools and Frameworks:** We use Cypress for integration tests, to test the repository's implementation. + +
+Example of an Integration Test + +```javascript +//tests/e2e-integration/integration/datasets/DatasetJSDataverseRepository.spec.ts + +describe('Dataset JSDataverse Repository', () => { + before(() => TestsUtils.setup()) + beforeEach(() => { + TestsUtils.login() + }) + it('gets the dataset by persistentId', async () => { + const datasetResponse = await DatasetHelper.create() + + await datasetRepository.getByPersistentId(datasetResponse.persistentId).then((dataset) => { + if (!dataset) { + throw new Error('Dataset not found') + } + const datasetExpected = datasetData(dataset.persistentId, dataset.version.id) + + expect(dataset.license).to.deep.equal(datasetExpected.license) + expect(dataset.metadataBlocks).to.deep.equal(datasetExpected.metadataBlocks) + expect(dataset.summaryFields).to.deep.equal(datasetExpected.summaryFields) + expect(dataset.version).to.deep.equal(datasetExpected.version) + expect(dataset.metadataBlocks[0].fields.publicationDate).not.to.exist + expect(dataset.metadataBlocks[0].fields.citationDate).not.to.exist + expect(dataset.permissions).to.deep.equal(datasetExpected.permissions) + expect(dataset.locks).to.deep.equal(datasetExpected.locks) + expect(dataset.downloadUrls).to.deep.equal(datasetExpected.downloadUrls) + expect(dataset.fileDownloadSizes).to.deep.equal(datasetExpected.fileDownloadSizes) + }) + }) +}) +``` + +
+ +
+Wait for no locks + +Some integration tests require waiting for no locks to be present in the dataset. This is done using the `waitForNoLocks` +method from the `TestsUtils` class. + +```javascript +it('waits for no locks', async () => { + const datasetResponse = await DatasetHelper.create() + + await DatasetHelper.publish(datasetResponse.persistentId) + await TestsUtils.waitForNoLocks(datasetResponse.persistentId) + + await datasetRepository.getByPersistentId(datasetResponse.persistentId).then((dataset) => { + if (!dataset) { + throw new Error('Dataset not found') + } + expect(dataset.locks).to.deep.equal([]) + }) +}) +``` + +
+ +### 3. **End-to-End (e2e) Tests:** + +End-to-end tests simulate real user scenarios, covering the complete flow of the application: + +- **User Workflows:** Focus on common user paths, like searching for a file, logging in, or creating a Dataset. Ensure + these workflows work from start to finish. +- **Avoid Redundancy:** Do not replicate tests covered by unit tests. E2E tests should cover broader user experiences. + It is important to note that e2e tests are slower and more brittle than unit tests, so we use them sparingly. +- **Tools and Frameworks:** We use Cypress for e2e tests, to test the application's behavior from the user's perspective. + It allows us to simulate user interactions and test the application's behavior in a real-world scenario. + +
+Example of an E2E Test + +```javascript +//tests/e2e-integration/e2e/sections/create-dataset/CreateDatasetForm.spec.tsx + +describe('Create Dataset', () => { + before(() => { + TestsUtils.setup() + }) + beforeEach(() => { + TestsUtils.login() + }) + + it('navigates to the new dataset after submitting a valid form', () => { + cy.visit('/spa/datasets/create') + + cy.findByLabelText(/Title/i).type('Test Dataset Title') + cy.findByLabelText(/Author Name/i).type('Test author name', { force: true }) + cy.findByLabelText(/Point of Contact E-mail/i).type('email@test.com') + cy.findByLabelText(/Description Text/i).type('Test description text') + cy.findByLabelText(/Arts and Humanities/i).check() + + cy.findByText(/Save Dataset/i).click() + + cy.findByRole('heading', { name: 'Test Dataset Title' }).should('exist') + cy.findByText(DatasetLabelValue.DRAFT).should('exist') + cy.findByText(DatasetLabelValue.UNPUBLISHED).should('exist') + }) +}) +``` + +
+ +> **Note:** Some end-to-end (e2e) tests are failing in local development environments despite passing in GitHub Actions. +> This discrepancy appears to be due to variations in machine resources. +> +> We need to investigate and potentially optimize several aspects of our local setup. Check the issue +> [here](https://github.com/IQSS/dataverse-frontend/issues/371). + +### Patterns and Conventions + +#### **Folder Structure** + +We have a `tests` folder in the root of the project, with subfolders for each type of test (component, +integration, e2e). The folders for integration and e2e are together in the `e2e-integration` folder. We follow the same +structure that we use for the application. + +#### **Test Naming** + +We use the same naming conventions as the application, with the suffix `.spec`. + +#### **Test Data for unit tests** + +We use [faker](https://www.npmjs.com/package/@faker-js/faker) to create test data for unit tests. This library allows us +to generate realistic and varied test data with minimal effort. We use it to create random strings, numbers, and other +values, ensuring that tests cover a wide range of cases without introducing unpredictable failures. + +**[Object Mother Pattern](https://martinfowler.com/bliki/ObjectMother.html)** + +We use the Object Mother pattern to create test data for unit tests. These classes are located in the `tests/component` +folder. + +The primary goal of the Object Mother pattern is to simplify the creation and management of test objects. It serves as a +centralized place where test objects are defined, making the testing process more streamlined and less error-prone. In this +pattern, we create a class or a set of functions dedicated solely to constructing and configuring objects needed for tests. + +Some benefits of this pattern are: + +- **Consistency:** It ensures consistent object setup across different tests, improving test reliability. +- **Readability:** It makes tests more readable and understandable by hiding complex object creation details. +- **Flexibility:** It allows us to create test data with different values and configurations. +- **Reusability:** It allows for the reuse of object configurations, reducing code duplication and saving time +- **Maintainability:** It centralizes object creation, making tests easier to maintain and update when object structures change. +- **Simplicity:** It simplifies test setup, allowing testers to focus more on the test logic than on object configuration. +- **Controlled Randomness:** It allows creating realistic and varied test scenarios while maintaining control over the randomness. + This ensures tests cover a wide range of cases without introducing unpredictable failures. + +
+Example of an Object Mother class + +```javascript +//tests/component/dataset/domain/models/DatasetMother.ts + +export class DatasetMother { + static create(props?: Partial): Dataset { + return { + id: faker.datatype.uuid(), + title: faker.lorem.words(3), + ...props + } + } +} +``` + +
+ +#### **Test Data for integration and e2e tests** + +We use some helper classes to create test data for the integration and e2e tests. These classes are located in the +`tests/e2e-integration/shared` folder. + +The test data is created using [axios](https://www.npmjs.com/package/axios), which allows us to make requests to the +Dataverse API and create test data for the integration and e2e tests. + +
+Example of a Helper class to create test data + +```javascript +//tests/e2e-integration/shared/datasets/DatasetHelper.ts +export class DatasetHelper extends DataverseApiHelper { + static async create(): Promise { + return this.request < DatasetResponse > (`/dataverses/root/datasets`, 'POST', newDatasetData) + } +} +``` + +
+ +#### **Test Organization** + +We organize tests into suites, grouping them by feature or user workflow. This makes it easier to find and run tests. + +#### **Test Isolation** + +We aim to keep tests as isolated as possible, avoiding dependencies between tests and ensuring that each test can run +independently. + +### Continuous Integration (CI) + +We run tests on every pull request and merge to ensure that the application is always stable and functional. You can +find the CI workflow in the `.github/workflows/test.yml` file. + +CI checks include: + +- **Unit Tests:** We run all unit tests to ensure that the application's components work as expected. +- **Integration Tests:** We run integration tests to ensure that the application communicates correctly with external + systems. +- **E2E Tests:** We run e2e tests to ensure that the application's behavior is correct from the user's perspective. +- **Accessibility Tests:** We run checks to ensure that the application is accessible and that it meets the highest standards + for accessibility compliance. +- **Code Coverage:** We check the test coverage to ensure that the application is well-tested and that new code is + covered by tests. + +### Test coverage + +We aim for high test coverage, especially in critical areas of the application, like user workflows or complex components. +However, we prioritize user-centric testing over coverage numbers. + +- **Coverage Threshold:** We aim for a test coverage of 95% for the unit tests. This threshold is set in the `.nycrc.json` file. +- **Coverage Reports:** We use [nyc](https://www.npmjs.com/package/nyc) to generate coverage reports, which are available + in the `coverage` folder after running the tests. These reports are also published to [Coveralls](https://coveralls.io/github/IQSS/dataverse-frontend?branch=develop) + with every pull request and merge. The coverage badge is displayed at the top of the README. +- **Tests included in the coverage:** We include all unit tests in the coverage report. +- **Pre-commit hook:** We use [pre-commit](https://www.npmjs.com/package/pre-commit) to run the unit tests before every commit, + ensuring that no code is committed without passing the tests. It also runs the coverage checks to ensure that the coverage + threshold is met. + +#### How to run the code coverage + +To generate the code coverage, you first need to run the tests with the `test:unit` script. After running the tests, you +can check the coverage with the `test:coverage` script. + +If you want to see the coverage report in the browser, you can open the `coverage/lcov-report/index.html` file in the browser. + +```bash +# root project directory + +# Run the unit tests and generate the coverage report + +npm run test:unit + +# Check the coverage + +npm run test:coverage +``` + +

(back to top)

+
+ +## Deployment + +Once the site is built through the `npm run build` command, it can be deployed in different ways to different types of +infrastructure, depending on the needs of the installation. + +We are working to provide different preconfigured automated deployment options, seeking to support common use cases +today for installing applications of this nature. + +The current automated deployment options are available within the GitHub `deploy` workflow, which is designed to be run +manually from GitHub Actions. The deployment option is selected via a dropdown menu, as well as the target environment. + +
+Examples for AWS and Payara + +#### Deployment with AWS3 + +This option will build and deploy the application to a remote S3 bucket. + +For this workflow to work, a GitHub environment must be configured with the following environment secrets: + +- `AWS_ACCESS_KEY_ID` +- `AWS_SECRET_ACCESS_KEY` +- `AWS_S3_BUCKET_NAME` +- `AWS_DEFAULT_REGION` + +Note that for the deployment to the S3 bucket to succeed, you must make the following changes to the bucket via the S3 +web interface (or equivalent changes using aws-cli or similar tools): + +- Under _`Permissions`_ ⏵ _`Permissions Overview`_ ⏵ _`Block public access (bucket settings)`_ ⏵ click _`Edit`_, then + **uncheck** _`Block all public access`_ and save. +- Under _`Properties`_ ⏵ _`Static website hosting`_ ⏵ click _`Edit`_ and enable it. Change _`Index document`_ and + _`Error document`_ to `index.html`. +- Under _`Bucket Policy`_, click _`Edit`_ and paste the following policy (changing `` to your bucket name) + and save. + +```json +{ + "Version": "2012-10-17", + "Statement": [ + { + "Sid": "PublicReadGetObject", + "Principal": "*", + "Effect": "Allow", + "Action": ["s3:GetObject"], + "Resource": ["arn:aws:s3:::/*"] + } + ] +} +``` + +You should see the deployed app at `http://[BUCKET-NAME].s3-website-[REGION].amazonaws.com`, for example; +`http://my-dataverse-bucket.s3-website-us-east-1.amazonaws.com/` + +#### Deployment with Payara + +This option will build and deploy the application to a remote Payara server. + +This option is intended for an "all-in-one" solution, where the Dataverse backend application and the frontend +application run on the same Payara server. + +For this workflow to work, a GitHub environment must be configured with the following environment secrets: + +- `PAYARA_INSTANCE_HOST` +- `PAYARA_INSTANCE_USERNAME` +- `PAYARA_INSTANCE_SSH_PRIVATE_KEY` + +It is important that the remote instance is correctly pre-configured, with the Payara server running, and a service +account for Dataverse with the corresponding SSH key pair established. + +A base path for the frontend application can be established on the remote server by setting the corresponding field in +the workflow inputs. This mechanism prevents conflicts between the frontend application and any pre-existing deployed +application running on Payara, which can potentially be a Dataverse backend. This way, only the routes with the base +path included will redirect to the frontend application. + +
+ +

(back to top)

+
+ +## Publishing the Design System + +The Design System is published to the npm Package Registry. To publish a new version, follow these steps: + +1. **Update the version** + + Update the version running the lerna command: + + ```shell + lerna version --no-push + ``` + + This command will ask you for the new version and will update the `package.json` files and create a new commit with the changes. + +2. **Review the auto generated CHANGELOG.md** + + The lerna command will generate a new `CHANGELOG.md` file with the changes for the new version. Review the changes and make sure that the file is correct. + + If it looks good, you can push the changes to the repository. + + ```shell + git push && git push --tags + ``` + + Optional: + + If you need to make any changes to the `CHANGELOG.md` file, you can do it manually. + + After manually updating the `CHANGELOG.md` file, you can commit the changes. + + ```shell + git add . + git commit --amend --no-edit + git push --force && git push --tags --force + ``` + + This command will amend the lerna commit and push the changes to the repository. + +3. **Review the new tag in GitHub** + + After pushing the changes, you can review the new tag in the [GitHub repository](https://github.com/IQSS/dataverse-frontend/tags). + + The tag should be created with the new version. + +4. **Publish the package** + + After the version is updated, you can publish the package running the lerna command: + + ```shell + lerna publish from-package + ``` + + This command will publish the package to the npm registry. + + Remember that you need a valid npm token to publish the packages. + + Get a new token from the npm website and update the `.npmrc` file with the new token. + + Open the `.npmrc` file and replace `YOUR_NPM_TOKEN ` with your actual npm token. + + ```plaintext + legacy-peer-deps=true + + //npm.pkg.github.com/:_authToken=YOUR_NPM_TOKEN + @iqss:registry=https://npm.pkg.github.com/ + ``` + +5. **Review the new version in the npm registry** + + After publishing the packages, you can review the new version in the [npm registry](https://www.npmjs.com/package/@iqss/dataverse-design-system?activeTab=versions). + + The new version should be available in the npm registry. + +

(back to top)

+
+ + + + + + + +[dv_repo_dvclientjs_url]: https://github.com/IQSS/dataverse-client-javascript/pkgs/npm/dataverse-client-javascript +[dv_repo_dvclientjs_npm_url]: https://www.npmjs.com/package/js-dataverse +[dv_repo_dvsampledata_url]: https://github.com/IQSS/dataverse-sample-data + + + + +[dv_app_localhost_build_url]: http://localhost:5173 +[dv_app_localhost_storybook_url]: http://localhost:6006/ +[dv_app_localhost_designsystem_url]: http://localhost:6007/ +[dv_app_localhost_spa_url]: http://localhost:8000/spa +[dv_app_localhost_legacy_url]: http://localhost:8000/ + + + +[dv_app_docker_image_url]: https://hub.docker.com/r/gdcc/dataverse/tags +[Github Container Registry]: https://github.com/orgs/gdcc/packages/container/package/dataverse + + + + +[dv_docs_github_userauthtoken_url]: https://docs.github.com/en/apps/creating-github-apps/authenticating-with-a-github-app/generating-a-user-access-token-for-a-github-app +[dv_docs_github_token_url]: https://github.com/settings/tokens +[TypeScript Deep Dive Style Guide]: https://basarat.gitbook.io/typescript/styleguide + + + + +[_uses_docker_url]: https://www.docker.com/products/docker-desktop/ +[_uses_repo_cra_error_url]: https://github.com/facebook/create-react-app/issues/11174 + + + +[_uses_lib_grep_url]: https://www.npmjs.com/package/@cypress/grep + + + + + + +[_shield_docker]: https://img.shields.io/badge/Docker-2496ED?style=for-the-badge&logo=docker&logoColor=white + + + diff --git a/README.md b/README.md index 37315fc22..8e2740317 100644 --- a/README.md +++ b/README.md @@ -1,374 +1,419 @@ -# Dataverse Frontend - -[![Project Status: WIP – Initial development is in progress, but there has not yet been a stable, usable release suitable for the public.](https://www.repostatus.org/badges/latest/wip.svg)](https://www.repostatus.org/#wip) -[![Tests](https://github.com/IQSS/dataverse-frontend/actions/workflows/test.yml/badge.svg)](https://github.com/IQSS/dataverse-frontend/actions/workflows/test.yml) -[![Accessibility](https://github.com/IQSS/dataverse-frontend/actions/workflows/accessibility.yml/badge.svg)](https://github.com/IQSS/dataverse-frontend/actions/workflows/accessibility.yml) -[![Unit Tests Coverage](https://coveralls.io/repos/github/IQSS/dataverse-frontend/badge.svg?branch=develop)](https://coveralls.io/github/IQSS/dataverse-frontend?branch=develop) - -## Demo videos +[![CI][_shield_projectstatus]][dv_repo_projectstatus_url] +[![CI][_shield_contributors]][dv_repo_contributors_url] +[![CI][_shield_stargazers]][dv_repo_stargazers_url] +[![CI][_shield_coveralls]][dv_repo_coveralls_url] +[![CI][_shield_workflow]][dv_repo_workflow_url] +[![CI][_shield_accessibility]][dv_repo_accessibility_url] +[![CI][_shield_issues]][dv_repo_issues_url] +[![CI][_shield_forks]][dv_repo_forks_url] + + + + + + + +
+
+ + Logo + +

Dataverse Frontend

+

+ A New Way To Create and View Datasets & Collections! +
+ Explore the Docs » +
+
+ Website | + View Demo (BETA) | + Report Bug | + Request Feature +

+
+

Progress Demo Videos

+

+ Dataset Overview Page (Aug. '23) | + Dataset Files Table (Dec. '23) +

+

Chat with us!

+
+ +[![Zulip][_shield_zulip]][dv_community_zulip_url] +[![GoogleDevsGrp][_shield_googledevs]][dv_community_google_devs_url] +[![GoogleUsersGrp][_shield_googleusers]][dv_community_google_users_url] + +--- + +### ⚠️ Important Information About the Dataverse Frontend ⚠️ + +> Dataverse Frontend is currently in beta and under active development. While it offers exciting new features, please note that it may not be stable for production use. We recommend sticking to the latest stable [Dataverse][dv_repo_legacyjsf_url] release for mission-critical applications. If you choose to use this repository in production, be prepared for potential bugs and breaking changes. Always check the official documentation and release notes for updates and proceed with caution. + +To stay up-to-date with all the latest changes, join the [Google Group][dv_community_google_users_url] + +--- + +## Table of Contents + +
    +
  1. + About The Project + +
  2. +
  3. Roadmap
  4. +
  5. Contributing
  6. +
  7. License
  8. +
  9. Contact
  10. +
  11. Acknowledgments
  12. +
+ +--- + +## About the Project + +### What is Dataverse? + +The Dataverse Project is an open source web application to share, preserve, cite, explore, and analyze research data. It +facilitates making data available to others, and allows you to replicate others' work more easily. Researchers, journals, +data authors, publishers, data distributors, and affiliated institutions all receive academic credit and web visibility. +Read more on the [Dataverse Website][dv_docs_dataverse_url]. + + + +### What is Dataverse Frontend _& How is it different?_ + +The Dataverse Frontend repository is an initiative undertaken in 2023 to modernize the UI and design of the Dataverse +Project by creating a stand-alone interface to allow users and organizations to implement their own Dataverse installations +and utilize the JavaScript framework of their choice. + +**The goals of Dataverse Frontend:** + +- Modernize the application +- Separate the frontend and backend logic, transition away from Monolithic Architecture +- Reimagine the current Dataverse backend as a headless API-first instance. +- The Dataverse Frontend becomes a stand-alone SPA (Single Page Application) +- Modularize the UI to allow third-party extension of the base project +- Increase cadence of development, decrease time between release cycles to implement new features +- Introduce testing automation +- Give priority and transparency to coding and design to support Harvard University's commitment to ensuring the highest + standards for Accessibility Compliance +- Empower the community to create, contribute, and improve. + +**New Features:** + +- Node Application using ReactJS for the project baseline +- Native localization support through the i18n library +- Accessibility compliant code built from the ground-up +- Improved modularity via Web Components +- Cypress testing automation +- Storybook for UI Component Library + +--- + +#### Demo videos - 2023-08-01: [View mode of the dataset page](https://groups.google.com/g/dataverse-community/c/cxZ3Bal_-uo/m/h3kh3iVNCwAJ) - 2023-12-13: [Files table on the dataset page](https://groups.google.com/g/dataverse-community/c/w_rEMddESYc/m/6F7QC1p-AgAJ) -## Getting Started - -First install node >=16 and npm >=8. Recommended versions `node v19` and `npm v9`. - -### Create a `.npmrc` file and add a token - -To install the [@iqss/dataverse-client-javascript](https://github.com/IQSS/dataverse-client-javascript/pkgs/npm/dataverse-client-javascript) -from the GitHub registry, necessary for connecting with the Dataverse API, follow these steps to create an `.npmrc` file in -the root of your project using your GitHub token. - -1. **Copy `.npmrc.example`** - - Duplicate the `.npmrc.example` file in your project and save it as `.npmrc`. - -2. **Replace the Token** - - Open the newly created `.npmrc` file and replace `YOUR_GITHUB_TOKEN` with your actual GitHub token. - - ```plaintext - legacy-peer-deps=true - - //npm.pkg.github.com/:_authToken= - @iqss:registry=https://npm.pkg.github.com/ - ``` - -#### How to Get a GitHub Token - -If you don't have a GitHub token yet, follow these steps: - -1. Go to your GitHub account settings. - -2. Navigate to "Developer settings" -> "Personal access tokens." - -3. Click "Personal access tokens" -> "Tokens (classic)" -> "Generate new token (classic)". - -4. Give the token a name and select the "read:packages" scope. - -5. Copy the generated token. - -6. Replace `YOUR_GITHUB_AUTH_TOKEN` in the `.npmrc` file with the copied token. - -Now, you should be able to install the Dataverse JavaScript client using npm. - -### `npm install` - -Run this command to install the dependencies. You may see a message about vulnerabilities after running this command. \ -Please check this announcement from Create React App repository https://github.com/facebook/create-react-app/issues/11174 . -These vulnerabilities won't be in the production build since they come from libraries only used during the development. - -### `cd packages/design-system && npm run build` - -Run this command to build the UI library. This is needed to be able to run the app. - -## Available Scripts - -In the project directory, you can run at any time: - -### `npm start` - -Runs the app in the development mode. -Open [http://localhost:5173](http://localhost:5173) to view it in your browser. - -The page will reload when you make changes. -You may also see any lint errors in the console. - -### `npm run test:unit` - -Launches the test runner for the unit tests in the interactive watch mode. -If you prefer to see the tests executing in cypress you can run `npm run cy:open-unit` -You can check the coverage with `npm run test:coverage` - -### `npm run build` - -Builds the app for production to the `dist` folder. - -### `npm run preview` - -Locally preview the production build. - -### `npm run test:e2e` - -Launches the Cypress test runner for the end-to-end tests. -If you prefer to see the tests executing in cypress you can run `npm run cy:open-e2e` - -### `npm run lint` - -Launches the linter. To automatically fix the errors run `npm run lint:fix` - -### `npm run format` - -Launches the prettier formatter. We recommend you to configure your IDE to run prettier on save. - -### `npm run storybook` - -Runs the Storybook in the development mode. - -There are 2 Storybook instances, one for the Design System and one for the Dataverse Frontend. - -Open [http://localhost:6006](http://localhost:6006) to view the Dataverse Frontend Storybook in your browser. -Open [http://localhost:6007](http://localhost:6007) to view the Design System Storybook in your browser. - -Note that both Storybook instances are also published to Chromatic (the Chromatic build contains -less dynamic content than the local Storybook): - -- [Dataverse Frontend](https://www.chromatic.com/builds?appId=646f68aa9beb01b35c599acd) -- [Dataverse Design System](https://www.chromatic.com/builds?appId=646fbe232a8d3b501a1943f3) - -### Cypress Testing Tool - -The project includes [@cypress/grep](https://www.npmjs.com/package/@cypress/grep) for running specific tests. -To run the tests, use --env grep="your test name" in the command line. - -To run a specific test multiple times, use --env burn=10 in the command line. -Running tests multilpe times is useful for detecting flaky tests. - -For example: - -`npx cypress run --spec tests/e2e-integration/e2e/sections/dataset/Dataset.spec.tsx --env grep="loads the restricted files when the user is logged in as owner",burn=10` - -## Local development environment - -A containerized environment, oriented to local development, is available to be run from the repository. - -This environment contains a dockerized instance of the Dataverse backend with its dependent services (database, mailserver, etc), as well as an npm development server running the SPA frontend (With code autoupdating). - -This environment is intended for locally testing any functionality that requires access to the Dataverse API from the SPA frontend. - -There is an Nginx reverse proxy container on top of the frontend and backend containers to avoid CORS issues while testing the application. - -### Run the environment - -Inside the `dev-env` folder, run the following command: - -``` -./run-env.sh -``` - -As the script argument, add the name of the Dataverse image tag you want to deploy. - -Note that the image tag in the docker registry must to be pre pushed, otherwise the script will fail. - -If you are running the script for the first time, it may take a while, since `npm install` has to install all the dependencies. This can also happen if you added new dependencies to package.json. - -Once the script has finished, you will be able to access Dataverse via: - -- [http://localhost:8000/spa](http://localhost:8000/spa): SPA Frontend -- [http://localhost:8000](http://localhost:8000): Dataverse Backend and JSF Frontend - -_Note: The Dataverse configbaker takes some time to start the application, so the application will not be accessible until the bootstrapping is complete._ - -If you want to add test data (collections and datasets) to the Dataverse instance, run the following command: - -``` -./add-env-data.sh -``` - -_Note: The above command uses the dataverse-sample-data repository whose scripts occasionally fail, so some of the test data may not be added_ - -### Remove the environment +### Beta Testing Environment -To clean up your environment of any running environment containers, as well as any associated data volumes, run this script inside the `dev-env` folder: +_Track our progress and compare it to the current Dataverse application!_ -``` -./rm-env -``` - -## Deployment - -Once the site is built through the `npm run build` command, it can be deployed in different ways to different types of infrastructure, depending on installation needs. - -We are working to provide different preconfigured automated deployment options, seeking to support common use cases today for installing applications of this nature. - -The current automated deployment options are available within the GitHub `deploy` workflow, which is designed to be run manually from GitHub Actions. The deployment option is selected via a dropdown menu, as well as the target environment. - -### AWS S3 Deployment - -This option will build and deploy the application to a remote S3 bucket. - -For this workflow to work, a GitHub environment must be configured with the following environment secrets: - -- AWS_ACCESS_KEY_ID -- AWS_SECRET_ACCESS_KEY -- AWS_S3_BUCKET_NAME -- AWS_DEFAULT_REGION - -Note that for the deployment to the S3 bucket to succeed, you must make the following changes to the bucket via the S3 web interface (or equivalent changes using aws-cli or similar tools): - -- Under "Permissions", "Permissions overview", "Block public access (bucket settings)", click "Edit", then uncheck "Block all public access" and save. -- Under "Properties", "Static website hosting", click "Edit" and enable it. Change "Index document" and "Error document" to "index.html". -- Under "Bucket policy", click "Edit" and paste the following policy (changing `` to your bucket name) and save. - -``` -{ - "Version": "2012-10-17", - "Statement": [ - { - "Sid": "PublicReadGetObject", - "Principal": "*", - "Effect": "Allow", - "Action": [ - "s3:GetObject" - ], - "Resource": [ - "arn:aws:s3:::/*" - ] - } - ] -} -``` - -You should see the deployed app at http://BUCKET-NAME.s3-website-REGION.amazonaws.com such as http://mybucket.s3-website-us-east-1.amazonaws.com - -### Payara Deployment - -This option will build and deploy the application to a remote Payara server. - -This option is intended for an "all-in-one" solution, where the Dataverse backend application and the frontend application run on the same Payara server. - -For this workflow to work, a GitHub environment must be configured with the following environment secrets: - -- PAYARA_INSTANCE_HOST -- PAYARA_INSTANCE_USERNAME -- PAYARA_INSTANCE_SSH_PRIVATE_KEY - -It is important that the remote instance is correctly pre-configured, with the Payara server running, and a service account for Dataverse with the corresponding SSH key pair established. - -A base path for the frontend application can be established on the remote server by setting the corresponding field in the workflow inputs. This mechanism prevents conflicts between the frontend application and any pre-existing deployed application running on Payara, which can potentially be a Dataverse backend. This way, only the routes with the base path included will redirect to the frontend application. - -#### Beta Testing Environment - -To make the SPA Frontend accesible and testable by people interested in the project, there is a remote beta testing environment that includes the latest changes developed both for the frontend application and the Dataverse backend application (develop branches). +To make the SPA Frontend accessible and testable by people interested in the project, there is a remote beta testing +environment that includes the latest changes developed both for the frontend application and the Dataverse backend +application (develop branches). This environment follows the "all-in-one" solution described above, where both applications coexist on a Payara server. -Environment updates are carried out automatically through GitHub actions, present both in this repository and in the Dataverse backend repository, which deploy the develop branches when any change is pushed to them. +Environment updates are carried out automatically through GitHub actions, present both in this repository and in the +Dataverse backend repository, which deploy the develop branches when any change is pushed to them. The environment is accessible through the following URLs: -- SPA: https://beta.dataverse.org/spa -- JSF: https://beta.dataverse.org - -## Changes from the original JSF application - -### Changes from the Style Guide - -The design system and frontend in this repo are inspired by the Dataverse Project [Style Guide](https://guides.dataverse.org/en/latest/style/index.html), but the following changes have been made, especially for accessibility. - -#### Links - -We added an underline to links to make them accessible. - -#### File label - -Now we are using Bootstrap with a theme, so there is only one definition for the secondary color. Since Bootstrap applies -the secondary color to the labels automatically, the color of the file label is now the global secondary color which is -a lighter shade of grey than what it used to be. - -We changed the citation block to be white with a colored border, to make the text in the box more accessible. - -#### Breadcrumbs - -We have introduced an update to the breadcrumb navigation UI. Unlike in the original JSF application, where breadcrumbs -did not reflect the user's current location within the site, our new SPA design now includes this feature in the breadcrumbs. -Additionally, we have aligned with best practices by positioning all breadcrumbs at the top, before anything else in the UI. - -This update gives users a clear indication of their current position within the application's hierarchy. - -### Changes in functionality behavior - -Our main goal is to replicate the behavior of the original JSF application in all its functionalities, although during development we have found opportunities to review certain behaviors and apply changes where we find appropriate. - -#### Dataset files tab search +- Dataverse Frontend SPA: [beta.dataverse.org/spa][dv_app_beta_spa_url] +- Dataverse JSF: [beta.dataverse.org][dv_app_beta_legacyjsf_url] + +### How Existing Dataverse Installations May Be Affected + +- The existing Dataverse API will be added to and extended from the present backend architecture while the existing UI + and current Dataverse functionalities are preserved. +- The SPA will continue its life as a separate application, supported on its own maintenance schedule. +- When the SPA has matured enough for an official release, we will switch to the new version and the [old backend][dv_repo_legacyjsf_url] + will be moved into maintenance mode with no new features being introduced and focusing only on critical bugfixes. + +
+ Changes from the original Dataverse JSF application + +> ### Changes From the Style Guide +> +> The design system and frontend in this repo are inspired by the Dataverse Project [Style Guide][dv_docs_styleguide_url], +> but the following changes have been made, especially for accessibility. +> +> #### Links +> +> We added an underline to links to make them accessible. +> +> #### File Labels +> +> Now we are using Bootstrap with a theme, so there is only one definition for the secondary color. Since Bootstrap applies +> the secondary color to the labels automatically, the color of the file label is now the global secondary color which is +> a lighter shade of grey than what it used to be. +> +> We changed the citation block to be white with a colored border, to make the text in the box more accessible. +> +> #### Breadcrumbs +> +> We have introduced an update to the breadcrumb navigation UI. Unlike in the original JSF application, where breadcrumbs +> did not reflect the user's current location within the site, our new SPA design now includes this feature in the breadcrumbs. +> Additionally, we have aligned with best practices by positioning all breadcrumbs at the top, before anything else in the UI. +> +> This update gives users a clear indication of their current position within the application's hierarchy. +> +> ### Changes in Functionality & Behavior +> +> Our main goal is to replicate the behavior of the original JSF application in all its functionalities, although during +> development we have found opportunities to review certain behaviors and apply changes where we find appropriate. +> +> #### Dataset Files Tab Search +> +> The original Dataset JSF page uses Solr to search for files based on the available filters. Past dataset versions are +> not indexed in Solr, so the filter option is not available (hidden) for such versions. When a version is indexed, the +> search text is searched in Solr, and Solr grammar can be applied. When the version is not indexed, the search text is +> searched in the database. +> +> The new SPA does not use Solr as the API endpoint it uses performs all queries on the database. Filters and search +> options are available for all versions in the same way, homogenizing behavior, although losing the possibility of +> using the Solr grammar. +> +> The decision of this change is made on the assumption that Solr may not be required in the context of files tab +> search, whose search facets are reduced compared to other in-application searches. Therefore, if we find evidence that +> the assumption is incorrect, we will work on extending the search capabilities to support Solr. +> +> #### Dataverses/Datasets list +> +> The original JSF Dataverses/Datasets list on the home page uses normal paging buttons at the bottom of the list. +> We have implemented infinite scrolling in this list, replacing the normal paging buttons, but the goal would be to be +> able to toggle between normal paging and infinite scrolling via a toggle setting or button. + +
+ +

(back to top)

+ +## Roadmap + +Interested in what's being developed currently? See the [open issues][dv_repo_issues_url] for a full list of proposed +features (and known issues), and what we are working on in the [currently planned sprint][dv_repo_sprint_url]. + +Keep an eye out on [The Institute for Quantitative Social Science (IQSS) Dataverse Roadmap][hvd_iqss_roadmap_url] at +Harvard University to get a look at upcoming initiatives for the project. + +

(back to top)

+
+ +## Contributing + +We love PRs! Read the [Contributor Guidelines](.github/CONTRIBUTING.md) for more info. Any contributions you make are **greatly appreciated**. + +Got Questions? Join the conversation on [Zulip][dv_community_zulip_url], or our Google Groups for +[Developers][dv_community_google_devs_url] and [Users][dv_community_google_users_url]. Or attend community meetings, +hosted by the Global Dataverse Community Consortium to collaborate with the interest groups for +[Frontend Development][dv_community_gdcc_ui_url] and [Containerization][dv_community_gdcc_containers_url], +learn and share with communities around the world! + +

(back to top)

+
+ +## Acknowledgments -The original Dataset JSF page uses Solr to search for files based on the available filters. Past dataset versions are not indexed in Solr, so the filter option is not available (hidden) for such versions. When a version is indexed, the search text is searched in Solr, and Solr grammar can be applied. When the version is not indexed, the search text is searched in the database. - -The new SPA does not use Solr as the API endpoint it uses performs all queries on the database. Filters and search options are available for all versions in the same way, homogenizing behavior, although losing the possibility of using the Solr grammar. - -The decision of this change is made on the assumption that Solr may not be required in the context of files tab search, whose search facets are reduced compared to other in-application searches. Therefore, if we find evidence that the assumption is incorrect, we will work on extending the search capabilities to support Solr. - -#### Dataverses/Datasets list - -The original JSF Dataverses/Datasets list on the home page uses normal paging buttons at the bottom of the list. -We have implemented infinite scrolling in this list, replacing the normal paging buttons, but the goal would be to be able to toggle between normal paging and infinite scrolling via a toggle setting or button. - -## Publishing the Design System - -The Design System is published to the npm Package Registry. To publish a new version, follow these steps: - -1. **Update the version** - - Update the version running the lerna command: - - ```shell - lerna version --no-push - ``` - - This command will ask you for the new version and will update the `package.json` files and create a new commit with the changes. - -2. **Review the auto generated CHANGELOG.md** - - The lerna command will generate a new `CHANGELOG.md` file with the changes for the new version. Review the changes and make sure that the file is correct. - - If it looks good, you can push the changes to the repository. - - ```shell - git push && git push --tags - ``` - - Optional: - - If you need to make any changes to the `CHANGELOG.md` file, you can do it manually. - - After manually updating the `CHANGELOG.md` file, you can commit the changes. - - ```shell - git add . - git commit --amend --no-edit - git push --force && git push --tags --force - ``` - - This command will amend the lerna commit and push the changes to the repository. - -3. **Review the new tag in GitHub** - - After pushing the changes, you can review the new tag in the [GitHub repository](https://github.com/IQSS/dataverse-frontend/tags). - - The tag should be created with the new version. - -4. **Publish the package** - - After the version is updated, you can publish the package running the lerna command: - - ```shell - lerna publish from-package - ``` - - This command will publish the package to the npm registry. - - Remember that you need a valid npm token to publish the packages. - - Get a new token from the npm website and update the `.npmrc` file with the new token. - - Open the `.npmrc` file and replace `YOUR_NPM_TOKEN ` with your actual npm token. - - ```plaintext - legacy-peer-deps=true - - //npm.pkg.github.com/:_authToken=YOUR_NPM_TOKEN - @iqss:registry=https://npm.pkg.github.com/ - ``` - -5. **Review the new version in the npm registry** +Chromatic - After publishing the packages, you can review the new version in the [npm registry](https://www.npmjs.com/package/@iqss/dataverse-design-system?activeTab=versions). +Thanks to [Chromatic][_uses_chromatic_url] for providing the visual testing platform that helps us review UI changes +and catch visual regressions. + +

(back to top)

+
+ +--- + +### Built With - The new version should be available in the npm registry. +- [![ReactJS][_shield_reactjs]][_uses_reactjs_url] +- [![NodeJS][_shield_nodejs]][_uses_nodejs_url] +- [![TypeScript][_shield_typescript]][_uses_typescript_url] +- [![Bootstrap][_shield_bootstrap]][_uses_bootstrap_url] +- [![Cypress][_shield_cypress]][_uses_cypress_url] +- [![TestingLibrary][_shield_testinglibrary]][_uses_testinglibrary_url] +- [![Storybook][_shield_storybook]][_uses_storybook_url] +- [![AmazonS3][_shield_amazons3]][_uses_aws3_url] +- [![Docker][_shield_docker]][_uses_docker_url] + +

(back to top)

+
+ +--- + +## License + +Distributed under the Apache License, Version 2.0. See [LICENSE](LICENSE) for more information. + +

(back to top)

+
+ + + + + + + + +[dv_repo_url]: https://github.com/IQSS/dataverse-frontend +[dv_repo_issues_url]: https://github.com/IQSS/dataverse-frontend/issues +[dv_repo_sprint_url]: https://github.com/orgs/IQSS/projects/34/views/23 +[dv_repo_contributors_url]: https://github.com/IQSS/dataverse-frontend/graphs/contributors +[dv_repo_stargazers_url]: https://github.com/IQSS/dataverse-frontend/stargazers +[dv_repo_coveralls_url]: https://coveralls.io/github/IQSS/dataverse-frontend?branch=develop +[dv_repo_workflow_url]: https://github.com/IQSS/dataverse-frontend/actions +[dv_repo_accessibility_url]: https://github.com/IQSS/dataverse-frontend/actions/workflows/accessibility.yml +[dv_repo_forks_url]: https://github.com/IQSS/dataverse-frontend/forks +[dv_repo_tag_url]: https://github.com/IQSS/dataverse-frontend/tags +[dv_repo_projectstatus_url]: https://www.repostatus.org/#wip +[dv_repo_releases_url]: https://github.com/IQSS/dataverse-frontend/releases + + + + +[dv_repo_dvclientjs_url]: https://github.com/IQSS/dataverse-client-javascript/pkgs/npm/dataverse-client-javascript +[dv_repo_dvclientjs_npm_url]: https://www.npmjs.com/package/js-dataverse +[dv_repo_dvsampledata_url]: https://github.com/IQSS/dataverse-sample-data +[dv_repo_legacyjsf_url]: https://github.com/IQSS/dataverse/ +[dv_repo_legacyjsf_releases_url]: https://github.com/IQSS/dataverse/releases +[dv_repo_legacyjsf_issues_url]: https://github.com/IQSS/dataverse/issues +[dv_repo_vscode_url]: https://github.com/IQSS/vscode-settings + + + + +[dv_app_beta_spa_url]: https://beta.dataverse.org/spa +[dv_app_beta_legacyjsf_url]: https://beta.dataverse.org +[dv_app_legacyjsf_demo_url]: https://demo.dataverse.org/ +[dv_app_localhost_build_url]: http://localhost:5173 +[dv_app_localhost_storybook_url]: http://localhost:6006/ +[dv_app_localhost_designsystem_url]: http://localhost:6007/ +[dv_app_localhost_spa_url]: http://localhost:8000/spa +[dv_app_localhost_legacy_url]: http://localhost:8000/ + + + +[dv_app_docker_image_url]: https://hub.docker.com/r/gdcc/dataverse/tags + + + + +[dv_community_gdcc_url]: https://www.gdcc.io/ +[dv_community_gdcc_ui_url]: https://ui.gdcc.io/ +[dv_community_gdcc_containers_url]: https://ct.gdcc.io/ +[dv_community_google_devs_url]: https://groups.google.com/g/dataverse-dev +[dv_community_google_users_url]: https://groups.google.com/g/dataverse-community +[dv_community_zulip_url]: https://dataverse.zulipchat.com/#narrow/stream/410361-ui-dev + + + + +[hvd_iqss_url]: https://www.iq.harvard.edu/ +[hvd_iqss_roadmap_url]: https://www.iq.harvard.edu/roadmap-dataverse-project +[hvd_legacyjsf_url]: https://dataverse.harvard.edu/ + + + + +[dv_docs_dataverse_url]: https://dataverse.org/ +[dv_docs_about_url]: https://dataverse.org/about +[dv_docs_styleguide_url]: https://guides.dataverse.org/en/latest/style/index.html +[dv_docs_api_url]: http://guides.dataverse.org/en/latest/api/index.html +[dv_docs_devs_url]: https://guides.dataverse.org/en/latest/developers/index.html +[dv_docs_github_userauthtoken_url]: https://docs.github.com/en/apps/creating-github-apps/authenticating-with-a-github-app/generating-a-user-access-token-for-a-github-app +[dv_docs_github_token_url]: https://github.com/settings/tokens + + + + +[_uses_reactjs_url]: https://reactjs.org/ +[_uses_nodejs_url]: https://nodejs.org/ +[_uses_typescript_url]: https://typescriptlang.org/ +[_uses_bootstrap_url]: https://getbootstrap.com +[_uses_cypress_url]: https://cypress.io/ +[_uses_testinglibrary_url]: https://testing-library.com/docs/react-testing-library/intro/ +[_uses_storybook_url]: https://storybook.js.org/ +[_uses_payara_url]: https://www.payara.fish/ +[_uses_docker_url]: https://www.docker.com/products/docker-desktop/ +[_uses_aws3_url]: https://aws.amazon.com/ +[_uses_chromatic_url]: https://www.chromatic.com/ +[_uses_repo_cra_error_url]: https://github.com/facebook/create-react-app/issues/11174 +[_uses_tool_chromatic_url]: https://www.chromatic.com/builds?appId=646f68aa9beb01b35c599acd + + + +[_uses_lib_grep_url]: https://www.npmjs.com/package/@cypress/grep + + + + + + +[_shield_reactjs]: https://img.shields.io/badge/React-20232A?style=for-the-badge&logo=react&logoColor=61DAFB +[_shield_nodejs]: https://img.shields.io/badge/node.js-000000?style=for-the-badge&logo=nodedotjs&logoColor=white +[_shield_typescript]: https://img.shields.io/badge/TypeScript-3178C6?style=for-the-badge&logo=typescript&logoColor=white +[_shield_bootstrap]: https://img.shields.io/badge/Bootstrap-563D7C?style=for-the-badge&logo=bootstrap&logoColor=white +[_shield_cypress]: https://img.shields.io/badge/Cypress-69D3A7?style=for-the-badge&logo=cypress&logoColor=black +[_shield_testinglibrary]: https://img.shields.io/badge/TestingLibrary-E33332?style=for-the-badge&logo=testinglibrary&logoColor=white +[_shield_storybook]: https://img.shields.io/badge/Storybook-FF4785?style=for-the-badge&logo=storybook&logoColor=white +[_shield_docker]: https://img.shields.io/badge/Docker-2496ED?style=for-the-badge&logo=docker&logoColor=white +[_shield_amazons3]: https://img.shields.io/badge/AmazonS3-569A31?style=for-the-badge&logo=amazons3&logoColor=white +[_shield_zulip]: https://img.shields.io/badge/zulip-chat?style=for-the-badge&logo=zulip&logoColor=%236492FE +[_shield_googledevs]: https://img.shields.io/badge/Developer_Group-white?style=for-the-badge&logo=google +[_shield_googleusers]: https://img.shields.io/badge/User_Group-white?style=for-the-badge&logo=google + + + +[_shield_projectstatus]: https://img.shields.io/badge/repo_status-WIP-yellow?style=for-the-badge +[_shield_contributors]: https://img.shields.io/github/contributors/IQSS/dataverse-frontend?branch=develop&style=for-the-badge +[_shield_stargazers]: https://img.shields.io/github/stars/iqss/dataverse-frontend?style=for-the-badge +[_shield_coveralls]: https://img.shields.io/coverallsCoverage/github/IQSS/dataverse-frontend?branch=develop&style=for-the-badge + + + +[_shield_workflow]: https://img.shields.io/github/actions/workflow/status/IQSS/dataverse-frontend/test.yml?branch=develop&style=for-the-badge +[_shield_accessibility]: https://img.shields.io/github/actions/workflow/status/IQSS/dataverse-frontend/accessibility.yml?branch=develop&style=for-the-badge&label=Accessibility +[_shield_issues]: https://img.shields.io/github/issues/IQSS/dataverse-frontend?style=for-the-badge +[_shield_forks]: https://img.shields.io/github/forks/IQSS/dataverse-frontend?style=for-the-badge +[_shield_tag]: https://img.shields.io/github/v/tag/iqss/dataverse-frontend?style=for-the-badge + + + + +[_img_dv_logo_withbackground]: https://github.com/IQSS/dataverse-frontend/assets/7512607/6986476f-39ba-46a4-9be0-f05cd8e92244 +[_img_dv_logo_nobackground]: https://github.com/IQSS/dataverse-frontend/assets/7512607/6c4d79e4-7be5-4102-88bd-dfa167dc79d3 +[_img_screenshot]: images/screenshot.png + + + + +[_video_demo_datasetpage_url]: https://groups.google.com/g/dataverse-community/c/cxZ3Bal_-uo/m/h3kh3iVNCwAJ -## Thanks + -Chromatic +[_video_demo_filetable_url]: https://groups.google.com/g/dataverse-community/c/w_rEMddESYc/m/6F7QC1p-AgAJ -Thanks to [Chromatic](https://www.chromatic.com/) for providing the visual testing platform that helps us review UI changes and catch visual regressions. + +