Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prevent user sign up on LOD imports #11827

Conversation

rtibbles
Copy link
Member

@rtibbles rtibbles commented Feb 1, 2024

Summary

  • Adds full facility import flag on dataset API response
  • Use that to determine if we should allow signup
  • Adds a permission check in the SignupViewset to prevent signups under the same condition at the API level

References

Fixes #11817

Reviewer guidance

Do a learner only import from a facility that allows learners to sign up.
Ensure that you cannot create an account.


Testing checklist

  • Contributor has fully tested the PR manually
  • If there are any front-end changes, before/after screenshots are included
  • Critical user journeys are covered by Gherkin stories
  • Critical and brittle code paths are covered by unit tests

PR process

  • PR has the correct target branch and milestone
  • PR has 'needs review' or 'work-in-progress' label
  • If PR is ready for review, a reviewer has been added. (Don't use 'Assignees')
  • If this is an important user-facing change, PR or related issue has a 'changelog' label
  • If this includes an internal dependency change, a link to the diff is provided

Reviewer checklist

  • Automated test coverage is satisfactory
  • PR is fully functional
  • PR has been tested for accessibility regressions
  • External dependency files were updated if necessary (yarn and pip)
  • Documentation is updated
  • Contributor is in AUTHORS.md

@github-actions github-actions bot added DEV: backend Python, databases, networking, filesystem... APP: User Re: User app (sign-in, sign-up, user profile, etc.) DEV: frontend SIZE: small labels Feb 1, 2024
@pcenov pcenov self-requested a review February 2, 2024 15:16
@pcenov
Copy link
Member

pcenov commented Feb 2, 2024

LGTM - learners on a LOD can no longer sign up. Tested on Android and Ubuntu with imported, created and migrated learners plus regression tested.

Copy link
Member

@radinamatic radinamatic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Manual QA passes, good to go! 💯 :shipit: 👏🏽

Copy link
Member

@jredrejo jredrejo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code looks good, I've left a couple of notes as a reminder to myself of things to do in future kolibri versions.

@@ -19,6 +19,7 @@
from kolibri.core.content.constants.schema_versions import MIN_CONTENT_SCHEMA_VERSION
from kolibri.utils.android import ANDROID_PLATFORM_SYSTEM_VALUE
from kolibri.utils.android import on_android
from kolibri.utils.lru_cache import lru_cache
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's remember to remove this file in 0.17, after stopping python 2.7 support

@@ -486,6 +487,7 @@ def get_device_info(version=DEVICE_INFO_VERSION):
return info


@lru_cache()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using lru_cache without a max_size does not use lru capabilities, it's a normal cache without memory limit. This is not a problem in this case because not many dataset_ids will be used to call this function. Maybe using a @cache decorator would result in a clearer code for future devs.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, I was misled by our backport, which has a default max_size parameter: https://github.com/learningequality/kolibri/blob/develop/kolibri/utils/lru_cache.py#L54 - we should ensure other uses of lru_cache in the codebase add a max_size parameter as well, as none of them do.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like the functools version has a default maxsize too? https://docs.python.org/3/library/functools.html#functools.lru_cache

@rtibbles rtibbles merged commit be24516 into learningequality:release-v0.16.x Feb 2, 2024
34 checks passed
@rtibbles rtibbles deleted the lod_or_not_lod_that_is_the_question branch February 2, 2024 18:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
APP: User Re: User app (sign-in, sign-up, user profile, etc.) DEV: backend Python, databases, networking, filesystem... DEV: frontend SIZE: small
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants