Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Abstracting permissions frameworks #1

Draft
wants to merge 31 commits into
base: bug/coral_dockerfile_for_arches_7.5
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
4f05067
centralize permissions imports
philtweir Jun 24, 2023
fea39e6
resolve circular imports
philtweir Jun 24, 2023
efe7807
initial attempt to centralize permissions
philtweir Jun 24, 2023
612457c
make permissions framework settable
philtweir Jun 24, 2023
00dbfba
tidy missing methods
philtweir Jun 27, 2023
b7f5b5c
minor permission framework bugfixes
philtweir Sep 23, 2023
2c39bd6
fix base-manager.htm
philtweir Sep 23, 2023
ba7e91f
correct signature for process_new_user
philtweir Oct 2, 2023
9497a83
trial casbin dependencies
philtweir Oct 1, 2023
7fac9ea
basic functioning casbin alignment with user/group/perm
philtweir Oct 2, 2023
56c5b8c
add user accounts and remove dependence on casbin, allowing projects …
philtweir Nov 18, 2023
108eed5
bump python and node versions to allow ubuntu upgrade, and restore po…
philtweir Nov 18, 2023
7de6525
remove migration merge
philtweir Nov 19, 2023
c815a2d
FIXME: need to work out why this is not normally required, but consis…
philtweir Nov 19, 2023
9048f79
accept sequences of CSVs in business data
philtweir Nov 19, 2023
c378e04
remove extra slash from urls.py
philtweir Nov 20, 2023
efe85dd
remove debug
philtweir Nov 21, 2023
f411912
allow not to include business data in package load (useful for webpac…
philtweir Nov 28, 2023
cfc3c46
implement dummy hooks for standard model for user/group/permission up…
philtweir Dec 7, 2023
7729612
remove request from internal search routines and allow updating by query
philtweir Dec 10, 2023
86802fd
add support for filtering by sets in ES
philtweir Dec 17, 2023
117bef0
enable async ES tasks
philtweir Dec 17, 2023
9e2a976
enable users to see the usual nodes
philtweir Dec 19, 2023
789a3da
fix issues with request passing to datatypes
philtweir Dec 21, 2023
150abf3
add a permissions framework that is a blanket allow for users with an…
philtweir Dec 23, 2023
b2163ae
speed up permission checking
philtweir Dec 27, 2023
310b5dd
user_is_resource_reviewer no longer throws attributeerror is user is …
philtweir Dec 27, 2023
f319e36
ensure recreating database recreates common connection
philtweir Dec 28, 2023
579d9d1
corrections to get_permitted_graphids
philtweir Dec 28, 2023
98f89e7
include concept of a principal user
philtweir Jan 22, 2024
65c9622
fix permissions return type bugs
philtweir Jan 22, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 16 additions & 13 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM ubuntu:18.04 as base
FROM ubuntu:22.04 as base
USER root

## Setting default environment variables
Expand Down Expand Up @@ -30,17 +30,17 @@ RUN set -ex \
docbook-mathml \
libgdal-dev \
libpq-dev \
python3.8 \
python3.8-dev \
python3.10 \
python3.10-dev \
curl \
python3.8-distutils \
python3.10-distutils \
libldap2-dev libsasl2-dev ldap-utils \
dos2unix \
" \
&& apt-get update -y \
&& apt-get install -y --no-install-recommends $BUILD_DEPS \
&& curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py \
&& python3.8 get-pip.py
&& python3.10 get-pip.py

RUN pip3 wheel --no-cache-dir -r ${WHEELS}/requirements.txt \
&& pip3 wheel --no-cache-dir -r ${WHEELS}/requirements_dev.txt \
Expand Down Expand Up @@ -70,20 +70,23 @@ RUN set -ex \
libgdal-dev \
python3-venv \
postgresql-client-12 \
python3.8 \
python3.8-distutils \
python3.8-venv \
python3.10 \
python3.10-distutils \
python3.10-venv \
" \
&& apt-get update -y \
&& apt-get install -y --no-install-recommends curl \
&& curl -sL https://deb.nodesource.com/setup_14.x | bash - \
&& apt-get install -y --no-install-recommends curl ca-certificates gnupg \
&& mkdir -p /etc/apt/keyrings \
&& curl -fsSL https://deb.nodesource.com/gpgkey/nodesource-repo.gpg.key | gpg --dearmor -o /etc/apt/keyrings/nodesource.gpg \
&& NODE_MAJOR=16 \
&& (echo "deb [signed-by=/etc/apt/keyrings/nodesource.gpg] https://deb.nodesource.com/node_$NODE_MAJOR.x nodistro main" > /etc/apt/sources.list.d/nodesource.list) \
&& curl -sL https://www.postgresql.org/media/keys/ACCC4CF8.asc | apt-key add - \
&& add-apt-repository "deb http://apt.postgresql.org/pub/repos/apt/ $(lsb_release -sc)-pgdg main" \
&& apt-get update -y \
&& apt-get install -y --no-install-recommends $RUN_DEPS \
&& curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py \
&& python3.8 get-pip.py \
&& apt-get install -y nodejs \
&& python3.10 get-pip.py \
&& apt-get -y install --no-install-recommends nodejs \
&& npm install -g yarn

# Install Yarn components
Expand All @@ -99,7 +102,7 @@ WORKDIR ${WEB_ROOT}

RUN mv ${WHEELS}/entrypoint.sh entrypoint.sh

RUN python3.8 -m venv ENV \
RUN python3.10 -m venv ENV \
&& . ENV/bin/activate \
&& pip install wheel setuptools requests \
&& pip install rjsmin==1.2.0 MarkupSafe==2.0.0 \
Expand Down
6 changes: 3 additions & 3 deletions arches/app/datatypes/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -222,14 +222,14 @@ def get_search_terms(self, nodevalue, nodeid=None):
"""
return []

def append_search_filters(self, value, node, query, request):
def append_search_filters(self, value, node, query, parameters):
"""
Allows for modification of an elasticsearch bool query for use in
advanced search
"""
pass

def append_null_search_filters(self, value, node, query, request):
def append_null_search_filters(self, value, node, query, parameters):
"""
Appends the search query dsl to search for fields that have not been populated
"""
Expand Down Expand Up @@ -298,7 +298,7 @@ def get_default_language_value_from_localized_node(self, tile, nodeid):
"""
return tile.data[str(nodeid)]

def post_tile_save(self, tile, nodeid, request):
def post_tile_save(self, tile, nodeid, parameters, user):
"""
Called after the tile is saved to the database

Expand Down
6 changes: 3 additions & 3 deletions arches/app/datatypes/concept_types.py
Original file line number Diff line number Diff line change
Expand Up @@ -124,10 +124,10 @@ def append_to_document(self, document, nodevalue, nodeid, tile, provisional=Fals
)
document["strings"].append({"string": value.value, "nodegroup_id": tile.nodegroup_id, "provisional": provisional})

def append_search_filters(self, value, node, query, request):
def append_search_filters(self, value, node, query, parameters):
try:
if value["op"] == "null" or value["op"] == "not_null":
self.append_null_search_filters(value, node, query, request)
self.append_null_search_filters(value, node, query, parameters)
elif value["val"] != "":
match_query = Match(field="tiles.data.%s" % (str(node.pk)), type="phrase", query=value["val"])
if "!" in value["op"]:
Expand Down Expand Up @@ -385,4 +385,4 @@ def collects_multiple_values(self):
return True

def ignore_keys(self):
return ["http://www.w3.org/2000/01/rdf-schema#label http://www.w3.org/2000/01/rdf-schema#Literal"]
return ["http://www.w3.org/2000/01/rdf-schema#label http://www.w3.org/2000/01/rdf-schema#Literal"]
76 changes: 44 additions & 32 deletions arches/app/datatypes/datatypes.py
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,7 @@ def transform_export_values(self, value, *args, **kwargs):
else:
return value[get_language()]["value"]
except KeyError:
# sometimes certain requested language values aren't populated. Just pass back with implicit None.
# sometimes certain parametersed language values aren't populated. Just pass back with implicit None.
pass

def get_search_terms(self, nodevalue, nodeid=None):
Expand All @@ -203,7 +203,7 @@ def get_search_terms(self, nodevalue, nodeid=None):
pass
return terms

def append_null_search_filters(self, value, node, query, request):
def append_null_search_filters(self, value, node, query, parameters):
"""
Appends the search query dsl to search for fields that have not been populated or are empty strings
"""
Expand All @@ -228,10 +228,10 @@ def append_null_search_filters(self, value, node, query, request):
non_blank_string_query = Term(field=f"tiles.data.{str(node.pk)}.{value['lang']}.value.keyword", query="")
query.should(Nested(path="tiles", query=non_blank_string_query))

def append_search_filters(self, value, node, query, request):
def append_search_filters(self, value, node, query, parameters):
try:
if value["op"] == "null" or value["op"] == "not_null":
self.append_null_search_filters(value, node, query, request)
self.append_null_search_filters(value, node, query, parameters)
elif value["val"] != "":
exact_terms = re.search('"(?P<search_string>.*)"', value["val"])
if exact_terms:
Expand Down Expand Up @@ -430,10 +430,10 @@ def pre_tile_save(self, tile, nodeid):
def append_to_document(self, document, nodevalue, nodeid, tile, provisional=False):
document["numbers"].append({"number": nodevalue, "nodegroup_id": tile.nodegroup_id, "provisional": provisional})

def append_search_filters(self, value, node, query, request):
def append_search_filters(self, value, node, query, parameters):
try:
if value["op"] == "null" or value["op"] == "not_null":
self.append_null_search_filters(value, node, query, request)
self.append_null_search_filters(value, node, query, parameters)
elif value["val"] != "":
if value["op"] != "eq":
operators = {"gte": None, "lte": None, "lt": None, "gt": None}
Expand Down Expand Up @@ -521,11 +521,11 @@ def to_json(self, tile, node):
def transform_value_for_tile(self, value, **kwargs):
return bool(util.strtobool(str(value)))

def append_search_filters(self, value, node, query, request):
def append_search_filters(self, value, node, query, parameters):
try:
if value["val"] == "null" or value["val"] == "not_null":
value["op"] = value["val"]
self.append_null_search_filters(value, node, query, request)
self.append_null_search_filters(value, node, query, parameters)
elif value["val"] != "" and value["val"] is not None:
term = True if value["val"] == "t" else False
query.must(Term(field="tiles.data.%s" % (str(node.pk)), term=term))
Expand Down Expand Up @@ -655,10 +655,10 @@ def append_to_document(self, document, nodevalue, nodeid, tile, provisional=Fals
{"date": ExtendedDateFormat(nodevalue).lower, "nodegroup_id": tile.nodegroup_id, "nodeid": nodeid, "provisional": provisional}
)

def append_search_filters(self, value, node, query, request):
def append_search_filters(self, value, node, query, parameters):
try:
if value["op"] == "null" or value["op"] == "not_null":
self.append_null_search_filters(value, node, query, request)
self.append_null_search_filters(value, node, query, parameters)
elif value["val"] != "" and value["val"] is not None:
try:
date_value = datetime.strptime(value["val"], "%Y-%m-%d %H:%M:%S%z").astimezone().isoformat()
Expand Down Expand Up @@ -781,7 +781,7 @@ def add_date_to_doc(document, edtf):
add_date_to_doc(document, edtf)
add_date_to_doc(tile.data[nodeid], edtf)

def append_search_filters(self, value, node, query, request):
def append_search_filters(self, value, node, query, parameters):
def add_date_to_doc(query, edtf):
if value["op"] == "eq":
if edtf.lower != edtf.upper:
Expand Down Expand Up @@ -809,7 +809,7 @@ def add_date_to_doc(query, edtf):
raise Exception(_("Invalid date specified."))

if value["op"] == "null" or value["op"] == "not_null":
self.append_null_search_filters(value, node, query, request)
self.append_null_search_filters(value, node, query, parameters)
elif value["val"] != "" and value["val"] is not None:
edtf = ExtendedDateFormat(value["val"])
if edtf.result_set:
Expand Down Expand Up @@ -1570,14 +1570,20 @@ def to_json(self, tile, node):
if data:
return self.compile_json(tile, node, file_details=data[str(node.nodeid)])

def post_tile_save(self, tile, nodeid, request):
if request is not None:
def post_tile_save(self, tile, nodeid, parameters, user=None):
if parameters is not None:
# this does not get called when saving data from the mobile app
previously_saved_tile = models.TileModel.objects.filter(pk=tile.tileid)
user = request.user
if hasattr(request.user, "userprofile") is not True:
models.UserProfile.objects.create(user=request.user)
user_is_reviewer = user_is_resource_reviewer(request.user)
user_is_reviewer = False
if user is True:
user_is_reviewer = True
elif user:
if hasattr(parameters.user, "userprofile") is not True:
models.UserProfile.objects.create(user=parameters.user)
user_is_reviewer = user_is_resource_reviewer(parameters.user)
else:
# There must be a user to be able to upload files.
return
current_tile_data = self.get_tile_data(tile)
if previously_saved_tile.count() == 1:
previously_saved_tile_data = self.get_tile_data(previously_saved_tile[0])
Expand All @@ -1594,7 +1600,7 @@ def post_tile_save(self, tile, nodeid, request):
except models.File.DoesNotExist:
logger.exception(_("File does not exist"))

files = request.FILES.getlist("file-list_" + nodeid + "_preloaded", []) + request.FILES.getlist("file-list_" + nodeid, [])
files = parameters["FILES"].getlist("file-list_" + nodeid + "_preloaded", []) + parameters["FILES"].getlist("file-list_" + nodeid, [])
tile_exists = models.TileModel.objects.filter(pk=tile.tileid).exists()

for file_data in files:
Expand Down Expand Up @@ -1649,7 +1655,7 @@ def transform_value_for_tile(self, value, **kwargs):
Accepts a comma delimited string of file paths as 'value' to create a file datatype value
with corresponding file record in the files table for each path. Only the basename of each path is used, so
the accuracy of the full path is not important. However the name of each file must match the name of a file in
the directory from which Arches will request files. By default, this is the directory in a project as defined
the directory from which Arches will parameters files. By default, this is the directory in a project as defined
in settings.UPLOADED_FILES_DIR.

"""
Expand Down Expand Up @@ -1696,7 +1702,7 @@ def transform_value_for_tile(self, value, **kwargs):
return json.loads(json.dumps(tile_data))

def pre_tile_save(self, tile, nodeid):
# TODO If possible this method should probably replace 'handle request'
# TODO If possible this method should probably replace 'handle parameters'
if tile.data[nodeid]:
for file in tile.data[nodeid]:
try:
Expand Down Expand Up @@ -1960,10 +1966,10 @@ def transform_export_values(self, value, *args, **kwargs):
ret = value
return ret

def append_search_filters(self, value, node, query, request):
def append_search_filters(self, value, node, query, parameters):
try:
if value["op"] == "null" or value["op"] == "not_null":
self.append_null_search_filters(value, node, query, request)
self.append_null_search_filters(value, node, query, parameters)
elif value["val"] != "":
search_query = Match(field="tiles.data.%s" % (str(node.pk)), type="phrase", query=value["val"])
if "!" in value["op"]:
Expand Down Expand Up @@ -2136,10 +2142,10 @@ def transform_export_values(self, value, *args, **kwargs):
new_values.append(val)
return ",".join(new_values)

def append_search_filters(self, value, node, query, request):
def append_search_filters(self, value, node, query, parameters):
try:
if value["op"] == "null" or value["op"] == "not_null":
self.append_null_search_filters(value, node, query, request)
self.append_null_search_filters(value, node, query, parameters)
elif value["val"] != "" and value["val"] != []:
search_query = Match(field="tiles.data.%s" % (str(node.pk)), type="phrase", query=value["val"])
if "!" in value["op"]:
Expand Down Expand Up @@ -2235,7 +2241,7 @@ def pre_tile_save(self, tile, nodeid):
for relationship in relationships:
relationship["resourceXresourceId"] = str(uuid.uuid4())

def post_tile_save(self, tile, nodeid, request):
def post_tile_save(self, tile, nodeid, parameters, user):
ret = False
sql = """
SELECT * FROM __arches_create_resource_x_resource_relationships('%s') as t;
Expand Down Expand Up @@ -2303,10 +2309,16 @@ def transform_value_for_tile(self, value, **kwargs):
return json.loads(value)
except ValueError:
# do this if json (invalid) is formatted with single quotes, re #6390
try:
return ast.literal_eval(value)
except:
if "'" in value:
try:
return ast.literal_eval(value)
except:
return value
elif isinstance(value, str):
return [{"resourceId": val} for val in value.split(";")]
else:
return value

except TypeError:
# data should come in as json but python list is accepted as well
if isinstance(value, list):
Expand All @@ -2315,10 +2327,10 @@ def transform_value_for_tile(self, value, **kwargs):
def transform_export_values(self, value, *args, **kwargs):
return json.dumps(value)

def append_search_filters(self, value, node, query, request):
def append_search_filters(self, value, node, query, parameters):
try:
if value["op"] == "null" or value["op"] == "not_null":
self.append_null_search_filters(value, node, query, request)
self.append_null_search_filters(value, node, query, parameters)
elif value["val"] != "" and value["val"] != []:
# search_query = Match(field="tiles.data.%s.resourceId" % (str(node.pk)), type="phrase", query=value["val"])
search_query = Terms(field="tiles.data.%s.resourceId.keyword" % (str(node.pk)), terms=value["val"])
Expand Down Expand Up @@ -2454,7 +2466,7 @@ def get_display_value(self, tile, node, **kwargs):
def append_to_document(self, document, nodevalue, nodeid, tile, provisional=False):
pass

def append_search_filters(self, value, node, query, request):
def append_search_filters(self, value, node, query, parameters):
pass


Expand Down
2 changes: 1 addition & 1 deletion arches/app/datatypes/url.py
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,7 @@ def get_search_terms(self, nodevalue, nodeid=None):
# terms.append(nodevalue['url']) FIXME: URLs searchable?
return terms

def append_search_filters(self, value, node, query, request):
def append_search_filters(self, value, node, query, parameters):
# Match the label in the same manner as a String datatype
try:
if value["val"] != "":
Expand Down
5 changes: 5 additions & 0 deletions arches/app/media/css/arches.scss
Original file line number Diff line number Diff line change
Expand Up @@ -10339,6 +10339,11 @@ ul.pagination {
padding: 10px 5px 0px 10px;
}

.search-listing-title.principal-user {
font-weight: 1000;
background: rgb(204,230,244);
padding-bottom: 5px;
}
.search-listing-title.i18n-alt a span {
font-size: 13px;
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -202,6 +202,9 @@ function($, _, BaseFilter, bootstrap, arches, select2, ko, koMapping, GraphModel
resourceinstanceid: result._source.resourceinstanceid,
displaydescription: result._source.displaydescription,
alternativelanguage: result._source.displayname_language != arches.activeLanguage,
principaluser: ko.computed(function () {
return result._source.permissions.principal_user && result._source.permissions.principal_user.includes(ko.unwrap(self.userid));
}),
"map_popup": result._source.map_popup,
"provisional_resource": result._source.provisional_resource,
geometries: ko.observableArray(result._source.geometries),
Expand Down Expand Up @@ -252,4 +255,4 @@ function($, _, BaseFilter, bootstrap, arches, select2, ko, koMapping, GraphModel
}),
template: searchResultsTemplate
});
});
});
Loading
Loading