Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use calculate_max_sqlite_variables to avoid errors in merge_users #8473

Merged
merged 1 commit into from
Oct 5, 2021
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 11 additions & 1 deletion kolibri/core/auth/utils.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
import sys

from django.db import connections
from django.utils.six.moves import input
from morango.sync.backends.utils import calculate_max_sqlite_variables
from morango.sync.operations import LocalOperation

from kolibri.core.auth.models import AdHocGroup
Expand Down Expand Up @@ -107,7 +109,15 @@ def _merge_log_data(LogModel):
log_map[log.id] = new_log.id
if new_log.id not in target_log_ids:
new_logs.append(new_log)
LogModel.objects.bulk_create(new_logs, batch_size=750)

vendor = connections[LogModel.objects.db].vendor
batch_size = (
calculate_max_sqlite_variables() // len(LogModel._meta.fields)
if vendor == "sqlite"
else 750
)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(I used an inline conditional because the linting was complaining about function complexity and I didn't have it in me to try to start refactoring other stuff :) )


LogModel.objects.bulk_create(new_logs, batch_size=batch_size)

_merge_log_data(ContentSessionLog)

Expand Down