Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fallback to cookie session store when mongo session store is not needed #44

Open
drgcms opened this issue Mar 28, 2020 · 2 comments
Open

Comments

@drgcms
Copy link

drgcms commented Mar 28, 2020

Hi!

I've been thinking about this for quite some time.

At first I was annoyed that whenever crawlers visit my web pages, they do not retain cookies. So every visit to site made a write into session store collection, which was never used again. I ended up making a batch job, which was deleting from session store all document which had created_at value equal to updated_at value. Later I found out that setting session to nil avoids this behavior, so this problem doesn't exist anymore.

Then I have one idea. I really need mongo session store only when users are logged in. Data required for guest user is so small that it can easily be stored to Rails cookie store. Even users, which are not CMS editors, don't need session data stored into mongo session store. On positive side this would make one read and write less to MongoDB per visit and can have huge impact on speed optimized web sites.

My question or proposal for an update is? Is there a way to fallback to cookie session store, when mongo session store is not required and how to do it?

Thanks and stay healty
Damjan Rems

@tombruijn
Copy link
Collaborator

Hi @drgcms, I wouldn't suggest mixing multiple session stores. They might create issues down the line where it's unclear where a session is if you rely on the session being available in the database. If you set the session to nil with your own logic to detect crawler, do you also avoid the extra read and write operation?

@drgcms
Copy link
Author

drgcms commented May 1, 2020

As far as I can see in the logs, there is no extra database read when new session is started (you can check this by deleting session cookie).

There is also no extra write to cookies database when cookie is set to nil.

I've been using session=nil for robots for years after starting a web site with lots of different pages. I have got more than 50k documents per day in cookie store collection from crawlers only. So I ran cron job every night, just to remove cookies with created_at=updated_at time. Problem normalized after setting session for crawlers to nil.

My idea for combined store would be:

  • Cookie store first. Everything goes to cookie store. Combine flag is set to false.
  • Set combine flag to true. Create session key and save session data to database store. Cookie now only holds session key and combine flag value.
  • Flipping combine flag back to false, removes session key and saves session to cookie. The process should fail if the session data is too large or in worst case session is reseted. That would be programmers problem.

Of course. This is just an idea and solution might be more complicated since I don't know a lot about cookies mechanism.

by
TheR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants