You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been thinking about this for quite some time.
At first I was annoyed that whenever crawlers visit my web pages, they do not retain cookies. So every visit to site made a write into session store collection, which was never used again. I ended up making a batch job, which was deleting from session store all document which had created_at value equal to updated_at value. Later I found out that setting session to nil avoids this behavior, so this problem doesn't exist anymore.
Then I have one idea. I really need mongo session store only when users are logged in. Data required for guest user is so small that it can easily be stored to Rails cookie store. Even users, which are not CMS editors, don't need session data stored into mongo session store. On positive side this would make one read and write less to MongoDB per visit and can have huge impact on speed optimized web sites.
My question or proposal for an update is? Is there a way to fallback to cookie session store, when mongo session store is not required and how to do it?
Thanks and stay healty
Damjan Rems
The text was updated successfully, but these errors were encountered:
Hi @drgcms, I wouldn't suggest mixing multiple session stores. They might create issues down the line where it's unclear where a session is if you rely on the session being available in the database. If you set the session to nil with your own logic to detect crawler, do you also avoid the extra read and write operation?
As far as I can see in the logs, there is no extra database read when new session is started (you can check this by deleting session cookie).
There is also no extra write to cookies database when cookie is set to nil.
I've been using session=nil for robots for years after starting a web site with lots of different pages. I have got more than 50k documents per day in cookie store collection from crawlers only. So I ran cron job every night, just to remove cookies with created_at=updated_at time. Problem normalized after setting session for crawlers to nil.
My idea for combined store would be:
Cookie store first. Everything goes to cookie store. Combine flag is set to false.
Set combine flag to true. Create session key and save session data to database store. Cookie now only holds session key and combine flag value.
Flipping combine flag back to false, removes session key and saves session to cookie. The process should fail if the session data is too large or in worst case session is reseted. That would be programmers problem.
Of course. This is just an idea and solution might be more complicated since I don't know a lot about cookies mechanism.
Hi!
I've been thinking about this for quite some time.
At first I was annoyed that whenever crawlers visit my web pages, they do not retain cookies. So every visit to site made a write into session store collection, which was never used again. I ended up making a batch job, which was deleting from session store all document which had created_at value equal to updated_at value. Later I found out that setting session to nil avoids this behavior, so this problem doesn't exist anymore.
Then I have one idea. I really need mongo session store only when users are logged in. Data required for guest user is so small that it can easily be stored to Rails cookie store. Even users, which are not CMS editors, don't need session data stored into mongo session store. On positive side this would make one read and write less to MongoDB per visit and can have huge impact on speed optimized web sites.
My question or proposal for an update is? Is there a way to fallback to cookie session store, when mongo session store is not required and how to do it?
Thanks and stay healty
Damjan Rems
The text was updated successfully, but these errors were encountered: