-
Notifications
You must be signed in to change notification settings - Fork 189
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Keep feeds in Redis synchronized with wiki over time #294
Comments
I'd like to take this up. |
This is potentially a duplicate of #503. |
@Silvyre, I was thinking about this today. Our wiki feed list is (currently) our "source of truth," and we want to use it as the official list of what we store in our system. Given this, here are the cases I can think of that we need to deal with:
The code that currently does a bunch of this is located in What I think we need to also do is get the complete set of all feed Objects from Redis, and put them in a JavaScript Set Object. We then need to create a second JavaScript Set Object, and put all the parsed feed Objects (i.e., from the wiki) into it. Now we can easily calculate the // Create a set of IDs for feeds in the wiki
let wiki = new Set(['one', 'two', 'three'])
// Create a set of IDs for feeds in redis
let redis = new Set(['one', 'two', 'four'])
// Figure out which feeds need to get added to Redis (in wiki, not in redis)
let newFeeds = new Set([...wiki].filter(feed => !redis.has(feed)))
// this will give us: Set { 'three' }
// Figure out which feeds need to get removed from redis (gone from wiki, but still in redis)
let deletedFeeds = new Set([...redis].filter(feed => !wiki.has(feed)))
// this will give us: Set { 'four' } So now we can iterate Writing tests for this will be a bit of work, but we can do that in a follow-up. What do you think of this? |
I plan to collaborate on this as part of Release 0.7 |
We agreed to delay this until the user structure (including 'null'/'wiki' users and/or 'admin' users) is implemented, e.g. #709 |
@Silvyre Did you have your own triage meeting today? How did you remember this issue from November? XD |
This particular issue has earned a special place in my heart, as it is the only one I have had to completely scrap my initial work on. |
Bring it back! Bring it back! I'm actually planning to bring back my own issue from November as well #290 after @agarcia-caicedo and I finish #896 |
We'll see if I have time before the code freeze! 🥶 |
Same |
As far as I know, our system currently handles the above three use cases.
We have yet to implement this use case, I believe.
This use case is a combination of use cases 3 (handled) and 4 (not yet handled). As such, resolving use case 4 should resolve this one, as well.
I believe we can we could implement the remaining missing logic (i.e. for use case 4) in /backend/index.js... telescope/src/backend/index.js Lines 76 to 85 in 80aa426
...like so: // telescope/src/backend/index.js
// ...
const normalizeUrl = require('normalize-url');
const hash = require('../src/backend/data/hash');
const urlToId = (url) => hash(normalizeUrl(url));
// ...
async function processAllFeeds() {
try {
// Get an Array of Feed objects from the wiki feed list and Redis
let [all, wiki] = await Promise.all([Feed.all(), getWikiFeeds()]);
// ensure no duplicate feeds exist
all = Array.from(new Set(all));
wiki = Array.from(new Set(wiki));
const wikiIds = wiki.map(feed => urlToId(feed.url));
// prevent deleted Wiki feeds from being processed
// additionally delete such feeds from Redis
all = all.filter(async (feed) => {
if (!wikiIds.includes(feed.id)) {
await feed.delete();
return false;
}
return true;
});
// Process remaining feeds into the database and feed queue
await processFeeds([...all, ...wiki]);
} catch (err) {
logger.error({ err }, 'Error queuing feeds');
}
} On a related note, there now exists a new problematic use case, however, that we may wish to address within a separate issue:
|
Here's another way to look at this problem: What if we got rid of the wiki feed list? All we need is a way for an Admin to be able to bulk insert a bunch of feeds once, and then this problem goes away. Maybe we should morph this bug into: "Allow an Admin to import an OPML or Planet Feed List formatted file, and add all the associated feeds." I don't see any value in keeping the feed list now that we have Add Feed via authenticated My Feeds. |
This problem is going away now that feeds are moving into Supabase. Closing. |
This is related to #266, #264. We are going to be downloading the feed list (names, urls) from the wiki, and placing them in Redis. We'll need to keep this data in sync:
env
I suspect that this will involve altering how we store feeds in Redis so that we can easily look them up by name/url, which we can't do at the moment.
The text was updated successfully, but these errors were encountered: