Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closes #2083: Migrate legacy accounts away from wiki #2096

Merged
merged 1 commit into from
Apr 9, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions tools/migrate/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
legacy_users.json
24 changes: 24 additions & 0 deletions tools/migrate/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# Planet CDOT Feed List migration tool

This tool downloads all users from the wiki page and dumps them into a JSON file.

## Customization

In `migrage.js` you can find the following two variables: `FEED_URL` and `FILE_NAME`.

- `FEED_URL` points to the current location of the [Planet CDOT Feed List](https://wiki.cdot.senecacollege.ca/wiki/Planet_CDOT_Feed_List#Feeds)
- `FILE_NAME` allows users to specify the desired filename of the output file

## Install Dependencies

```
cd src/tools/migrate
npm install
```

## Usage

```
cd src/tools/migrate
npm start
```
65 changes: 65 additions & 0 deletions tools/migrate/migrate.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
const fetch = require('node-fetch');
const jsdom = require('jsdom');
const { isWebUri } = require('valid-url');
const fs = require('fs');

const { JSDOM } = jsdom;
const URL = 'https://wiki.cdot.senecacollege.ca/wiki/Planet_CDOT_Feed_List';
const FILE = 'legacy_users.json';

const getWikiText = async (url) => {
try {
const response = await fetch(url);
const data = await response.text();

const dom = new JSDOM(data);
return dom.window.document.querySelector('pre').textContent;
} catch {
throw new Error(`Unable to download wiki feed data from url ${url}`);
}
};

(async () => {
let wikiText;

// Try to fetch the feed list from 'FEED_URL'
try {
wikiText = await getWikiText(URL);
console.info(`Extracting users from ${URL}`);
} catch (error) {
console.error(error);
process.exit(1);
}

// store every line in an array
const lines = wikiText.split(/\r\n|\r|\n/);
const commentRegex = /^\s*#/;

let firstName;
let lastName;
let feed;
const users = [];

// Iterate through all lines and find url/name pairs, then parse them.
lines.forEach((line, index) => {
if (!commentRegex.test(line) && line.startsWith('[')) {
feed = line.replace(/[[\]']/g, '');

if (feed.length && isWebUri(feed)) {
[firstName, lastName] = lines[index + 1].replace(/^\s*name\s*=\s*/, '').split(' ');
users.push({ firstName, lastName, feed });
} else {
console.error(`Skipping invalid wiki feed url ${feed} for author ${firstName} ${lastName}`);
}
}
});

try {
fs.writeFileSync(`${FILE}`, JSON.stringify(users));
console.log(
`Processed ${users.length} records. Legacy users were successfully written to file: ${FILE}.`
);
} catch (err) {
console.error(err);
}
})();
16 changes: 16 additions & 0 deletions tools/migrate/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
{
"name": "migrate",
"version": "1.0.0",
"description": "Migrates users from the planet feed wiki list to a JSON file.",
"main": "migrate.js",
"scripts": {
"start": "node migrate.js"
},
"author": "",
"license": "ISC",
"dependencies": {
"jsdom": "^16.5.2",
"node-fetch": "^2.6.1",
"valid-url": "^1.0.9"
}
}