Skip to content

Commit

Permalink
Co-authored-by: Josue <[email protected]>
Browse files Browse the repository at this point in the history
* Initial implementation of planet feed wiki migration tool
* Removed satellite & migrated to console.log (and friends) instead
* Added README, review fixes
* Refactor README to be more a-temporal
* Change feeds into string
* Removed env.local
  • Loading branch information
chrispinkney committed Apr 9, 2021
1 parent f8aee7b commit da076a4
Show file tree
Hide file tree
Showing 4 changed files with 106 additions and 0 deletions.
1 change: 1 addition & 0 deletions tools/migrate/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
legacy_users.json
24 changes: 24 additions & 0 deletions tools/migrate/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# Planet CDOT Feed List migration tool

This tool downloads all users from the wiki page and dumps them into a JSON file.

## Customization

In `migrage.js` you can find the following two variables: `FEED_URL` and `FILE_NAME`.

- `FEED_URL` points to the current location of the [Planet CDOT Feed List](https://wiki.cdot.senecacollege.ca/wiki/Planet_CDOT_Feed_List#Feeds)
- `FILE_NAME` allows users to specify the desired filename of the output file

## Install Dependencies

```
cd src/tools/migrate
npm install
```

## Usage

```
cd src/tools/migrate
npm start
```
65 changes: 65 additions & 0 deletions tools/migrate/migrate.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
const fetch = require('node-fetch');
const jsdom = require('jsdom');
const { isWebUri } = require('valid-url');
const fs = require('fs');

const { JSDOM } = jsdom;
const URL = 'https://wiki.cdot.senecacollege.ca/wiki/Planet_CDOT_Feed_List';
const FILE = 'legacy_users.json';

const getWikiText = async (url) => {
try {
const response = await fetch(url);
const data = await response.text();

const dom = new JSDOM(data);
return dom.window.document.querySelector('pre').textContent;
} catch {
throw new Error(`Unable to download wiki feed data from url ${url}`);
}
};

(async () => {
let wikiText;

// Try to fetch the feed list from 'FEED_URL'
try {
wikiText = await getWikiText(URL);
console.info(`Extracting users from ${URL}`);
} catch (error) {
console.error(error);
process.exit(1);
}

// store every line in an array
const lines = wikiText.split(/\r\n|\r|\n/);
const commentRegex = /^\s*#/;

let firstName;
let lastName;
let feed;
const users = [];

// Iterate through all lines and find url/name pairs, then parse them.
lines.forEach((line, index) => {
if (!commentRegex.test(line) && line.startsWith('[')) {
feed = line.replace(/[[\]']/g, '');

if (feed.length && isWebUri(feed)) {
[firstName, lastName] = lines[index + 1].replace(/^\s*name\s*=\s*/, '').split(' ');
users.push({ firstName, lastName, feed });
} else {
console.error(`Skipping invalid wiki feed url ${feed} for author ${firstName} ${lastName}`);
}
}
});

try {
fs.writeFileSync(`${FILE}`, JSON.stringify(users));
console.log(
`Processed ${users.length} records. Legacy users were successfully written to file: ${FILE}.`
);
} catch (err) {
console.error(err);
}
})();
16 changes: 16 additions & 0 deletions tools/migrate/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
{
"name": "migrate",
"version": "1.0.0",
"description": "Migrates users from the planet feed wiki list to a JSON file.",
"main": "migrate.js",
"scripts": {
"start": "node migrate.js"
},
"author": "",
"license": "ISC",
"dependencies": {
"jsdom": "^16.5.2",
"node-fetch": "^2.6.1",
"valid-url": "^1.0.9"
}
}

0 comments on commit da076a4

Please sign in to comment.