Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v3.7.0 #307

Merged
merged 10 commits into from
Dec 21, 2019
3 changes: 2 additions & 1 deletion .eslintrc.js
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,8 @@ module.exports = {
"prettier/prettier": "error",
"indent": [
"error",
4
4,
{ "SwitchCase": 1 }
],
"no-restricted-syntax": ["error", "ForInStatement", "LabeledStatement", "WithStatement"],
"object-curly-spacing": ["error", "always"],
Expand Down
10 changes: 10 additions & 0 deletions History.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,13 @@
# v3.7.0

* [ADDED] Ability to Transform Header [#287](https://github.com/C2FO/fast-csv/issues/287)
* [ADDED] Example require and import to README [#301](https://github.com/C2FO/fast-csv/issues/301)
* [ADDED] Added new formatting option `alwaysWriteHeaders` to always write headers even if no rows are provided [#300](https://github.com/C2FO/fast-csv/issues/300)
* [ADDED] Appending to csv example and docs [#272](https://github.com/C2FO/fast-csv/issues/300)
* [FIXED] Issue with duplicate headers causing dataloss, duplicate headers will can an error to be emitted. [#276](https://github.com/C2FO/fast-csv/issues/272)
* [FIXED] Issue where an error thrown while processing rows causes stream continue to parse, causing duplicate writes or swallowed exceptions.


# v3.6.0

* [ADDED] `maxRows` option to limit the number of rows parsed. [#275](https://github.com/C2FO/fast-csv/issues/275) [#277](https://github.com/C2FO/fast-csv/pull/277) - [@cbrittingham](https://github.com/cbrittingham)
Expand Down
15 changes: 15 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,20 @@ Fast-csv is library for parsing and formatting csvs or any other delimited value

`npm install -S fast-csv`

## Usage

To use `fast-csv` in `javascript` you can require the module/

```js
const csv = require('fast-csv');
```

To import with typescript

```typescript
import * as csv from 'fast-csv';
```

## Documentation

* [Parsing Docs](./docs/parsing.md)
Expand Down Expand Up @@ -55,3 +69,4 @@ MIT <https://github.com/C2FO/fast-csv/raw/master/LICENSE>
* Website: <http://c2fo.com>
* Twitter: [http://twitter.com/c2fo](http://twitter.com/c2fo) - 877.465.4045


2 changes: 1 addition & 1 deletion benchmark/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ const benchmarkFastCsv = type => num => {
const file = path.resolve(__dirname, `./assets/${num}.${type}.csv`);
const stream = fs
.createReadStream(file)
.pipe(fastCsv.parse({ headers: true, maxRows: 10 }))
.pipe(fastCsv.parse({ headers: true }))
.transform(data => {
const ret = {};
['first_name', 'last_name', 'email_address'].forEach(prop => {
Expand Down
87 changes: 87 additions & 0 deletions docs/formatting.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@
* [`quoteColumns`](#examples-quote-columns)
* [`quoteHeaders`](#examples-quote-headers)
* [Transforming Rows](#examples-transforming)
* [Appending To A CSV](#examples-appending)

<a name="options"></a>
## Options
Expand All @@ -52,6 +53,8 @@
* If there is not a headers row and you want to provide one then set to a `string[]`
* **NOTE** If the row is an object the headers must match fields in the object, otherwise you will end up with empty fields
* **NOTE** If there are more headers than columns then additional empty columns will be added
* `alwaysWriteHeaders: {boolean} = false`: Set to true if you always want headers written, even if no rows are written.
* **NOTE** This will throw an error if headers are not specified as an array.
* `quoteColumns: {boolean|boolean[]|{[string]: boolean} = false`
* If `true` then columns and headers will be quoted (unless `quoteHeaders` is specified).
* If it is an object then each key that has a true value will be quoted ((unless `quoteHeaders` is specified)
Expand Down Expand Up @@ -836,4 +839,88 @@ VALUE1A,VALUE2A
VALUE1A,VALUE2A
VALUE1A,VALUE2A
VALUE1A,VALUE2A
```

<a name="examples-appending"></a>
### Appending To A CSV

[`examples/formatting/append.example.js`](../examples/formatting/append.example.js)

In this example a new csv is created then appended to.

```javascript
const path = require('path');
const fs = require('fs');

const write = (filestream, rows, options) => {
return new Promise((res, rej) => {
csv.writeToStream(filestream, rows, options)
.on('error', err => rej(err))
.on('finish', () => res());
});
};

// create a new csv
const createCsv = (filePath, rows) => {
const csvFile = fs.createWriteStream(filePath);
return write(csvFile, rows, { headers: true, includeEndRowDelimiter: true });
};

// append the rows to the csv
const appendToCsv = (filePath, rows = []) => {
const csvFile = fs.createWriteStream(filePath, { flags: 'a' });
// notice how headers are set to false
return write(csvFile, rows, { headers: false });
};

// read the file
const readFile = filePath => {
return new Promise((res, rej) => {
fs.readFile(filePath, (err, contents) => {
if (err) {
return rej(err);
}
return res(contents);
});
});
};

const csvFilePath = path.resolve(__dirname, 'tmp', 'append.csv');

// 1. create the csv
createCsv(csvFilePath, [
{ a: 'a1', b: 'b1', c: 'c1' },
{ a: 'a2', b: 'b2', c: 'c2' },
{ a: 'a3', b: 'b3', c: 'c3' },
])
.then(() => {
// 2. append to the csv
return appendToCsv(csvFilePath, [
{ a: 'a4', b: 'b4', c: 'c4' },
{ a: 'a5', b: 'b5', c: 'c5' },
{ a: 'a6', b: 'b6', c: 'c6' },
]);
})
.then(() => readFile(csvFilePath))
.then(contents => {
console.log(`${contents}`);
})
.catch(err => {
console.error(err.stack);
process.exit(1);
});


```

Expected output

```
a,b,c
a1,b1,c1
a2,b2,c2
a3,b3,c3
a4,b4,c4
a5,b5,c5
a6,b6,c6
```
43 changes: 41 additions & 2 deletions docs/parsing.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
* [First Row As Headers](#csv-parse-first-row-as-headers)
* [Custom Headers](#csv-parse-custom-headers)
* [Renaming Headers](#csv-parse-renaming-headers)
* [Transforming Headers](#csv-parse-transforming-headers)
* [Skipping Columns](#csv-parse-skipping-columns)
* [Ignoring Empty Rows](#csv-parse-ignoring-empty-rows)
* [Transforming Rows](#csv-parse-transforming)
Expand All @@ -31,12 +32,16 @@
* `"first,name",last name`
* `escape: {string} = '"'`: The character to used tp escape quotes inside of a quoted field.
* `i.e`: `First,"Name"' => '"First,""Name"""`
* `headers: {boolean|string[]} = false`:
* `headers: {boolean|string[]|(string[]) => string[])} = false`:
* If you want the first row to be treated as headers then set to `true`
* If there is not a headers row and you want to provide one then set to a `string[]`
* If you wish to discard the first row and use your own headers set to a `string[]` and set the `renameHeaders` option to `true`
* `renameHeaders: {boolean} = false`: If you want the first line of the file to be removed and replaced by the one provided in the `headers` option.
* If you wish to transform the headers you can provide a transform function.
* **NOTE** This will always rename the headers
* **NOTE** If headers either parsed, provided or transformed are NOT unique, then an error will be emitted and the stream will stop parsing.
* `renameHeaders: {boolean} = false`: If you want the first line of the file to be removed and replaced by the one provided in the `headers` option.
* **NOTE** This option should only be used if the `headers` option is a `string[]`
* **NOTE** If the `headers` option is a function then this option is always set to true.
* `ignoreEmpty: {boolean} = false`: If you wish to ignore empty rows.
* **NOTE** this will discard columns that are all white space or delimiters.
* `comment: {string} = null`: If your CSV contains comments you can use this option to ignore lines that begin with the specified character (e.g. `#`).
Expand Down Expand Up @@ -337,6 +342,39 @@ Expected output
Parsed 2 rows
```

<a name="csv-parse-transforming-headers"></a>
### Transforming Headers

If the CSV contains a header row but you want transform the headers you can provide a function to the `headers` option.

[`examples/parsing/rename_headers.example.js`](../examples/parsing/rename_headers.example.js)

```javascript
const { EOL } = require('os');

const CSV_STRING = ['header1,header2', 'a1,b1', 'a2,b2'].join(EOL);

const stream = csv
.parse({
headers: headers => headers.map(h => h.toUpperCase()),
})
.on('error', error => console.error(error))
.on('data', row => console.log(row))
.on('end', rowCount => console.log(`Parsed ${rowCount} rows`));

stream.write(CSV_STRING);
stream.end();

```

Expected output

```
{ HEADER1: 'a1', HEADER2: 'b1' }
{ HEADER1: 'a2', HEADER2: 'b2' }
Parsed 2 rows
```

<a name="csv-parse-skipping-columns"></a>
### Skipping Columns

Expand Down Expand Up @@ -711,3 +749,4 @@ Parsed 4 rows
```



61 changes: 61 additions & 0 deletions examples/formatting/append.example.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
const path = require('path');
const fs = require('fs');
const csv = require('../..');

const write = (filestream, rows, options) => {
return new Promise((res, rej) => {
csv.writeToStream(filestream, rows, options)
.on('error', err => rej(err))
.on('finish', () => res());
});
};

// create a new csv
const createCsv = (filePath, rows) => {
const csvFile = fs.createWriteStream(filePath);
return write(csvFile, rows, { headers: true, includeEndRowDelimiter: true });
};

// append the rows to the csv
const appendToCsv = (filePath, rows = []) => {
const csvFile = fs.createWriteStream(filePath, { flags: 'a' });
// notice how headers are set to false
return write(csvFile, rows, { headers: false });
};

// read the file
const readFile = filePath => {
return new Promise((res, rej) => {
fs.readFile(filePath, (err, contents) => {
if (err) {
return rej(err);
}
return res(contents);
});
});
};

const csvFilePath = path.resolve(__dirname, 'tmp', 'append.csv');

// 1. create the csv
createCsv(csvFilePath, [
{ a: 'a1', b: 'b1', c: 'c1' },
{ a: 'a2', b: 'b2', c: 'c2' },
{ a: 'a3', b: 'b3', c: 'c3' },
])
.then(() => {
// 2. append to the csv
return appendToCsv(csvFilePath, [
{ a: 'a4', b: 'b4', c: 'c4' },
{ a: 'a5', b: 'b5', c: 'c5' },
{ a: 'a6', b: 'b6', c: 'c6' },
]);
})
.then(() => readFile(csvFilePath))
.then(contents => {
console.log(`${contents}`);
})
.catch(err => {
console.error(err.stack);
process.exit(1);
});
15 changes: 15 additions & 0 deletions examples/parsing/transform_headers.example.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
const { EOL } = require('os');
const csv = require('../../');

const CSV_STRING = ['header1,header2', 'a1,b1', 'a2,b2'].join(EOL);

const stream = csv
.parse({
headers: headers => headers.map(h => h.toUpperCase()),
})
.on('error', error => console.error(error))
.on('data', row => console.log(row))
.on('end', rowCount => console.log(`Parsed ${rowCount} rows`));

stream.write(CSV_STRING);
stream.end();
30 changes: 29 additions & 1 deletion package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading