- π Web Standards first.
- Utilizing the Web Standards APIs, such as the Web Streams API.
- β€οΈ TypeScript friendly & User friendly.
- Fully typed and documented.
- 0οΈβ£ Zero dependencies.
- Using only Web Standards APIs.
- πͺ Property-based testing.
- Using fast-check and vitest.
- β
Cross-platform.
- Works on browsers, Node.js, and Deno.
- π Efficient CSV Parsing with Streams
- π» Leveraging the WHATWG Streams API and other Web APIs for seamless and efficient data processing.
- π¨ Flexible Source Support
- 𧩠Parse CSVs directly from
string
s,ReadableStream
s, orResponse
objects.
- 𧩠Parse CSVs directly from
- βοΈ Advanced Parsing Options: Customize your experience with various delimiters and quotation marks.
- π Defaults to
,
and"
respectively.
- π Defaults to
- πΎ Specialized Binary CSV Parsing: Leverage Stream-based processing for versatility and strength.
- π Flexible BOM handling.
- ποΈ Supports various compression formats.
- π€ Charset specification for diverse encoding.
- π¦ Lightweight and Zero Dependencies: No external dependencies, only Web Standards APIs.
- π Fully Typed and Documented: Fully typed and documented with TypeDoc.
- π Using WebAssembly for High Performance: WebAssembly is used for high performance parsing. (Experimental)
- π¦ WebAssembly is used for high performance parsing.
This package can then be installed using a package manager.
# Install with npm
$ npm install web-csv-toolbox
# Or Yarn
$ yarn add web-csv-toolbox
# Or pnpm
$ pnpm add web-csv-toolbox
<script src="https://unpkg.com/web-csv-toolbox"></script>
<script>
const csv = `name,age
Alice,42
Bob,69`;
(async function () {
for await (const record of CSV.parse(csv)) {
console.log(record);
}
})();
</script>
<script type="module">
import { parse } from 'https://unpkg.com/web-csv-toolbox?module';
const csv = `name,age
Alice,42
Bob,69`;
for await (const record of parse(csv)) {
console.log(record);
}
</script>
You can install and use the package by specifying the following:
import { parse } from "npm:web-csv-toolbox";
import { parse } from 'web-csv-toolbox';
const csv = `name,age
Alice,42
Bob,69`;
for await (const record of parse(csv)) {
console.log(record);
}
// Prints:
// { name: 'Alice', age: '42' }
// { name: 'Bob', age: '69' }
import { parse } from 'web-csv-toolbox';
const csv = `name,age
Alice,42
Bob,69`;
const stream = new ReadableStream({
start(controller) {
controller.enqueue(csv);
controller.close();
},
});
for await (const record of parse(stream)) {
console.log(record);
}
// Prints:
// { name: 'Alice', age: '42' }
// { name: 'Bob', age: '69' }
import { parse } from 'web-csv-toolbox';
const response = await fetch('https://example.com/data.csv');
for await (const record of parse(response)) {
console.log(record);
}
// Prints:
// { name: 'Alice', age: '42' }
// { name: 'Bob', age: '69' }
import { parse } from 'web-csv-toolbox';
const csv = `name\tage
Alice\t42
Bob\t69`;
for await (const record of parse(csv, { delimiter: '\t' })) {
console.log(record);
}
// Prints:
// { name: 'Alice', age: '42' }
// { name: 'Bob', age: '69' }
import { parse } from 'web-csv-toolbox';
const csv = `Alice,42
Bob,69`;
for await (const record of parse(csv, { headers: ['name', 'age'] })) {
console.log(record);
}
// Prints:
// { name: 'Alice', age: '42' }
// { name: 'Bob', age: '69' }
Versions | Status |
---|---|
20.x | β |
18.x | β |
OS | Chrome | FireFox | Default |
---|---|---|---|
Windows | β | β | β (Edge) |
macos | β | β | β¬ (Safari *) |
Linux | β | β | - |
* To Be Tested: I couldn't launch Safari in headless mode on GitHub Actions, so I couldn't verify it, but it probably works.
These APIs are designed for Simplicity and Ease of Use, providing an intuitive and straightforward experience for users.
function parse(input[, options]): AsyncIterableIterator<CSVRecord>
: π- Parses various CSV input formats into an asynchronous iterable of records.
function parse.toArray(input[, options]): Promise<CSVRecord[]>
: π- Parses CSV input into an array of records, ideal for smaller data sets.
The input
paramater can be a string
, a ReadableStream
of string
s or Uint8Arrays,
or a Uint8Array object,
or a ArrayBuffer object,
or a Response object.
These APIs are optimized for Enhanced Performance and Control, catering to users who need more detailed and fine-tuned functionality.
function parseString(string[, options])
: π- Efficient parsing of CSV strings.
function parseBinary(buffer[, options])
: π- Parse CSV Binary of ArrayBuffer or Uint8Array.
function parseResponse(response[, options])
: π- Customized parsing directly from
Response
objects.
- Customized parsing directly from
function parseStream(stream[, options])
: π- Stream-based parsing for larger or continuous data.
function parseStringStream(stream[, options])
: π- Combines string-based parsing with stream processing.
function parseUint8ArrayStream(stream[, options])
: π- Parses binary streams with precise control over data types.
These APIs are built for Advanced Customization and Pipeline Design, ideal for developers looking for in-depth control and flexibility.
class LexerTransformer
: π- A TransformStream class for lexical analysis of CSV data.
class RecordAssemblerTransformer
: π- Handles the assembly of parsed data into records.
These APIs are experimental and may change in the future.
You can use WebAssembly to parse CSV data for high performance.
- Parsing with WebAssembly is faster than parsing with JavaScript, but it takes time to load the WebAssembly module.
- Supports only UTF-8 encoding csv data.
- Quotation characters are only
"
. (Double quotation mark)- If you pass a different character, it will throw an error.
import { loadWASM, parseStringWASM } from "web-csv-toolbox";
// load WebAssembly module
await loadWASM();
const csv = "a,b,c\n1,2,3";
// parse CSV string
const result = parseStringToArraySyncWASM(csv);
console.log(result);
// Prints:
// [{ a: "1", b: "2", c: "3" }]
function loadWASM(): Promise<void>
: π- Loads the WebAssembly module.
function parseStringToArraySyncWASM(string[, options]): CSVRecord[]
: π- Parses CSV strings into an array of records.
Option | Description | Default | Notes |
---|---|---|---|
delimiter |
Character to separate fields | , |
|
quotation |
Character used for quoting fields | " |
|
headers |
Custom headers for the parsed records | First row | If not provided, the first row is used as headers |
Option | Description | Default | Notes |
---|---|---|---|
charset |
Character encoding for binary CSV inputs | utf-8 |
See Encoding API Compatibility for the encoding formats that can be specified. |
decompression |
Decompression algorithm for compressed CSV inputs | See DecompressionStream Compatibility. | |
ignoreBOM |
Whether to ignore Byte Order Mark (BOM) | false |
See TextDecoderOptions.ignoreBOM for more information about the BOM. |
fatal |
Throw an error on invalid characters | false |
See TextDecoderOptions.fatal for more information. |
The easiest way to contribute is to use the library and star repository.
Feel free to ask questions on GitHub Discussions.
Please register at GitHub Issues.
Please support kamiazya.
Even just a dollar is enough motivation to develop π
This software is released under the MIT License, see LICENSE.