Releases: huggingface/transformers.js
2.2.0
What's new?
Multilingual speech recognition and translation w/ Whisper
You can now transcribe and translate speech for over 100 different languages, directly in your browser, with Whisper! Play around with our demo application here.
Example: Transcribe English.
let url = 'https://huggingface.co/datasets/Xenova/transformers.js-docs/resolve/main/jfk.wav';
let transcriber = await pipeline('automatic-speech-recognition', 'Xenova/whisper-tiny.en');
let output = await transcriber(url);
// { text: " And so my fellow Americans ask not what your country can do for you, ask what you can do for your country." }
Example: Transcribe English w/ timestamps.
let url = 'https://huggingface.co/datasets/Xenova/transformers.js-docs/resolve/main/jfk.wav';
let transcriber = await pipeline('automatic-speech-recognition', 'Xenova/whisper-tiny.en');
let output = await transcriber(url, { return_timestamps: true });
// {
// text: " And so my fellow Americans ask not what your country can do for you, ask what you can do for your country."
// chunks: [
// { timestamp: [0, 8], text: " And so my fellow Americans ask not what your country can do for you" }
// { timestamp: [8, 11], text: " ask what you can do for your country." }
// ]
// }
Example: Transcribe French.
let url = 'https://huggingface.co/datasets/Xenova/transformers.js-docs/resolve/main/french-audio.mp3';
let transcriber = await pipeline('automatic-speech-recognition', 'Xenova/whisper-small');
let output = await transcriber(url, { language: 'french', task: 'transcribe' });
// { text: " J'adore, j'aime, je n'aime pas, je déteste." }
Example: Translate French to English.
let url = 'https://huggingface.co/datasets/Xenova/transformers.js-docs/resolve/main/french-audio.mp3';
let transcriber = await pipeline('automatic-speech-recognition', 'Xenova/whisper-small');
let output = await transcriber(url, { language: 'french', task: 'translate' });
// { text: " I love, I like, I don't like, I hate." }
Misc
- Aligned
.generate()
function with original python implementation - Minor improvements to documentation (+ some examples). More to come in the future.
Full Changelog: 2.1.1...2.2.0
2.1.1
2.1.0
What's new?
Improved feature extraction pipeline for Embeddings
You can now perform feature extraction on models other than sentence-transformers! All you need to do is target a repo (and/or revision) that was exported with --task default
. Also be sure to use the correct quantization for your use-case!
Example: Run feature extraction with bert-base-uncased (without pooling/normalization).
let extractor = await pipeline('feature-extraction', 'Xenova/bert-base-uncased', { revision: 'default' });
let result = await extractor('This is a simple test.');
console.log(result);
// Tensor {
// type: 'float32',
// data: Float32Array [0.05939924716949463, 0.021655935794115067, ...],
// dims: [1, 8, 768]
// }
Example: Run feature extraction with bert-base-uncased (with pooling/normalization).
let extractor = await pipeline('feature-extraction', 'Xenova/bert-base-uncased', { revision: 'default' });
let result = await extractor('This is a simple test.', { pooling: 'mean', normalize: true });
console.log(result);
// Tensor {
// type: 'float32',
// data: Float32Array [0.03373778983950615, -0.010106077417731285, ...],
// dims: [1, 768]
// }
Example: Calculating embeddings with sentence-transformers models.
let extractor = await pipeline('feature-extraction', 'Xenova/all-MiniLM-L6-v2');
let result = await extractor('This is a simple test.', { pooling: 'mean', normalize: true });
console.log(result);
// Tensor {
// type: 'float32',
// data: Float32Array [0.09094982594251633, -0.014774246141314507, ...],
// dims: [1, 384]
// }
This also means you can do things like semantic search directly in JavaScript/Typescript! Check out the Pinecone docs for an example app which uses Transformers.js!
Over 100 Transformers.js models on the hub!
We now have 109 models to choose from! Check them out at https://huggingface.co/models?other=transformers.js! If you'd like to contribute models (exported with Optimum), you can tag them with library_name: "transformers.js"
! Let's make ML more web-friendly!
Misc
- Fixed various quantization/exporting issues
Full Changelog: 2.0.2...2.1.0
2.0.2
Fixes issues stemming from ORT's recent release of a buggy version 1.15.0 🙄 (https://www.npmjs.com/package/onnxruntime-web)
Also freezes examples and updates links to use the latest stable wasm files.
2.0.1
2.0.0
Transformers.js v2.0.0
It's finally here! 🔥
Run Hugging Face transformers directly in your browser, with no need for a server!
GitHub: https://github.com/xenova/transformers.js
Demo site: https://xenova.github.io/transformers.js/
Documentation: https://huggingface.co/docs/transformers.js
Main features:
🛠️ Complete ES6 rewrite
📄 Documentation and examples
🤗 Improved Hugging Face Hub integration
🖥️ Server-side model caching (in Node.js)
Dev-related features:
🧪 Improved testing framework w/ Jest
⚙️ CI/CD with GitHub actions
2.0.0-alpha.4
Pre-release for Transformers.js v2.0.0
Same as https://github.com/xenova/transformers.js/releases/tag/2.0.0-alpha.3 with various improvements, including:
- GH actions to build demo site (https://xenova.github.io/transformers.js/)
- Calculate whisper mel filters when not present in processor's config.json
2.0.0-alpha.3
Pre-release for Transformers.js v2.0.0
Same as https://github.com/xenova/transformers.js/releases/tag/2.0.0-alpha.2 but with added allowLocalModels
setting and improved handling of errors (e.g., CORS errors).
2.0.0-alpha.2
Pre-release for Transformers.js v2.0.0
Same as https://github.com/xenova/transformers.js/releases/tag/2.0.0-alpha.1 but with updated jsdelivr entry point in package.json
2.0.0-alpha.1
Pre-release for Transformers.js v2.0.0
Same as https://github.com/xenova/transformers.js/releases/tag/2.0.0-alpha.0 but with CDN-specific entry points in package.json