Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

draft: browser compatibility #56

Closed
wants to merge 2 commits into from
Closed

draft: browser compatibility #56

wants to merge 2 commits into from

Conversation

BruceMacD
Copy link
Collaborator

  • add dynamic import for node specific features

This change allows for ollama-js using on the client-side in browser environments by moving node-specific logic to a dynamic import. By doing this we are able to keep the interface unchanged, while simultaneously supporting browsers and node environments which have access to the file-system.

Node specific features:

  • in generate and chat, reading images from the filesystem specified by their path
  • in create, reading modelfiles from the filesystem specified by their path
  • in create, creating blobs in the server from the local filesystem if they do not exist

Here is a full example of using ollama-js in a client-side react page:

import { useEffect } from "react";
import ollama, {Message} from 'ollama';

export default function Home() {
  useEffect(() => {
    (async () => {
      // generate
    
      console.log('generating...')
    
      const genStream = await ollama.generate({
        model: 'llama2',
        prompt: "Say 'Hello, World!'",
        stream: true,
      })
      try {
        for await (const chunk of genStream) {
          // Assuming chunk is the object with a message property
          console.log(chunk.response); // Output each message content to the console
          // Or, update the DOM with the message content as needed
        }
      } catch (error) {
        console.error('Error while reading from chatStream:', error);
      }
      console.log('\n')
    
      const genResp = await ollama.generate({
        model: 'llama2',
        prompt: "Say 'Hello, World!'",
      })
      console.log(genResp.response)
      console.log('\n')
    
      console.log('generating multimodal...')
      const img = await fetch('/cat.jpg');
      const blob = await img.blob();
      const arrayBuffer = await blob.arrayBuffer();
      const uint8Array = new Uint8Array(arrayBuffer);
      const mmgenerate = await ollama.generate({
        model: 'llava',
        prompt: 'what do you see?',
        images: [uint8Array]
      })
      console.log(mmgenerate.response)
      console.log('\n')
    
      // chat
    
      console.log('chatting...')
    
      const msgs: Message[] = [
        { role: 'system', content: 'you are mario' },
        { role: 'user', content: 'hello' },
      ]
      const chatStream = await ollama.chat({ model: 'llama2', messages: msgs, stream: true })
      try {
        for await (const chunk of chatStream) {
          // Assuming chunk is the object with a message property
          console.log(chunk.message.content); // Output each message content to the console
          // Or, update the DOM with the message content as needed
        }
      } catch (error) {
        console.error('Error while reading from chatStream:', error);
      }
      console.log('\n')
    
      const resp = await ollama.chat({ model: 'llama2', messages: msgs })
      console.log(resp.message.content)
      console.log('\n')

      const mmChatUint8Array = await ollama.chat({
        model: 'llava',
        messages: [{ role: 'user', content: 'what do you see', images: [uint8Array] }],
      })
      console.log(mmChatUint8Array.message.content)
      console.log('\n')
  
      // Test with a Base64 encoded string
      const b64img = Buffer.from(uint8Array).toString('base64')
      const mmChatB64 = await ollama.chat({
        model: 'llava',
        messages: [{ role: 'user', content: 'what do you see', images: [b64img] }],
      })
      console.log(mmChatB64.message.content)
      console.log('\n')
    
      // pull
    
      console.log('pulling...')
    
      const pullStream = await ollama.pull({ model: 'llama2', stream: true })
      for await (const chunk of pullStream) {
        console.log(chunk.status)
      }
      console.log('\n')
    
      const pulled = await ollama.pull({ model: 'llama2' })
      console.log(pulled.status)
      console.log('\n')
    
      // create
    
      console.log('creating...')
    
      const modelFile = `FROM tinyllama
    SYSTEM """
    You are Mario from super mario bros, acting as an assistant.
    """`
      const createFromModelfile = await ollama.create({
        model: 'brxce/mario',
        modelfile: modelFile,
      })
      console.log(createFromModelfile.status)
      console.log('\n')
    
      // push
    
      console.log('pushing...')
      const pushStream = await ollama.push({ model: 'brxce/mario', stream: true })
      for await (const chunk of pushStream) {
        console.log(chunk.status)
      }
      console.log('\n')
    
      const pushed = await ollama.push({ model: 'brxce/mario' })
      console.log(pushed.status)
      console.log('\n')
    
      // delete
    
      const deleted = await ollama.delete({ model: 'brxce/mario' })
      console.log(deleted)
      console.log('\n')
    
      // copy
    
      const copied = await ollama.copy({ source: 'llama2', destination: 'brxce/mario2' })
      console.log(copied)
    
      // show
    
      const shown = await ollama.show({ model: 'llama2' })
      console.log(shown)
    
      // list
    
      const models = await ollama.list()
      console.log(models.models)
    })()
    
  }, []);

  return (
    <main />
  );
}

resolves #25
resolves #13

- add dynamic import for node specific features
@BruceMacD BruceMacD changed the title browser compatibility draft: browser compatibility Feb 27, 2024
@BruceMacD
Copy link
Collaborator Author

closing in favour of #47

@BruceMacD BruceMacD closed this Feb 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

allow client-side library use Browser compatible?
1 participant