Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: Object of type function is not JSON serializable #161

Open
Kam718 opened this issue Dec 10, 2024 · 3 comments
Open

TypeError: Object of type function is not JSON serializable #161

Kam718 opened this issue Dec 10, 2024 · 3 comments

Comments

@Kam718
Copy link

Kam718 commented Dec 10, 2024

I've been working on a personal project that has me using RealtimeSTT since it's been the easiest package for me to implement it with my project. Thank you for your work on this!

I've been resolving things on my own until I ran into this issue and now I can't get past it. My brain hurts ( I'm not a programmer nor consider myself a beginner Dev but I've taken the past 2 years to try to learn new things to me so here I am. ).

I am running
python==3.9.13
Torch==2.3.1
Cuda==118
Latest Version of LMStudio
OpenAI==1.9.0

(let me know if there is any lib I missed)

import shutil
import json
import os
import requests
import winsound
from gradio_client import Client as clienttts
from openai import OpenAI
from RealtimeSTT import AudioToTextRecorder
import pyautogui
import time

history = [

    {"role": "system", "content": 'Answer precise and short responses. Also you are a friendly, NEVER say you are an AI assistant or speak in third person. You are a tech-savvy 25-year-old named Bella who loves Hello Kitty, dogs, boba, video games, pop culture and technology. You are always up for a chat and love to help people. You are not interested in discussing politics or racism. Let us have some fun! Keep the responses very short and focused on keeping the conversation evolving.'},
    {"role": "user", "content": "Hello, are you ready to begin today's twitch stream with me Bernie? Be concise with all of your answers please."}
]

def process_text(tteexxtt):
    print(tteexxtt)
#    print(text1)
#    print(text1)
#    print(text1)

def process_lmstudio(history):
    client = OpenAI(base_url="http://172.30.100.206:1111/v1", api_key="not-needed")
    completion = client.chat.completions.create(
        model="local-model", # this field is currently unused
        messages=history,
        temperature=0.8,
        stream=True,
    )

    new_message = {"role": "assistant", "content": ""}

if __name__ == '__main__':
    print("Wait until it says 'speak now'")
    recorder = AudioToTextRecorder()
    
    while True:
        recorder.text(process_text)
        assisstant_reponse = process_lmstudio(history.copy())
        print()
        history.append({"role": "user", "content": process_text})

ERROR:

    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type function is not JSON serializable

I have LMStudio running on another computer and am watching the console and never see it connect or even attempt to connect so I think my script is failing before it gets to that point in the code.
I feel like I've tried to look at everything that I could. I just can't get the STT portion to talk to the LMStudio LLM. Much appreciated for any help and Thank you for your hard work on the STT and TTS.

@Kam718
Copy link
Author

Kam718 commented Dec 10, 2024

I'm also trying to run llama3.2-1b GGUF if that helps troubleshoot this

@Kam718
Copy link
Author

Kam718 commented Dec 10, 2024

i think i found my issue. I was missing this:

    new_message = {"role": "assistant", "content": ""}
    for chunk in completion:
        if chunk.choices[0].delta.content:
            print(chunk.choices[0].delta.content, end="", flush=True)
            new_message["content"] += chunk.choices[0].delta.content

    history.append(new_message)
    return new_message["content"] 

@KoljaB
Copy link
Owner

KoljaB commented Dec 10, 2024

Yeah, should have been this line:
history.append({"role": "user", "content": process_text})

RealtimeSTT has nothing to do with this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants