You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While looking through this question , I was looking at the punctuation in your code and the chinese language test in your tests. The Chinese comma (",") seems to missing from the punctuation in _tokenize, here. Since Chinese rarely uses periods or spaces, it was entirely possible that he was getting a maximum recursion error by have a space with a combination of 99 characters and commas. I thought I would let you know as I can't imagine the fix requires more than inserting that one character into the list.
The text was updated successfully, but these errors were encountered:
Sorry for the super long delay about this, but I wanted to thank you a lot for this. I had not seen the original issue (I should set a watch on SO), but I'm currently refactoring a lot of gTTS and this dragged me down quite an interesting rabbit hole and deeper underlying issues (which are fixed for the upcoming 2.0.0. I added the Chinese comma too! (but the problem was deeper than that!)
Hello,
While looking through this question , I was looking at the punctuation in your code and the chinese language test in your tests. The Chinese comma (
","
) seems to missing from the punctuation in_tokenize
, here. Since Chinese rarely uses periods or spaces, it was entirely possible that he was getting a maximum recursion error by have a space with a combination of 99 characters and commas. I thought I would let you know as I can't imagine the fix requires more than inserting that one character into the list.The text was updated successfully, but these errors were encountered: