Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reuses compiled regex, fixes #10, and flexible m3u generation #11

Merged
merged 2 commits into from
Jul 21, 2021

Conversation

dineiar
Copy link
Contributor

@dineiar dineiar commented Jul 21, 2021

By reusing the regexes, we avoid recompiling them for each stream item being processed, and also they now are next to the other regexes defined in the M3uParser class.

I added extra flexibility when parsing the M3U, now the only field that is always stored is the url (which is always present). All the other fields are only stored if they were present in the original m3u. Also, if the field was defined as an empty string (eg.: #EXTINF:-1 tvg-id="", Google), it is stored as an empty string.

Also, I changed the way of generating the M3U to avoid concatenating big strings in the memory, which usually consumes large portions of memory and slows down when the string gets big (I'm processing M3U files of several megabytes). Using a list avoids these issues, since the big string is only generated at the end of the processing.

This way, when generating JSON and M3U it only generates the fields that were originally present in the parsed M3U.

For example, with this input:

#EXTM3U
#EXTINF:-1 tvg-id="google" tvg-name="Google" tvg-logo="" group-title="Google Group", Google
http://www.google.com/

Now using parse_m3u and to_file( ... , 'm3u') the generated file is exactly the same as the original (parsed) M3U.

Related to #7 and #10

@pawanpaudel93 pawanpaudel93 merged commit eaa2d87 into pawanpaudel93:master Jul 21, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants