-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix bugs regarding connector.start() & connector.stop() #34
base: master
Are you sure you want to change the base?
Conversation
+ fixes (sousa-andre#18) + Each WebSocket registered by the user is now stored in a list called self.connections. This allows us to iterate over each stored connection object later when the connector.stop() method is called. By unregistering each connection, the method can successfully close the connections as intended. This prevents the occurrence of an infinite loop, which could happen if the @connector.ws.register() decorator was used to register a connection before
+ When the League Lobby Clients setting "Close client during game" was set to "Always"; it resulted in the in-game client being open exclusively during games. This caused the initialization of lcu-driver's connector.start() method to fail because the LeagueClientUx process was not running. + This change introduces automatically killing the in-game client, if it was detected as running during initialization of lcu-driver, allowing lcu-driver to start as intended.
This comment was marked as outdated.
This comment was marked as outdated.
+ fix value error from being caused, if connector.stop() was called randomly somewhere in a function which didn't have the connector.close() function decorator
I just realised that you've a completly undocumented & unmentioned class in connector.py called MultipleClientConnector which perfectly implements everything I was trying to implement into the Connector class. Just write something similar to how you've kept track & cancelled previously registered websocket connections in MultipleClientConnector, into the Connector class and that will fix #18 I would still also be happy about a merge of the change I labeled as Edit: I noticed the MultipleClientConnector class didn't have a stop event similar to the Connector class, so I added one and now it can actually terminate and re-start as intended in 4b81934 Previously it wouldn't actually close out and instead would just get stuck doing nothing whenever |
…t into MultipleClientConnector Class + Reversed the changes made to the Connector class due too many bugs encountered. + Implemented a stop event in the MultipleClientConnector class, enabling it to receive stop signals. Similar to the Connector class, this enhancement ensures proper termination and restart. Previously, attempting to exit & re-start this class in a loop would result in errors or it would become stuck instead of closing gracefully.
+ addresses an issue in the WebSocket connection code where JSON decoding and logging were not handled correctly, resulting in errors and incomplete log messages, by utilising a f-string instead
+ previously when a task started within the event loop, tried killing MCC, it wouldn't work + it's kind of a sloppy fix, to just add .stop into the finally clause. For a more robust design, sometime in the future it should be re-written using a context manager to close the connection rather than manually calling stop() there. + also ideally we should use asyncio.run() instead of managing the event loop's lifetime within the class itself anyway, so that python handles it
- killing the in-game client this way, isn't really purposefull as I've originally intended
Firstly, thank you for taking the time to open the pull request; it's great to have someone to help 😄. Regarding the .stop() issue and the undocumented class MultipleClientConnector, I added that class to the library to solve a specific problem that some people were facing (I don't remember exactly which). Since it's close to impossible to cover all use cases of the library, I think it's better to write documentation on how to extend the BaseConnector class to fit your needs and not to create classes for specific use cases. As for the start bug, I agree that it needs to be fixed, but I don't think closing the game itself is the way to go, at least not by default. I would love to hear your opinion on some solutions for this. |
The pull request seems alright even though some commits should have been addressed in a different pull request. |
Could this be the exact issue I was referring to with the websockets? Because the only difference between your connector class and the MultipleClientConnector (MCC) class is that MCC handles tracking and closing of websockets effectively. With my additions to the MCC class, it allows for stopping and restarting lcu-driver as intended. Merging that change would be beneficial. As for the normal Connector class, the issues I mentioned with websockets not closing properly persist. This prevents lcu-driver from closing at all if a websocket has been registered during lcu-driver's runtime. You may consider modifying the Connector class yourself, but since MultipleClientConnector already works well, renaming it to Connector and removing the other buggy class seems like a viable option. I mean both classes are quite similar, except one works and the other doesn't.
Yes this is the same conclusion I've came to after a week of testing the change (which is why I reverted the changes to utils.py). |
The |
Register a websocket and use the normal Connector class, then try to close lcu-driver and see if it closes or just gets stuck doing nothing, because the stale websocket connection is dissalowing it to close.
Oh are you saying this is intended this behaviour; where once lcu-driver has started, it can never be entirely stopped again? |
When you register a connection with a websocket it should block until the client closes or the user calls the
When you call the .stop method on a Connector class instance, not only does the connector stop looking for new clients/connections, but it also interrupts the current connection that's listening to incoming WebSocket messages from blocking. As a result, the execution returns back to the global Python interpretation. |
Except that this doesn't work, unless registered websockets are actively tracked and closed when a close event is received, else they end up stale, blocking lcu-driver from closing. Which is the whole point I've added these commits; MCC already had the tracking neccessary, but it was lacking the stop event to actually close them all out. Look MCC can already track multiple registered websockets within a list. lcu-driver/lcu_driver/connector.py Line 88 in 768d79b
While the normal connector class only tracks the latest registered websocket (?) lcu-driver/lcu_driver/connector.py Line 38 in 768d79b
|
You're right about the issue with the .stop method. In the context of the connector, the .stop method only prevents the connector from finding new clients and doesn't stop any connection. I'm considering creating a new method on the Connection class to stop the run_ws method (a simple flag should solve this). The |
Sorry for the delayed feedback. |
connector.stop() bug: The change prevents the occurrence of an infinite loop, which would previously happen if the @connector.ws.register() decorator was used to register a connection. This is done by storing each WebSocket registered by the user in a list called self.connections. Which when wanting to
.stop()
allows us to first iterate over each stored connection object and unregistering stale websocket connections, so that the stop method can successfully close the connections as intended. (fixes connector.stop() issue #18)connector.start() bug: When the League Lobby Clients setting "Close client during game" was set to "Always"; it resulted in the in-game client being open exclusively during games. This caused the initialization of lcu-driver's connector.start() method to fail because the LeagueClientUx process was not running. This change introduces automatically killing the in-game client, if it was detected as running during initialization of lcu-driver, this action triggers the automatic relaunch of the LeagueClientUx process, finally enabling lcu-driver to start as intended.