-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Asyncio TimeoutError #152
Comments
Hello ! I saw that error too. Here is a simple code to reproduce: import asyncio
from arsenic import get_session, browsers, services
async def launch_browser():
service = services.Geckodriver()
browser = browsers.Firefox(
acceptInsecureCerts=True,
)
async with get_session(service, browser) as session:
await session.get('http://127.0.0.1:8000/index.php', timeout=5)
page_source = await session.get_page_source()
async def main():
await asyncio.gather(
launch_browser(),
)
asyncio.run(main()) Here the timeout is set to 5 for arsenic. The php script is sleeping for 6 seconds: <?php
sleep(6);
?> Traceback:
Also that timeout is not raised after the delay given to the Behavior seen:
Expected behavior:
Versions used:
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hello everyone, I would like to mention a minor issue with arsenic.
When the browser takes up a long period of time to start up and as excepted, a timeout exception will be raised. However, when the current code attempts to handle this exception, the
asyncio.futures
.TimoutError is used in the code. According to the Python documentation, the exception is not available inasyncio.futures
.Therefore, I searched the python documentation and found that the
asyncio.TimeoutError
exception is supposed to be utilized to handle such asynchronous timeout errors. I hope the code can be fixed as it may bring issues to python programs, especially if the program uses multiple processes. That is all and thanks for your attention and time.The text was updated successfully, but these errors were encountered: