Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test_dask_multiprocessing fails under Python 3.9 #258

Closed
ArchangeGabriel opened this issue Nov 15, 2020 · 5 comments
Closed

test_dask_multiprocessing fails under Python 3.9 #258

ArchangeGabriel opened this issue Nov 15, 2020 · 5 comments

Comments

@ArchangeGabriel
Copy link
Contributor

We (Arch Linux) are currently rebuilding all our python packages against Python 3.9, and spyder-kernels is one of the packages showing a new test failure:

============================= test session starts ==============================
platform linux -- Python 3.9.0, pytest-6.1.2, py-1.9.0, pluggy-0.13.1
rootdir: /build/python-spyder-kernels/src/spyder-kernels-1.10.0
plugins: flaky-3.7.0
collected 58 items

spyder_kernels/console/tests/test_console_kernel.py .................F.. [ 34%]
s.......                                                                 [ 48%]
spyder_kernels/customize/tests/test_umr.py .....                         [ 56%]
spyder_kernels/customize/tests/test_utils.py .                           [ 58%]
spyder_kernels/utils/tests/test_dochelpers.py s                          [ 60%]
spyder_kernels/utils/tests/test_iofuncs.py ...........                   [ 79%]
spyder_kernels/utils/tests/test_nsview.py ............                   [100%]

=================================== FAILURES ===================================
__________________________ test_dask_multiprocessing ___________________________

tmpdir = local('/tmp/pytest-of-builduser/pytest-0/test_dask_multiprocessing2')

    @flaky(max_runs=3)
    @pytest.mark.skipif(not PY3,
                        reason="Only meant for Python 3")
    def test_dask_multiprocessing(tmpdir):
        """
        Test that dask multiprocessing works on Python 3.
        """
        # Command to start the kernel
        cmd = "from spyder_kernels.console import start; start.main()"
    
        with setup_kernel(cmd) as client:
            # Remove all variables
            client.execute("%reset -f")
            client.get_shell_msg(block=True, timeout=TIMEOUT)
    
            # Write multiprocessing code to a file
            # Runs two times to verify that in the second case it doesn't break
            code = """
    from dask.distributed import Client
    
    if __name__=='__main__':
        client = Client()
        client.close()
        x = 'hello'
    """
            p = tmpdir.join("mp-test.py")
            p.write(code)
    
            # Run code two times
            client.execute("runfile(r'{}')".format(to_text_string(p)))
>           client.get_shell_msg(block=True, timeout=TIMEOUT)

spyder_kernels/console/tests/test_console_kernel.py:468: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.9/site-packages/jupyter_client/client.py:78: in get_shell_msg
    return self.shell_channel.get_msg(*args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <jupyter_client.blocking.channels.ZMQSocketChannel object at 0x7f1ea87b8fd0>
block = True, timeout = 15000

    def get_msg(self, block=True, timeout=None):
        """ Gets a message if there is one that is ready. """
        if block:
            if timeout is not None:
                timeout *= 1000  # seconds to ms
            ready = self.socket.poll(timeout)
        else:
            ready = self.socket.poll(timeout=0)
    
        if ready:
            return self._recv()
        else:
>           raise Empty
E           _queue.Empty

/usr/lib/python3.9/site-packages/jupyter_client/blocking/channels.py:54: Empty

Not sure whether this is an issue with dask/distributed or spyder_kernels though. Could you try adding Python 3.9 to your CI and see whether it works for you?

@ccordoba12
Copy link
Member

Hey @ArchangeGabriel, this looks like an issue with jupyter-client. Were you successful in creating packages for it?

If not, I must say I haven't seen any work in the Jupyter community to support 3.9. And until they do that, there's nothing we can do about it, sorry.

@ArchangeGabriel
Copy link
Contributor Author

We did rebuild jupyter-client, but apparently we don’t run tests for it, so… It could also be an issue with dask/distributed because we have failing tests for distributed.

@ccordoba12
Copy link
Member

It could be that too, that's right.

@foutrelis
Copy link

Looks like adding dask/distributed#4234 to distributed 2.30.1 allows test_dask_multiprocessing to succeed.

(For what it's worth, jupyter-client's test suite seems to pass regardless of the above.)

@ArchangeGabriel
Copy link
Contributor Author

Closing as this is an issue with upstream distributed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants