Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test_upload_file_egg started failing on mindeps CI #8784

Open
hendrikmakait opened this issue Jul 19, 2024 · 0 comments
Open

test_upload_file_egg started failing on mindeps CI #8784

hendrikmakait opened this issue Jul 19, 2024 · 0 comments
Labels
tests Unit tests and/or continuous integration

Comments

@hendrikmakait
Copy link
Member

(Example) CI run: https://github.com/dask/distributed/actions/runs/10008758535/job/27666108355#step:20:1460

Traceback and logs:

_____________________________ test_upload_file_egg _____________________________

c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:45467', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:42479', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:45969', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @pytest.mark.slow
    @gen_cluster(client=True)
    async def test_upload_file_egg(c, s, a, b):
        pytest.importorskip("setuptools")
    
        def g():
            import package_1
            import package_2
    
            return package_1.a, package_2.b
    
        # c.upload_file tells each worker to
        # - put this file in their local_directory
        # - modify their sys.path to include it
        # we don't care about the local_directory
        # but we do care about restoring the path
    
        with save_sys_modules():
            for value in [123, 456]:
                with tmpfile() as dirname:
                    os.mkdir(dirname)
    
                    with open(os.path.join(dirname, "setup.py"), "w") as f:
                        f.write("from setuptools import setup, find_packages\n")
                        f.write(
                            'setup(name="my_package", packages=find_packages(), '
                            f'version="{value}")\n'
                        )
    
                    # test a package with an underscore in the name
                    package_1 = os.path.join(dirname, "package_1")
                    os.mkdir(package_1)
                    with open(os.path.join(package_1, "__init__.py"), "w") as f:
                        f.write(f"a = {value}\n")
    
                    # test multiple top-level packages
                    package_2 = os.path.join(dirname, "package_2")
                    os.mkdir(package_2)
                    with open(os.path.join(package_2, "__init__.py"), "w") as f:
                        f.write(f"b = {value}\n")
    
                    # compile these into an egg
>                   subprocess.check_call(
                        [sys.executable, "setup.py", "bdist_egg"], cwd=dirname
                    )

distributed/tests/test_client.py:1773: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

popenargs = (['/home/runner/miniconda3/envs/dask-distributed/bin/python3.9', 'setup.py', 'bdist_egg'],)
kwargs = {'cwd': '/tmp/tmp2eglu4x4'}, retcode = 1
cmd = ['/home/runner/miniconda3/envs/dask-distributed/bin/python3.9', 'setup.py', 'bdist_egg']

    def check_call(*popenargs, **kwargs):
        """Run command with arguments.  Wait for command to complete.  If
        the exit code was zero then return, otherwise raise
        CalledProcessError.  The CalledProcessError object will have the
        return code in the returncode attribute.
    
        The arguments are the same as for the call function.  Example:
    
        check_call(["ls", "-l"])
        """
        retcode = call(*popenargs, **kwargs)
        if retcode:
            cmd = kwargs.get("args")
            if cmd is None:
                cmd = popenargs[0]
>           raise CalledProcessError(retcode, cmd)
E           subprocess.CalledProcessError: Command '['/home/runner/miniconda3/envs/dask-distributed/bin/python3.9', 'setup.py', 'bdist_egg']' returned non-zero exit status 1.

../../../miniconda3/envs/dask-distributed/lib/python3.9/subprocess.py:373: CalledProcessError
----------------------------- Captured stdout call -----------------------------
running bdist_egg
running egg_info
creating my_package.egg-info
writing my_package.egg-info/PKG-INFO
writing dependency_links to my_package.egg-info/dependency_links.txt
writing top-level names to my_package.egg-info/top_level.txt
writing manifest file 'my_package.egg-info/SOURCES.txt'
reading manifest file 'my_package.egg-info/SOURCES.txt'
Dumped cluster state to test_cluster_dump/test_upload_file_egg.yaml
----------------------------- Captured stderr call -----------------------------
2024-07-19 13:21:22,829 - distributed.scheduler - INFO - State start
2024-07-19 13:21:22,832 - distributed.scheduler - INFO -   Scheduler at:     tcp://127.0.0.1:45467
2024-07-19 13:21:22,833 - distributed.scheduler - INFO -   dashboard at:  http://127.0.0.1:46669/status
2024-07-19 13:21:22,833 - distributed.scheduler - INFO - Registering Worker plugin shuffle
2024-07-19 13:21:22,838 - distributed.worker - INFO -       Start worker at:      tcp://127.0.0.1:42479
2024-07-19 13:21:22,839 - distributed.worker - INFO -          Listening to:      tcp://127.0.0.1:42479
2024-07-19 13:21:22,839 - distributed.worker - INFO -           Worker name:                          0
2024-07-19 13:21:22,839 - distributed.worker - INFO -          dashboard at:            127.0.0.1:33685
2024-07-19 13:21:22,839 - distributed.worker - INFO - Waiting to connect to:      tcp://127.0.0.1:45467
2024-07-19 13:21:22,840 - distributed.worker - INFO - -------------------------------------------------
2024-07-19 13:21:22,840 - distributed.worker - INFO -               Threads:                          1
2024-07-19 13:21:22,840 - distributed.worker - INFO -                Memory:                  15.61 GiB
2024-07-19 13:21:22,840 - distributed.worker - INFO -       Local Directory: /tmp/dask-scratch-space/worker-l2plt12c
2024-07-19 13:21:22,841 - distributed.worker - INFO - -------------------------------------------------
2024-07-19 13:21:22,841 - distributed.worker - INFO -       Start worker at:      tcp://127.0.0.1:45969
2024-07-19 13:21:22,842 - distributed.worker - INFO -          Listening to:      tcp://127.0.0.1:45969
2024-07-19 13:21:22,842 - distributed.worker - INFO -           Worker name:                          1
2024-07-19 13:21:22,842 - distributed.worker - INFO -          dashboard at:            127.0.0.1:42473
2024-07-19 13:21:22,842 - distributed.worker - INFO - Waiting to connect to:      tcp://127.0.0.1:45467
2024-07-19 13:21:22,843 - distributed.worker - INFO - -------------------------------------------------
2024-07-19 13:21:22,843 - distributed.worker - INFO -               Threads:                          2
2024-07-19 13:21:22,843 - distributed.worker - INFO -                Memory:                  15.61 GiB
2024-07-19 13:21:22,843 - distributed.worker - INFO -       Local Directory: /tmp/dask-scratch-space/worker-s6vb8kja
2024-07-19 13:21:22,843 - distributed.worker - INFO - -------------------------------------------------
2024-07-19 13:21:22,862 - distributed.scheduler - INFO - Register worker <WorkerState 'tcp://127.0.0.1:42479', name: 0, status: init, memory: 0, processing: 0>
2024-07-19 13:21:22,869 - distributed.scheduler - INFO - Starting worker compute stream, tcp://127.0.0.1:42479
2024-07-19 13:21:22,869 - distributed.core - INFO - Starting established connection to tcp://127.0.0.1:40490
2024-07-19 13:21:22,869 - distributed.scheduler - INFO - Register worker <WorkerState 'tcp://127.0.0.1:45969', name: 1, status: init, memory: 0, processing: 0>
2024-07-19 13:21:22,876 - distributed.scheduler - INFO - Starting worker compute stream, tcp://127.0.0.1:45969
2024-07-19 13:21:22,877 - distributed.core - INFO - Starting established connection to tcp://127.0.0.1:40500
2024-07-19 13:21:22,877 - distributed.worker - INFO - Starting Worker plugin shuffle
2024-07-19 13:21:22,878 - distributed.worker - INFO - Starting Worker plugin shuffle
2024-07-19 13:21:22,878 - distributed.worker - INFO -         Registered to:      tcp://127.0.0.1:45467
2024-07-19 13:21:22,879 - distributed.worker - INFO - -------------------------------------------------
2024-07-19 13:21:22,879 - distributed.worker - INFO -         Registered to:      tcp://127.0.0.1:45467
2024-07-19 13:21:22,879 - distributed.worker - INFO - -------------------------------------------------
2024-07-19 13:21:22,880 - distributed.core - INFO - Starting established connection to tcp://127.0.0.1:45467
2024-07-19 13:21:22,880 - distributed.core - INFO - Starting established connection to tcp://127.0.0.1:45467
2024-07-19 13:21:22,891 - distributed.scheduler - INFO - Receive client connection: Client-cb0be2cf-45d1-11ef-883e-6045bdb71ecc
2024-07-19 13:21:22,898 - distributed.core - INFO - Starting established connection to tcp://127.0.0.1:40514
Traceback (most recent call last):
  File "/tmp/tmp2eglu4x4/setup.py", line 2, in <module>
    setup(name="my_package", packages=find_packages(), version="123")
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/__init__.py", line 106, in setup
    return distutils.core.setup(**attrs)
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/_distutils/core.py", line 184, in setup
    return run_commands(dist)
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/_distutils/core.py", line 200, in run_commands
    dist.run_commands()
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/_distutils/dist.py", line 970, in run_commands
    self.run_command(cmd)
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/dist.py", line 974, in run_command
    super().run_command(command)
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/_distutils/dist.py", line 989, in run_command
    cmd_obj.run()
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/command/bdist_egg.py", line 158, in run
    self.run_command("egg_info")
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/_distutils/cmd.py", line 316, in run_command
    self.distribution.run_command(command)
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/dist.py", line 974, in run_command
    super().run_command(command)
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/_distutils/dist.py", line 989, in run_command
    cmd_obj.run()
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/command/egg_info.py", line 321, in run
    self.find_sources()
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/command/egg_info.py", line 329, in find_sources
    mm.run()
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/command/egg_info.py", line 555, in run
    self.prune_file_list()
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/command/egg_info.py", line 621, in prune_file_list
    base_dir = self.distribution.get_fullname()
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/_core_metadata.py", line 266, in get_fullname
    return _distribution_fullname(self.get_name(), self.get_version())
  File "/home/runner/miniconda3/envs/dask-distributed/lib/python3.9/site-packages/setuptools/_core_metadata.py", line 284, in _distribution_fullname
    canonicalize_version(version, strip_trailing_zero=False),
TypeError: canonicalize_version() got an unexpected keyword argument 'strip_trailing_zero'
2024-07-19 13:21:23,432 - distributed.scheduler - INFO - Remove client Client-cb0be2cf-45d1-11ef-883e-6045bdb71ecc
2024-07-19 13:21:23,433 - distributed.core - INFO - Received 'close-stream' from tcp://127.0.0.1:40514; closing.
2024-07-19 13:21:23,433 - distributed.scheduler - INFO - Remove client Client-cb0be2cf-45d1-11ef-883e-6045bdb71ecc
2024-07-19 13:21:23,434 - distributed.scheduler - INFO - Close client connection: Client-cb0be2cf-45d1-11ef-883e-6045bdb71ecc
2024-07-19 13:21:23,435 - distributed.worker - INFO - Stopping worker at tcp://127.0.0.1:42479. Reason: worker-close
2024-07-19 13:21:23,436 - distributed.worker - INFO - Stopping worker at tcp://127.0.0.1:45969. Reason: worker-close
2024-07-19 13:21:23,438 - distributed.core - INFO - Connection to tcp://127.0.0.1:45467 has been closed.
2024-07-19 13:21:23,439 - distributed.core - INFO - Connection to tcp://127.0.0.1:45467 has been closed.
2024-07-19 13:21:23,439 - distributed.core - INFO - Received 'close-stream' from tcp://127.0.0.1:40490; closing.
2024-07-19 13:21:23,439 - distributed.core - INFO - Received 'close-stream' from tcp://127.0.0.1:40500; closing.
2024-07-19 13:21:23,439 - distributed.scheduler - INFO - Remove worker <WorkerState 'tcp://127.0.0.1:42479', name: 0, status: closing, memory: 0, processing: 0> (stimulus_id='handle-worker-cleanup-1721395283.439893')
2024-07-19 13:21:23,440 - distributed.scheduler - INFO - Remove worker <WorkerState 'tcp://127.0.0.1:45969', name: 1, status: closing, memory: 0, processing: 0> (stimulus_id='handle-worker-cleanup-1721395283.4404929')
2024-07-19 13:21:23,440 - distributed.scheduler - INFO - Lost all workers
2024-07-19 13:21:23,442 - distributed.scheduler - INFO - Scheduler closing due to unknown reason...
2024-07-19 13:21:23,442 - distributed.scheduler - INFO - Scheduler closing all comms
@hendrikmakait hendrikmakait added the tests Unit tests and/or continuous integration label Jul 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
tests Unit tests and/or continuous integration
Projects
None yet
Development

No branches or pull requests

1 participant