Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting pickle error #27

Open
moo-joshua opened this issue Dec 11, 2023 · 2 comments
Open

Getting pickle error #27

moo-joshua opened this issue Dec 11, 2023 · 2 comments

Comments

@moo-joshua
Copy link

moo-joshua commented Dec 11, 2023

Hi, I am getting this weird pickle error, any ideas on how to fix it? I was able to download the pretrained weights and tried to run the python train.py file (with all the options as specified in the main page)

Traceback (most recent call last): File "C:\Users\joshu\OneDrive\Desktop\Lightpoint\pose_estimation\ImageTools\posenet-pytorch\train.py", line 33, in <module> Traceback (most recent call last): File "<string>", line 1, in <module> for i, data in enumerate(dataset): File "C:\Users\joshu\OneDrive\Desktop\Lightpoint\pose_estimation\ImageTools\posenet-pytorch\data\custom_dataset_data_loader.py", line 43, in __iter__ File "C:\Users\joshu\anaconda3\envs\arvis\lib\multiprocessing\spawn.py", line 116, in spawn_main for i, data in enumerate(self.dataloader): File "C:\Users\joshu\anaconda3\envs\arvis\lib\site-packages\torch\utils\data\dataloader.py", line 438, in __iter__ exitcode = _main(fd, parent_sentinel) return self._get_iterator() File "C:\Users\joshu\anaconda3\envs\arvis\lib\multiprocessing\spawn.py", line 126, in _main File "C:\Users\joshu\anaconda3\envs\arvis\lib\site-packages\torch\utils\data\dataloader.py", line 386, in _get_iterator return _MultiProcessingDataLoaderIter(self) self = reduction.pickle.load(from_parent) File "C:\Users\joshu\anaconda3\envs\arvis\lib\site-packages\torch\utils\data\dataloader.py", line 1039, in __init__ EOFError: Ran out of input w.start() File "C:\Users\joshu\anaconda3\envs\arvis\lib\multiprocessing\process.py", line 121, in start self._popen = self._Popen(self) File "C:\Users\joshu\anaconda3\envs\arvis\lib\multiprocessing\context.py", line 224, in _Popen return _default_context.get_context().Process._Popen(process_obj) File "C:\Users\joshu\anaconda3\envs\arvis\lib\multiprocessing\context.py", line 336, in _Popen return Popen(process_obj) File "C:\Users\joshu\anaconda3\envs\arvis\lib\multiprocessing\popen_spawn_win32.py", line 93, in __init__ reduction.dump(process_obj, to_child) File "C:\Users\joshu\anaconda3\envs\arvis\lib\multiprocessing\reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) AttributeError: Can't pickle local object 'get_posenet_transform.<locals>.<lambda>'

@splinexuan
Copy link

Hi, I even can't download the pretrained weights with wget .. How can get the googlenet.pickle ?...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants