Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update dependencies, drop py35, test in py310, improve itemadapter performance #66

Merged
merged 4 commits into from
Mar 22, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 9 additions & 9 deletions .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,24 +10,24 @@ jobs:
fail-fast: false
matrix:
include:
- python-version: "3.5"
env:
TOXENV: py35-scrapy18
- python-version: "3.5"
- python-version: "3.6"
env:
TOXENV: py35
TOXENV: py36-scrapy16
- python-version: "3.6"
env:
TOXENV: py36
TOXENV: py
- python-version: "3.7"
env:
TOXENV: py37
TOXENV: py
- python-version: "3.8"
env:
TOXENV: py38
TOXENV: py
- python-version: "3.9"
env:
TOXENV: py39
TOXENV: py
- python-version: "3.10"
env:
TOXENV: py

steps:
- uses: actions/checkout@v2
Expand Down
8 changes: 4 additions & 4 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@
long_description=open('README.md').read(),
packages=find_packages(),
install_requires=[
'Scrapy>=1.0',
'scrapinghub>=1.9.0',
'Scrapy>=1.6',
'scrapinghub>=2.1.0',
],
entry_points={
'console_scripts': [
Expand All @@ -18,7 +18,7 @@
'shub-image-info = sh_scrapy.crawl:shub_image_info',
],
},
python_requires='>=3.5',
python_requires='>=3.6',
classifiers=[
'Framework :: Scrapy',
'Development Status :: 5 - Production/Stable',
Expand All @@ -27,11 +27,11 @@
'Operating System :: OS Independent',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Topic :: Utilities',
],
)
8 changes: 6 additions & 2 deletions sh_scrapy/extension.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,14 +19,18 @@


try:
from itemadapter import is_item
from itemadapter import ItemAdapter
except ImportError:
_base_item_cls = [dict, scrapy.item.Item]
with suppress(AttributeError):
_base_item_cls.append(scrapy.item.BaseItem)
_base_item_cls = tuple(_base_item_cls)

def is_item(item):
return isinstance(item, tuple(_base_item_cls))
return isinstance(item, _base_item_cls)
else:
def is_item(item):
return ItemAdapter.is_item(item)


class HubstorageExtension(object):
Expand Down
2 changes: 1 addition & 1 deletion tests/test_writer.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@
import logging
import os
import threading
from queue import Queue

from six.moves.queue import Queue
import pytest

from sh_scrapy.writer import _PipeWriter
Expand Down
5 changes: 2 additions & 3 deletions tox.ini
Original file line number Diff line number Diff line change
@@ -1,15 +1,14 @@
# tox.ini
[tox]
envlist = py35-scrapy18, py35, py36, py37, py38, py39
envlist = py36-scrapy16, py

[testenv]
deps =
pytest
pytest-cov
mock
hubstorage
six
packaging
py35-scrapy18: Scrapy==1.8
py36-scrapy16: Scrapy==1.6
commands =
pytest --verbose --cov=sh_scrapy --cov-report=term-missing --cov-report=html --cov-report=xml {posargs: sh_scrapy tests}