You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
... #210 added some caching helper functions including set_cache_size, get_cache_size, and clear_cache but these functions don't really work as intended.
dacite/cache.py defines a cache decorator and then the decorator itself is cached using @lru_cache(maxsize=None) - this layer of "meta" caching does not have any effect. @cache is invoked as a decorator on many functions, but only once - the Python interpreter applies the decorator once per decorated function as it parses the library. You can see the effect using cache_info():
These 17 misses and currsize correspond to the 17 functions decorated - 15 in dacite/types.py and 2 in dacite/dataclasses.py. But there are no cache hits and it would be difficult to even make use of these (I think you'd have to manually call cache(f.__wrapped__) to reapply the caching to the original, un-decorated function).
clear_cache clears the above "meta" cache, but this has no effect on the underlying cached functions like dacite.types.is_generic - in order to clear those caches you'd have to go through them like dacite.types.is_generic.clear_cache() - you could in theory have the cache function maintain a global list of cached functions so that clear_cache could iterate through this and clear them.
set_cache_size sets a global variable but this has no effect on the caching behavior - dacite.types.is_generic would have already been decorated by the time any user is calling set_cache_size and you can see by inspecting it with dacite.types.is_generic.cache_info() that the maxsize does not get updated. My understanding is that there is no good way to update a cache size dynamically.
To Reproduce
...
Expected behavior
...
Environment
Python version: ...
dacite version: ...
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered:
Describe the bug
...
#210 added some caching helper functions including
set_cache_size
,get_cache_size
, andclear_cache
but these functions don't really work as intended.dacite/cache.py
defines acache
decorator and then the decorator itself is cached using@lru_cache(maxsize=None)
- this layer of "meta" caching does not have any effect.@cache
is invoked as a decorator on many functions, but only once - the Python interpreter applies the decorator once per decorated function as it parses the library. You can see the effect usingcache_info()
:These 17 misses and currsize correspond to the 17 functions decorated - 15 in
dacite/types.py
and 2 indacite/dataclasses.py
. But there are no cache hits and it would be difficult to even make use of these (I think you'd have to manually callcache(f.__wrapped__)
to reapply the caching to the original, un-decorated function).clear_cache
clears the above "meta" cache, but this has no effect on the underlying cached functions likedacite.types.is_generic
- in order to clear those caches you'd have to go through them likedacite.types.is_generic.clear_cache()
- you could in theory have thecache
function maintain a global list of cached functions so thatclear_cache
could iterate through this and clear them.set_cache_size
sets a global variable but this has no effect on the caching behavior -dacite.types.is_generic
would have already been decorated by the time any user is callingset_cache_size
and you can see by inspecting it withdacite.types.is_generic.cache_info()
that the maxsize does not get updated. My understanding is that there is no good way to update a cache size dynamically.To Reproduce
...
Expected behavior
...
Environment
dacite
version: ...Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: