Skip to content

Commit

Permalink
fix(lindera-search): tokenizer is not callable, use method
Browse files Browse the repository at this point in the history
  • Loading branch information
attakei committed Dec 10, 2024
1 parent f76536d commit 4f372dc
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/atsphinx/toybox/lindera_search/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,4 @@ def __init__(self, options: dict[str, str]) -> None: # noqa: D107
self.tokenizer = Tokenizer(self.segmenter)

def split(self, input: str) -> list[str]: # noqa: D102
return [token.text for token in self.tokenizer(input)]
return [token.text for token in self.tokenizer.tokenize(input)]

0 comments on commit 4f372dc

Please sign in to comment.