Bugfix : Prompt Inference 실행시, 모델의 device setting이 flag와 맞지 않는 버그 해결 #26
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
버그 발생 상황
GPU 환경에서 기존의 코드를 동작시킬 때 아래와 같은 에러가 발생합니다.
해결 방법
위의 에러가 발생하는 원인은
KoGPTInference
클래스의 생성자가 호출될 때 입력 받은device
변수를 통해 GPT 모델의 device가 설정되지 않아 발생하는 문제로 보입니다.따라서, 아래와 같은 코드를 추가하여 문제를 해결하였습니다.