Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

no action after thinking #11

Open
supperdsj opened this issue Jan 22, 2025 · 12 comments
Open

no action after thinking #11

supperdsj opened this issue Jan 22, 2025 · 12 comments

Comments

@supperdsj
Copy link

i'v deplay hf.co/bytedance-research/UI-TARS-7B-gguf:latest with ollama in localhost,but get any action after thinking and screen shot

vlm provider: ollama
vlm base url: http://localhost:11434/api/generate
vlm model name: hf.co/bytedance-research/UI-TARS-7B-gguf:latest

@zakkor
Copy link

zakkor commented Jan 22, 2025

Hi, you need to use this for the base url: http://localhost:11434/v1

@zakkor
Copy link

zakkor commented Jan 22, 2025

that being said, it still doesn't work. seems totally braindead and just clicks a random spot over and over

Image

@wrt-n
Copy link

wrt-n commented Jan 22, 2025

Had the same issue and this worked for me:

Image

Also getting the issue mentioned by zakkor with nulls in the click coordinates (tried official 16bit 2B version, as well as quants of 2B and 7B model)

@SebastianBoehler
Copy link

I am having the same issue with null coordinates...

@thepok
Copy link

thepok commented Jan 22, 2025

i have the same proplems i tried the 7b and the 2b model. they seem to work fine in ollama, and i guess they produce tokens.

They take screenshots and print zeros to the chatlog

@naveenkasturi
Copy link

Seems to be same issue for me as well. Did any one find a way around it?

@DFin
Copy link

DFin commented Jan 22, 2025

Can confirm the Null issue in coordinates. It works generating tokens with both GGUF models using ollama endpoint, but it constantly generates coordinates like this (Null, 0.xzy, Null, 0.xyz). Using the Mac desktop client.

@rolandgvc
Copy link

I'm having the same issues with the huggingface setup

Image

@tomcharlesosman
Copy link

mac desktop client not working for me either.

@doubleLLL3
Copy link

I'm having the same issues with the huggingface setup我在使用 huggingface 设置时遇到了同样的问题

Image

@rolandgvc Restart the App, and then fine.

And I'm also using the huggingface, is your url like this?
https://xxx.endpoints.huggingface.cloud/v1

need /v1 suffix

@DFin
Copy link

DFin commented Jan 24, 2025

I am not having the problems with running via huggingface. Both UI-TARS-72B-SFT and UI-TARS-7B-SFT worked fine. The problem only happens using ollama.

@nivibilla
Copy link

I also get this when i use vllm 72b in bf16

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests