Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to create modal, error message -> neither 'from' or 'files' was specified #194

Open
shxlsp opened this issue Jan 16, 2025 · 5 comments
Assignees

Comments

@shxlsp
Copy link

shxlsp commented Jan 16, 2025

Here is the code I executed.

import ollama from 'ollama'
const modelfile = `
FROM ./LLM/phi-4-q4.gguf
SYSTEM "You are mario from super mario bros."
`
await ollama.create({ model: 'phi4', modelfile: modelfile })

This is the error message I received.

file:///Users/sunhaoxing1/Documents/project/test/node_modules/ollama/dist/shared/ollama.cddbc85b.mjs:70
  throw new ResponseError(message, response.status);
        ^

ResponseError: neither 'from' or 'files' was specified
    at checkOk (file:///Users/sunhaoxing1/Documents/project/test/node_modules/ollama/dist/shared/ollama.cddbc85b.mjs:70:9)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async post (file:///Users/sunhaoxing1/Documents/project/test/node_modules/ollama/dist/shared/ollama.cddbc85b.mjs:123:3)
    at async Ollama.processStreamableRequest (file:///Users/sunhaoxing1/Documents/project/test/node_modules/ollama/dist/shared/ollama.cddbc85b.mjs:257:22)
    at async file:///Users/sunhaoxing1/Documents/project/test/LLM/ollama.mjs:23:1 {
  error: "neither 'from' or 'files' was specified",
  status_code: 400
}

I followed the method in the JS documentation, and it indicated that 'from' or 'files' is missing.

After debugging, I found that the input parameters of the request sent here are inconsistent with the API input parameters provided in Ollama.

Image

Image

Image

Image

I processed the parameters according to the documentation in Ollama, but received another error.

Image
Image

I see that the SHA256 value is being passed to the Ollama service in the code, but I don't understand why it still results in an error.

Image
Please help analyze the reason for the error. Thank you!!

@BruceMacD BruceMacD self-assigned this Jan 17, 2025
@Justin-SG
Copy link

I am getting the same error message after updating to Ollama 0.5.7. Uninstalling Ollama and installing an older version (0.5.1 in my case) fixed it. I assume you can do the same as a temporary workaround.

@Nickmiste
Copy link

Running into this as well, also when updating ollama (I have ollama running in a docker container so I didn't even realize it updated at first). From what I can tell, it seems like #192 (ollama-js 0.5.12) updated the api due to changes in ollama 0.5.5, but removed support for modelfiles. I'm downgrading to ollama 0.5.4 and ollama-js 0.5.11 for the time being, but would love to see support for modelfiles added again. If they're still supported and Im just missing something, please let me know!

@BruceMacD
Copy link
Collaborator

Hey everyone, sorry for the confusion here. The API in Ollama was updated to take the fields from Modelfiles, rather than the model itself so the API has been updated in ollama-js also.

Here is the Ollama API change for reference:
ollama/ollama#7935

You may see this errors from this API when:

  • You are using the current versionollama-js SDK with a version of Ollama < v0.5.5
  • You are using the old versionollama-js SDK with a version of Ollama >= v0.5.5

I plan on making the experience better here for ollama-js, but have been blocked by problems with streaming multi-part file uploads with Node.js, which is being tracked in #191

@Nickmiste
Copy link

@BruceMacD Thanks for the update! Are there any plans to readd support for modelfiles - possibly by parsing the Modelfile and converting it to JSON before sending the request? Separation between code and configuration is really nice to have, and adding a modelfile.json doesn't seem like the best solution because regular modelfiles are more standard.

I also understand that the modelfile spec is still in development, so it seems acceptable if the answer is "not until things are more standardized".

@BruceMacD
Copy link
Collaborator

@Nickmiste You raise some good points about Modelfiles and configuration separation. There's some stuff in flux at the moment while the main Ollama repo is working on a new engine, so I'm gonna hold off on adding support for Modelfies directly back for the time being.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants