Your data node, complete with private personal AI Large Language Model.
True Native DWeb:
- Connect to your own node from home, web, or mobile (libp2p + WebRTC)
- Store your data to your own device (Tauri)
- Remotely run Large Language Models (LLMs) privately on your own device
The power of a native app for desktop, built with:
- Save your Web3 data to your devices or the network
- Plugins of your choice
- Linux
- Windows
- MacOS Apple Silicon
- MacOS Intel x86_64 🤕 (unfriendly target)
- 🌐 Android (Via web browser to your node running at home)
- 🌐 iOS (Via web browser to your node running at home)
To build yourself, ensure you just install ollama executable first:
just install_ollama
If you get a proc_macro
error, you may need to create a dist
folder in the root of the project.
This command will run Svelte first which will start the vite dev server. Then it will compile the rust code, and start the Tauri dev server:
npm run tauri dev
# or using just.systems:
just tauri
- Update the version in the
tauri.conf.json
file. - Run the following command to merge into the release branch, create a new version tag and push to the remote repository:
just release