[Blog] [Demo on YouTube] [Paper]
rclip is a command-line photo search tool powered by the awesome OpenAI's CLIP neural network.
sudo snap install rclip
Alternative options (AppImage and pip
)
If your Linux distribution doesn't support snap, you can use one of the alternative installation options:
On Linux x86_64, you can install rclip as a self-contained executable.
-
Download the AppImage from the latest release.
-
Execute the following commands:
chmod +x <downloaded AppImage filename>
sudo mv <downloaded AppImage filename> /usr/local/bin/rclip
pip install --extra-index-url https://download.pytorch.org/whl/cpu rclip
brew install yurijmikhalevich/tap/rclip
Alternative option (pip
)
pip install rclip
- Download the "*.msi" from the latest release.
- Install rclip by running the installer.
Alternative option (pip
)
pip install rclip
cd photos && rclip "search query"
When you run rclip for the first time in a particular directory, it will extract features from the photos, which takes time. How long it will take depends on your CPU and the number of pictures you will search through. It took about a day to process 73 thousand photos on my NAS, which runs an old-ish Intel Celeron J3455, 7 minutes to index 50 thousand images on my MacBook with an M1 Max CPU, and three hours to process 1.28 million images on the same MacBook.
For a detailed demonstration, watch the video: https://www.youtube.com/watch?v=tAJHXOkHidw.
You can use another image as a query by passing a file path or even an URL to the image file, and rclip will find the images most similar to the one you used as a query. If you are referencing a local image via a relative path, you must prefix it with ./
. For example:
cd photos && rclip ./cat.jpg
# or use URL
cd photos && rclip https://raw.githubusercontent.com/yurijmikhalevich/rclip/main/tests/e2e/images/cat.jpg
Check this video out for the image-to-image search demo: https://www.youtube.com/watch?v=1YQZKeCBxWM.
You can add and subtract image and text queries from each other; here are a few usage examples:
cd photos && rclip horse + stripes
cd photos && rclip apple - fruit
cd photos && rclip "./new york city.jpg" + night
cd photos && rclip "2:golden retriever" + "./swimming pool.jpg"
cd photos && rclip "./racing car.jpg" - "2:sports car" + "2:snow"
If you want to see how these queries perform when executed on the 1.28 million images ImageNet-1k dataset, check out the demo on YouTube: https://www.youtube.com/watch?v=MsTgYdOpgcQ.
If you are using either one of iTerm2, Konsole (version 22.04 and higher), wezterm, Mintty, or mlterm all you need to do is pass --preview
(or -p
) argument to rclip:
rclip -p kitty
Using a different terminal or viewer
If you are using any other terminal or want to view the results in your viewer of choice, you can pass the output of rclip to it. For example, on Linux, the command from below will open top-5 results for "kitty" in your default image viewer:
rclip -f -t 5 kitty | xargs -d '\n' -n 1 xdg-open
The -f
param or --filepath-only
makes rclip print the file paths only, without scores or the header, which makes it ideal to use together with a custom viewer as in the example.
I prefer to use feh's thumbnail mode to preview multiple results:
rclip -f -t 5 kitty | feh -f - -t
https://github.com/yurijmikhalevich/rclip/discussions/new/choose
This repository follows the Conventional Commits standard.
To run rclip locally from the source code, you must have Python and Poetry installed.
Then do:
# clone the source code repository
git clone [email protected]:yurijmikhalevich/rclip.git
# install dependencies and rclip
cd rclip
poetry install
# activate the new poetry environment
poetry shell
If the poetry environment is active, you can use rclip locally, as described in the Usage section above.
Thanks go to these wonderful people and organizations (emoji key):
ramayer 💻 |
Caphyon 🚇 |
Thanks to Caphyon and Advanced Installer team for generously supplying rclip project with the Professional Advanced Installer license for creating the Windows installer.
This project follows the all-contributors specification. Contributions of any kind are welcome!
MIT