Skip to content

Commit

Permalink
update: version 1.0.2
Browse files Browse the repository at this point in the history
  • Loading branch information
namtranase committed Jan 26, 2024
1 parent b877d29 commit b33cba3
Show file tree
Hide file tree
Showing 6 changed files with 142 additions and 84 deletions.
117 changes: 41 additions & 76 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,103 +4,76 @@
<img src="assets/logo.png" alt="terminalmind logo" width="400"/>
</p>

**Current Version: v1.0.1**
**Latest Version: v1.0.2**

### What's New in v1.0.1
[![License: MIT](https://img.shields.io/badge/license-MIT-blue.svg)](https://opensource.org/licenses/MIT)

- **New Feature**: Integration with `llama.cpp` framework, enhancing the tool's capabilities.
- **Bug Fixed**: Remove prompt input for better user experiments
- **Documentation Updates**: Release notes added along with updated instructions for working with llama.cpp
Enhance your coding and command-line efficiency with `terminalmind`, now known as `temi`. Get relevant code snippets, process text and PDF data without leaving your terminal, and maintain a seamless workflow.

For more detailed information, please see the [release notes](https://github.com/namtranase/terminalmind/releases/tag/v1.0.0).

[![License: MIT](https://img.shields.io/badge/license-MIT-blue.svg)](https://opensource.org/licenses/MIT)
### 🚀 What's New

- Simplified configuration via `config.json`
- Enhanced helper interface for user commands
- Cleaner and more accessible documentation

Explore the [release notes](https://github.com/namtranase/terminalmind/releases/tag/v1.0.2) for more details.

`terminalmind`, now accessible via the `temi` command, is a command-line tool designed to enhance coding efficiency by suggesting relevant code snippets and facilitating text and PDF data processing directly in the terminal. This approach promotes a seamless development workflow, keeping developers focused within their coding environment.
## 🙏 Acknowledgments

## Acknowledgments
A huge shoutout to [Georgi Gerganov](https://github.com/ggerganov) and all contributors of [llama.cpp](https://github.com/ggerganov/llama.cpp) for their pioneering work.

`terminalmind` builds on the work of the [llama.cpp](https://github.com/ggerganov/llama.cpp) project. Special thanks to [Georgi Gerganov](https://github.com/ggerganov) and contributors for their groundbreaking work in creating a C++ interface for the LLM model. We encourage `terminalmind` users and developers to support and contribute to `llama.cpp`.

## Supported Features
## Supported Features

Demo:
<p align="center">
<img src="assets/examples.png" alt="Examples"/>
</p>

- [x] **Q&A in Terminal**: Interactive question-and-answer functionality directly in your command line.
- [x] **PDF Information Retrieval**: Extract and analyze information from both local and web-based PDF files based on keyword.
- [x] **Online Article Summarization**: Quickly summarize the content of online articles with a simple command.
- [x] **temi integrates the llama.cpp framework**: `temi` now supports multiple models provided by the llama.cpp framework, enhancing its functionality.
- [ ] **Command Execution Based on User Input**: *Upcoming feature* to execute relevant terminal commands automatically based on user queries.
- [ ] **GPU Acceleration for Enhanced Performance**: *In development* – leveraging GPU acceleration for faster response times and improved efficiency.
- [ ] **Integration with llama.cpp Server**: Enhance response times and model management by interfacing with llama.cpp's dedicated server.
- [ ] **Interactive App and Web Launcher**: Execute applications and navigate to websites directly through keyword-triggered commands.
- [ ] **Docker Compatibility**: Run temi seamlessly in Docker containers for isolated and consistent development environments.
- [ ] **User-Friendly Settings Interface**: Customize temi with ease using a graphical settings panel.

<details>
<summary>Table of Contents</summary>
<ol>
<li><a href="#description">Description</a></li>
<li><a href="#installation">Installation</a></li>
<li><a href="#usage">Usage</a></li>
<li><a href="#contributing">Contributing</a></li>
<li><a href="#license">License</a></li>
</ol>
</details>

## Description

`terminalmind`, accessible via the `temi` command, offers functionalities like text summarization, content retrieval from PDFs, and more, using an integrated language model. Designed for terminal use, it caters to developers, researchers, and technology enthusiasts.

## Installation
- **Q&A in Terminal**: Get instant answers within your command line.
- **PDF Information Retrieval**: Delve into PDFs for the info you need.
- **Online Article Summarization**: Summarize articles in commands.
- **llama.cpp Integration**: Multiple model support for extended capabilities.
- **User-Friendly Settings**: A simple JSON config caters to all use cases.
- *Upcoming*: Command execution, GPU acceleration, llama.cpp server integration, app and web launcher, and Docker support.


## 📖 Description

`temi` is your go-to CLI companion for quick text summarization, efficient content retrieval, and more. It's an indispensable tool for developers, researchers, and anyone who loves efficiency.

## 🛠 Installation

Clone the repository:
```bash
git clone https://github.com/namtranase/terminalmind.git
cd terminalmind
```

Install requirement packages:
```bash
sudo apt install python3-requests python3-bs4 python3-pypdf2
```

Download the model:
```bash
# Install requirement packages
sudo apt install python3-requests python3-bs4 python3-pypdf2 jq
# Download the model
./scripts/download_model.sh
```

Install the package by scripts:
```
#Install the package by scripts:
./scripts/install_temi.sh
```

Or build step by step.
Optionally, build and install the Debian package manually:

Build the Debian package:
```bash
dpkg-deb --build temi-packaging temi.deb
```

Install package using `dpkg`:
```bash
sudo dpkg -i temi.deb
```

## Usage

After installation, you can use temi but first, it needs to update the model file. Start temi for the first time and paste the absolute path to the model downloaded above. For example:
## 🖥 Usage
For the first time, you can call helper for better understand the functions of temi:
```bash
temi Hello
# Model file not found at /llm_models/model.gguf.
# Please enter the absolute path to your .gguf model file:
models/model.gguf
temi --help
```
<p align="center">
<img src="assets/helper.png" alt="Examples"/>
</p>

And now you can use `temi` as the terminal assistant:
Use temi for a variety of tasks:
```bash
temi how to make a python package
# To create a Python package, follow these steps:
Expand All @@ -113,11 +86,11 @@ temi how to make a python package
# 6. To distribute, use tools like setuptools, Twine and PyPI to create a distribution package.%

```
## Contributing
## 🤝 Contributing

Contributions to `temi` are greatly appreciated, whether they involve submitting pull requests, reporting bugs, or suggesting enhancements.
We welcome contributions!

### Getting Started
### Quick Start for Contributors
For detailed instructions on working with the llama.cpp submodule, including setup and usage, refer to the WORK_WITH_LLAMACPP.md file.

1. Install repo and submoudles:
Expand All @@ -135,14 +108,6 @@ pip install -r requirements/dev_requirements.txt
pre-commit install
```

### Making Contributions

- Fork and Clone the Repository: Start by forking the repository and then clone your fork to your local machine.

- Create a New Branch: Work on your changes in a new git branch. This helps to keep your modifications organized and separate from the main codebase.

- Submit a Pull Request: Once your changes are complete, push them to your fork and submit a pull request to the main terminalmind repository.

## License

`terminalmind` is MIT licensed, as found in the [LICENSE](https://github.com/namtranase/terminalmind/LICENSE) file.
Binary file added assets/helper.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 4 additions & 0 deletions config/temi_config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
{
"model_path": "~/llm_models/model.gguf",
"type": "local"
}
2 changes: 1 addition & 1 deletion llama.cpp
61 changes: 61 additions & 0 deletions temi-packaging/usr/local/bin/config_setup.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
#!/bin/bash

# Define the directory where the script is located
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"

# Path to the configuration directory and default configuration file
CONFIG_DIR="$HOME/.config/temi"
CONFIG_FILE="$CONFIG_DIR/temi_config.json"

# Ensure jq is installed
if ! command -v jq &>/dev/null; then
echo "jq is not installed. Please install jq to use this script."
exit 1
fi

# Function to load the configuration from the JSON file
load_config() {
if [ ! -f "$CONFIG_FILE" ]; then
# Provide a default configuration if the config file is missing
mkdir -p "$CONFIG_DIR"
echo '{"model_path": "~/llm_models/model.gguf", "type": "local"}' > "$CONFIG_FILE"
echo "Default configuration created at '$CONFIG_FILE'."
fi

MODEL_PATH=$(jq -r '.model_path' "$CONFIG_FILE")
MODEL_TYPE=$(jq -r '.type' "$CONFIG_FILE")

# Resolve tilde and parameter expansion
MODEL_PATH=$(eval echo "$MODEL_PATH")

# Check if the model path is valid
if [ "$MODEL_TYPE" == "local" ] && [ ! -f "$MODEL_PATH" ]; then
echo "Invalid model path in configuration: '$MODEL_PATH'"
fi
}

# Function to update the configuration file
update_config() {
local new_config_path="$1"

if [ ! -f "$new_config_path" ]; then
echo "The provided configuration file does not exist: '$new_config_path'"
exit 1
fi

# Copy the new configuration file to the standard location
cp "$new_config_path" "$CONFIG_FILE"
echo "Configuration updated successfully."

# Reload the configuration
load_config
}

# Check if the user wants to update the configuration
if [[ "$1" == "update_config" ]]; then
update_config "$2"
exit 0
fi

# Load configuration settings
load_config
42 changes: 35 additions & 7 deletions temi-packaging/usr/local/bin/temi
Original file line number Diff line number Diff line change
Expand Up @@ -3,20 +3,53 @@
# Get the directory of the current script
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"

# Use paths relative to the script directory
CONFIG_FILE="$HOME/.temi_config"
EXTRACT_PDF_SCRIPT="$SCRIPT_DIR/extract_pdf_content.py"
FETCH_ARTICLE_SCRIPT="$SCRIPT_DIR/fetch_article_content.py"
PROMPT_FILE="$SCRIPT_DIR/prompt.txt"
TEMICORE="$SCRIPT_DIR/temicore"

# Source the configuration setup script
source "$SCRIPT_DIR/config_setup.sh"

# Function to validate if the model path ends with .gguf
validate_model_path() {
if [[ $1 != *.gguf ]]; then
echo "temi only supports .gguf models for now."
exit 1
fi
}

show_help() {
echo -e "\033[1mUsage:\033[0m"
echo " temi [COMMAND] [ARGUMENTS]... [OPTIONS]"
echo
echo -e "\033[1mCommands:\033[0m"
echo " summary|summarize Summarize the content from a given URL."
echo " RAG|retrieve|Retrieve KEYWORD Retrieve information given keyword from a PDF file."
echo " change_model Change the model configuration."
echo " update_config FILE Update the configuration with the given JSON file."
echo
echo -e "\033[1mOptions:\033[0m"
echo " --help Show this help message and exit."
echo
echo -e "\033[1mExamples:\033[0m"
echo " temi summary https://example.com/article"
echo " temi retrieve with keyword: test, path/to/file.pdf"
echo " temi change_model"
echo " temi update_config path/to/config.json"
echo
echo "For more information, visit: https://github.com/namtranase/terminalmind"
echo
echo -e "\033[1mNote:\033[0m Replace [COMMAND] with one of the commands above,"
echo "and [ARGUMENTS]... with the appropriate input for the command."
}

# Check for --help option
if [[ "$1" == "--help" ]]; then
show_help
exit 0
fi

# Function to extract URL from input
extract_url() {
echo "$1" | grep -o 'http[s]*://[^ ]*'
Expand All @@ -42,11 +75,6 @@ extract_keyword() {
echo "$keyword"
}

# Load model path from the configuration file if it exists
if [ -f "$CONFIG_FILE" ]; then
source "$CONFIG_FILE"
fi

change_model() {
echo "Please enter the new absolute path to your .gguf model file:"
read -r new_model_path
Expand Down

0 comments on commit b33cba3

Please sign in to comment.