diff --git a/_posts/2024-02-12-Simple-Dockerfile-for-Dev-Purposes.md b/_posts/2024-02-12-Simple-Dockerfile-for-Dev-Purposes.md index e2a4675..7890776 100644 --- a/_posts/2024-02-12-Simple-Dockerfile-for-Dev-Purposes.md +++ b/_posts/2024-02-12-Simple-Dockerfile-for-Dev-Purposes.md @@ -17,7 +17,7 @@ Here's a straightforward Dockerfile template that suits various Machine Learning - Enables seamless GitHub authentication for source control within the container. - Uses "pip install -e ." to install the current directory as a package, allowing users to treat their utilities within the container like any other library. -The next Dockerfile was gotten from our [*llamaindex-learning repository.*](https://github.com/bubl-ai/llamaindex-learning/blob/main/docker/Dockerfile) +The next Dockerfile was gotten from our [*llamaindex-project repository.*](https://github.com/bubl-ai/llamaindex-project/blob/main/docker/Dockerfile) ``` # Use an official base image with Python and Conda @@ -40,11 +40,11 @@ RUN pip install llama-index RUN mkdir -p /root/.ssh && \ ssh-keyscan github.com >> /root/.ssh/known_hosts -# Copy the current directory contents into the container at /llamaindex-learning -COPY . /llamaindex-learning +# Copy the current directory contents into the container at /llamaindex-project +COPY . /llamaindex-project # Set the working directory inside the container -WORKDIR /llamaindex-learning +WORKDIR /llamaindex-project # Install Python dependencies using pip RUN pip install -e . --no-binary :all: diff --git a/_posts/2024-02-13-Repo-as-Importable-Package.md b/_posts/2024-02-13-Repo-as-Importable-Package.md index 37c1e03..840bcd3 100644 --- a/_posts/2024-02-13-Repo-as-Importable-Package.md +++ b/_posts/2024-02-13-Repo-as-Importable-Package.md @@ -12,20 +12,20 @@ Now, the presence of a `setup.py` file is your project's golden ticket, especial In a nutshell, doing this the right way allows you to treat your code just like any other Python library. The real magic happens when you realize you can seamlessly manage your source control and develop your code within the container. Everything neatly in the same place! -The next `setup.py` file was gotten from our [*llamaindex-learning repository.*](https://github.com/bubl-ai/llamaindex-learning/blob/main/setup.py) +The next `setup.py` file was gotten from our [*llamaindex-project repository.*](https://github.com/bubl-ai/llamaindex-project/blob/main/setup.py) ```python from setuptools import setup, find_packages setup( - name='llamaindex_project', + name='llamaindex-project', version='0.1.0', author= 'Santiago Olivar', author_email='contact.bubl.ai@gmail.com', description='', long_description=open('README.md').read(), long_description_content_type='text/markdown', - url='https://github.com/bubl-ai/llamaindex-learning', + url='https://github.com/bubl-ai/llamaindex-project', license='MIT', classifiers=[ 'Development Status ::Planning', diff --git a/_posts/2024-02-14-Repo-Structure.md b/_posts/2024-02-14-Repo-Structure.md index 53e36c1..8c8ae25 100644 --- a/_posts/2024-02-14-Repo-Structure.md +++ b/_posts/2024-02-14-Repo-Structure.md @@ -14,7 +14,7 @@ For us, having a fully operational Docker container is pivotal. This container m We strongly believe that being equipped with this setup places any developer in an ideal position. It transforms innovation into the primary focus, allowing us to bid farewell to errors associated with a subpar environment setup. -Now, let's delve into what you'll discover in the bubl-ai repositories, such as the [*llamaindex-learning*](https://github.com/bubl-ai/llamaindex-learning): +Now, let's delve into what you'll discover in the bubl-ai repositories, such as the [*llamaindex-project*](https://github.com/bubl-ai/llamaindex-project): - **bubls**: Think of this as the source code of our repository, containing all the importable classes and functionalities. These are the fundamental building blocks, or bubls, that will serve as the foundation for crafting new creations!