Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge develop #14

Merged
merged 29 commits into from
Apr 14, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
1ba9396
Add ollama, support, memGPT services
lehcode Apr 4, 2024
0ca8af9
feat: Docker services
lehcode Apr 4, 2024
309df69
hotfix: Restore useTranslation()
lehcode Apr 10, 2024
751442b
hotfix: Frontend integration
lehcode Apr 11, 2024
94619eb
hotfix: Backend app service dependencies fix under Conda
lehcode Apr 12, 2024
6fe6b59
feat: Add API startup script
lehcode Apr 12, 2024
ffbad57
feat: Add FastAPI server and Vite dev server logging for debug and li…
lehcode Apr 12, 2024
187ca9d
chore: Cleanup after local rebase
lehcode Apr 12, 2024
a3d6c03
feat: Improve docker compose services integration
lehcode Apr 12, 2024
03a530a
hotfix: Frontend and API integration. Build improvements.
lehcode Apr 14, 2024
e826a5c
feat/poetry-build (#8)
lehcode Apr 14, 2024
0676e94
fix: fix some of the styling to more closely match figma (#927)
Sparkier Apr 12, 2024
dea68c4
Add Italian, Spanish and Português (#1017)
PierrunoYT Apr 12, 2024
a52a495
Add Azure configuration doc (#1035)
enyst Apr 12, 2024
9165ba1
Formatting AZURE_LLM_GUIDE (#1046)
enyst Apr 12, 2024
c1754bf
Feat add agent manager (#904)
iFurySt Apr 12, 2024
1e73863
simplified get (#962)
SmartManoj Apr 12, 2024
8fdc728
Response recognition for weak llms (#523)
namtacs Apr 12, 2024
48efce0
Traffic Control: Add new config MAX_CHARS (#1015)
li-boxuan Apr 12, 2024
ea2abcf
fix: print the wrong ssh port number (#1054)
iFurySt Apr 13, 2024
30c4969
fix(editor): ui enhancements and code refactor (#1069)
akhilvc10 Apr 13, 2024
043ee5a
Add new sandbox type - local (#1029)
foragerr Apr 14, 2024
a53d4af
Auto-close stale issues and PRs (#1032)
rbren Apr 14, 2024
5a8553d
Throw error if an illegal sandbox type is used (#1087)
yimothysu Apr 14, 2024
fb30ad3
Unify linter behaviour across CI and pre-commit-hook (#1071)
li-boxuan Apr 14, 2024
a5051cb
Revamp Exception handling (#1080)
li-boxuan Apr 14, 2024
c5af998
doc: Add supplementary notes for WSL2 users to Local LLM Guide (#1031)
FZFR Apr 14, 2024
784f7ab
added to sudo group (#1091)
SmartManoj Apr 14, 2024
cddc385
chore: Merge .dockerignore
lehcode Apr 14, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
Add ollama, support, memGPT services
  • Loading branch information
lehcode committed Apr 14, 2024
commit 1ba939618953382bd3469f57e430b9b270f9f3c6
Empty file added .mitmproxy/.gitkeep
Empty file.
Empty file added .ollama/.gitkeep
Empty file.
33 changes: 33 additions & 0 deletions docker/backends/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
FROM litellm/openai-proxy

#ARG ws_dir
ARG venv_dir
ARG apt_proxy
ARG debug

ENV PATH=${PATH}:${venv_dir}/bin
ENV LITELLM_PORT=${litellm_port}
ENV VENV_DIR=${venv_dir}
ENV DEBUG=${debug}

RUN --mount=type=cache,target=/var/cache/apt \
if [ ! -z "${debug}" ]; then set -eux; fi && \
apt update && apt -qy upgrade && \
apt install -y ca-certificates python3 python3-venv python3-pip curl git && \
python3 -m venv --upgrade-deps ${venv_dir} && \
python3 -m pip install openai==0.28.1

ENV PATH=${PATH}:/root/.local/bin

RUN if [ ! -z "${debug}" ]; then set -eux; fi && \
curl -sSL https://install.python-poetry.org | python3 -

WORKDIR /usr/local/src

RUN if [ ! -z "${debug}" ]; then set -eux; fi && \
git clone https://github.com/cpacker/MemGPT.git ./MemGPT && \
cd ./MemGPT && \
. ${venv_dir}/bin/activate && \
poetry install -E local

COPY ./docker/backends/entrypoint.sh /
20 changes: 20 additions & 0 deletions docker/backends/entrypoint.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
#!/usr/bin/env bash

[[ ! -z "${DEBUG}" ]]; set -eux

update-ca-certificates

#curl http://ollama:11434/api/tags
# curl -X POST http://ollama:11434/api/pull -d '{"name": "'${OLLAMA_MODEL}'"}'
# curl http://ollama:11434/api/generate -d '{"model": "'${OLLAMA_MODEL}'"}'

source ${VENV_DIR}/bin/activate

[[ ! -z "${DEBUG}" ]]; litellm --help
litellm --file /etc/litellm_config.yaml --port "${LITELLM_PORT}"

memgpt run -y --agent devin_memory \
--model "llama2" \
--model-endpoint-type ollama \
--model-endpoint http://ollama:11434 \
--debug
17 changes: 17 additions & 0 deletions docker/backends/litellm_config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
model_list:
- model_name: llama2
litellm_params:
model: ollama/llama2:70b
api_base: http://ollama:11434
- model_name: deepseek-coder
litellm_params:
model: ollama/deepseek-coder:6.7b
api_base: http://ollama:11434
- model_name: vicuna
litellm_params:
model: ollama/vicuna:7b-16k
api_base: http://ollama:11434
- model_name: mistral
litellm_params:
model: ollama/mistral:7b
api_base: http://ollama:11434
56 changes: 56 additions & 0 deletions docker/devin/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
FROM nvidia/cuda:12.3.2-devel-ubuntu22.04 as build-sandbox

LABEL org.opencontainers.image.description="Devin server with CUDA support"

ARG venv_dir
ARG debug
ARG conda_dir
ARG ws_dir
ARG node_version

ENV DEBUG=${debug}

RUN --mount=type=cache,target=/var/cache/apt \
if [ ! -z "${debug}" ]; then set -eux; fi && \
apt update && apt -qy upgrade

WORKDIR ${venv_dir}

COPY environment.yml .
COPY requirements.txt .

COPY opendevin .

ENV PATH=${PATH}:${conda_dir}/bin

SHELL ["/bin/bash", "-c"]

RUN --mount=type=cache,target=/var/cache/apt \
if [ ! -z "${debug}" ]; then set -eux; fi && \
apt update && apt -qy upgrade && \
apt install -qy --no-install-recommends wget ca-certificates build-essential && \
mkdir -p /tmp/miniconda3 && \
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-$(uname -i).sh -O /tmp/miniconda.sh && \
bash /tmp/miniconda.sh -b -u -p ${conda_dir} && \
sed -i 's|prefix:\s+.+|prefix: '${venv_dir}'|' ./environment.yml && \
conda env create -f environment.yml

SHELL ["${conda_dir}/bin/conda", "run", "--no-capture-output", "-n", "od", "python3", "-m", "pip", "install", "-r", "/tmp/requirements.txt"]
SHELL ["${conda_dir}/bin/conda", "run", "--no-capture-output", "-n", "od", "python3", "./opendevin/sandbox/sandbox.py", "-d", "${ws_dir}" ]

FROM node:18.20.1-alpine as build-front

ARG venv_dir

WORKDIR ${venv_dir}
COPY frontend .
WORKDIR ${venv_dir}/frontend
#COPY --from=build-sandbox ${venv_dir}/frontend ${venv_dir}/frontend

RUN --mount=type=cache,target=${HOME}/.local/share/pnpm/store \
apk update && apk add git && \
set -eux && \
cd ${venv_dir}/frontend && \
npm install -g pnpm && \
pnpm install
# pnpm run start
19 changes: 19 additions & 0 deletions docs/docker-compose.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Run Devin with `docker compose`

Simple start of Devin instance with support for Ollama models and MemGPT backends.

This will run the Devin server with UI and all backends it depends on, automatically 💻:
```bash
git clone https://github.com/OpenDevin/OpenDevin.git
docker compose up devin
```

## Services

- [LiteLLM Proxy Service](https://litellm.vercel.app/docs/): Call 100+ LLMs using the same Input/Output Format
- [MemGPT](https://memgpt.readme.io/docs/index): Allows you to build LLM agents with self-editing memory
Note: Generating MemGPT-compatible outputs is a harder task for an LLM than regular text output. For this reason we **strongly advise** users to **NOT use models below Q5 quantization** - as the model gets worse, the number of errors you will encounter while using MemGPT will dramatically increase (MemGPT will not send messages properly, edit memory properly, etc. [[Read here](https://memgpt.readme.io/docs/local_llm)]).


- [mitmproxy](https://docs.mitmproxy.org/stable/): A free and open source interactive HTTPS proxy. May be useful for developers

Loading