Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/upstream' into upstream
Browse files Browse the repository at this point in the history
  • Loading branch information
lehcode committed Apr 17, 2024
2 parents 0474c7a + b0899c8 commit 12d1f9d
Show file tree
Hide file tree
Showing 114 changed files with 10,921 additions and 2,233 deletions.
3 changes: 3 additions & 0 deletions .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,11 +16,14 @@ assignees: ''
```bash
```

* Operating System:

<!-- tell us everything about your environment -->
**My config.toml and environment vars** (be sure to redact API keys):
```toml
```


**My model and agent** (you can see these settings in the UI):
* Model:
* Agent:
Expand Down
16 changes: 16 additions & 0 deletions .github/ISSUE_TEMPLATE/question.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
---
name: Question
about: Use this template to ask a question regarding the project.
title: ''
labels: question
assignees: ''

---

## Describe your question

<!--A clear and concise description of what you want to know.-->

## Additional context

<!--Add any other context about the question here, like what you've tried so far.-->
1 change: 1 addition & 0 deletions .github/workflows/dogfood.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ jobs:
open-devin:
if: github.event.label.name == 'dogfood-this'
runs-on: ubuntu-latest
environment: OpenAI

steps:
- name: Checkout Repository
Expand Down
12 changes: 4 additions & 8 deletions .github/workflows/lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,11 +32,7 @@ jobs:
uses: actions/setup-python@v5
with:
python-version: 3.11
- name: Create mypy cache directory
run: mkdir -p .mypy_cache
- name: Install dependencies
run: pip install ruff mypy==1.9.0 types-PyYAML types-toml
- name: Run mypy
run: python -m mypy --install-types --non-interactive --config-file dev_config/python/mypy.ini opendevin/ agenthub/
- name: Run ruff
run: ruff check --config dev_config/python/ruff.toml opendevin/ agenthub/
- name: Install pre-commit
run: pip install pre-commit==3.7.0
- name: Run pre-commit hooks
run: pre-commit run --files opendevin/**/* agenthub/**/* --show-diff-on-failure --config ./dev_config/python/.pre-commit-config.yaml
29 changes: 29 additions & 0 deletions .github/workflows/stale.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
name: 'Close stale issues'
on:
schedule:
- cron: '30 1 * * *'

jobs:
stale:
runs-on: ubuntu-latest
steps:
- uses: actions/stale@v9
with:
# Aggressively close issues that have been explicitly labeled `age-out`
any-of-labels: age-out
stale-issue-message: 'This issue is stale because it has been open for 7 days with no activity. Remove stale label or comment or this will be closed in 1 day.'
close-issue-message: 'This issue was closed because it has been stalled for over 7 days with no activity.'
stale-pr-message: 'This PR is stale because it has been open for 7 days with no activity. Remove stale label or comment or this will be closed in 1 days.'
close-pr-message: 'This PR was closed because it has been stalled for over 7 days with no activity.'
days-before-stale: 7
days-before-close: 1

- uses: actions/stale@v9
with:
# Be more lenient with other issues
stale-issue-message: 'This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.'
close-issue-message: 'This issue was closed because it has been stalled for over 30 days with no activity.'
stale-pr-message: 'This PR is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.'
close-pr-message: 'This PR was closed because it has been stalled for over 30 days with no activity.'
days-before-stale: 30
days-before-close: 7
66 changes: 66 additions & 0 deletions Development.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
# Development Guide
This guide is for people working on OpenDevin and editing the source code.

## Start the server for development

### 1. Requirements
* Linux, Mac OS, or [WSL on Windows](https://learn.microsoft.com/en-us/windows/wsl/install)
* [Docker](https://docs.docker.com/engine/install/)(For those on MacOS, make sure to allow the default Docker socket to be used from advanced settings!)
* [Python](https://www.python.org/downloads/) >= 3.11
* [NodeJS](https://nodejs.org/en/download/package-manager) >= 18.17.1
* [Poetry](https://python-poetry.org/docs/#installing-with-the-official-installer) >= 1.8

Make sure you have all these dependencies installed before moving on to `make build`.

### 2. Build and Setup The Environment

- **Build the Project:** Begin by building the project, which includes setting up the environment and installing dependencies. This step ensures that OpenDevin is ready to run smoothly on your system.
```bash
make build
```

### 3. Configuring the Language Model

OpenDevin supports a diverse array of Language Models (LMs) through the powerful [litellm](https://docs.litellm.ai) library. By default, we've chosen the mighty GPT-4 from OpenAI as our go-to model, but the world is your oyster! You can unleash the potential of Anthropic's suave Claude, the enigmatic Llama, or any other LM that piques your interest.

To configure the LM of your choice, follow these steps:

1. **Using the Makefile: The Effortless Approach**
With a single command, you can have a smooth LM setup for your OpenDevin experience. Simply run:
```bash
make setup-config
```
This command will prompt you to enter the LLM API key and model name, ensuring that OpenDevin is tailored to your specific needs.

**Note on Alternative Models:**
Some alternative models may prove more challenging to tame than others. Fear not, brave adventurer! We shall soon unveil LLM-specific documentation to guide you on your quest. And if you've already mastered the art of wielding a model other than OpenAI's GPT, we encourage you to [share your setup instructions with us](https://github.com/OpenDevin/OpenDevin/issues/417).

For a full list of the LM providers and models available, please consult the [litellm documentation](https://docs.litellm.ai/docs/providers).

There is also [documentation for running with local models using ollama](./docs/documentation/LOCAL_LLM_GUIDE.md).

### 4. Run the Application

- **Run the Application:** Once the setup is complete, launching OpenDevin is as simple as running a single command. This command starts both the backend and frontend servers seamlessly, allowing you to interact with OpenDevin without any hassle.
```bash
make run
```

### 5. Individual Server Startup

- **Start the Backend Server:** If you prefer, you can start the backend server independently to focus on backend-related tasks or configurations.
```bash
make start-backend
```

- **Start the Frontend Server:** Similarly, you can start the frontend server on its own to work on frontend-related components or interface enhancements.
```bash
make start-frontend
```

### 6. Help

- **Get Some Help:** Need assistance or information on available targets and commands? The help command provides all the necessary guidance to ensure a smooth experience with OpenDevin.
```bash
make help
```
144 changes: 0 additions & 144 deletions Makefile

This file was deleted.

Loading

0 comments on commit 12d1f9d

Please sign in to comment.