Skip to content

Commit

Permalink
Merge pull request #80 from dbpunk-labs/feat/rename
Browse files Browse the repository at this point in the history
Feat/rename
  • Loading branch information
imotai authored Sep 27, 2023
2 parents 71bdac1 + cae947e commit fe46b65
Show file tree
Hide file tree
Showing 57 changed files with 199 additions and 289 deletions.
2 changes: 1 addition & 1 deletion LICENSE
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
#MIT License
#
#Copyright (c) 2023 dbpunk.xyz
#Copyright (c) 2023 octogen.dev
#
#Permission is hereby granted, free of charge, to any person obtaining a copy
#of this software and associated documentation files (the "Software"), to deal
Expand Down
33 changes: 16 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,10 @@
<p align="center">
<img width="100px" src="https://github.com/dbpunk-labs/octopus/assets/8623385/6c60cb2b-415f-4979-9dc2-b8ce1958e17a" align="center"/>

![GitHub Workflow Status (with event)](https://img.shields.io/github/actions/workflow/status/dbpunk-labs/octopus/ci.yml?branch=main&style=flat-square)
![GitHub Workflow Status (with event)](https://img.shields.io/github/actions/workflow/status/dbpunk-labs/octogen/ci.yml?branch=main&style=flat-square)
[![Discord](https://badgen.net/badge/icon/discord?icon=discord&label)](https://discord.gg/UjSHsjaz66)
[![Twitter Follow](https://img.shields.io/twitter/follow/OCopilot7817?style=flat-square)](https://twitter.com/OCopilot7817)
[![PyPI - Version](https://img.shields.io/pypi/v/octopus_chat)](https://pypi.org/project/octopus-chat/)
![PyPI - Downloads](https://img.shields.io/pypi/dm/octopus_chat?logo=pypi)
[![PyPI - Version](https://img.shields.io/pypi/v/og_chat)](https://pypi.org/project/og-chat/)
![PyPI - Downloads](https://img.shields.io/pypi/dm/og_chat?logo=pypi)

[中文](./README_zh_cn.md)

Expand All @@ -28,10 +27,10 @@ Requirement
* pip
* docker 24.0.0 and above

> To deploy Octopus, the user needs permission to run Docker commands.
> To deploy Octogen, the user needs permission to run Docker commands.
> To use codellama, your host must have at least 8 CPUs and 16 GB of RAM.
Install the octopus on your local computer
Install the octogen on your local computer

1. Install og_up

Expand All @@ -46,9 +45,9 @@ pip install og_up
og_up
```

> You can choose the openai, azure openai, codellama and octopus agent sevice
> Ocotopus will download codellama from huggingface.co if you choose codellama
> If the installation of the Octopus Terminal CLI takes a long time, consider changing the pip mirror.
> You can choose the openai, azure openai, codellama and octogen agent sevice
> Octogen will download codellama from huggingface.co if you choose codellama
> If the installation of the Octogen Terminal CLI takes a long time, consider changing the pip mirror.
3. Open your terminal and execute the command `og`, you will see the following output

Expand All @@ -63,19 +62,19 @@ You can use /help to look for help

|name|type|status| installation|
|----|-----|----------------|---|
|[Openai GPT 3.5/4](https://openai.com/product#made-for-developers) |LLM| ✅ fully supported|use `octopus_up` then choose the `OpenAI`|
|[Azure Openai GPT 3.5/4](https://azure.microsoft.com/en-us/products/ai-services/openai-service) |LLM| ✅ fully supported|use `octopus_up` then choose the `Azure OpenAI`|
|[LLama.cpp Server](https://github.com/ggerganov/llama.cpp/tree/master/examples/server) |LLM| ✔️ supported | use `octopus_up` then choose the `CodeLlama` |
|[Octopus Agent Service](https://dbpunk.xyz) |Code Interpreter| ✅ supported | use `octopus_up` then choose the `Octopus` |
|[Openai GPT 3.5/4](https://openai.com/product#made-for-developers) |LLM| ✅ fully supported|use `og_up` then choose the `OpenAI`|
|[Azure Openai GPT 3.5/4](https://azure.microsoft.com/en-us/products/ai-services/openai-service) |LLM| ✅ fully supported|use `og_up` then choose the `Azure OpenAI`|
|[LLama.cpp Server](https://github.com/ggerganov/llama.cpp/tree/master/examples/server) |LLM| ✔️ supported | use `og_up` then choose the `CodeLlama` |
|[Octopus Agent Service](https://dbpunk.xyz) |Code Interpreter| ✅ supported | use `og_up` then choose the `Octogen` |


## The internal of Octopus

![octopus_simple](https://github.com/dbpunk-labs/octopus/assets/8623385/e5bfb3fb-74a5-4c60-8842-a81ee54fcb9d)

* Octopus Kernel: The code execution engine, based on notebook kernels.
* Octopus Agent: Manages client requests, uses ReAct to process complex tasks, and stores user-assembled applications.
* Octopus Terminal Cli: Accepts user requests, sends them to the Agent, and renders rich results. Currently supports Discord, iTerm2, and Kitty terminals.
* Octogen Kernel: The code execution engine, based on notebook kernels.
* Octogen Agent: Manages client requests, uses ReAct to process complex tasks, and stores user-assembled applications.
* Octogen Terminal Cli: Accepts user requests, sends them to the Agent, and renders rich results. Currently supports Discord, iTerm2, and Kitty terminals.

## Demo

Expand All @@ -94,6 +93,6 @@ if you have any feature suggestion. please create a discuession to talk about it

## Roadmap

* [roadmap for v0.5.0](https://github.com/dbpunk-labs/octopus/issues/64)
* [roadmap for v0.5.0](https://github.com/dbpunk-labs/octogen/issues/64)


18 changes: 9 additions & 9 deletions agent/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,24 +18,24 @@
from setuptools import setup, find_packages

setup(
name="octopus_agent",
name="og_agent",
version="0.3.6",
description="Open source code interpreter agent",
author="imotai",
author_email="[email protected]",
url="https://github.com/dbpunk-labs/octopus",
url="https://github.com/dbpunk-labs/octogen",
long_description=open("README.md").read(),
long_description_content_type="text/markdown",
packages=[
"octopus_agent",
"og_agent",
],
package_dir={
"octopus_agent": "src/octopus_agent",
"og_agent": "src/og_agent",
},
install_requires=[
"octopus_proto",
"octopus_kernel",
"octopus_sdk",
"og_proto",
"og_kernel",
"og_sdk",
"grpcio-tools>=1.57.0",
"grpc-google-iam-v1>=0.12.6",
"aiofiles",
Expand All @@ -48,8 +48,8 @@
package_data={"octopus_agent": ["*.bnf"]},
entry_points={
"console_scripts": [
"octopus_agent_rpc_server = octopus_agent.agent_server:server_main",
"octopus_agent_setup = octopus_agent.agent_setup:setup",
"og_agent_rpc_server = og_agent.agent_server:server_main",
"og_agent_setup = og_agent.agent_setup:setup",
]
},
)
49 changes: 0 additions & 49 deletions agent/src/octopus_agent/mock_tools.py

This file was deleted.

File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@

""" """
import json
from .prompt import OCTOPUS_FUNCTION_SYSTEM, OCTOPUS_CODELLAMA_SYSTEM
from .prompt import OCTOGEN_FUNCTION_SYSTEM, OCTOGEN_CODELLAMA_SYSTEM
from .codellama_agent import CodellamaAgent
from .openai_agent import OpenaiAgent
from .codellama_client import CodellamaClient
Expand All @@ -31,7 +31,7 @@ def build_codellama_agent(endpoint, key, sdk, grammer_path):
grammar = fd.read()

client = CodellamaClient(
endpoint, key, OCTOPUS_CODELLAMA_SYSTEM, "Octopus", "User", grammar
endpoint, key, OCTOGEN_CODELLAMA_SYSTEM, "Octogen", "User", grammar
)

# init the agent
Expand All @@ -43,7 +43,7 @@ def build_openai_agent(sdk, model_name, is_azure=True):
# TODO a data dir per user
# init the agent

agent = OpenaiAgent(model_name, OCTOPUS_FUNCTION_SYSTEM, sdk, is_azure=is_azure)
agent = OpenaiAgent(model_name, OCTOGEN_FUNCTION_SYSTEM, sdk, is_azure=is_azure)
return agent


Expand Down
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
""" """
import click
import asyncio
from octopus_sdk.agent_sdk import AgentSDK
from og_sdk.agent_sdk import AgentSDK


async def add_kernel(endpoint, api_key, kernel_endpoint, kernel_api_key):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,9 +20,9 @@
import logging
from typing import Any, Dict, List, Optional, Sequence, Union, Type
from pydantic import BaseModel, Field
from octopus_proto.kernel_server_pb2 import ExecuteResponse
from octopus_proto.agent_server_pb2 import OnAgentAction, TaskRespond, OnAgentActionEnd, FinalRespond
from octopus_sdk.utils import parse_image_filename, process_char_stream
from og_proto.kernel_server_pb2 import ExecuteResponse
from og_proto.agent_server_pb2 import OnAgentAction, TaskRespond, OnAgentActionEnd, FinalRespond
from og_sdk.utils import parse_image_filename, process_char_stream

logger = logging.getLogger(__name__)

Expand Down
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
import logging
import io
from .codellama_client import CodellamaClient
from octopus_proto.agent_server_pb2 import OnAgentAction, TaskRespond, OnAgentActionEnd, FinalRespond
from og_proto.agent_server_pb2 import OnAgentAction, TaskRespond, OnAgentActionEnd, FinalRespond
from .base_agent import BaseAgent, TypingState
from .tokenizer import tokenize

Expand Down Expand Up @@ -253,7 +253,7 @@ async def arun(self, question, queue, context, max_iteration=5):
)
)
history.append("User:%s" % current_question)
history.append("Octopus:%s\n" % ("".join(response)))
history.append("Octogen:%s\n" % ("".join(response)))
ins = "Check if the following output meets the goal. If it does, explain it and stop respond. Otherwise, try a new solution."
# TODO limit the output size
if function_result.has_result:
Expand Down
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -18,15 +18,15 @@

import logging
from .base_agent import BaseAgent, TypingState
from octopus_proto.agent_server_pb2 import OnAgentAction, TaskRespond, OnAgentActionEnd, FinalRespond
from og_proto.agent_server_pb2 import OnAgentAction, TaskRespond, OnAgentActionEnd, FinalRespond
from .tokenizer import tokenize

logger = logging.getLogger(__name__)


class MockAgent(BaseAgent):
"""
a test agent for octopus
a test agent for octogen
"""

def __init__(self, messages, sdk):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,13 +20,13 @@
import json
import logging
from pydantic import BaseModel, Field
from octopus_proto.agent_server_pb2 import OnAgentAction, TaskRespond, OnAgentActionEnd, FinalRespond
from og_proto.agent_server_pb2 import OnAgentAction, TaskRespond, OnAgentActionEnd, FinalRespond
from .base_agent import BaseAgent, TypingState
from .tokenizer import tokenize

logger = logging.getLogger(__name__)

OCTOPUS_FUNCTIONS = [
OCTOGEN_FUNCTIONS = [
{
"name": "execute_python_code",
"description": "Safely execute arbitrary Python code and return the result, stdout, and stderr.",
Expand Down Expand Up @@ -139,7 +139,7 @@ async def call_openai(self, messages, queue, context):
engine=self.model,
messages=messages,
temperature=0,
functions=OCTOPUS_FUNCTIONS,
functions=OCTOGEN_FUNCTIONS,
function_call="auto",
stream=True,
)
Expand All @@ -148,7 +148,7 @@ async def call_openai(self, messages, queue, context):
model=self.model,
messages=messages,
temperature=0,
functions=OCTOPUS_FUNCTIONS,
functions=OCTOGEN_FUNCTIONS,
function_call="auto",
stream=True,
)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
# limitations under the License.


OCTOPUS_FUNCTION_SYSTEM = """Firstly,You are the Programming Copilot called **Octopus**, a large language model designed to complete any goal by **executing code**
OCTOGEN_FUNCTION_SYSTEM = """Firstly,You are the Programming Copilot called **Octogen**, a large language model designed to complete any goal by **executing code**
Secondly, Being an expert in programming, you must follow the rules
* To complete the goal, You must write a plan and execute it step by step, the followings are examples
Expand All @@ -42,7 +42,7 @@
* wikipedia: a Python library that makes it easy to access and parse data from Wikipedia
"""

OCTOPUS_CODELLAMA_SYSTEM = """Firstly,You are the Programming Copilot called **Octopus**, a large language model designed to complete any goal by **executing code**
OCTOGEN_CODELLAMA_SYSTEM = """Firstly,You are the Programming Copilot called **Octogen**, a large language model designed to complete any goal by **executing code**
Secondly, Being an expert in programming, you must follow the rules
* To complete the goal, You must write a plan and execute it step by step, the followings are examples
Expand Down
File renamed without changes.
22 changes: 11 additions & 11 deletions chat/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,24 +18,24 @@
from setuptools import setup, find_packages

setup(
name="octopus_chat",
name="og_chat",
version="0.3.6",
description="the chat client for open source code interpreter octopus",
author="imotai",
author_email="wangtaize@dbpunk.com",
url="https://github.com/dbpunk-labs/octopus",
author_email="codego.me@gmail.com",
url="https://github.com/dbpunk-labs/octogen",
long_description=open("README.md").read(),
long_description_content_type="text/markdown",
packages=[
"octopus_discord",
"octopus_terminal",
"og_discord",
"og_terminal",
],
package_dir={
"octopus_discord": "src/octopus_discord",
"octopus_terminal": "src/octopus_terminal",
"og_discord": "src/og_discord",
"og_terminal": "src/og_terminal",
},
install_requires=[
"octopus_sdk>=0.1.0",
"og_sdk>=0.1.0",
"rich>=13.5.2",
"prompt_toolkit>=3.0.0",
"click>=8.0.0",
Expand All @@ -46,9 +46,9 @@
],
entry_points={
"console_scripts": [
"octopus = octopus_terminal.terminal_chat:app",
"octopus_ping = octopus_terminal.ping:app",
"octopus_discord_bot = octopus_discord.discord_chat:run_app",
"og = og_terminal.terminal_chat:app",
"og_ping = og_terminal.ping:app",
"og_discord_bot = og_discord.discord_chat:run_app",
]
},
)
File renamed without changes.
Loading

0 comments on commit fe46b65

Please sign in to comment.