From 870e4691eed036e29e3f3822d0c5c8c44036e62f Mon Sep 17 00:00:00 2001 From: yodamaster726 Date: Tue, 10 Dec 2024 12:45:17 -0600 Subject: [PATCH 1/5] chore: add how to startup chat ui --- docs/docs/guides/local-development.md | 20 +- docs/docs/quickstart copy.md | 314 ++++++++++++++++++++++++++ docs/docs/quickstart.md | 38 +++- 3 files changed, 365 insertions(+), 7 deletions(-) create mode 100644 docs/docs/quickstart copy.md diff --git a/docs/docs/guides/local-development.md b/docs/docs/guides/local-development.md index ec3929a944..3d872a5525 100644 --- a/docs/docs/guides/local-development.md +++ b/docs/docs/guides/local-development.md @@ -12,7 +12,7 @@ Before you begin, ensure you have: ```bash # Required -Node.js 22+ +Node.js 23+ pnpm Git @@ -94,6 +94,24 @@ pnpm run test:watch # Run tests in watch mode pnpm run lint # Lint code ``` +### Direct Client Chat UI + +``` +# Open a terminal and Start with specific character +pnpm run dev --characters="characters/my-character.json" +``` +``` +# Open a 2nd terminal and go to the client directory +cd client +pnpm install +pnpm run dev +``` + +Look for the message: +` ➜ Local: http://localhost:5173/` +Click on that link or open a browser window to that location. Once you do that you should see the chat interface connect with the system and you can start interacting with your character. + + ## Database Development ### SQLite (Recommended for Development) diff --git a/docs/docs/quickstart copy.md b/docs/docs/quickstart copy.md new file mode 100644 index 0000000000..ec50d2b90b --- /dev/null +++ b/docs/docs/quickstart copy.md @@ -0,0 +1,314 @@ +--- +sidebar_position: 2 +--- + +# Quickstart Guide + +## Prerequisites + +Before getting started with Eliza, ensure you have: + +- [Python 2.7+](https://www.python.org/downloads/) +- [Node.js 23+](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm) +- [pnpm 9+](https://pnpm.io/installation) +- Git for version control +- A code editor ([VS Code](https://code.visualstudio.com/) or [VSCodium](https://vscodium.com) recommended) +- [CUDA Toolkit](https://developer.nvidia.com/cuda-toolkit) (optional, for GPU acceleration) + +## Installation + +1. **Clone and Install** + + Please be sure to check what the [latest available stable version tag](https://github.com/ai16z/eliza/tags) is. + + Clone the repository + + ```bash + git clone https://github.com/ai16z/eliza.git + ``` + + Enter directory + + ```bash + cd eliza + ``` + + Switch to latest tagged release + + ```bash + # Checkout the latest release + # This project iterates fast, so we recommend checking out the latest release + git checkout $(git describe --tags --abbrev=0) + ``` + + Install dependencies + + ```bash + pnpm install + ``` + + Build the local libraries + + ```bash + pnpm build + ``` + +2. **Configure Environment** + + Copy example environment file + + ```bash + cp .env.example .env + ``` + + Edit `.env` and add your values: + + ```bash + # Suggested quickstart environment variables + DISCORD_APPLICATION_ID= # For Discord integration + DISCORD_API_TOKEN= # Bot token + HEURIST_API_KEY= # Heurist API key for LLM and image generation + OPENAI_API_KEY= # OpenAI API key + GROK_API_KEY= # Grok API key + ELEVENLABS_XI_API_KEY= # API key from elevenlabs (for voice) + ``` + +## Choose Your Model + +Eliza supports multiple AI models: + +- **Heurist**: Set `modelProvider: "heurist"` in your character file. Most models are uncensored. + - LLM: Select available LLMs [here](https://docs.heurist.ai/dev-guide/supported-models#large-language-models-llms) and configure `SMALL_HEURIST_MODEL`,`MEDIUM_HEURIST_MODEL`,`LARGE_HEURIST_MODEL` + - Image Generation: Select available Stable Diffusion or Flux models [here](https://docs.heurist.ai/dev-guide/supported-models#image-generation-models) and configure `HEURIST_IMAGE_MODEL` (default is FLUX.1-dev) +- **Llama**: Set `XAI_MODEL=meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo` +- **Grok**: Set `XAI_MODEL=grok-beta` +- **OpenAI**: Set `XAI_MODEL=gpt-4o-mini` or `gpt-4o` + +You set which model to use inside the character JSON file + +### Local inference + + #### For llama_local inference: + + 1. Set `XAI_MODEL` to your chosen model + 2. Leave `X_SERVER_URL` and `XAI_API_KEY` blank + 3. The system will automatically download the model from Hugging Face + 4. `LOCAL_LLAMA_PROVIDER` can be blank + + Note: llama_local requires a GPU, it currently will not work with CPU inference + + #### For Ollama inference: + + - If `OLLAMA_SERVER_URL` is left blank, it defaults to `localhost:11434` + - If `OLLAMA_EMBEDDING_MODE` is left blank, it defaults to `mxbai-embed-large` + +## Create Your First Agent + +1. **Create a Character File** + + Check out `characters/trump.character.json` or `characters/tate.character.json` as a template you can use to copy and customize your agent's personality and behavior. + Additionally you can read `core/src/core/defaultCharacter.ts` (in 0.0.10 but post-refactor will be in `packages/core/src/defaultCharacter.ts`) + + 📝 [Character Documentation](./core/characterfile.md) + +2. **Start the Agent** + + Inform it which character you want to run: + + ```bash + pnpm start --character="characters/trump.character.json" + ``` + + You can also load multiple characters with the characters option with a comma separated list: + + ```bash + pnpm start --characters="characters/trump.character.json,characters/tate.character.json" + ``` + +3. **Interact with the Agent** + + Now you're ready to start a conversation with your agent! Follow these steps: + +1. Open a new terminal window +2. Navigate to the client directory: + ```bash + cd client + ``` +3. Install dependencies (first time only): + ```bash + pnpm install + ``` +4. Start the UI client: + ```bash + pnpm run dev + ``` + + Once the client is running, you'll see a message like this: +``` +➜ Local: http://localhost:5173/ +``` + + Simply click the link or open your browser to `http://localhost:5173/`. You'll see the chat interface connect to the system, and you can begin interacting with your character. + +## Platform Integration + +### Discord Bot Setup + +1. Create a new application at [Discord Developer Portal](https://discord.com/developers/applications) +2. Create a bot and get your token +3. Add bot to your server using OAuth2 URL generator +4. Set `DISCORD_API_TOKEN` and `DISCORD_APPLICATION_ID` in your `.env` + +### Twitter Integration + +Add to your `.env`: + +```bash +TWITTER_USERNAME= # Account username +TWITTER_PASSWORD= # Account password +TWITTER_EMAIL= # Account email +TWITTER_COOKIES= # Account cookies (auth_token and CT0) +``` + +Example for TWITTER_COOKIES + +The TWITTER_COOKIES variable should be a JSON string containing the necessary cookies. You can find these cookies in your web browser's developer tools. Here is an example format: + +```bash +TWITTER_COOKIES='[{"key":"auth_token","value":"your token","domain":".twitter.com"}, + {"key":"ct0","value":"your ct0","domain":".twitter.com"}, + {"key":"guest_id","value":"your guest_id","domain":".twitter.com"}]' +``` + +### Telegram Bot + +1. Create a bot +2. Add your bot token to `.env`: + +```bash +TELEGRAM_BOT_TOKEN=your_token_here +``` + +## Optional: GPU Acceleration + +If you have an NVIDIA GPU: + +```bash +# Install CUDA support +npx --no node-llama-cpp source download --gpu cuda + +# Ensure CUDA Toolkit, cuDNN, and cuBLAS are installed +``` + +## Basic Usage Examples + +### Chat with Your Agent + +```bash +# Start chat interface +pnpm start +``` + +### Run Multiple Agents + +```bash +pnpm start --characters="characters/trump.character.json,characters/tate.character.json" +``` + +## Common Issues & Solutions + +1. **Node.js Version** + + - Ensure Node.js 23.3.0 is installed + - Use `node -v` to check version + - Consider using [nvm](https://github.com/nvm-sh/nvm) to manage Node versions + +2. **Sharp Installation** + If you see Sharp-related errors: + + ```bash + pnpm install --include=optional sharp + ``` + +3. **CUDA Setup** + + - Verify CUDA Toolkit installation + - Check GPU compatibility with toolkit + - Ensure proper environment variables are set + +4. **Exit Status 1** + If you see + + ``` + triggerUncaughtException( + ^ + [Object: null prototype] { + [Symbol(nodejs.util.inspect.custom)]: [Function: [nodejs.util.inspect.custom]] + } + ``` + + You can try these steps, which aim to add `@types/node` to various parts of the project + + ``` + # Add dependencies to workspace root + pnpm add -w -D ts-node typescript @types/node + + # Add dependencies to the agent package specifically + pnpm add -D ts-node typescript @types/node --filter "@ai16z/agent" + + # Also add to the core package since it's needed there too + pnpm add -D ts-node typescript @types/node --filter "@ai16z/eliza" + + # First clean everything + pnpm clean + + # Install all dependencies recursively + pnpm install -r + + # Build the project + pnpm build + + # Then try to start + pnpm start + ``` + +5. **Better sqlite3 was compiled against a different Node.js version** + If you see + + ``` + Error starting agents: Error: The module '.../eliza-agents/dv/eliza/node_modules/better-sqlite3/build/Release/better_sqlite3.node' + was compiled against a different Node.js version using + NODE_MODULE_VERSION 131. This version of Node.js requires + NODE_MODULE_VERSION 127. Please try re-compiling or re-installing + ``` + + You can try this, which will attempt to rebuild better-sqlite3. + + ```bash + pnpm rebuild better-sqlite3 + ``` + + If that doesn't work, try clearing your node_modules in the root folder + + ```bash + rm -fr node_modules; pnpm store prune + ``` + + Then reinstall the requirements + + ```bash + pnpm i + ``` + +## Next Steps + +Once you have your agent running, explore: + +1. 🤖 [Understand Agents](./core/agents.md) +2. 📝 [Create Custom Characters](./core/characterfile.md) +3. ⚡ [Add Custom Actions](./core/actions.md) +4. 🔧 [Advanced Configuration](./guides/configuration.md) + +For detailed API documentation, troubleshooting, and advanced features, check out our [full documentation](https://ai16z.github.io/eliza/). + +Join our [Discord community](https://discord.gg/ai16z) for support and updates! diff --git a/docs/docs/quickstart.md b/docs/docs/quickstart.md index f797b268f9..ec50d2b90b 100644 --- a/docs/docs/quickstart.md +++ b/docs/docs/quickstart.md @@ -8,8 +8,9 @@ sidebar_position: 2 Before getting started with Eliza, ensure you have: -- [Node.js 23.3.0](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm) -- [pnpm](https://pnpm.io/installation) +- [Python 2.7+](https://www.python.org/downloads/) +- [Node.js 23+](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm) +- [pnpm 9+](https://pnpm.io/installation) - Git for version control - A code editor ([VS Code](https://code.visualstudio.com/) or [VSCodium](https://vscodium.com) recommended) - [CUDA Toolkit](https://developer.nvidia.com/cuda-toolkit) (optional, for GPU acceleration) @@ -35,7 +36,9 @@ Before getting started with Eliza, ensure you have: Switch to latest tagged release ```bash - git checkout v0.0.10 + # Checkout the latest release + # This project iterates fast, so we recommend checking out the latest release + git checkout $(git describe --tags --abbrev=0) ``` Install dependencies @@ -75,7 +78,7 @@ Before getting started with Eliza, ensure you have: Eliza supports multiple AI models: - **Heurist**: Set `modelProvider: "heurist"` in your character file. Most models are uncensored. - - LLM: Select available LLMs [here](https://docs.heurist.ai/dev-guide/supported-models#large-language-models-llms) and configure `SMALL_HEURIST_LANGUAGE_MODEL`,`MEDIUM_HEURIST_LANGUAGE_MODEL`,`LARGE_HEURIST_LANGUAGE_MODEL` + - LLM: Select available LLMs [here](https://docs.heurist.ai/dev-guide/supported-models#large-language-models-llms) and configure `SMALL_HEURIST_MODEL`,`MEDIUM_HEURIST_MODEL`,`LARGE_HEURIST_MODEL` - Image Generation: Select available Stable Diffusion or Flux models [here](https://docs.heurist.ai/dev-guide/supported-models#image-generation-models) and configure `HEURIST_IMAGE_MODEL` (default is FLUX.1-dev) - **Llama**: Set `XAI_MODEL=meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo` - **Grok**: Set `XAI_MODEL=grok-beta` @@ -122,6 +125,31 @@ You set which model to use inside the character JSON file pnpm start --characters="characters/trump.character.json,characters/tate.character.json" ``` +3. **Interact with the Agent** + + Now you're ready to start a conversation with your agent! Follow these steps: + +1. Open a new terminal window +2. Navigate to the client directory: + ```bash + cd client + ``` +3. Install dependencies (first time only): + ```bash + pnpm install + ``` +4. Start the UI client: + ```bash + pnpm run dev + ``` + + Once the client is running, you'll see a message like this: +``` +➜ Local: http://localhost:5173/ +``` + + Simply click the link or open your browser to `http://localhost:5173/`. You'll see the chat interface connect to the system, and you can begin interacting with your character. + ## Platform Integration ### Discord Bot Setup @@ -152,8 +180,6 @@ TWITTER_COOKIES='[{"key":"auth_token","value":"your token","domain":".twitter.co {"key":"guest_id","value":"your guest_id","domain":".twitter.com"}]' ``` -Using TWITTER_COOKIES makes providing TWITTER_PASSWORD and TWITTER_EMAIL unnecessary. TWITTER_USERNAME is still required. - ### Telegram Bot 1. Create a bot From d97bf2179d65f8f1820f8459d995c35b41feee47 Mon Sep 17 00:00:00 2001 From: yodamaster726 Date: Tue, 10 Dec 2024 13:00:34 -0600 Subject: [PATCH 2/5] fix: remove copy file --- docs/docs/quickstart copy.md | 314 ----------------------------------- 1 file changed, 314 deletions(-) delete mode 100644 docs/docs/quickstart copy.md diff --git a/docs/docs/quickstart copy.md b/docs/docs/quickstart copy.md deleted file mode 100644 index ec50d2b90b..0000000000 --- a/docs/docs/quickstart copy.md +++ /dev/null @@ -1,314 +0,0 @@ ---- -sidebar_position: 2 ---- - -# Quickstart Guide - -## Prerequisites - -Before getting started with Eliza, ensure you have: - -- [Python 2.7+](https://www.python.org/downloads/) -- [Node.js 23+](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm) -- [pnpm 9+](https://pnpm.io/installation) -- Git for version control -- A code editor ([VS Code](https://code.visualstudio.com/) or [VSCodium](https://vscodium.com) recommended) -- [CUDA Toolkit](https://developer.nvidia.com/cuda-toolkit) (optional, for GPU acceleration) - -## Installation - -1. **Clone and Install** - - Please be sure to check what the [latest available stable version tag](https://github.com/ai16z/eliza/tags) is. - - Clone the repository - - ```bash - git clone https://github.com/ai16z/eliza.git - ``` - - Enter directory - - ```bash - cd eliza - ``` - - Switch to latest tagged release - - ```bash - # Checkout the latest release - # This project iterates fast, so we recommend checking out the latest release - git checkout $(git describe --tags --abbrev=0) - ``` - - Install dependencies - - ```bash - pnpm install - ``` - - Build the local libraries - - ```bash - pnpm build - ``` - -2. **Configure Environment** - - Copy example environment file - - ```bash - cp .env.example .env - ``` - - Edit `.env` and add your values: - - ```bash - # Suggested quickstart environment variables - DISCORD_APPLICATION_ID= # For Discord integration - DISCORD_API_TOKEN= # Bot token - HEURIST_API_KEY= # Heurist API key for LLM and image generation - OPENAI_API_KEY= # OpenAI API key - GROK_API_KEY= # Grok API key - ELEVENLABS_XI_API_KEY= # API key from elevenlabs (for voice) - ``` - -## Choose Your Model - -Eliza supports multiple AI models: - -- **Heurist**: Set `modelProvider: "heurist"` in your character file. Most models are uncensored. - - LLM: Select available LLMs [here](https://docs.heurist.ai/dev-guide/supported-models#large-language-models-llms) and configure `SMALL_HEURIST_MODEL`,`MEDIUM_HEURIST_MODEL`,`LARGE_HEURIST_MODEL` - - Image Generation: Select available Stable Diffusion or Flux models [here](https://docs.heurist.ai/dev-guide/supported-models#image-generation-models) and configure `HEURIST_IMAGE_MODEL` (default is FLUX.1-dev) -- **Llama**: Set `XAI_MODEL=meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo` -- **Grok**: Set `XAI_MODEL=grok-beta` -- **OpenAI**: Set `XAI_MODEL=gpt-4o-mini` or `gpt-4o` - -You set which model to use inside the character JSON file - -### Local inference - - #### For llama_local inference: - - 1. Set `XAI_MODEL` to your chosen model - 2. Leave `X_SERVER_URL` and `XAI_API_KEY` blank - 3. The system will automatically download the model from Hugging Face - 4. `LOCAL_LLAMA_PROVIDER` can be blank - - Note: llama_local requires a GPU, it currently will not work with CPU inference - - #### For Ollama inference: - - - If `OLLAMA_SERVER_URL` is left blank, it defaults to `localhost:11434` - - If `OLLAMA_EMBEDDING_MODE` is left blank, it defaults to `mxbai-embed-large` - -## Create Your First Agent - -1. **Create a Character File** - - Check out `characters/trump.character.json` or `characters/tate.character.json` as a template you can use to copy and customize your agent's personality and behavior. - Additionally you can read `core/src/core/defaultCharacter.ts` (in 0.0.10 but post-refactor will be in `packages/core/src/defaultCharacter.ts`) - - 📝 [Character Documentation](./core/characterfile.md) - -2. **Start the Agent** - - Inform it which character you want to run: - - ```bash - pnpm start --character="characters/trump.character.json" - ``` - - You can also load multiple characters with the characters option with a comma separated list: - - ```bash - pnpm start --characters="characters/trump.character.json,characters/tate.character.json" - ``` - -3. **Interact with the Agent** - - Now you're ready to start a conversation with your agent! Follow these steps: - -1. Open a new terminal window -2. Navigate to the client directory: - ```bash - cd client - ``` -3. Install dependencies (first time only): - ```bash - pnpm install - ``` -4. Start the UI client: - ```bash - pnpm run dev - ``` - - Once the client is running, you'll see a message like this: -``` -➜ Local: http://localhost:5173/ -``` - - Simply click the link or open your browser to `http://localhost:5173/`. You'll see the chat interface connect to the system, and you can begin interacting with your character. - -## Platform Integration - -### Discord Bot Setup - -1. Create a new application at [Discord Developer Portal](https://discord.com/developers/applications) -2. Create a bot and get your token -3. Add bot to your server using OAuth2 URL generator -4. Set `DISCORD_API_TOKEN` and `DISCORD_APPLICATION_ID` in your `.env` - -### Twitter Integration - -Add to your `.env`: - -```bash -TWITTER_USERNAME= # Account username -TWITTER_PASSWORD= # Account password -TWITTER_EMAIL= # Account email -TWITTER_COOKIES= # Account cookies (auth_token and CT0) -``` - -Example for TWITTER_COOKIES - -The TWITTER_COOKIES variable should be a JSON string containing the necessary cookies. You can find these cookies in your web browser's developer tools. Here is an example format: - -```bash -TWITTER_COOKIES='[{"key":"auth_token","value":"your token","domain":".twitter.com"}, - {"key":"ct0","value":"your ct0","domain":".twitter.com"}, - {"key":"guest_id","value":"your guest_id","domain":".twitter.com"}]' -``` - -### Telegram Bot - -1. Create a bot -2. Add your bot token to `.env`: - -```bash -TELEGRAM_BOT_TOKEN=your_token_here -``` - -## Optional: GPU Acceleration - -If you have an NVIDIA GPU: - -```bash -# Install CUDA support -npx --no node-llama-cpp source download --gpu cuda - -# Ensure CUDA Toolkit, cuDNN, and cuBLAS are installed -``` - -## Basic Usage Examples - -### Chat with Your Agent - -```bash -# Start chat interface -pnpm start -``` - -### Run Multiple Agents - -```bash -pnpm start --characters="characters/trump.character.json,characters/tate.character.json" -``` - -## Common Issues & Solutions - -1. **Node.js Version** - - - Ensure Node.js 23.3.0 is installed - - Use `node -v` to check version - - Consider using [nvm](https://github.com/nvm-sh/nvm) to manage Node versions - -2. **Sharp Installation** - If you see Sharp-related errors: - - ```bash - pnpm install --include=optional sharp - ``` - -3. **CUDA Setup** - - - Verify CUDA Toolkit installation - - Check GPU compatibility with toolkit - - Ensure proper environment variables are set - -4. **Exit Status 1** - If you see - - ``` - triggerUncaughtException( - ^ - [Object: null prototype] { - [Symbol(nodejs.util.inspect.custom)]: [Function: [nodejs.util.inspect.custom]] - } - ``` - - You can try these steps, which aim to add `@types/node` to various parts of the project - - ``` - # Add dependencies to workspace root - pnpm add -w -D ts-node typescript @types/node - - # Add dependencies to the agent package specifically - pnpm add -D ts-node typescript @types/node --filter "@ai16z/agent" - - # Also add to the core package since it's needed there too - pnpm add -D ts-node typescript @types/node --filter "@ai16z/eliza" - - # First clean everything - pnpm clean - - # Install all dependencies recursively - pnpm install -r - - # Build the project - pnpm build - - # Then try to start - pnpm start - ``` - -5. **Better sqlite3 was compiled against a different Node.js version** - If you see - - ``` - Error starting agents: Error: The module '.../eliza-agents/dv/eliza/node_modules/better-sqlite3/build/Release/better_sqlite3.node' - was compiled against a different Node.js version using - NODE_MODULE_VERSION 131. This version of Node.js requires - NODE_MODULE_VERSION 127. Please try re-compiling or re-installing - ``` - - You can try this, which will attempt to rebuild better-sqlite3. - - ```bash - pnpm rebuild better-sqlite3 - ``` - - If that doesn't work, try clearing your node_modules in the root folder - - ```bash - rm -fr node_modules; pnpm store prune - ``` - - Then reinstall the requirements - - ```bash - pnpm i - ``` - -## Next Steps - -Once you have your agent running, explore: - -1. 🤖 [Understand Agents](./core/agents.md) -2. 📝 [Create Custom Characters](./core/characterfile.md) -3. ⚡ [Add Custom Actions](./core/actions.md) -4. 🔧 [Advanced Configuration](./guides/configuration.md) - -For detailed API documentation, troubleshooting, and advanced features, check out our [full documentation](https://ai16z.github.io/eliza/). - -Join our [Discord community](https://discord.gg/ai16z) for support and updates! From 55c1843de95c8ae2045044cbefdc453bcedd920c Mon Sep 17 00:00:00 2001 From: yodamaster726 Date: Tue, 10 Dec 2024 21:21:40 -0600 Subject: [PATCH 3/5] fix: remove python reference --- docs/docs/quickstart.md | 2 -- 1 file changed, 2 deletions(-) diff --git a/docs/docs/quickstart.md b/docs/docs/quickstart.md index ec50d2b90b..687d0fcec8 100644 --- a/docs/docs/quickstart.md +++ b/docs/docs/quickstart.md @@ -7,8 +7,6 @@ sidebar_position: 2 ## Prerequisites Before getting started with Eliza, ensure you have: - -- [Python 2.7+](https://www.python.org/downloads/) - [Node.js 23+](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm) - [pnpm 9+](https://pnpm.io/installation) - Git for version control From 030fe4cb7615f57f952f354e4077b98990f80c24 Mon Sep 17 00:00:00 2001 From: yodamaster726 Date: Tue, 10 Dec 2024 21:52:13 -0600 Subject: [PATCH 4/5] chore: update readme to show how to how to run client ui and update package.json to start ui client --- docs/docs/guides/local-development.md | 4 +--- docs/docs/quickstart.md | 15 +++------------ package.json | 2 +- 3 files changed, 5 insertions(+), 16 deletions(-) diff --git a/docs/docs/guides/local-development.md b/docs/docs/guides/local-development.md index 3d872a5525..0f7eec829a 100644 --- a/docs/docs/guides/local-development.md +++ b/docs/docs/guides/local-development.md @@ -102,9 +102,7 @@ pnpm run dev --characters="characters/my-character.json" ``` ``` # Open a 2nd terminal and go to the client directory -cd client -pnpm install -pnpm run dev +pnpm start:client ``` Look for the message: diff --git a/docs/docs/quickstart.md b/docs/docs/quickstart.md index 687d0fcec8..206e148602 100644 --- a/docs/docs/quickstart.md +++ b/docs/docs/quickstart.md @@ -125,20 +125,11 @@ You set which model to use inside the character JSON file 3. **Interact with the Agent** - Now you're ready to start a conversation with your agent! Follow these steps: + Now you're ready to start a conversation with your agent! + Open a new terminal window -1. Open a new terminal window -2. Navigate to the client directory: ```bash - cd client - ``` -3. Install dependencies (first time only): - ```bash - pnpm install - ``` -4. Start the UI client: - ```bash - pnpm run dev + pnpm start:client ``` Once the client is running, you'll see a message like this: diff --git a/package.json b/package.json index fef7142766..c0bf2289ed 100644 --- a/package.json +++ b/package.json @@ -4,7 +4,7 @@ "preinstall": "npx only-allow pnpm", "build": "turbo run build", "start": "pnpm --filter \"@ai16z/agent\" start --isRoot", - "start:client": "pnpm --dir client start --isRoot", + "start:client": "pnpm --dir client dev", "start:debug": "cross-env NODE_ENV=development VERBOSE=true DEBUG=eliza:* pnpm --filter \"@ai16z/agent\" start --isRoot", "dev": "turbo check-types dev --concurrency 25", "lint": "bash ./scripts/lint.sh", From 8a5c1a5e0d2d7bc388f542c46492fbc117864d97 Mon Sep 17 00:00:00 2001 From: yodamaster726 Date: Wed, 11 Dec 2024 08:22:31 -0600 Subject: [PATCH 5/5] fix: update text --- docs/docs/guides/local-development.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/docs/guides/local-development.md b/docs/docs/guides/local-development.md index 0f7eec829a..26f66dec59 100644 --- a/docs/docs/guides/local-development.md +++ b/docs/docs/guides/local-development.md @@ -101,7 +101,7 @@ pnpm run lint # Lint code pnpm run dev --characters="characters/my-character.json" ``` ``` -# Open a 2nd terminal and go to the client directory +# Open a 2nd terminal and start the client pnpm start:client ```