core[minor]: Make LLMs and chat models always stream when invoked within streamEvents #33
Annotations
7 errors and 1 warning
Build
@langchain/core#build: command (/home/runner/work/langchainjs/langchainjs/langchain-core) /tmp/xfs-c315dc7b/yarn run build exited (1)
|
Build
'FakeStreamingLLM' is declared but its value is never read.
|
Build
All imports in import declaration are unused.
|
Build
'CommaSeparatedListOutputParser' is declared but its value is never read.
|
Build
'ChatPromptValue' is declared but its value is never read.
|
Build
@langchain/core#build:internal: command (/home/runner/work/langchainjs/langchainjs/langchain-core) /tmp/xfs-5fd7d77b/yarn run build:internal exited (1)
|
Build
Process completed with exit code 1.
|
Build
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: actions/setup-node@v3. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.
|