Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[pull] main from lobehub:main #133

Merged
merged 5 commits into from
Dec 26, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 25 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,31 @@

# Changelog

### [Version 1.40.2](https://github.com/lobehub/lobe-chat/compare/v1.40.1...v1.40.2)

<sup>Released on **2024-12-26**</sup>

#### ♻ Code Refactoring

- **misc**: Refactor `tokens` to `contextWindowTokens`.

<br/>

<details>
<summary><kbd>Improvements and Fixes</kbd></summary>

#### Code refactoring

- **misc**: Refactor `tokens` to `contextWindowTokens`, closes [#5185](https://github.com/lobehub/lobe-chat/issues/5185) ([a2aa99a](https://github.com/lobehub/lobe-chat/commit/a2aa99a))

</details>

<div align="right">

[![](https://img.shields.io/badge/-BACK_TO_TOP-151515?style=flat-square)](#readme-top)

</div>

### [Version 1.40.1](https://github.com/lobehub/lobe-chat/compare/v1.40.0...v1.40.1)

<sup>Released on **2024-12-26**</sup>
Expand Down
7 changes: 7 additions & 0 deletions changelog/v1.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,11 @@
[
{
"children": {
"improvements": ["Refactor tokens to contextWindowTokens."]
},
"date": "2024-12-26",
"version": "1.40.2"
},
{
"children": {
"fixes": ["Fix o1Models list."]
Expand Down
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@lobehub/chat",
"version": "1.40.1",
"version": "1.40.2",
"description": "Lobe Chat - an open-source, high-performance chatbot framework that supports speech synthesis, multimodal, and extensible Function Call plugin system. Supports one-click free deployment of your private ChatGPT/LLM web application.",
"keywords": [
"framework",
Expand Down
4 changes: 3 additions & 1 deletion src/app/(main)/changelog/page.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,8 @@ const Page = async () => {
const changelogService = new ChangelogService();
const data = await changelogService.getChangelogIndex();

if (!data) return notFound();

const ld = ldModule.generate({
description: t('changelog.description', { appName: BRANDING_NAME }),
title: t('changelog.title', { appName: BRANDING_NAME }),
Expand All @@ -48,7 +50,7 @@ const Page = async () => {
<>
<StructuredData ld={ld} />
<Flexbox gap={mobile ? 16 : 48}>
{data.map((item) => (
{data?.map((item) => (
<Fragment key={item.id}>
<Suspense
fallback={
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ const ModelActions = memo<ModelActionsProps>(({ identifier, providerData, data }
tags: (
<ModelFeatureTags
functionCall={data.meta.functionCall}
tokens={data.meta.tokens}
tokens={data.meta.contextWindowTokens}
vision={data.meta.vision}
/>
),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ const Header = memo<HeaderProps>(({ identifier, data, mobile }) => {
{data.meta.description && <div>{t(`${identifier}.description`, { ns: 'models' })}</div>}
<ModelFeatureTags
functionCall={data.meta.functionCall}
tokens={data.meta.tokens}
tokens={data.meta.contextWindowTokens}
vision={data.meta.vision}
/>
</Flexbox>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ export interface SuggestionItemProps
FlexboxProps {}

const SuggestionItem = memo<SuggestionItemProps>(({ className, meta, identifier, ...rest }) => {
const { title, description, tokens, vision, functionCall } = meta;
const { title, description, contextWindowTokens, vision, functionCall } = meta;
const { t } = useTranslation('models');
const { cx, styles } = useStyles();

Expand All @@ -67,7 +67,7 @@ const SuggestionItem = memo<SuggestionItemProps>(({ className, meta, identifier,
{t(`${identifier}.description`)}
</Paragraph>
)}
<ModelFeatureTags functionCall={functionCall} tokens={tokens} vision={vision} />
<ModelFeatureTags functionCall={functionCall} tokens={contextWindowTokens} vision={vision} />
</Flexbox>
);
});
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ const ProviderItem = memo<ProviderItemProps>(({ mobile, modelId, identifier }) =
const items: StatisticProps[] = [
{
title: t('models.contentLength'),
value: model?.tokens ? formatTokenNumber(model.tokens) : '--',
value: model?.contextWindowTokens ? formatTokenNumber(model.contextWindowTokens) : '--',
},
{
title: t('models.providerInfo.maxOutput'),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ export interface SuggestionItemProps
}

const ModelItem = memo<SuggestionItemProps>(({ mobile, meta, identifier }) => {
const { title, tokens, vision, functionCall } = meta;
const { title, contextWindowTokens, vision, functionCall } = meta;
const { xl = true } = useResponsive();
const { t } = useTranslation('discover');
const { styles, theme } = useStyles();
Expand All @@ -57,7 +57,7 @@ const ModelItem = memo<SuggestionItemProps>(({ mobile, meta, identifier }) => {
const items: StatisticProps[] = [
{
title: t('models.contentLength'),
value: meta?.tokens ? formatTokenNumber(meta.tokens) : '--',
value: meta?.contextWindowTokens ? formatTokenNumber(meta.contextWindowTokens) : '--',
},
{
title: t('models.providerInfo.maxOutput'),
Expand Down Expand Up @@ -98,7 +98,7 @@ const ModelItem = memo<SuggestionItemProps>(({ mobile, meta, identifier }) => {
</Flexbox>
</Flexbox>
</Link>
<ModelFeatureTags functionCall={functionCall} tokens={tokens} vision={vision} />
<ModelFeatureTags functionCall={functionCall} tokens={contextWindowTokens} vision={vision} />
</Flexbox>
);

Expand Down
8 changes: 6 additions & 2 deletions src/app/(main)/discover/(list)/models/features/Card.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ export interface ModelCardProps extends DiscoverModelItem, FlexboxProps {
}

const ModelCard = memo<ModelCardProps>(({ className, meta, identifier, ...rest }) => {
const { description, title, functionCall, vision, tokens } = meta;
const { description, title, functionCall, vision, contextWindowTokens } = meta;
const { t } = useTranslation('models');
const { cx, styles } = useStyles();

Expand Down Expand Up @@ -107,7 +107,11 @@ const ModelCard = memo<ModelCardProps>(({ className, meta, identifier, ...rest }
</Paragraph>
)}

<ModelFeatureTags functionCall={functionCall} tokens={tokens} vision={vision} />
<ModelFeatureTags
functionCall={functionCall}
tokens={contextWindowTokens}
vision={vision}
/>
</Flexbox>
</Flexbox>
);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,11 +29,7 @@ export const useCloudflareProvider = (): ProviderItem => {
name: [KeyVaultsConfigKey, providerKey, 'apiKey'],
},
{
children: (
<Input
placeholder={t(`${providerKey}.baseURLOrAccountID.placeholder`)}
/>
),
children: <Input placeholder={t(`${providerKey}.baseURLOrAccountID.placeholder`)} />,
desc: t(`${providerKey}.baseURLOrAccountID.desc`),
label: t(`${providerKey}.baseURLOrAccountID.title`),
name: [KeyVaultsConfigKey, providerKey, 'baseURLOrAccountID'],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -80,8 +80,8 @@ const ModelFetcher = memo<ModelFetcherProps>(({ provider }) => {
title={
latestFetchTime
? t('llm.fetcher.latestTime', {
time: dayjs(latestFetchTime).format('YYYY-MM-DD HH:mm:ss'),
})
time: dayjs(latestFetchTime).format('YYYY-MM-DD HH:mm:ss'),
})
: t('llm.fetcher.noLatestTime')
}
>
Expand Down
4 changes: 3 additions & 1 deletion src/app/@modal/(.)changelog/modal/page.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -20,9 +20,11 @@ const Page = async () => {
const changelogService = new ChangelogService();
const data = await changelogService.getChangelogIndex();

if (!data) return notFound();

return (
<>
{data.map((item) => (
{data?.map((item) => (
<Suspense fallback={<Loading />} key={item.id}>
<Post locale={locale} mobile={mobile} {...item} />
</Suspense>
Expand Down
11 changes: 7 additions & 4 deletions src/components/ModelSelect/index.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -102,19 +102,22 @@ export const ModelInfoTags = memo<ModelInfoTagsProps>(
</div>
</Tooltip>
)}
{model.tokens !== undefined && (
{model.contextWindowTokens !== undefined && (
<Tooltip
overlayStyle={{ maxWidth: 'unset', pointerEvents: 'none' }}
placement={placement}
title={t('ModelSelect.featureTag.tokens', {
tokens: model.tokens === 0 ? '∞' : numeral(model.tokens).format('0,0'),
tokens:
model.contextWindowTokens === 0
? '∞'
: numeral(model.contextWindowTokens).format('0,0'),
})}
>
<Center className={styles.token} title="">
{model.tokens === 0 ? (
{model.contextWindowTokens === 0 ? (
<Infinity size={17} strokeWidth={1.6} />
) : (
formatTokenNumber(model.tokens)
formatTokenNumber(model.contextWindowTokens)
)}
</Center>
</Tooltip>
Expand Down
8 changes: 6 additions & 2 deletions src/config/__tests__/app.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,9 @@ describe('getServerConfig', () => {
describe('index url', () => {
it('should return default URLs when no environment variables are set', () => {
const config = getAppConfig();
expect(config.AGENTS_INDEX_URL).toBe('https://registry.npmmirror.com/@lobehub/agents-index/v1/files/public');
expect(config.AGENTS_INDEX_URL).toBe(
'https://registry.npmmirror.com/@lobehub/agents-index/v1/files/public',
);
expect(config.PLUGINS_INDEX_URL).toBe('https://chat-plugins.lobehub.com');
});

Expand All @@ -41,7 +43,9 @@ describe('getServerConfig', () => {
process.env.PLUGINS_INDEX_URL = '';

const config = getAppConfig();
expect(config.AGENTS_INDEX_URL).toBe('https://registry.npmmirror.com/@lobehub/agents-index/v1/files/public');
expect(config.AGENTS_INDEX_URL).toBe(
'https://registry.npmmirror.com/@lobehub/agents-index/v1/files/public',
);
expect(config.PLUGINS_INDEX_URL).toBe('https://chat-plugins.lobehub.com');
});
});
Expand Down
3 changes: 1 addition & 2 deletions src/config/app.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,7 @@ if (typeof window === 'undefined' && isServerMode && !APP_URL) {
throw new Error('`APP_URL` is required in server mode');
}

const ASSISTANT_INDEX_URL =
'https://registry.npmmirror.com/@lobehub/agents-index/v1/files/public';
const ASSISTANT_INDEX_URL = 'https://registry.npmmirror.com/@lobehub/agents-index/v1/files/public';

const PLUGINS_INDEX_URL = 'https://chat-plugins.lobehub.com';

Expand Down
4 changes: 2 additions & 2 deletions src/config/modelProviders/ai21.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ import { ModelProviderCard } from '@/types/llm';
const Ai21: ModelProviderCard = {
chatModels: [
{
contextWindowTokens: 256_000,
displayName: 'Jamba 1.5 Mini',
enabled: true,
functionCall: true,
Expand All @@ -12,9 +13,9 @@ const Ai21: ModelProviderCard = {
input: 0.2,
output: 0.4,
},
tokens: 256_000,
},
{
contextWindowTokens: 256_000,
displayName: 'Jamba 1.5 Large',
enabled: true,
functionCall: true,
Expand All @@ -23,7 +24,6 @@ const Ai21: ModelProviderCard = {
input: 2,
output: 8,
},
tokens: 256_000,
},
],
checkModel: 'jamba-1.5-mini',
Expand Down
8 changes: 4 additions & 4 deletions src/config/modelProviders/ai360.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ import { ModelProviderCard } from '@/types/llm';
const Ai360: ModelProviderCard = {
chatModels: [
{
contextWindowTokens: 8192,
description:
'360GPT2 Pro 是 360 公司推出的高级自然语言处理模型,具备卓越的文本生成和理解能力,尤其在生成与创作领域表现出色,能够处理复杂的语言转换和角色演绎任务。',
displayName: '360GPT2 Pro',
Expand All @@ -15,9 +16,9 @@ const Ai360: ModelProviderCard = {
input: 5,
output: 5,
},
tokens: 8192,
},
{
contextWindowTokens: 8192,
description:
'360GPT Pro 作为 360 AI 模型系列的重要成员,以高效的文本处理能力满足多样化的自然语言应用场景,支持长文本理解和多轮对话等功能。',
displayName: '360GPT Pro',
Expand All @@ -30,9 +31,9 @@ const Ai360: ModelProviderCard = {
input: 5,
output: 5,
},
tokens: 8192,
},
{
contextWindowTokens: 8192,
description:
'360GPT Turbo 提供强大的计算和对话能力,具备出色的语义理解和生成效率,是企业和开发者理想的智能助理解决方案。',
displayName: '360GPT Turbo',
Expand All @@ -44,9 +45,9 @@ const Ai360: ModelProviderCard = {
input: 2,
output: 2,
},
tokens: 8192,
},
{
contextWindowTokens: 8192,
description:
'360GPT Turbo Responsibility 8K 强调语义安全和责任导向,专为对内容安全有高度要求的应用场景设计,确保用户体验的准确性与稳健性。',
displayName: '360GPT Turbo Responsibility 8K',
Expand All @@ -58,7 +59,6 @@ const Ai360: ModelProviderCard = {
input: 2,
output: 2,
},
tokens: 8192,
},
],
checkModel: '360gpt-turbo',
Expand Down
Loading
Loading