diff --git a/README-EN.md b/README-EN.md deleted file mode 100644 index e5c223d..0000000 --- a/README-EN.md +++ /dev/null @@ -1,186 +0,0 @@ -![Maven Central](https://img.shields.io/maven-central/v/io.github.lambdua/service?color=blue) - -> ⚠️This project is [openai Java](https://github.com/TheoKanning/openai-java) fork. The project seems to have been stopped by the original project author and cannot meet my needs, so I have decided to continue maintaining the project and adding new features. - -# OpenAI-Java -Java libraries for using OpenAI's GPT apis. Supports GPT-3, ChatGPT, and GPT-4. - -Includes the following artifacts: -- `api` : request/response POJOs for the GPT APIs. -- `client` : a basic retrofit client for the GPT endpoints, includes the `api` module -- `service` : A basic service class that creates and calls the client. This is the easiest way to get started. - -as well as an example project using the service. - -## Supported APIs -- [Models](https://platform.openai.com/docs/api-reference/models) -- [Completions](https://platform.openai.com/docs/api-reference/completions) -- [Chat Completions](https://platform.openai.com/docs/api-reference/chat/create) -- [Edits](https://platform.openai.com/docs/api-reference/edits) -- [Embeddings](https://platform.openai.com/docs/api-reference/embeddings) -- [Audio](https://platform.openai.com/docs/api-reference/audio) -- [Files](https://platform.openai.com/docs/api-reference/files) -- [Fine-tuning](https://platform.openai.com/docs/api-reference/fine-tuning) -- [Images](https://platform.openai.com/docs/api-reference/images) -- [Moderations](https://platform.openai.com/docs/api-reference/moderations) -- [Assistants](https://platform.openai.com/docs/api-reference/assistants) - - -## Importing - -### Gradle -`implementation 'io.github.lambdua::'` - -### Maven -```xml - - io.github.lambdua - {api|client|service} - version - -``` - -## Usage -### Data classes only -If you want to make your own client, just import the POJOs from the `api` module. -Your client will need to use snake case to work with the OpenAI API. - -### Retrofit client -If you're using retrofit, you can import the `client` module and use the [OpenAiApi](client/src/main/java/com/theokanning/openai/OpenAiApi.java). -You'll have to add your auth token as a header (see [AuthenticationInterceptor](client/src/main/java/com/theokanning/openai/AuthenticationInterceptor.java)) -and set your converter factory to use snake case and only include non-null fields. - -### OpenAiService -If you're looking for the fastest solution, import the `service` module and use [OpenAiService](service/src/main/java/com/theokanning/openai/service/OpenAiService.java). - -```java -OpenAiService service = new OpenAiService("your_token"); -CompletionRequest completionRequest = CompletionRequest.builder() - .prompt("Somebody once told me the world is gonna roll me") - .model("babbage-002"") - .echo(true) - .build(); -service.createCompletion(completionRequest).getChoices().forEach(System.out::println); -``` - -### Customizing OpenAiService -If you need to customize OpenAiService, create your own Retrofit client and pass it in to the constructor. -For example, do the following to add request logging (after adding the logging gradle dependency): - -```java -ObjectMapper mapper = defaultObjectMapper(); -OkHttpClient client = defaultClient(token, timeout) - .newBuilder() - .interceptor(HttpLoggingInterceptor()) - .build(); -Retrofit retrofit = defaultRetrofit(client, mapper); - -OpenAiApi api = retrofit.create(OpenAiApi.class); -OpenAiService service = new OpenAiService(api); -``` - -### Adding a Proxy -To use a proxy, modify the OkHttp client as shown below: -```java -ObjectMapper mapper = defaultObjectMapper(); -Proxy proxy = new Proxy(Proxy.Type.HTTP, new InetSocketAddress(host, port)); -OkHttpClient client = defaultClient(token, timeout) - .newBuilder() - .proxy(proxy) - .build(); -Retrofit retrofit = defaultRetrofit(client, mapper); -OpenAiApi api = retrofit.create(OpenAiApi.class); -OpenAiService service = new OpenAiService(api); -``` - -### Functions -You can create your functions and define their executors easily using the ChatFunction class, along with any of your custom classes that will serve to define their available parameters. You can also process the functions with ease, with the help of an executor called FunctionExecutor. - -First we declare our function parameters: -```java -public class Weather { - @JsonPropertyDescription("City and state, for example: León, Guanajuato") - public String location; - @JsonPropertyDescription("The temperature unit, can be 'celsius' or 'fahrenheit'") - @JsonProperty(required = true) - public WeatherUnit unit; -} -public enum WeatherUnit { - CELSIUS, FAHRENHEIT; -} -public static class WeatherResponse { - public String location; - public WeatherUnit unit; - public int temperature; - public String description; - - // constructor -} -``` - -Next, we declare the function itself and associate it with an executor, in this example we will fake a response from some API: -```java -ChatFunction.builder() - .name("get_weather") - .description("Get the current weather of a location") - .executor(Weather.class, w -> new WeatherResponse(w.location, w.unit, new Random().nextInt(50), "sunny")) - .build() -``` - -Then, we employ the FunctionExecutor object from the 'service' module to assist with execution and transformation into an object that is ready for the conversation: -```java -List functionList = // list with functions -FunctionExecutor functionExecutor = new FunctionExecutor(functionList); - -List messages = new ArrayList<>(); -ChatMessage userMessage = new ChatMessage(ChatMessageRole.USER.value(), "Tell me the weather in Barcelona."); -messages.add(userMessage); -ChatCompletionRequest chatCompletionRequest = ChatCompletionRequest - .builder() - .model("gpt-3.5-turbo-0613") - .messages(messages) - .functions(functionExecutor.getFunctions()) - .functionCall(new ChatCompletionRequestFunctionCall("auto")) - .maxTokens(256) - .build(); - -ChatMessage responseMessage = service.createChatCompletion(chatCompletionRequest).getChoices().get(0).getMessage(); -ChatFunctionCall functionCall = responseMessage.getFunctionCall(); // might be null, but in this case it is certainly a call to our 'get_weather' function. - -ChatMessage functionResponseMessage = functionExecutor.executeAndConvertToMessageHandlingExceptions(functionCall); -messages.add(response); -``` -> **Note:** The `FunctionExecutor` class is part of the 'service' module. - -You can also create your own function executor. The return object of `ChatFunctionCall.getArguments()` is a JsonNode for simplicity and should be able to help you with that. - -For a more in-depth look, refer to a conversational example that employs functions in: [OpenAiApiFunctionsExample.java](example/src/main/java/example/OpenAiApiFunctionsExample.java). -Or for an example using functions and stream: [OpenAiApiFunctionsWithStreamExample.java](example/src/main/java/example/OpenAiApiFunctionsWithStreamExample.java) - -### Streaming thread shutdown -If you want to shut down your process immediately after streaming responses, call `OpenAiService.shutdownExecutor()`. -This is not necessary for non-streaming calls. - -## Running the example project -All the [example](example/src/main/java/example/OpenAiApiExample.java) project requires is your OpenAI api token - -## FAQ -### Does this support GPT-4? -Yes! GPT-4 uses the ChatCompletion Api, and you can see the latest model options [here](https://platform.openai.com/docs/models/gpt-4). -GPT-4 is currently in a limited beta (as of 4/1/23), so make sure you have access before trying to use it. - -### Does this support functions? -Absolutely! It is very easy to use your own functions without worrying about doing the dirty work. -As mentioned above, you can refer to [OpenAiApiFunctionsExample.java](example/src/main/java/example/OpenAiApiFunctionsExample.java) or -[OpenAiApiFunctionsWithStreamExample.java](example/src/main/java/example/OpenAiApiFunctionsWithStreamExample.java) projects for an example. - -### Why am I getting connection timeouts? -Make sure that OpenAI is available in your country. - -### Why doesn't OpenAiService support x configuration option? -Many projects use OpenAiService, and in order to support them best I've kept it extremely simple. -You can create your own OpenAiApi instance to customize headers, timeouts, base urls etc. -If you want features like retry logic and async calls, you'll have to make an `OpenAiApi` instance and call it directly instead of using `OpenAiService` - -## License -Published under the MIT License diff --git a/README.md b/README.md index 0d9fe57..14a3a74 100644 --- a/README.md +++ b/README.md @@ -3,7 +3,6 @@ > ⚠️ 这个项目是[openai Java](https://github.com/TheoKanning/openai-java)项目的分叉,原项目作者似乎已经停止维护,无法满足我的需求,所以我决定继续维护这个项目,并添加新功能。 > [版本变化详情](https://github.com/Lambdua/openai4j/releases) -[english doc](README-EN.md) # OpenAI-Java 用于使用OpenAI的GPT API的Java库。支持openAi所有的模型,支持最新的gpt4-trubo识图模型