Releases: Ironclad/rivet
Releases · Ironclad/rivet
Rivet IDE v1.7.6
Bug Fixes
- Fix regression with rendering objects as strings causing crashes or hanging UI
Rivet Libraries v1.14.1
Bug Fixes & Improvements
- Fix Destructure Node wrapping single-value results in arrays. If you do
$.path
and it only matches one thing, you will get"result"
out instead of["result"]
- Validate HTTP Call Node URLs before calling them
- Improve Object Node type inference - if the type being returned is an object, it will correctly return
object[]
instead ofobject
to improve type coercion in later nodes. - Improve type coercion for
type: 'object'
values. This is now treated more or less the same as anany
(the actual type is inferred by the value's type). This improves coercion to string for arrays. - Fix
isSplitSequential
being serialized to project file.
Rivet IDE v1.7.5
Bug Fixes & Improvements
- Persist last debugger connection URI
- Fix flickering of nodes while dragging of them
- Fix
fsPath
error when opening new project - Fix hotkeys on Windows
- Fix LM Studio auto-configure path
- Limit Object Node body rendering length so that it doesn't expand infinitely
- Fix errors when running Rivet in a browser
- Fix constant memory usage increase on Windows
- Fix high CPU usage when Rivet is idle
- Fix console error when dragging wire around
- Fix Destructure Node wrapping its outputs in an array, when there is only one match for the output.
- Validate URLs for the HTTP Call Node
- Object Node is now more intelligent about its output data type. You can use arrays in an Object Node and they will be interpreted correctly when coercing in subsequent nodes.
- Fix coercion from object to string, when the object is really an array. Object and
any
function more or less the same now (by analyzing the contents to see what type it is) - Fix saving "split sequential" toggle
Rivet Libraries v1.14.0
New Features
- Google Gemini Plugin.
Bug Fixes
- Packages updated.
- AssemblyAI plugin and package updated.
- Fix concurrent chat node with local LLM (#249)
- Fix creating objects with type and value properties.
- Fix abort/subscribe memory leak.
Rivet IDE v1.7.4
New Features
- Added Google plugin! Use Google Gemini.
Changes & Fixes
- Plugin UI: Updated image thumbnails for Autoevals, Python, and FS plugins.
- Fix JavaScript example in Code Node.
- Fix input/output names in Code Node was crashing render.
- Packages updated.
- AssemblyAI plugin and package updated.
- Fix concurrent chat node with local LLM (#249)
- Fix creating objects with type and value properties.
- Fix abort/subscribe memory leak.
Rivet Libraries v1.13.4
Bug Fixes
- Fix circular import with ESM importing (might be esbuild specific)
Rivet Libraries v1.13.3
Fixes
- Fix base64 encoding when executing in Node.js, which fixes GPT-4 Vision being used there
Rivet IDE v1.7.3
New Features
- Added Ollama plugin! Use Ollama to run local LLM models.
Changes & Fixes
- OpenAI Plugin: Allow more values (better coercion) to be passed into the
messages
input of the Run Thread node. - OpenAI Plugin: Fix data type for
functions
input of Run Thread node - OpenAI Plugin: Include
message
input to "On Message Creation" subgraph like the helper text says - OpenAI Plugin: Fix error message when a function call isn't mapped in the handlers
- Added helper tooltip to all input port toggle switches
- Fixed base64 encoding images for GPT-4 Vision when using Node executor
Rivet Libraries v1.13.2
Tweaks & Fixes
- Fixed
NodeDatasetProvider
not preserving options such assave
when it does not find a data file. - Added
requireFile
option toNodeDatasetProvider
, so that it can error when it does not find a dataset file. - Better coercion of messages input for Run Thread node.
- Fix a Run Thread node error message.
- Correct
functions
input data type for Run Thread node. - Run Thread passes a
message
input for the "on message creation subgraph" like the docs say
Rivet Libraries v1.13.1
Tweaks & Bug Fixes
- Fixed request token count being 0 when number of choices is unset (1)
- Account for GPT functions in token counting by approximating what OpenAI does to calculate their token counts.