Releases: IBM/prompt-declaration-language
Releases · IBM/prompt-declaration-language
Version 0.6.1
What's Changed
- feat: update rust python support to pull in python stdlib by @starpit in #877
- feat: port rust model pull logic to use rust AST by @starpit in #875
- fix: avoid openssl in rust app for now by @starpit in #880
- gsm8k multi-plan, tree-of-thought, tree-of-thought with few shots by @vazirim in #881
- feat: rust interpreter support for modelResponse, ollama-rs tooling calling, and no-stream by @starpit in #883
- chore: minor code cleanups to rust interpreter by @starpit in #884
- test: update github action rust interpreter test to remove ollama pull by @starpit in #885
- fix: add kind tags to rust ast blocks by @starpit in #887
- fix: update rust interpreter to avoid global emit/scope state by @starpit in #888
- fix: rust pull logic may not always pull by @starpit in #889
- feat: remove tauri cli support for running python interpreter by @starpit in #890
- chore: bump rust dependencies to resolve alert by @starpit in #893
- test: tauri github actions test should apt install the deb by @starpit in #892
- feat: support for --data and --data-file in rust interpreter by @starpit in #894
- fix: begin phasing in Metadata (common defs, etc. attrs) into rust AST by @starpit in #895
- chore: update granite-io dependency by @mandel in #896
- refactor: move def attr into Metadata (rust interpreter) by @starpit in #897
- feat: introduce Expr typing and apply it to IfBlock.condition by @starpit in #899
- feat: update rust Call AST to use Expr for condition attr by @starpit in #901
- chore: bump to rust 2024 edition by @starpit in #902
- feat: continue to flesh out block metadata structure in rust by @starpit in #898
- refactor: add metadata attr to remaining rust block asts by @starpit in #903
- feat: update rust Repeat AST to use Expr for
for
attr by @mandel in #904 - refactor: introduce Advanced enum to rust AST by @starpit in #906
- refactor: refactor rust ast to place metadata in common struct by @starpit in #909
- fix: improve deserialization of python-generated model block traces by @starpit in #910
- fix: in rust ast, allow ModelBlock model to be an expr by @starpit in #911
- feat: initial pdl__id and --trace support for rust interpreter by @starpit in #912
- fix: update rust interpreter to create Data blocks for expr eval, and model_input trace field by @starpit in #914
- fix: populate trace context field in rust interpreter by @starpit in #915
- Update stop sequences in parameters by @jgchn in #861
- refactor: extract platform-generic logic from run_ollama_model() handler by @starpit in #916
- fix: rust interpreter was not handling pdl__context for re-runs of traces by @starpit in #917
- feat: improve support for importing stdlib in python code blocks by @starpit in #918
- skeleton-of-thought example by @vazirim in #919
- Bump litellm and openai versions by @vazirim in #920
- fix: improve support for rust interpreter python imports from venv by @starpit in #922
- chore: bump tauri and npm dependencies by @starpit in #923
- chore: bump ui to 0.6.1 by @starpit in #921
Full Changelog: v0.6.0...v0.6.1
Version 0.6.0
This new release has two major changes:
pdl-lint
, a linter for PDL programs (thanks to @vite-falcon!)- mesages in the context with the same role are not automatically merged anymore.
For example, the following program generates 3 messages:
lastOf:
- role: user
text:
- hello
- "\n"
- world
- ${pdl_context}
result:
[{"role": "user", "content": "hello", "defsite": "lastOf.text.0"}, {"role": "user", "content": "\n", "defsite": "lastOf.text.1"}, {"role": "user", "content": "world", "defsite": "lastOf.text.2"}]
To generate only one message, you have to use a message
block:
lastOf:
- role: user
content:
text:
- hello
- "\n"
- world
- ${pdl_context}
result
[{"role": "user", "content": "hello\nworld", "defsite": "lastOf.message"}]
What's Changed
- Switch to granite-io version 0.2 by @mandel in #818
- feat: update beeai compiler to support compiling directly from python source by @starpit in #834
- Examples restructuring, tutorial changes by @vazirim in #836
- Bug fixes for setting default parameters by @vazirim in #838
- fix: pdl view trace.json fixes by @starpit in #843
- fix: Bug in pdl_schema_error_analyzer that raises exception during analysis by @vite-falcon in #851
- Clean up run examples and automate result updating via GH Actions by @jgchn in #853
- feat: do not merge messages with same role by @mandel in #846
- fixes to react examples by @vazirim in #859
- feat:
message
blocks contribute the message instead of the content to the context by @mandel in #862 - feat: do not stringify messages content by @mandel in #858
- feat: rust interpreter by @starpit in #857
- gsm8k plan with few-shots by @vazirim in #870
- feat: add pdl-lint tool that can be configured via pyproject.toml by @vite-falcon in #864
New Contributors
- @vite-falcon made their first contribution in #851
Full Changelog: v0.5.1...v0.6.0
Version 0.5.1
What's Changed
- Fix parsing of localized expressions by @mandel in #804
- fix: update ui to support pulling interpreter from local dir by @starpit in #807
- fix: interpreter squashes PdlRuntimeErrors by @starpit in #800
- fix: interpreter fails when passed parameters with null/None values by @starpit in #809
- Rename
expr
field ofLocalizedExpression
intopdl__expr
by @mandel in #806 - feat: compile bee to pdl by @starpit in #817
- fix: rerun in ui bombs by @starpit in #819
- fix: async model call prints should only occur if PDL_VERBOSE_ASYNC env is set by @starpit in #820
- fix: skip litellm 1.63.14 by @starpit in #827
- fix: pin openai==1.61.0 pip due to breaking change on their side by @starpit in #829
Full Changelog: v0.5.0...v0.5.1
Version 0.5.0
What's Changed
- feat: add model token usage stats to trace by @starpit in #762
- feat: add support for CodeBlocks with argv by @starpit in #756
- feat: add Usage tab to detail drawer for model blocks by @starpit in #778
- Systematically used localized expressions in the trace and add
pdl__result
in them by @mandel in #760 - feat: add Result tab to detail drawer by @starpit in #782
- doc: update readme ui screenshot to show usage metrics by @starpit in #793
- Copy the schema on the website by @mandel in #797
Full Changelog: v0.4.2...v0.5.0
Version 0.4.2
What's Changed
- feat: in UI allow pty execution to be canceled by @starpit in #673
- feat: switch ui tile run menu to split action dropdown by @starpit in #675
- default parameters for ollama_chat models by @vazirim in #689
- Change examples to ollama_chat by @jgchn in #691
- Grade School Math example by @esnible in #694
- Docstrings and examples for PDL concepts by @esnible in #693
- feat: ui temperature stability by @starpit in #697
- feat: add Pagination to UI by @starpit in #704
- demo hallucination trace by @vazirim in #705
- Structured decoding bug fix for watsonx, ollama + traceback for python code blocks by @vazirim in #708
- feat: interpreter should report (stderr) call start/end and timing by @starpit in #711
- feat: update interpreter to print out model response messages in async mode by @starpit in #717
- Update parse_str interface by @mandel in #741
- Support dev version of granite-io by @mandel in #742
- feat: pull models in rust by @starpit in #743
- feat: update auto-pull logic to run in parallel with pip install by @starpit in #744
- feat: use shasum as cache key for venv by @starpit in #748
- feat: add gsm8k demo to ui (demo9) by @starpit in #753
- Contributed values are expressions by @mandel in #754
- Use
${ pdl_context }
as default value for theinput
field of a model block by @mandel in #757 - added litellm param to ignore structure decoding param in tools example by @vazirim in #750
- gsm8k with planning by @vazirim in #761
Full Changelog: v0.4.1...v0.4.2
Version 0.4.1
Version 0.4.0
What's Changed
- gsm8k examples and results by @vazirim in #322
- feat: in react ui Context tab, use breadcrumb ui to link to def site by @starpit in #380
- Add json-repair by @claudiosv in #279
- feat: in react ui, add Masonry view to replace Transcript view by @starpit in #395
- Asynchronous LLMs calls by @mandel in #284
- feat: in react UI, add Run button to support re-running program by @starpit in #431
- Add a
max_iterations
field to therepeat/until
loop (and makeuntil
optional) and removerepeat/num_iterations
loops. by @mandel in #434 - Granite 3.1 chat template fix by @vazirim in #433
- feat: add homebrew formula by @starpit in #447
- Unify
repeat
andfor
loops and addwhile
by @mandel in #444 - feat: in react ui re-enable Replay/Run button and functionality by @starpit in #451
- intrinsics hallucination example by @vazirim in #464
- doc: update react ui build for production documentation by @starpit in #476
- feat: syntax highlight code in markdown blocks by @starpit in #480
- feat: in react ui, use pty for Run by @starpit in #492
- feat: add final result 'output of program' to react ui masonry by @starpit in #497
- removed old viewer by @vazirim in #510
- feat: add Platform field to model detail Summary tab by @starpit in #516
- modules implementation by @vazirim in #507
- New RAG example by @esnible in #427
- feat: initial support for showing structured model responses in detail ui by @starpit in #527
- updated PDL quick reference by @hirzel in #545
- feat: add opengraph meta info to ui by @starpit in #549
- Migrate some examples from Replicate to Ollama by @jgchn in #522
- Added Granite 3.2 chat template, template that works with ollama by @vazirim in #557
model
blocks can use the granite-io platform by @mandel in #529- Join loop iterations as objects by @mandel in #580
- Add flexibility on the case of the values for the fields
lang
,parser
, andmode
by @mandel in #612 - Incorrect endpoint in Telemetry doc by @esnible in #619
- Deploy Ollama to run examples by @jgchn in #629
New Contributors
Full Changelog: v0.3.0...v0.4.0
Version 0.3.0
What's Changed
- Port pdl-live to react+patternfly+vite by @starpit in #253
- Embed PDL interpreter into react (tauri) app builds by @starpit in #350
- Remove BAM support by @mandel in #265
- Config options for model default parameters by @esnible in #266
- Add a
match
block by @mandel in #252 - Don't crash without log on keyboard exit by @esnible in #346
- lowered lower bound of litellm to 1.57.3 by @vazirim in #365
New Contributors
Full Changelog: v0.2.0...v0.3.0
Version 0.2.0
Warning: This release introduce a non-backward compatible change: call
blocks take closures as argument instead of function names. In practice, a function call like call: f
must be replace by call: ${f}
.
What's Changed
- The argument of
call:
is a closure, not a name by @mandel in #242 - Initial OpenTelemetry support by @esnible in #254
- Removed upper bound on LiteLLM version by @vazirim in #255
- Add comments to examples to make them more obvious by @esnible in #259
Full Changelog: v0.1.1...v0.2.0
Version 0.1.1
What's Changed
- remove None valued parameters when calling LLMs by @vazirim in #212
- Flush the output after each print in streaming mode by @mandel in #213
- support for expressions in function call args by @vazirim in #221
- Show the model name, and request info if possible, on error by @esnible in #226
New Contributors
- @trsudarshan made their first contribution in #209
- @esnible made their first contribution in #222
Full Changelog: v0.1.0...v0.1.1