You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have found a bug within the TC that I believe is somewhere within the async runtime.
It is not urgent for Madrid but it is quite significant for our work with dynamic properties.
Reproduce:
The bug can be reproduced by running the defer example program on the main branch (fbfb769) with the async runtime: cargo run -- --input-file examples/defer.input examples/defer.lola
Expected output:
The expected output can be found by running the queueing runtime: cargo run -- --input-file examples/defer.input examples/defer.lola --runtime queuing
There are two bugs when running with the async runtime.
The program does not terminate as expected when the input stream is emptied (hence the ^C in the actual output).
The produced outputs are slightly wrong. Notice the difference between the two outputs at z[1]. Also, notice that z[15] is present on the async output.
From the async output, it is as if the z-stream uses the x[0] value, despite it having already produced an Unknown value at that timestep.
Further digging:
For the first bug, I have identified that the bug was introduced in the commit be5c41b.
This is the commit where we started using stream! instead of unfold (see references below).
However, even with that commit, the second bug is present.
One difference between the two implementations is that with the unfold version we yield until the prop_stream terminates, whereas in the stream! implementation we yield from the eval_output_stream. (I believe this is technically a small bug in the old implementation).
I tried implementing a version with the following while-let version to make them more similar: let (Some(eval_res), Some(_)) = join!(eval_output_stream.next(), prop_stream.next()) {
This terminates the program as expected, but oddly enough it makes z contain only 14 samples at termination instead of 16.
I.e., one too few compared to the queuing runtime instead of one too many.
References:
stream! implementation:
pubfndefer(ctx:&dynStreamContext<Value>,mutprop_stream:OutputStream<Value>,history_length:usize,) -> OutputStream<Value>{/* Subcontext with current values only*/let subcontext = ctx.subcontext(history_length);Box::pin(stream!{letmut eval_output_stream:Option<OutputStream<Value>> = None;// Yield Unknown until we have a value to evaluate, then evaluate itwhileletSome(current) = prop_stream.next().await{match current {Value::Str(defer_s) => {// We have a string to evaluate so do solet defer_parse = &mut defer_s.as_str();let expr = lola_expression.parse_next(defer_parse).expect("Invalid eval str");
eval_output_stream = Some(UntimedLolaSemantics::to_async_stream(expr, subcontext.deref()));break;}Value::Unknown => {// Consume a sample from the subcontext but return Unknown (aka. Waiting)
subcontext.advance();
yield Value::Unknown;}
_ => panic!("Invalid defer property type {:?}", current)}}letmut eval_output_stream = eval_output_stream.expect("No eval stream");// Yield the saved value until the inner stream is donewhileletSome(eval_res) = eval_output_stream.next().await{
subcontext.advance();
yield eval_res;}})}
unfold implementation:
pubfndefer(ctx:&dynStreamContext<Value>,prop_stream:OutputStream<Value>,history_length:usize,) -> OutputStream<Value>{/* Subcontext with current values only*/let subcontext = ctx.subcontext(history_length);/*unfold() creates a Stream from a seed value.*/Box::pin(stream::unfold((subcontext, prop_stream,None::<Value>),
|(subcontext,mut x, saved)| asyncmove{/* x.next() returns None if we are done unfolding. Return in that case.*/let current = x.next().await?;/* If we have a saved state then use that otherwise use current */let defer_str = saved.unwrap_or_else(|| current);match defer_str {Value::Str(defer_s) => {let defer_parse = &mut defer_s.as_str();let expr = match lola_expression.parse_next(defer_parse){Ok(expr) => expr,Err(_) => unimplemented!("Invalid eval str"),};letmut es = UntimedLolaSemantics::to_async_stream(expr, subcontext.deref());let eval_res = es.next().await?;
subcontext.advance();returnSome((eval_res,(subcontext, x,Some(Value::Str(defer_s)))));}Value::Unknown => {// Consume a sample from the subcontext but return Unknown (aka. Waiting)
subcontext.advance();Some((Value::Unknown,(subcontext, x,None)))}
_ => panic!("We did not have memory and defer_str was not a Str"),}},))}
The text was updated successfully, but these errors were encountered:
I have found a bug within the TC that I believe is somewhere within the async runtime.
It is not urgent for Madrid but it is quite significant for our work with dynamic properties.
Reproduce:
The bug can be reproduced by running the defer example program on the main branch (fbfb769) with the async runtime:
cargo run -- --input-file examples/defer.input examples/defer.lola
Expected output:
The expected output can be found by running the queueing runtime:
cargo run -- --input-file examples/defer.input examples/defer.lola --runtime queuing
Output:
Actual output:
From running with the async runtime:
Explanation:
There are two bugs when running with the async runtime.
Further digging:
For the first bug, I have identified that the bug was introduced in the commit be5c41b.
This is the commit where we started using
stream!
instead ofunfold
(see references below).However, even with that commit, the second bug is present.
One difference between the two implementations is that with the unfold version we yield until the
prop_stream
terminates, whereas in thestream!
implementation we yield from theeval_output_stream
. (I believe this is technically a small bug in the old implementation).I tried implementing a version with the following while-let version to make them more similar:
let (Some(eval_res), Some(_)) = join!(eval_output_stream.next(), prop_stream.next()) {
This terminates the program as expected, but oddly enough it makes
z
contain only 14 samples at termination instead of 16.I.e., one too few compared to the queuing runtime instead of one too many.
References:
stream!
implementation:unfold
implementation:The text was updated successfully, but these errors were encountered: