Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bi streaming no debugger stuff #219

Merged
merged 50 commits into from
Jul 31, 2021
Merged

Conversation

goodboy
Copy link
Owner

@goodboy goodboy commented Jul 5, 2021

This is a first draft of a bidirectional streaming api as discussed in #53.

This specific PR is a clone of the original #209 but with debugger changes factored out to see if we can get a clean CI run as well as a smaller patch set for reviewers. There is a follow up PR #220 which includes the debugger improvements that now use and rely on this new api.

Critique and suggestions very welcome from lurkers 😎.


continued task list from #209:

  • more extensive tests for closing down either side of the stream early
  • port _debug.py remote tty locking to context api.
  • tests where the consumer tasks use async for stream while sender is running independently in another trio task
  • [ ] should we add broadcasting support here for We need broadcast channels, stat. #204 ? let's keep it separate.
    /api.html#anyio.streams.stapled.StapledObjectStream) (obvs we won't use *Object* since we're msgpack type contrained). Actual code is here.
  • example in the main readme. i'm thinking that should be a big boon when compared to projects like faust and ray which (i think) are unidirectional only?

the following were deferred to #223

  • docs on the new apis
  • possibly a better internal design for tracking bidir vs unidir context usage to avoid hacky if not self._ctx._portal checking inside ReceiveMsgStream.aclose()
  • from Ems to bidir streaming pikers/piker#190

    would be nice if in tractor we can require either a ctx arg, or a named arg with ctx in it and a type annotation of tractor.Context instead of strictly requiring a ctx arg.

  • move to a split SendMsgStream / ReceiveMsgStream type set and staple them together using a channel/messaging equivalent of trio.StapledStream? I'm thinking we'll make all streams a MsgStream and just don't allow send on receive only (which should be minority use case i'd imagine eventually).
  • The main question was moreso about cancellation race conditions that can arise where the local channel is killed after it's sent the stop and whether or not we should wait / shield the mem chan until the msg is processed later (also presumably this is all before the sub-actor is killed).

    • this really has to do with whether or not we want a channel teardown transaction eventually. I personally think right now it's not a requirement (and makes facing 2-general's more immediate). the more sane thing to address first is the naive discovery issues we have as per Multi-root discovery: pragmatic, simple consensus. #216.

outstanding windows hangs CI work, leak-over from #209

  • [ ] comment from #209 suggests it might have been the mutlti-task producer task, but removing that on 83dd8e4 shows it's still not working... it was some wacky logging thing 🙄

@goodboy goodboy force-pushed the bi_streaming_no_debugger_stuff branch from e084925 to d130f2c Compare July 5, 2021 19:04
@goodboy goodboy mentioned this pull request Jul 5, 2021
@goodboy
Copy link
Owner Author

goodboy commented Jul 5, 2021

@chrizzFTD if you've got a second I would sure love to know why these tests hang 😂

@goodboy goodboy force-pushed the bi_streaming_no_debugger_stuff branch from 25938fe to 83dd8e4 Compare July 6, 2021 11:31
@goodboy goodboy mentioned this pull request Jul 6, 2021
9 tasks
Base automatically changed from transport_cleaning to master July 6, 2021 12:20
goodboy added 20 commits July 6, 2021 08:23
This mostly adds the api described in
#53 (comment)

The first draft summary:
- formalize bidir steaming using the `trio.Channel` style interface
  which we derive as a `MsgStream` type.
- add `Portal.open_context()` which provides a `trio.Nursery.start()`
  remote task invocation style for setting up and tearing down tasks
  contexts in remote actors.
- add a distinct `'started'` message to the ipc protocol to facilitate
  `Context.start()` with a first return value.
- for our `ReceiveMsgStream` type, don't cancel the remote task in
  `.aclose()`; this is now done explicitly by the surrounding `Context`
   usage: `Context.cancel()`.
- streams in either direction still use a `'yield'` message keeping the
  proto mostly symmetric without having to worry about which side is the
  caller / portal opener.
- subtlety: only allow sending a `'stop'` message during a 2-way
  streaming context from `ReceiveStream.aclose()`, detailed comment
  with explanation is included.

Relates to #53
Add clear teardown semantics for `Context` such that the remote side
cancellation propagation happens only on error or if client code
explicitly requests it (either by exit flag to `Portal.open_context()`
or by manually calling `Context.cancel()`).  Add `Context.result()`
to wait on and capture the final result from a remote context function;
any lingering msg sequence will be consumed/discarded.

Changes in order to make this possible:
- pass the runtime msg loop's feeder receive channel in to the context
  on the calling (portal opening) side such that a final 'return' msg
  can be waited upon using `Context.result()` which delivers the final
  return value from the callee side `@tractor.context` async function.
- always await a final result from the target context function in
  `Portal.open_context()`'s `__aexit__()` if the context has not
  been (requested to be) cancelled by client code on block exit.
- add an internal `Context._cancel_called` for context "cancel
  requested" tracking (much like `trio`'s cancel scope).
- allow flagging a stream as terminated using an internal
  `._eoc` flag which will mark the stream as stopped for iteration.
- drop `StopAsyncIteration` catching in `.receive()`; it does
  nothing.
@goodboy
Copy link
Owner Author

goodboy commented Jul 6, 2021

Looking at this last run it appears as though it's the discovery tests causing the hang now?

I think we need a manual tester to come in and check it out.

@goodboy goodboy force-pushed the bi_streaming_no_debugger_stuff branch from f5af130 to 4157d98 Compare July 6, 2021 16:54
@goodboy goodboy force-pushed the bi_streaming_no_debugger_stuff branch 2 times, most recently from fbcd253 to 929b6dc Compare July 6, 2021 17:54
@goodboy
Copy link
Owner Author

goodboy commented Jul 6, 2021

Ok going to increment in from #220 onto the CI_increment_for_windows_bidirstreaming branch to see if we can bisect the solution 😂

@goodboy
Copy link
Owner Author

goodboy commented Jul 7, 2021

As per whatever dark magic is going on in this patch set this should solve windows CI hangs?

@goodboy
Copy link
Owner Author

goodboy commented Jul 7, 2021

Lol true 🤯 right there.

@goodboy
Copy link
Owner Author

goodboy commented Jul 8, 2021

Aight, gonna consider this a clean run since the one flaky test is just in-determinism that'll be removed in #220 😎

Last couple things I'm hoping to get in here:

@goodboy goodboy mentioned this pull request Jul 31, 2021
4 tasks
@goodboy goodboy force-pushed the bi_streaming_no_debugger_stuff branch from 47b0d32 to a35b938 Compare July 31, 2021 15:28
@goodboy goodboy force-pushed the bi_streaming_no_debugger_stuff branch from a35b938 to 240f591 Compare July 31, 2021 16:10
@@ -378,10 +473,24 @@ async def _stream_handler(
try:
await self._process_messages(chan)
finally:

# channel cleanup sequence
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ahh dang I guess we can drop this?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh wait, no this is a reminder comment of sorts.

@goodboy goodboy merged commit 54d8c93 into master Jul 31, 2021
@goodboy goodboy deleted the bi_streaming_no_debugger_stuff branch July 31, 2021 16:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant