Skip to content

allow use streaming query rows from promise-connection #822

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

jimmyolo
Copy link

https://github.com/mysqljs/mysql#streaming-query-rows
sometime, streaming results interface is useful
add streamingQuery to PromiseConnection
directly export original connection.query to access streaming interface

@sidorares
Copy link
Owner

I was thinking to have async iterator instead of node stream-like api, see example here: #677 (comment)

@jimmyolo
Copy link
Author

looks great!! close this

@jimmyolo jimmyolo closed this Jul 31, 2018
@sidorares
Copy link
Owner

I'll put some notes here:

The whole point of streaming is to save resourses ( memory & cpu ). We should be able to avoid buffering all the rows we don't need and potentially skip parsing rest of the rows if exit early from iterator loop.

https://jakearchibald.com/2017/async-iterators-and-generators/

The way fetch() api work is it has 2 stages: 1st await gives you headers, second await is to start reading rest of the body. We could model querying rows similar:

non-streaming version:

   const  response = await connection.fetchQuery('sql');
   // response.fields, response.insertId etc already available here
   const rows = await response.getRows();

streaming version:

   const  response = await connection.fetchQuery('sql');
   // response.fields, response.insertId etc already available here
   for await (const row of response.rows) {
      if (row.id === '123') {
         soSomethingWith(row); // obviously not really great example because we could do the same withe WHERE id=123 sql but hope I demonstrated the point
         break;  // after this we can even skip row packets parsing. We still have to receive all the packets because of the way protocol works but can optimise here by not de-serealising all the packets we won't need
      }
   }  

Still not sure if this is api I like. Since we already have buffering "get all rows" api first we can only model streaming api to stream rows:

   const rows = conn.streamQuery('sql');
   for await (const row of rows) {
     console.log(row); 
  }
  • slighly shorter but less flexible. Also in first example not sure if I like 'fetchQuery' name but don't see anything better yet. @gajus do you use async iterators / streams in your helper wrappers? Any suggestions on potential async streaming api?

@jimmyolo
Copy link
Author

jimmyolo commented Aug 1, 2018

@sidorares
Copy link
Owner

@victor0801x node Streams and async iterators are quite different, we need to communicate clearly the difference and ideally have api similar to other async streaming libraries ( in my example it's fetch() but I'd like to see more )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants