Replies: 1 comment 1 reply
-
@Alex-ley if you are talking about scatter plots I'll shamelessly point to my medium article: https://towardsdatascience.com/how-to-create-fast-and-accurate-scatter-plots-with-lots-of-data-in-python-a1d3f578e551 |
Beta Was this translation helpful? Give feedback.
-
I have a use case where I want to plot millions of data points and it's painfully slow with matplotlib. I was wondering if this package/approach could be used to parallelize this? It seems that you generate N distinct (sub)plots/figures and then stitch them together after they're all done? Do you think it might be possible to essentially plot the same axes N times (let's say N=4 -> 4 copies) and then split up the data into N chunks based on their values (i.e. top left, top right, bottom left, bottom right) which should be pretty performant and then plot those chunks onto their own copy of the axes (essentially just plot a partial chunk of the data) and then cut up those images to take only their populated chunk and stitch the chunks back together to form a final image 🤔 - seems plausible but I don't know if it's worth it. It might be better to just sample the data 😂 Anyway, just thought I'd ask if you'd ever thought about something like this?
Edit: Transparent background images might also be possible and simpler than cutting up the parallel images 🤔
Beta Was this translation helpful? Give feedback.
All reactions