-
Notifications
You must be signed in to change notification settings - Fork 2
Interactive, online analysis tool (devname: w_ipython) #18
Comments
@ajoshpratt It's an interesting idea and certainly with newer versions of IPython (>5.0) that have multi-line editing, it could be helpful. I know, however, that I tend to do most of my analysis in a Jupyter notebook when possible. This usually involved moving a copy of the data (or some relevant intermediate result) to my local machine, so I can see the advantage of having something that can be run remotely from the command line. What I never explored, that might be relevant is being able to run a remote jupyter notebook kernel and then attach a local browser to it so you get the best of both worlds: http://jupyter-notebook.readthedocs.io/en/latest/public_server.html Again, I've never done this, so there might be some major limitations, but maybe it's worth looking at so users could potentially leverage all of the niceties of the notebook and also have full-fledged plotting capabilities. Also, I wanted to note from a workflow standpoint that I'd discourage you from having a generic But more generally, I think the WESTPA team should have a well-defined workflow for adding features. Other big projects spell them out in the docs: http://scikit-learn.org/stable/developers/contributing.html I know this is diverging from the main topic of the issue, but to keep the long term maintainability of the code, I think it behoves us to have a well-defined process that includes automated test running and pull requests. |
@synapticarbors, thanks for the workflow suggestion. I agree; we don't have a well defined workflow, so it's easy to stumble into a development situation where changes and fixes end up getting built on top of each other without getting merged. For what it's worth, I'd been thinking about breaking development of this off into another branch to keep this one focused on changes to the kinetics code, but hadn't decided if it was worth it. Development sins aside, though, I wanted feedback before making any more changes. I'll be opening another topic on the kinetics changes soon, once I can work through the writeup and document why the changes are necessary (as well as cleaning the code). Anyway, the suggestion about leveraging Jupyter is worth looking into; the information about other people's workflows is also nice to hear. We could consider creating 'easy to import' modules (and sample notebooks) that could work with a Jupyter notebook, but greatly simplify analysis for new users (exposing the same sort of data we're doing here). Actually, that's probably pretty straightforward with this tool; when the object is created, it does all the work necessary to prepare the various datasets. I suppose you'd really only have to:
Which is something I hadn't really thought of before. There are some 'convenience' plotting functions that take advantage of matplotlib that would already work reasonably well, here. Adam |
Hi Josh, Thanks for bringing up these ideas of yours again about the workflow -- Best, On Tue, Oct 18, 2016 at 7:55 PM, Joshua Adelman [email protected]
Lillian T. Chong |
Looking around, it looks like it's a little difficult (if not impossible) to cleanly launch an interface-agnostic ipython notebook interface from a script*, and impossible to call an IPython notebook that interfaces to a running kernel.
There may be a magic command, but it's probably much easier to simplify modify the west script in $WEST_ROOT/bin to accept a '--notebook' command which launches a Jupyter notebook. The user could then create a notebook and import the module (we could provide examples of how to do this) and run with the convenience functions in w_ipython, if they wanted. On the user end, this takes care of all the variable setting that is required to launch a WESTPA script. On our end, it's not that difficult, either. The WEST script already accepts flags (strace, etc) that aren't sent on to the python binary, so the framework is there, so to speak. Seems to work well enough. The user could then launch w_ipython --notebook to launch Jupyter notebook, or just w_ipython To drop them into an interactive prompt. Still thinking of a good name for this. Also, you can tell that I started from w_kinavg as a base for this, given that it's still named Kinetics. Hah. |
Hi all,
Inside of the DEVELOPMENT branch of west_tools, I've started work on an interactive tool to ease interactive and automated analysis; the current name is w_ipython. I'm totally up for a different name.
The idea stemmed from the fact that we routinely needed to access the raw h5 data, either to debug a simulation or analyse it in a way that the tools don't currently (and probably won't ever) support. It would have been nice, I figured, to have a script that would just load up the main h5 file (typically west.h5) and avoid having to do import numpy, h5py, load up the iterations, etc, and maybe throw in a few convenience functions.
It sort of grew from there. It currently looks through the main configuration file (west.cfg), pulls in analysis parameters, runs functions that it needs to, and drops you at an ipython prompt with a 'w' object that contains all the information from your simulation and the analysis you've selected to do.
The initial and current development goals, as well as their implementation, are as follows.
Some issues that would need to be ironed out before release:
It's calling functionality from other code whenever it can, for the most part, so it should be easy enough to maintain.
A few screenshots or configuration options, for the unbelievers:
Inside my west.cfg:
Startup, selecting iteration, and what's available in the current iteration:
Plotting from state 0 to 1 from the reweighting code:
Output from a trace. Easily plotted with pyplot, if one chose to do so:
Comments, suggestions, criticisms, design suggestions, usability concerns, etc, are all appreciated. It's worth noting that all the tools have been updated such that they can run according to a particular 'analysis scheme' (in addition to their normal functionality), as well, so that it should be easy to integrate into an existing workflow. One can also call the 'analyze only' flag, as well, to just run everything and call it a day.
Adam
The text was updated successfully, but these errors were encountered: