You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
presumably to make it easier to run more than one "benchmark" (collection of pipelines).
Some questions:
is this documented somewhere?
do you have to name them if you define more than one benchmark?
Do we have guidance on best practices here? It seems like it could get complicated
Does DSC automatically share results across different benchmarks like this when it can?
The text was updated successfully, but these errors were encountered:
do you have to name them if you define more than one benchmark?
No, it is optional. See other examples on that page above involving multiple pipelines yet do not have this requirement
Does DSC automatically share results across different benchmarks like this when it can?
DSC always build one DAG by consolidating all pipelines. So it does figure out the possible "sharing" and avoid rerunning or queuing stuff. Is this your question?
Do we have guidance on best practices here? It seems like it could get complicated
I'm not sure ... things in DSC can be "my practice" not necessarily the best practice. I'm always open to suggestions. But this current setup seems to work for my purpose.
all the intro examples i looked at seem to run just one benchmark
eg
run: simulate * analyze * score
but in practice i see definitions like this:
run:
default: data * sim_gaussian * get_sumstats * ((susie_z, susie_oracle) * (score_susie, plot_susie), dap_z * (score_dap, plot_dap), finemap * (score_finemap, plot_finemap))
null: data * sim_gaussian_null * get_sumstats * ((susie_z, susie_oracle) * (score_susie, plot_susie), finemap * (score_finemap, plot_finemap))
presumably to make it easier to run more than one "benchmark" (collection of pipelines).
Some questions:
Does DSC automatically share results across different benchmarks like this when it can?
The text was updated successfully, but these errors were encountered: