Skip to content

Commit

Permalink
Unify naming of the inference session on the backend.
Browse files Browse the repository at this point in the history
  • Loading branch information
ondrej33 committed Dec 10, 2024
1 parent 16c2cae commit 0303825
Show file tree
Hide file tree
Showing 30 changed files with 212 additions and 211 deletions.
10 changes: 5 additions & 5 deletions project-docs/ARCHITECTURE.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@ We first list the most important components of the project repository, mainly va
- `main.ts` | The entry point for all JavaScript code. Later we'll probably want different entry points for different windows, but for now it's just a single file.
- `aeon_events.ts` and `aeon_state.ts`| The front-end part of the event-based communication. The first one defines the event processing mechanisms in general, the second defines particular structure with event wrapper API.
- `html/window.html` | The default HTML "wrapper" that is extended by individual windows. Later, we'll probably need to add other wrappers for things like dialog windows.
- `html/component-editor` | A directory with HTML files of self-contained components for sketch editor workflow.
- `html/component-analysis` | A directory with HTML files of self-contained components for analysis workflow.
- `html/component-editor` | A directory with HTML files of self-contained components for sketch editor workflows.
- `html/component-analysis` | A directory with HTML files of self-contained components for analysis workflows.
- `html/util` | A directory with various (TypeScript) utilities, data interfaces, and so on.
- `html` | Other HTML content goes here. For now, this includes various windows.
- `assets` | Any images/icons/whatever.
Expand Down Expand Up @@ -58,12 +58,12 @@ Note that unit tests are mostly defined within the same file as the tested funct
- `eval_dynamic` | Algorithms and wrappers for evaluation of dynamic properties.
- `eval_static` | Algorithms and wrappers for evaluation of static properties.
- `fo_logic` | Parsing and evaluation for the FOL formulas.
- `analysis` | Module to handle the state and high-level computation for the inference analysis session.
- `inference` | Module to handle the state and high-level computation for the inference session.
- `_test_inference` | Tests for the whole inference computation pipeline.
- `app` | Module defining the core structures and traits behind the application's architecture (like sessions, events, undo-redo stack).
- `state` | Structures for managing application state and event handling.
- `analysis` | Skeleton of the analysis session.
- `editor` | Skeleton of the editor session.
- `inference` | Skeleton of the top-level inference session structure.
- `editor` | Skeleton of the top-level editor session structure.
- `bin` | Sources for additional binaries that can be run from CLI.
- `run_inference.rs` | Program for running the inference on the given sketch from CLI.
- `sketchbook` | Module to handle the state of the sketch and the editor session.
Expand Down
30 changes: 15 additions & 15 deletions project-docs/events-structure.md
Original file line number Diff line number Diff line change
Expand Up @@ -308,33 +308,33 @@ Events are structured in a hierarchical manner to simplify navigation and unders


---
## Analysis Workflow Events
## Inference Session Events

#### State Related Events
- **Path**: `['analysis', 'refresh_sketch']`
- **Path**: `['inference', 'refresh_sketch']`
- **Description**: Request the current sketch data from the backend.
- **Payload**: None

#### Inference Computation Events
- **Path**: `['analysis', 'start_full_inference']`
- **Path**: `['inference', 'start_full_inference']`
- **Description**: Start a full inference analysis.
- **Payload**: None
- **Path**: `['analysis', 'start_static_inference']`
- **Path**: `['inference', 'start_static_inference']`
- **Description**: Start an inference analysis using static properties only.
- **Payload**: None
- **Path**: `['analysis', 'start_dynamic_inference']`
- **Path**: `['inference', 'start_dynamic_inference']`
- **Description**: Start an inference analysis using dynamic properties only.
- **Payload**: None
- **Path**: `['analysis', 'reset_analysis']`
- **Description**: Reset the current analysis and start again using the same sketch.
- **Path**: `['inference', 'reset_inference']`
- **Description**: Reset the current inference and start again using the same sketch.
- **Payload**: None
- **Path**: `['analysis', 'ping_for_results']`
- **Path**: `['inference', 'ping_for_results']`
- **Description**: Check if inference results are ready.
- **Payload**: None

#### Inference Results Events
- **Path**: `['analysis', 'sample_networks']`
- **Description**: Sample Boolean networks from the analysis results.
- **Path**: `['inference', 'sample_networks']`
- **Description**: Sample Boolean networks from the inference results.
- **Payload**:
```json
{
Expand All @@ -343,14 +343,14 @@ Events are structured in a hierarchical manner to simplify navigation and unders
"path": "string"
}
```
- **Path**: `['analysis', 'dump_results']`
- **Description**: Save the analysis results to a specified path, including the sketch and other related data.
- **Path**: `['inference', 'dump_results']`
- **Description**: Save the inference results to a specified path, including the sketch and other related data.
- **Payload**: `{ "path": "string" }`


---
## Error Events
- **Path**: `['error', 'generic']`
- **Path**: `['error']`
- **Description**: Receive a generic error message from the backend.
- **Payload**:
```json
Expand All @@ -362,8 +362,8 @@ Events are structured in a hierarchical manner to simplify navigation and unders

---
## New Session Events
- **Path**: `['new_session', 'create']`
- **Description**: Create a new analysis session.
- **Path**: `['new-inference-session']`
- **Description**: Create a new inference session.
- **Payload**: None

---
Expand Down
4 changes: 0 additions & 4 deletions src-tauri/src/app/state/analysis/mod.rs

This file was deleted.

4 changes: 2 additions & 2 deletions src-tauri/src/app/state/editor/_state_editor_session.rs
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ use crate::sketchbook::Sketch;
/// The state of one editor session.
///
/// An editor session is the "main" app session where a model is created/edited and from which
/// different analysis sessions can be started.
/// other sessions can be started.
pub struct EditorSession {
id: String,
undo_stack: UndoStack,
Expand Down Expand Up @@ -41,7 +41,7 @@ impl StackSession for EditorSession {
// todo: make this `mut` when we have some cases here that could mutate state
let reset_stack = false;

// request from new Analysis session for sending a sketch
// request from new Inference session for sending a sketch
let result = if path == vec!["send_sketch".to_string()] {
let sketch_string = self.sketch.to_custom_json();
let response_msg = SessionMessage {
Expand Down
Original file line number Diff line number Diff line change
@@ -1,32 +1,32 @@
use crate::analysis::analysis_state::AnalysisState;
use crate::app::event::{Event, SessionMessage, StateChange};
use crate::app::state::_undo_stack::UndoStack;
use crate::app::state::{Consumed, SessionHelper, SessionState, StackSession};
use crate::app::{AeonError, DynError};
use crate::debug;
use crate::inference::inference_state::InferenceState;
use crate::sketchbook::data_structs::SketchData;
use crate::sketchbook::{JsonSerde, Sketch};

/// The state of one editor session.
///
/// An analysis session is the session where the process of the inference is run on a given model.
pub struct AnalysisSession {
/// An inference session is the session where the process of the inference is run on a given model.
pub struct InferenceSession {
id: String,
undo_stack: UndoStack,
analysis_state: AnalysisState,
inference_state: InferenceState,
}

impl AnalysisSession {
pub fn new(id: &str) -> AnalysisSession {
AnalysisSession {
impl InferenceSession {
pub fn new(id: &str) -> InferenceSession {
InferenceSession {
id: id.to_string(),
undo_stack: UndoStack::default(),
analysis_state: AnalysisState::new_empty(),
inference_state: InferenceState::new_empty(),
}
}
}

impl StackSession for AnalysisSession {
impl StackSession for InferenceSession {
fn process_message(
&mut self,
message: &SessionMessage,
Expand All @@ -42,21 +42,21 @@ impl StackSession for AnalysisSession {
if let Some(sketch_payload) = message.message.payload.clone() {
let sketch = Sketch::from_custom_json(&sketch_payload)?;
reset_stack = true;
self.analysis_state.set_sketch(sketch);
self.inference_state.set_sketch(sketch);
} else {
panic!("Message `sketch_sent` must always carry a payload.")
}

// no backend response is expected, but we must send refresh event to inform frontend
// about the state change
let sketch_data = SketchData::new_from_sketch(self.analysis_state.get_sketch());
let sketch_data = SketchData::new_from_sketch(self.inference_state.get_sketch());
let payload = sketch_data.to_json_str();
let state_change = StateChange {
events: vec![Event::build(&["analysis", "get_sketch"], Some(&payload))],
events: vec![Event::build(&["inference", "get_sketch"], Some(&payload))],
};
Ok((None, Some(state_change)))
} else {
let error_msg = format!("`AnalysisSession` cannot process path {:?}.", path);
let error_msg = format!("`InferenceSession` cannot process path {:?}.", path);
AeonError::throw(error_msg)
};

Expand All @@ -83,14 +83,14 @@ impl StackSession for AnalysisSession {
}
}

impl SessionHelper for AnalysisSession {}
impl SessionHelper for InferenceSession {}

impl SessionState for AnalysisSession {
impl SessionState for InferenceSession {
fn perform_event(&mut self, event: &Event, at_path: &[&str]) -> Result<Consumed, DynError> {
if let Some(at_path) = Self::starts_with("undo_stack", at_path) {
self.undo_stack.perform_event(event, at_path)
} else if let Some(at_path) = Self::starts_with("analysis", at_path) {
self.analysis_state.perform_event(event, at_path)
} else if let Some(at_path) = Self::starts_with("inference", at_path) {
self.inference_state.perform_event(event, at_path)
} else {
Self::invalid_path_error_generic(at_path)
}
Expand All @@ -99,8 +99,8 @@ impl SessionState for AnalysisSession {
fn refresh(&self, full_path: &[String], at_path: &[&str]) -> Result<Event, DynError> {
if let Some(at_path) = Self::starts_with("undo_stack", at_path) {
self.undo_stack.refresh(full_path, at_path)
} else if let Some(at_path) = Self::starts_with("analysis", at_path) {
self.analysis_state.refresh(full_path, at_path)
} else if let Some(at_path) = Self::starts_with("inference", at_path) {
self.inference_state.refresh(full_path, at_path)
} else {
Self::invalid_path_error_generic(at_path)
}
Expand Down
4 changes: 4 additions & 0 deletions src-tauri/src/app/state/inference/mod.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
/// Declares [InferenceSession]: the root state object of the inference session.
mod _state_inference_session;

pub use _state_inference_session::InferenceSession;
6 changes: 3 additions & 3 deletions src-tauri/src/app/state/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,10 @@ mod _state_atomic;
mod _state_map;
pub mod _undo_stack;

/// Declares state objects that are unique to the sketchbook analysis window.
pub mod analysis;
/// Declares state objects that are unique to the sketchbook editor window.
/// Declares top-level state objects that are unique to the sketchbook editor session.
pub mod editor;
/// Declares top-level state objects that are unique to the sketchbook inference session.
pub mod inference;

use crate::app::state::_undo_stack::UndoStack;
use crate::debug;
Expand Down
6 changes: 3 additions & 3 deletions src-tauri/src/bin/run_inference.rs
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
use biodivine_sketchbook::analysis::inference_results::InferenceResults;
use biodivine_sketchbook::analysis::inference_solver::InferenceSolver;
use biodivine_sketchbook::analysis::inference_type::InferenceType;
use biodivine_sketchbook::inference::inference_results::InferenceResults;
use biodivine_sketchbook::inference::inference_solver::InferenceSolver;
use biodivine_sketchbook::inference::inference_type::InferenceType;
use biodivine_sketchbook::logging;
use biodivine_sketchbook::sketchbook::Sketch;

Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
use super::utils::load_test_model;
use crate::analysis::_test_inference::utils::add_dyn_prop_and_infer;
use crate::inference::_test_inference::utils::add_dyn_prop_and_infer;
use crate::sketchbook::properties::shortcuts::*;
use crate::sketchbook::properties::DynProperty;

Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
use crate::analysis::_test_inference::utils::apply_event_fully;
use crate::inference::_test_inference::utils::apply_event_fully;
use crate::sketchbook::event_utils::mk_model_event;
use crate::sketchbook::model::Monotonicity;
use crate::sketchbook::JsonSerde;
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
use super::utils::load_test_model;
use crate::analysis::_test_inference::utils::add_stat_prop_and_infer;
use crate::inference::_test_inference::utils::add_stat_prop_and_infer;
use crate::sketchbook::model::{Essentiality, Monotonicity};
use crate::sketchbook::properties::shortcuts::*;
use crate::sketchbook::properties::StatProperty;
Expand Down
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
use crate::analysis::inference_results::InferenceResults;
use crate::analysis::inference_solver::InferenceSolver;
use crate::analysis::inference_type::InferenceType;
use crate::app::event::Event;
use crate::app::state::{Consumed, SessionState};
use crate::inference::inference_results::InferenceResults;
use crate::inference::inference_solver::InferenceSolver;
use crate::inference::inference_type::InferenceType;
use crate::sketchbook::properties::{DynProperty, StatProperty};
use crate::sketchbook::Sketch;
use std::fs::File;
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
use crate::analysis::inference_status::InferenceStatusReport;
use crate::analysis::inference_type::InferenceType;
use crate::analysis::update_fn_details::MAX_UPDATE_FN_COUNT;
use crate::inference::inference_status::InferenceStatusReport;
use crate::inference::inference_type::InferenceType;
use crate::inference::update_fn_details::MAX_UPDATE_FN_COUNT;
use crate::sketchbook::JsonSerde;
use serde::{Deserialize, Serialize};
use std::{collections::HashMap, time::Duration};
Expand Down Expand Up @@ -99,9 +99,9 @@ mod tests {
use std::collections::HashMap;
use std::time::Duration;

use crate::analysis::inference_results::InferenceResults;
use crate::analysis::inference_status::{InferenceStatus, InferenceStatusReport};
use crate::analysis::inference_type::InferenceType;
use crate::inference::inference_results::InferenceResults;
use crate::inference::inference_status::{InferenceStatus, InferenceStatusReport};
use crate::inference::inference_type::InferenceType;

#[test]
fn test_inference_results_summary_and_report() {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,10 @@ use crate::algorithms::eval_static::eval::eval_static_prop;
use crate::algorithms::eval_static::prepare_graph::prepare_graph_for_static_fol;
use crate::algorithms::eval_static::processed_props::{process_static_props, ProcessedStatProp};
use crate::algorithms::fo_logic::utils::get_implicit_function_name;
use crate::analysis::inference_results::InferenceResults;
use crate::analysis::inference_status::InferenceStatus;
use crate::analysis::inference_type::InferenceType;
use crate::debug;
use crate::inference::inference_results::InferenceResults;
use crate::inference::inference_status::InferenceStatus;
use crate::inference::inference_type::InferenceType;
use crate::sketchbook::{JsonSerde, Sketch};
use biodivine_lib_param_bn::symbolic_async_graph::{
GraphColoredVertices, GraphColors, SymbolicAsyncGraph,
Expand Down Expand Up @@ -172,10 +172,10 @@ impl InferenceSolver {
}
}

/// Update the status of the solver, and send a progress message to the AnalysisState
/// Update the status of the solver, and send a progress message to the InferenceState
/// instance (that started this solver).
///
/// If the channel for progress updates no longer exists (because analysis is supposed to
/// If the channel for progress updates no longer exists (because inference is supposed to
/// be reset, the window was closed, or some other reason), we instead forcibly stop the
/// computation. Destroying the channel can thus actually be used as another way to stop the
/// asynchronous computation, since one does not need to acquire lock over the whole solver.
Expand Down Expand Up @@ -339,11 +339,11 @@ impl InferenceSolver {
/// Run the prototype version of the inference using the given solver.
/// This wraps the [Self::run_inference_modular] to also log potential errors.
///
/// The argument `analysis_type` specifies which kind of inference should be used.
/// The argument `inference_type` specifies which kind of inference should be used.
/// Currently, we support full inference with all properties, and partial inferences with only
/// static or only dynamic properties.
///
/// The results are saved to sepcific fields of the provided solver and can be retrieved later.
/// The results are saved to specific fields of the provided solver and can be retrieved later.
/// They are also returned, which is now used for logging later.
pub async fn run_inference_async(
solver: Arc<RwLock<InferenceSolver>>,
Expand Down Expand Up @@ -484,7 +484,7 @@ impl InferenceSolver {
/// For example, you can only consider static properties, only dynamic properties, or all.
pub fn run_inference_modular(
&mut self,
analysis_type: InferenceType,
inference_type: InferenceType,
sketch: Sketch,
use_static: bool,
use_dynamic: bool,
Expand Down Expand Up @@ -600,7 +600,7 @@ impl InferenceSolver {
num_update_fn_variants_per_var(self.final_sat_colors()?, self.bn()?);
let total_time = self.total_duration().unwrap();
let results = InferenceResults::new(
analysis_type,
inference_type,
num_sat_networks,
total_time,
&summary_msg,
Expand Down
Loading

0 comments on commit 0303825

Please sign in to comment.