Skip to content

Commit

Permalink
Phrasing suggestions by @jwallwork23
Browse files Browse the repository at this point in the history
Co-authored-by: Joe Wallwork <[email protected]>
  • Loading branch information
jatkinson1000 and jwallwork23 authored Feb 21, 2025
1 parent 3c6ae78 commit aeadefd
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions paper/paper.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ A central tenet of its design, in contrast to other approaches, is
that FTorch removes dependence on the Python runtime (and virtual environments).
By building on the `LibTorch` backend (written in C++ and accessible via an API) it
allows users to run ML models on both
CPU and GPU architectures without the need for porting code to device-specific languages.
CPU and GPU architectures without needing to port code to device-specific languages.


# Statement of need
Expand Down Expand Up @@ -118,7 +118,7 @@ maximise efficiency by reducing data-transfer during coupling^[i.e. the same
data in memory is used by both `LibTorch` and Fortran without creating a copy.]
and avoids any use of Python at runtime.
PyTorch types are represented through derived types in `FTorch`, with Tensors supported
across a range of data types and ranks by using the `fypp` preprocessor [@fypp].
across a range of data types and ranks using the `fypp` preprocessor [@fypp].
Fortran code quality is enforced using fortitude [@fortitude], alongside other tools.

We utilise the existing support in `LibTorch` for
Expand Down Expand Up @@ -155,7 +155,7 @@ call torch_delete(model_outputs)
...
```

Included with `FTorch` is a directory of examples covering an extensive range of use
`FTorch` includes a directory of examples covering an extensive range of use
cases.
Each guides users through a complete workflow from Python to Fortran.
These examples underpin integration testing alongside unit testing with
Expand Down

0 comments on commit aeadefd

Please sign in to comment.