Skip to content

Commit

Permalink
Release version 0.3.1
Browse files Browse the repository at this point in the history
Co-authored-by: Thomas Hoffmann <[email protected]>
Co-authored-by: Dimitri Kartsaklis <[email protected]>
Co-authored-by: Charles London <[email protected]>
  • Loading branch information
4 people committed Apr 14, 2023
1 parent 54b5bfa commit 7027843
Show file tree
Hide file tree
Showing 24 changed files with 533 additions and 316 deletions.
19 changes: 18 additions & 1 deletion .github/workflows/build_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ on:
env:
SRC_DIR: lambeq
TEST_DIR: tests
DOCS_DIR: docs

jobs:
lint:
Expand Down Expand Up @@ -76,7 +77,7 @@ jobs:
--doctest-modules
--durations=50
--ignore=${{ env.TEST_DIR }}/text2diagram/test_depccg_parser.py
--ignore=docs/extract_code_cells.py
--ignore=${{ env.DOCS_DIR }}/extract_code_cells.py
- name: Determine if depccg tests should be run
# only test depccg if it is explicitly changed, since it is very slow
# tests are also disabled on Python 3.11
Expand Down Expand Up @@ -108,6 +109,22 @@ jobs:
if: steps.depccg-enabled.outcome == 'success'
continue-on-error: true
run: coverage run --append --source=${{ env.SRC_DIR }} -m pytest -k test_depccg_parser.py
- name: Preparation for notebook testing
run: pip install nbmake
- name: Test example notebooks
env:
TEST_NOTEBOOKS: 1
run: >
pytest --nbmake ${{ env.DOCS_DIR }}/examples/
--nbmake-timeout=60
- name: Test tutorial notebooks
env:
TEST_NOTEBOOKS: 1
run: >
pytest --nbmake ${{ env.DOCS_DIR }}/tutorials/
--nbmake-timeout=60
--ignore ${{ env.DOCS_DIR }}/tutorials/trainer_hybrid.ipynb
--ignore ${{ env.DOCS_DIR }}/tutorials/code
- name: Coverage report
run: coverage report -m
type_check:
Expand Down
Binary file added docs/_static/images/Quantinuum_logo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
35 changes: 35 additions & 0 deletions docs/clean_notebooks.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
from pathlib import Path
from itertools import chain
import nbformat as nbf


print("Cleaning notebooks...")

nbs_path = Path("examples")
tut_path = Path("tutorials")
useful_metadata = ["nbsphinx", "raw_mimetype"]

for file in chain(nbs_path.iterdir(), tut_path.iterdir()):
if not (file.is_file() and file.suffix == ".ipynb"):
continue

ntbk = nbf.read(file, nbf.NO_CONVERT)

for cell in ntbk.cells:
# Delete cell ID if it's there
cell.pop("id", None)

# Keep only useful metadata
new_metadata = {x: cell.metadata[x]
for x in useful_metadata
if x in cell.metadata}
cell.metadata = new_metadata

ntbk.metadata = {"language_info": {"name": "python"}}

# We need the version of nbformat to be x.4, otherwise cells IDs
# are regenerated automatically
ntbk.nbformat = 4
ntbk.nbformat_minor = 4

nbf.write(ntbk, file, version=nbf.NO_CONVERT)
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@
]

intersphinx_mapping = {
'discopy': ("https://discopy.readthedocs.io/en/0.5/", None),
'discopy': ("https://docs.discopy.org/en/0.5.1.1/", None),
'pennylane': ("https://pennylane.readthedocs.io/en/stable/", None),
}

Expand Down
9 changes: 3 additions & 6 deletions docs/discopy.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,14 +3,11 @@
DisCoPy
=======

While the :ref:`parser <sec-parsing>` provides ``lambeq``'s input, *DisCoPy* [#f1]_ [FTC2020]_ is ``lambeq``'s underlying engine, the component where all the low-level processing takes place. At its core, DisCoPy is a Python library that allows computation with :term:`monoidal categories <monoidal category>`. The main data structure is that of a *monoidal diagram*, or :ref:`string diagram <sec-string-diagrams>`, which is the format that ``lambeq`` uses internally to encode a sentence (:py:class:`discopy.rigid.Diagram`). DisCoPy makes this easy, by offering many language-related features, such as support for :term:`pregroup grammars <pregroup grammar>` and :term:`functors <functor>` for implementing :term:`compositional models <compositional model>` such as :term:`DisCoCat`. Furthermore, from a quantum computing perspective, DisCoPy provides abstractions for creating all standard :term:`quantum gates <quantum gate>` and building :term:`quantum circuits <quantum circuit>`, which are used by ``lambeq`` in the final stages of the :ref:`pipeline <sec-pipeline>`.
While the :ref:`parser <sec-parsing>` provides ``lambeq``'s input, `DisCoPy <https://discopy.org>`_ [FTC2020]_ is ``lambeq``'s underlying engine, the component where all the low-level processing takes place. At its core, DisCoPy is a Python library that allows computation with :term:`monoidal categories <monoidal category>`. The main data structure is that of a *monoidal diagram*, or :ref:`string diagram <sec-string-diagrams>`, which is the format that ``lambeq`` uses internally to encode a sentence (:py:class:`discopy.rigid.Diagram`). DisCoPy makes this easy, by offering many language-related features, such as support for :term:`pregroup grammars <pregroup grammar>` and :term:`functors <functor>` for implementing :term:`compositional models <compositional model>` such as :term:`DisCoCat`. Furthermore, from a quantum computing perspective, DisCoPy provides abstractions for creating all standard :term:`quantum gates <quantum gate>` and building :term:`quantum circuits <quantum circuit>`, which are used by ``lambeq`` in the final stages of the :ref:`pipeline <sec-pipeline>`.

Thus, it is not a surprise that the advanced use of ``lambeq``, involving extending the toolkit with new :term:`compositional models <compositional model>` and :term:`ansätze <ansatz (plural: ansätze)>`, requires some familiarity of DisCoPy. For this, you can use the following resources:

- For a gentle introduction to basic DisCoPy concepts, start with ``lambeq``'s tutorial :ref:`sec-advanced`.
- The `basic example notebooks <https://discopy.readthedocs.io/en/main/notebooks.basics.html>`_ in DisCoPy documentation provide another good starting point.
- The `advanced tutorials <https://discopy.readthedocs.io/en/main/notebooks.advanced.html>`_ in DisCoPy documentation can help you to delve further into DisCoPy.
- The `basic example notebooks <https://docs.discopy.org/en/0.5.1.1/notebooks.basics.html>`_ in DisCoPy documentation provide another good starting point.
- The `advanced tutorials <https://docs.discopy.org/en/0.5.1.1/notebooks.advanced.html>`_ in DisCoPy documentation can help you to delve further into DisCoPy.

.. rubric:: Footnotes

.. [#f1] https://github.com/oxford-quantum-group/discopy
19 changes: 18 additions & 1 deletion docs/examples/classical_pipeline.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,23 @@
"test_labels, test_data = read_data('datasets/mc_test_data.txt')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"nbsphinx": "hidden"
},
"outputs": [],
"source": [
"TESTING = int(os.environ.get('TEST_NOTEBOOKS', '0'))\n",
"\n",
"if TESTING:\n",
" train_labels, train_data = train_labels[:2], train_data[:2]\n",
" dev_labels, dev_data = dev_labels[:2], dev_data[:2]\n",
" test_labels, test_data = test_labels[:2], test_data[:2]\n",
" EPOCHS = 1"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down Expand Up @@ -306,5 +323,5 @@
}
},
"nbformat": 4,
"nbformat_minor": 5
"nbformat_minor": 4
}
2 changes: 1 addition & 1 deletion docs/examples/parser.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -64,5 +64,5 @@
}
},
"nbformat": 4,
"nbformat_minor": 5
"nbformat_minor": 4
}
171 changes: 118 additions & 53 deletions docs/examples/pennylane.ipynb

Large diffs are not rendered by default.

19 changes: 18 additions & 1 deletion docs/examples/quantum_pipeline.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,23 @@
"test_labels, test_data = read_data('datasets/mc_test_data.txt')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"nbsphinx": "hidden"
},
"outputs": [],
"source": [
"TESTING = int(os.environ.get('TEST_NOTEBOOKS', '0'))\n",
"\n",
"if TESTING:\n",
" train_labels, train_data = train_labels[:2], train_data[:2]\n",
" dev_labels, dev_data = dev_labels[:2], dev_data[:2]\n",
" test_labels, test_data = test_labels[:2], test_data[:2]\n",
" EPOCHS = 1"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down Expand Up @@ -356,5 +373,5 @@
}
},
"nbformat": 4,
"nbformat_minor": 2
"nbformat_minor": 4
}
19 changes: 18 additions & 1 deletion docs/examples/quantum_pipeline_jax.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,23 @@
"test_labels, test_data = read_data('datasets/mc_test_data.txt')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"nbsphinx": "hidden"
},
"outputs": [],
"source": [
"TESTING = int(os.environ.get('TEST_NOTEBOOKS', '0'))\n",
"\n",
"if TESTING:\n",
" train_labels, train_data = train_labels[:2], train_data[:2]\n",
" dev_labels, dev_data = dev_labels[:2], dev_data[:2]\n",
" test_labels, test_data = test_labels[:2], test_data[:2]\n",
" EPOCHS = 1"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down Expand Up @@ -360,5 +377,5 @@
}
},
"nbformat": 4,
"nbformat_minor": 2
"nbformat_minor": 4
}
2 changes: 1 addition & 1 deletion docs/examples/reader.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -137,5 +137,5 @@
}
},
"nbformat": 4,
"nbformat_minor": 5
"nbformat_minor": 4
}
2 changes: 1 addition & 1 deletion docs/examples/rewrite.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -816,5 +816,5 @@
}
},
"nbformat": 4,
"nbformat_minor": 5
"nbformat_minor": 4
}
2 changes: 1 addition & 1 deletion docs/examples/tensor.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -154,5 +154,5 @@
}
},
"nbformat": 4,
"nbformat_minor": 5
"nbformat_minor": 4
}
2 changes: 1 addition & 1 deletion docs/examples/tree_reader.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -101,5 +101,5 @@
}
},
"nbformat": 4,
"nbformat_minor": 2
"nbformat_minor": 4
}
6 changes: 3 additions & 3 deletions docs/index.rst
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
lambeq
======

.. image:: _static/images/CQ-logo.png
:width: 120px
.. image:: _static/images/Quantinuum_logo.png
:width: 240px
:align: right

``lambeq`` is an open-source, modular, extensible high-level Python library for experimental :term:`Quantum Natural Language Processing <quantum NLP (QNLP)>` (QNLP), created by `Cambridge Quantum <https://cambridgequantum.com>`_'s QNLP team. At a high level, the library allows the conversion of any sentence to a :term:`quantum circuit`, based on a given :term:`compositional model` and certain parameterisation and choices of :term:`ansätze <ansatz (plural: ansätze)>`, and facilitates :ref:`training <sec-training>` for both quantum and classical NLP experiments. The notes for the latest release can be found :ref:`here <sec-release_notes>`.
``lambeq`` is an open-source, modular, extensible high-level Python library for experimental :term:`Quantum Natural Language Processing <quantum NLP (QNLP)>` (QNLP), created by `Quantinuum <https://www.quantinuum.com>`_'s QNLP team. At a high level, the library allows the conversion of any sentence to a :term:`quantum circuit`, based on a given :term:`compositional model` and certain parameterisation and choices of :term:`ansätze <ansatz (plural: ansätze)>`, and facilitates :ref:`training <sec-training>` for both quantum and classical NLP experiments. The notes for the latest release can be found :ref:`here <sec-release_notes>`.

``lambeq`` is available for Python 3.8 and higher, on Linux, macOS and Windows. To install, type:

Expand Down
16 changes: 16 additions & 0 deletions docs/release_notes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,22 @@ Release notes
=============


.. _rel-0.3.1:

`0.3.1 <https://github.com/CQCL/lambeq/releases/tag/0.3.1>`_
------------------------------------------------------------

Changed:

- Added example and tutorial notebooks to tests.
- Dependencies: pinned the maximum version of Jax and Jaxlib to 0.4.6 to avoid a JIT-compilation error when using the :py:class:`~lambeq.NumpyModel`.

Fixed:

- Documentation: fixed broken DisCoPy links.
- Fixed PyTorch datatype errors in example and tutorial notebooks.
- Updated custom :term:`ansätze <ansatz (plural: ansätze)>` in tutorial notebook to match new structure of :py:class:`~lambeq.CircuitAnsatz` and :py:class:`~lambeq.TensorAnsatz`.

.. _rel-0.3.0:

`0.3.0 <https://github.com/CQCL/lambeq/releases/tag/0.3.0>`_
Expand Down
Loading

0 comments on commit 7027843

Please sign in to comment.