From acdf80e8906fe13830f694036793e41f51a84658 Mon Sep 17 00:00:00 2001 From: Lilja77 Date: Thu, 16 Nov 2023 09:04:07 +0100 Subject: [PATCH] Update index.md --- content/pages/events/seminars/2023-11-15/index.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/content/pages/events/seminars/2023-11-15/index.md b/content/pages/events/seminars/2023-11-15/index.md index 2d66444d1..0eb872f38 100644 --- a/content/pages/events/seminars/2023-11-15/index.md +++ b/content/pages/events/seminars/2023-11-15/index.md @@ -8,5 +8,5 @@ venue: "Gothenburg and online" slides: "Staffan Larsson 15.11.2023.pdf" --- -Abstract +Abstract: The goal of the work presented here is to provide a hybrid of formal and neural semantics for natural language. To this end, we consider how the kind of formal semantic objects used in TTR (a theory of types with records, Cooper, 2023) might be related to the vector representations used in Eliasmith (2013). An advantage of doing this is that it would immediately give us a neural representation for TTR objects as Eliasmith relates vectors to neural activity in his semantic pointer architecture (SPA). This would be an alternative using convolution to the suggestions made by Cooper (2019a) based on the phasing of neural activity. The project seems potentially hopeful since all complex TTR objects are constructed from labelled sets (essentially sets of ordered pairs consisting of labels and values) which might be seen as corresponding to the representation of structured objects which Eliasmith achieves using superposition and circular convolution.