diff --git a/articles/editor.md b/articles/editor.md
index 93a48fe9..adf1af64 100644
--- a/articles/editor.md
+++ b/articles/editor.md
@@ -48,14 +48,14 @@ The most useful action to learn your way around the `Workflow` panel is right-cl
If only one node is selected, the `Output` menu item will display the type of the elements emitted by that operator sequence.
> [!Tip]
-> If the output of an operator is a complex type, you can inspect its public data members. Clicking on any of the sub-items will automatically place a new [`MemberSelector`](xref:Bonsai.Expressions.MemberSelectorBuilder) operator to pick the specified data member from the output of the node.
+> If the output of an operator is a complex type, you can inspect its public data members. Clicking on any of the sub-items will automatically place a new [`MemberSelector`] operator to pick the specified data member from the output of the node.
The context menu also allows you to externalize public properties of the operator as explicit nodes in the workflow using the `Externalize Property` drop-down menu. Once a property is externalized, you can connect other nodes in the workflow to it so you can change the value of the property dynamically (see the [Property Mapping](xref:property-mapping) section for more information).
-Finally, it is possible to group nodes, both for organizing large workflows, and to define [higher-order operators](xref:higher-order). The most basic grouping is the [`GroupWorkflow`](xref:Bonsai.Expressions.GroupWorkflowBuilder) which allows you to encapsulate a workflow fragment inside a single node. Any group can be assigned a `Name` for ease of reference and a `Description` for documentation. Any named properties which are externalized from nodes inside the group will be shown as properties of the group node itself on the top-level workflow.
+Finally, it is possible to group nodes, both for organizing large workflows, and to define [higher-order operators](xref:higher-order). The most basic grouping is the [`GroupWorkflow`] which allows you to encapsulate a workflow fragment inside a single node. Any group can be assigned a `Name` for ease of reference and a `Description` for documentation. Any named properties which are externalized from nodes inside the group will be shown as properties of the group node itself on the top-level workflow.
> [!Note]
-> You can use `GroupWorkflow` nodes to document your workflow by adding names and descriptions inline with operator chains. These can help readability of a workflow and no additional processing cost is incurred by the use of `GroupWorkflow` nodes.
+> You can use [`GroupWorkflow`] nodes to document your workflow by adding names and descriptions inline with operator chains. These can help readability of a workflow and no additional processing cost is incurred by the use of [`GroupWorkflow`] nodes.
### Type Visualizers
@@ -82,12 +82,12 @@ If you leave one or more visualizers open when you stop the workflow, the editor
You can create and save workflow extensions by selecting one or more nodes and clicking the `Save Workflow As...` button in the context menu.
-Workflow extensions are a powerful way to reuse common workflow patterns across a large project. When you save a new extension it will immediately show up in the `Toolbox` panel for placement. Placing a workflow extension will create a new [`IncludeWorkflow`](xref:Bonsai.Expressions.IncludeWorkflowBuilder) operator pointing to the saved workflow. You can place an extension multiple times in the same workflow.
+Workflow extensions are a powerful way to reuse common workflow patterns across a large project. When you save a new extension it will immediately show up in the `Toolbox` panel for placement. Placing a workflow extension will create a new [`IncludeWorkflow`] operator pointing to the saved workflow. You can place an extension multiple times in the same workflow.
> [!Tip]
-> Like other groups, any named properties which are externalized from nodes inside the `IncludeWorkflow` will be shown as properties of the include node itself. These properties can have different values across different instances of the same workflow extension, and will be saved as part of the top-level workflow.
+> Like other groups, any named properties which are externalized from nodes inside the [`IncludeWorkflow`] will be shown as properties of the include node itself. These properties can have different values across different instances of the same workflow extension, and will be saved as part of the top-level workflow.
-All included workflow extensions are read-only, meaning that you cannot change the internal structure of the extension once it is loaded into the workflow, only its properties. If you want to change the implementation of the extension you need to first `Ungroup` the `IncludeWorkflow` operator. This will make a copy of the included workflow and place it inside a [`GroupWorkflow`](xref:Bonsai.Expressions.GroupWorkflowBuilder). From there you will be able to modify the internal implementation at will. After you have changed the structure, you can save the extension again using `Save Workflow As...`.
+All included workflow extensions are read-only, meaning that you cannot change the internal structure of the extension once it is loaded into the workflow, only its properties. If you want to change the implementation of the extension you need to first `Ungroup` the [`IncludeWorkflow`] operator. This will make a copy of the included workflow and place it inside a [`GroupWorkflow`]. From there you will be able to modify the internal implementation at will. After you have changed the structure, you can save the extension again using `Save Workflow As...`.
> [!Warning]
> When you change the structure of an included workflow and save it over the original file, all references to that workflow extension will be automatically reloaded and updated. This ensures that all references to the same extension remain consistent throughout.
@@ -117,7 +117,7 @@ You can take advantage of tabs, windows, breadcrumbs and docked panels to naviga

-Right-clicking on a nested node such as a [`GroupWorkflow`](xref:Bonsai.Expressions.GroupWorkflowBuilder) will bring up the context menu, where you can select the `Open in New Tab` or `Open in New Window` commands. You can also access these commands by right-clicking on the tab header or window title bar.
+Right-clicking on a nested node such as a [`GroupWorkflow`] will bring up the context menu, where you can select the `Open in New Tab` or `Open in New Window` commands. You can also access these commands by right-clicking on the tab header or window title bar.
Each tab or window displays a breadcrumb trail at the top, indicating the location of the current view within the nested workflows. Clicking a breadcrumb switches the view to the corresponding workflow, allowing you to navigate between levels.
@@ -130,14 +130,14 @@ You can further organize tabs and windows by rearranging them into docked panels
The `Explorer` panel also supports workflow navigation by providing a hierarchical tree view, similar to a file browser. Each level in the tree corresponds to a nested node. Selecting a node will update the `Workflow` panel view to display the corresponding nested workflow. You can also navigate the tree by using the keyboard arrow keys and pressing Enter to update the view. To open the node in a new tab or window, right-click on the node label and select one of the options. To expand or collapse the tree view at any level, click on the `+` or `-` icon to the left of the node label, or double-click the label itself. Icons adjacent to each label indicate the status of the corresponding workflow:
- ✏️ Editable workflow
-- 🔒 Locked workflow (`IncludeWorkflow`)
+- 🔒 Locked workflow ([`IncludeWorkflow`])
- ⛔ Workflow contains errors
## Properties
{width=300}
-Each operator exposes a set of configuration properties that parameterize the operator's behaviour (e.g., the [`Timer`](xref:Bonsai.Reactive.Timer) operator exposes the period between generated values, whereas an image [`Threshold`](xref:Bonsai.Vision.Threshold) exposes the brightness cutoff value applied to individual pixels).
+Each operator exposes a set of configuration properties that parameterize the operator's behaviour (e.g., the [`Timer`] operator exposes the period between generated values, whereas an image [`Threshold`] exposes the brightness cutoff value applied to individual pixels).
The `Properties` panel will display all the configuration properties which are available for the currently selected operator. A summary description of the currently selected property can be found in the textbox at the bottom of the panel. Similarly, a description of the behaviour of the currently selected operator itself is shown at the top of the panel.
@@ -214,4 +214,11 @@ Below is a summary of the most used actions and shortcuts in the workflow editor
| Find visualizer source | With the visualizer highlighted: Ctrl+Backspace |
| Open context menu | Shift+F10 |
| View help | F1 |
-| Go to definition | F12 |
\ No newline at end of file
+| Go to definition | F12 |
+
+
+[`GroupWorkflow`]: xref:Bonsai.Expressions.GroupWorkflowBuilder
+[`IncludeWorkflow`]: xref:Bonsai.Expressions.IncludeWorkflowBuilder
+[`MemberSelector`]: xref:Bonsai.Expressions.MemberSelectorBuilder
+[`Threshold`]: xref:Bonsai.Vision.Threshold
+[`Timer`]: xref:Bonsai.Reactive.Timer
\ No newline at end of file
diff --git a/articles/expressions-externalizedmapping.md b/articles/expressions-externalizedmapping.md
index b8f43213..b20d2d92 100644
--- a/articles/expressions-externalizedmapping.md
+++ b/articles/expressions-externalizedmapping.md
@@ -4,6 +4,6 @@ title: "ExternalizedMapping"
---
> [!Warning]
-> In any one workflow, it is not possible to have more than one externalized property with the same name. When externalizing multiple conflicting properties, you can use the [`DisplayName`](xref:Bonsai.Expressions.ExternalizedMapping.DisplayName) property of the externalized mapping to provide distinct unique names for each property. It is also possible to specify different category or description strings to the externalized property for documentation purposes.
+> In any one workflow, it is not possible to have more than one externalized property with the same name. When externalizing multiple conflicting properties, you can use the `DisplayName` property of the externalized mapping to provide distinct unique names for each property. It is also possible to specify different category or description strings to the externalized property for documentation purposes.
When externalized properties are nested inside an operator group, for example inside a [`GroupWorkflow`](xref:Bonsai.Expressions.GroupWorkflowBuilder), they will be exposed as member properties of the node group itself. This means that when the group node is selected, all named externalized properties will show up in the `Properties` panel.
\ No newline at end of file
diff --git a/articles/expressions-inputmapping.md b/articles/expressions-inputmapping.md
index 342bb68d..c1b0e32c 100644
--- a/articles/expressions-inputmapping.md
+++ b/articles/expressions-inputmapping.md
@@ -3,6 +3,6 @@ uid: expressions-inputmapping
title: "InputMapping"
---
-Fundamentally, the `InputMapping` operator works exactly the same way as [`PropertyMapping`](xref:Bonsai.Expressions.PropertyMappingBuilder), but now the connection from the mapping operator to its target node is done through the upstream sources. In this case, only values from the source sequence can be used to map properties in the target node. However, it is possible to specify which specific member of the original data source will be selected as input to the target node by setting the [Selector](xref:Bonsai.Expressions.InputMappingBuilder.Selector) property.
+Fundamentally, the [`InputMapping`](xref:Bonsai.Expressions.InputMappingBuilder) operator works exactly the same way as [`PropertyMapping`](xref:Bonsai.Expressions.PropertyMappingBuilder), but now the connection from the mapping operator to its target node is done through the upstream sources. In this case, only values from the source sequence can be used to map properties in the target node. However, it is possible to specify which specific member of the original data source will be selected as input to the target node by setting the `Selector` property.
Whenever the original input sequence sends out a new data item, all the specified property mappings will be updated at the same time before this item is finally allowed to go through and notify the target. In this way, you can be sure that no property changes are performed between upstream notifications.
\ No newline at end of file
diff --git a/articles/expressions-propertymapping-multiple.md b/articles/expressions-propertymapping-multiple.md
index cacae705..f2106593 100644
--- a/articles/expressions-propertymapping-multiple.md
+++ b/articles/expressions-propertymapping-multiple.md
@@ -3,7 +3,7 @@ uid: expressions-propertymapping-multiple
title: "PropertyMappingMultiple"
---
-Multiple properties can be mapped simultaneously from the same source sequence when using `PropertyMapping`. You can select which properties to map by using the editors available in the property grid. For each mapped property you must specify a source selector, i.e. an expression specifying which members of the input data type are used to assign values to the mapped property.
+Multiple properties can be mapped simultaneously from the same source sequence when using [`PropertyMapping`](xref:Bonsai.Expressions.PropertyMappingBuilder). You can select which properties to map by using the editors available in the property grid. For each mapped property you must specify a source selector, i.e. an expression specifying which members of the input data type are used to assign values to the mapped property.
> [!Note]
> If the type of the selected member does not match the type of the property, a conversion is attempted. If no compatible conversion is available, the compiler checks whether it is possible to construct the corresponding data type from the selected members. For example, it would be possible to map to a [`Point`](xref:OpenCV.Net.Point) type by selecting two numeric values from the source sequence. In this case, the values would be used to construct a new point instance by assigning them to the X and Y parameters of the type constructor.
diff --git a/articles/higher-order.md b/articles/higher-order.md
index 93df36e5..4ddf54f0 100644
--- a/articles/higher-order.md
+++ b/articles/higher-order.md
@@ -6,7 +6,7 @@ uid: higher-order
When building simple reactive programs, it is usually enough to place a source for every device or every file you are accessing, and explicitly replicate the chain of transforms, sinks and combinators representing all the operations you need to perform on the data.
-However, sometimes you may need to build systems that deal with an unknown number of sources. For example, imagine you wanted to create a workflow to merge together several video files. If you knew beforehand how many files you will need to combine and where they are exactly located, you might use the [`Concat`](xref:Bonsai.Reactive.Concat) operator to design a workflow like the following:
+However, sometimes you may need to build systems that deal with an unknown number of sources. For example, imagine you wanted to create a workflow to merge together several video files. If you knew beforehand how many files you will need to combine and where they are exactly located, you might use the [`Concat`] operator to design a workflow like the following:
:::workflow

@@ -14,13 +14,13 @@ However, sometimes you may need to build systems that deal with an unknown numbe
But what if you did not know beforehand how many video files you will need to combine, and you wanted to merge all these videos without having to manually place a source node for every file?
-Suppose all you had to get started was the [`EnumerateFiles`](xref:Bonsai.IO.EnumerateFiles) source. This operator creates an observable sequence that will emit all the file names in a folder, one after the other.
+Suppose all you had to get started was the [`EnumerateFiles`] source. This operator creates an observable sequence that will emit all the file names in a folder, one after the other.
-In order to merge all the frames from these files in a single sequence you would need to create a different [`FileCapture`](xref:Bonsai.Vision.FileCapture) source for every file name emitted by this sequence, and pass all these sources to the [`Concat`](xref:Bonsai.Reactive.Concat) operator to generate a single sequence of frames. In other words, you want to create a sequence of frames for every file name in the folder, and then combine all these sequences into a single video file.
+In order to merge all the frames from these files in a single sequence you would need to create a different [`FileCapture`] source for every file name emitted by this sequence, and pass all these sources to the [`Concat`] operator to generate a single sequence of frames. In other words, you want to create a sequence of frames for every file name in the folder, and then combine all these sequences into a single video file.
Whenever an operator receives or emits a sequence of sequences, we call it a higher-order operator. These operators play a particularly powerful role in the Bonsai programming language so it is useful to describe them in some detail.
@@ -32,18 +32,18 @@ For example, the video concatenation workflow can be implemented in Bonsai as fo

-The behaviour of the [`CreateObservable`](xref:Bonsai.Reactive.CreateObservable) operator is specified by the floating node group. Each time a new file name is emitted by the [`EnumerateFiles`](xref:Bonsai.IO.EnumerateFiles) source, the `CreateObservable` operator creates a new observable sequence controlled by the operators inside the group.
+The behaviour of the [`CreateObservable`] operator is specified by the floating node group. Each time a new file name is emitted by the [`EnumerateFiles`] source, the [`CreateObservable`] operator creates a new observable sequence controlled by the operators inside the group.
-The input to the node group -- represented by the `Source1` operator -- is a sequence containing the individual items received by `CreateObservable`. In this case, it is a sequence with a single item that returns the file name emitted by the `EnumerateFiles` source. We use an externalized property to assign this value to the [`FileName`](xref:Bonsai.Vision.FileCapture.FileName) property of the `FileCapture` node, so that the correct video is accessed. Finally, the output of the node group determines the type and timing of the items emitted by the created sequence.
+The input to the node group -- represented by the `Source1` operator -- is a sequence containing the individual items received by [`CreateObservable`]. In this case, it is a sequence with a single item that returns the file name emitted by the [`EnumerateFiles`] source. We use an externalized property to assign this value to the `FileName` property of the [`FileCapture`] node, so that the correct video is accessed. Finally, the output of the node group determines the type and timing of the items emitted by the created sequence.
> [!Note]
-> The `CreateObservable` operator creates new sequences for every input notification. However, it does not automatically subscribe to them -- they are latent. No data actually flows through the operators in the node group until some other higher-order operator -- in this case `Concat` -- actually takes these sequences and subscribes to them.
+> The [`CreateObservable`] operator creates new sequences for every input notification. However, it does not automatically subscribe to them -- they are latent. No data actually flows through the operators in the node group until some other higher-order operator -- in this case [`Concat`] -- actually takes these sequences and subscribes to them.
## Marble diagrams for higher-order operators
Marble diagrams can also be extended to describe the behaviour of higher-order operators. Emitted sequences are represented by diagonal timelines branching off the main operator timeline. The start of the branching sequence represents the time at which that sequence was emitted.
-For example, the [`CreateObservable`](xref:Bonsai.Reactive.CreateObservable) operator used to convert file names into sequences of video frames is described below:
+For example, the [`CreateObservable`] operator used to convert file names into sequences of video frames is described below:
\ No newline at end of file
+ style="max-height:250px;padding:1em 0" />
+
+
+[`Concat`]: xref:Bonsai.Reactive.Concat
+[`CreateObservable`]: xref:Bonsai.Reactive.CreateObservable
+[`EnumerateFiles`]: xref:Bonsai.IO.EnumerateFiles
+[`FileCapture`]: xref:Bonsai.Vision.FileCapture
\ No newline at end of file
diff --git a/articles/observables.md b/articles/observables.md
index 0a9f1b87..898b0278 100644
--- a/articles/observables.md
+++ b/articles/observables.md
@@ -26,7 +26,7 @@ In reactive programming, we compose operations on sequences (generation, filteri
Arrows entering the box indicate that the operator is receiving notifications from the observable sequence that it is subscribed to. Arrows leaving the box show items that are emitted by the operator itself. If no `subscribe` arrow is explicitly indicated in the diagram, it is assumed to be placed at the start of the source sequence.
-In this case we can see from the diagram that the [`Condition`](xref:Bonsai.Reactive.Condition) operator is filtering the input notifications from the source sequence: only notifications with a specific shape are sent out in the result sequence.
+In this case we can see from the diagram that the [`Condition`] operator is filtering the input notifications from the source sequence: only notifications with a specific shape are sent out in the result sequence.
## Bonsai Workflows
@@ -43,17 +43,17 @@ By chaining networks of observable sequences in this way, it becomes possible to
> [!Warning]
> Do not confuse a *workflow* with the marble diagrams described above. A marble diagram describes the dynamic behaviour over time of an observable sequence or reactive operator, whereas a Bonsai workflow describes only which different operators subscribe to each other.
-We often combine marble diagrams with the workflow representation to better understand the behaviour of a Bonsai program. In the example above, we can see from the workflow that the [`Sample`](xref:Bonsai.Reactive.Sample) operator subscribes to the sequences generated by two other operators: [`Grayscale`](xref:Bonsai.Vision.Grayscale) and [`KeyDown`](xref:Bonsai.Windows.Input.KeyDown). `Grayscale` sends out images periodically, following the camera. However, `KeyDown` sends out a notification only when there is a key press, which can happen at any moment, even in between camera images.
+We often combine marble diagrams with the workflow representation to better understand the behaviour of a Bonsai program. In the example above, we can see from the workflow that the [`Sample`] operator subscribes to the sequences generated by two other operators: [`Grayscale`] and [`KeyDown`]. [`Grayscale`] sends out images periodically, following the camera. However, [`KeyDown`] sends out a notification only when there is a key press, which can happen at any moment, even in between camera images.
-What exactly will be the result sequence coming out of `Sample` in this case? Below is the marble diagram for the `Sample` operator, where the first sequence is the "source" sequence (`Grayscale`), and the second sequence is the "sampler" (`KeyDown`).
+What exactly will be the result sequence coming out of [`Sample`] in this case? Below is the marble diagram for the [`Sample`] operator, where the first sequence is the "source" sequence ([`Grayscale`]), and the second sequence is the "sampler" ([`KeyDown`]).
-From the marble diagram the behaviour of `Sample` is clear: it sends out the latest image that was received from `Grayscale` whenever there was a new key press. Marble diagrams can be an extremely useful tool to convey graphically the intuition of what a reactive operator is doing and are used extensively throughout the documentation.
+From the marble diagram the behaviour of [`Sample`] is clear: it sends out the latest image that was received from [`Grayscale`] whenever there was a new key press. Marble diagrams can be an extremely useful tool to convey graphically the intuition of what a reactive operator is doing and are used extensively throughout the documentation.
## Hot versus cold observable sequences {#temperature}
-One of the most important aspects for understanding the behaviour of observable sequences is to clarify the side-effects of subscription. For example, when an image processing operator like `Grayscale` subscribes to a sequence of images from a camera for the first time, the camera is turned on and an acquisition loop starts streaming live frames. If instead we have the `Grayscale` operator subscribe to a sequence of images from a pre-recorded video, the movie file is opened and frames begin to be decoded from disk into memory.
+One of the most important aspects for understanding the behaviour of observable sequences is to clarify the side-effects of subscription. For example, when an image processing operator like [`Grayscale`] subscribes to a sequence of images from a camera for the first time, the camera is turned on and an acquisition loop starts streaming live frames. If instead we have the [`Grayscale`] operator subscribe to a sequence of images from a pre-recorded video, the movie file is opened and frames begin to be decoded from disk into memory.
@@ -67,12 +67,20 @@ A video file also generates a sequence of images, but in contrast to the camera,
Understanding the *temperature* of an observable sequence is particularly important when that sequence is shared between multiple operators. It can help to understand whether those operators will see the same data items, and what the effect of subscribing to the shared sequence at different times is going to be.
-It is also possible to change the temperature of observable sequences using reactive operators. The [Replay](xref:Bonsai.Reactive.Replay) operator can be used to subscribe to the camera and start recording all incoming images. Every time a downstream observer subscribes to the result sequence, it will then replay all images on-demand, even if they subscribe late. The originally *hot* sequence has been turned into a *cold* observable by the replay behaviour.
+It is also possible to change the temperature of observable sequences using reactive operators. The [`Replay`] operator can be used to subscribe to the camera and start recording all incoming images. Every time a downstream observer subscribes to the result sequence, it will then replay all images on-demand, even if they subscribe late. The originally *hot* sequence has been turned into a *cold* observable by the replay behaviour.
-Conversely, the [Publish](xref:Bonsai.Reactive.Publish) operator can be used to share a single subscription to a video file when sending images to downstream observers. In this case, instead of requesting a new subscription to the video for each new observer, the publish behaviour will always share only the images coming from the original subscription, no matter at what point the video is in. The original sequence has been turned from *cold* to *hot*.
+Conversely, the [`Publish`] operator can be used to share a single subscription to a video file when sending images to downstream observers. In this case, instead of requesting a new subscription to the video for each new observer, the publish behaviour will always share only the images coming from the original subscription, no matter at what point the video is in. The original sequence has been turned from *cold* to *hot*.
-In the Bonsai visual language, whenever two operators receive data from the same source, i.e. whenever there are branches in the workflow, subscriptions use the `Publish` behaviour. This means that the default sharing behaviour of Bonsai sequences is *hot*. It is possible to change this by using specialized sharing operators, called [Subjects](xref:subjects).
\ No newline at end of file
+In the Bonsai visual language, whenever two operators receive data from the same source, i.e. whenever there are branches in the workflow, subscriptions use the `Publish` behaviour. This means that the default sharing behaviour of Bonsai sequences is *hot*. It is possible to change this by using specialized sharing operators, called [Subjects](xref:subjects).
+
+
+[`Condition`]: xref:Bonsai.Reactive.Condition
+[`Grayscale`]: xref:Bonsai.Vision.Grayscale
+[`KeyDown`]: xref:Bonsai.Windows.Input.KeyDown
+[`Publish`]: xref:Bonsai.Reactive.Publish
+[`Replay`]: xref:Bonsai.Reactive.Replay
+[`Sample`]: xref:Bonsai.Reactive.Sample
\ No newline at end of file
diff --git a/articles/property-mapping.md b/articles/property-mapping.md
index 1a6112b5..c2ec5426 100644
--- a/articles/property-mapping.md
+++ b/articles/property-mapping.md
@@ -12,7 +12,7 @@ As an example, imagine you wanted to continuously playback a sound WAV file to t

:::
-Using the [`ConvertScale`](xref:Bonsai.Dsp.ConvertScale) operator you could set the volume manually by changing its [`Scale`](xref:Bonsai.Dsp.ConvertScale.Scale) parameter.
+Using the [`ConvertScale`] operator you could set the volume manually by changing its `Scale` parameter.
Now consider a variation of this workflow where the playback volume needs to be modulated continuously depending on some other variable, for example the horizontal position of the mouse cursor as it moves across the screen.
@@ -22,7 +22,7 @@ A simple way to compute the desired scale value would be to rescale the X coordi

:::
-However, how would you now connect the sequence of scale values computed from the mouse position to changes in the [`Scale`](xref:Bonsai.Dsp.ConvertScale.Scale) property of the [`ConvertScale`](xref:Bonsai.Dsp.ConvertScale) node?
+However, how would you now connect the sequence of scale values computed from the mouse position to changes in the `Scale` property of the [`ConvertScale`] node?
Property mapping operators allow you to do exactly this. They are operators that take a single input sequence and react to notifications from that sequence by changing the values of the specified properties in the *subsequent* node. There are three types of property mapping operators, described below.
@@ -34,7 +34,7 @@ Property mapping operators allow you to do exactly this. They are operators that
## Externalized properties
-The [ExternalizedMapping](xref:Bonsai.Expressions.ExternalizedMappingBuilder) operator allows you to create externalized properties. The easiest way to initialize the mapping is from the right-click context menu when a single node is selected. Selecting a property from this menu will create or update the externalized mapping node. Multiple properties can be externalized from the same node.
+The [`ExternalizedMapping`] operator allows you to create externalized properties. The easiest way to initialize the mapping is from the right-click context menu when a single node is selected. Selecting a property from this menu will create or update the externalized mapping node. Multiple properties can be externalized from the same node.

@@ -42,7 +42,7 @@ The [ExternalizedMapping](xref:Bonsai.Expressions.ExternalizedMappingBuilder) op
## Mapping a sequence to a property
-After an operator property has been externalized, you can connect any sequence which is compatible with the data type of the property to the mapping node. When a connection to a source sequence is established, the externalized property will be promoted to a [`PropertyMapping`](xref:Bonsai.Expressions.PropertyMappingBuilder) operator.
+After an operator property has been externalized, you can connect any sequence which is compatible with the data type of the property to the mapping node. When a connection to a source sequence is established, the externalized property will be promoted to a [`PropertyMapping`] operator.
Now every time the source sequence emits a new notification, the mapping operator will react by changing the target property to the incoming value.
@@ -60,6 +60,12 @@ Now every time the source sequence emits a new notification, the mapping operato
Sometimes you need to synchronize property updates with the data flow, i.e. you do not want the property mapping operator to change the property values outside of notifications emitted by the source sequence.
-For example, imagine a transform operator which is converting a source sequence from one format to another, where the format specification is given by a set of operator properties. You may need the target format to change dynamically from time to time, but you may also need to guarantee that parts of the format specification do not change while the operator was converting some other input. The [`InputMapping`](xref:Bonsai.Expressions.InputMappingBuilder) operator allows you to do this by synchronizing property updates with input notifications.
+For example, imagine a transform operator which is converting a source sequence from one format to another, where the format specification is given by a set of operator properties. You may need the target format to change dynamically from time to time, but you may also need to guarantee that parts of the format specification do not change while the operator was converting some other input. The [`InputMapping`] operator allows you to do this by synchronizing property updates with input notifications.
-[!include[InputMapping](~/articles/expressions-inputmapping.md)]
\ No newline at end of file
+[!include[InputMapping](~/articles/expressions-inputmapping.md)]
+
+
+[`ConvertScale`]: xref:Bonsai.Dsp.ConvertScale
+[`ExternalizedMapping`]: xref:Bonsai.Expressions.ExternalizedMappingBuilder
+[`InputMapping`]: xref:Bonsai.Expressions.InputMappingBuilder
+[`PropertyMapping`]: xref:Bonsai.Expressions.PropertyMappingBuilder
\ No newline at end of file
diff --git a/articles/subject-async.md b/articles/subject-async.md
index ea903039..8935132b 100644
--- a/articles/subject-async.md
+++ b/articles/subject-async.md
@@ -5,9 +5,13 @@ title: "AsyncSubject"

-`AsyncSubject` stores and passes the last value (and only the last value) emitted by the source sequence to each subscribed observer. The value is also only sent out after the source sequence terminates. If the source sequence does not emit any value, `AsyncSubject` will also terminate without emitting any values.
+[`AsyncSubject`] stores and passes the last value (and only the last value) emitted by the source sequence to each subscribed observer. The value is also only sent out after the source sequence terminates. If the source sequence does not emit any value, [`AsyncSubject`] will also terminate without emitting any values.
> [!Tip]
-> You can use the [`Take`](xref:Bonsai.Reactive.Take) operator before `AsyncSubject` to store the first value from an infinite sequence.
+> You can use the [`Take`] operator before [`AsyncSubject`] to store the first value from an infinite sequence.
-Any observers which subscribe after the source sequence terminates will immediately receive the stored value. If the source sequence terminates with an error, `AsyncSubject` will not emit any values but will pass along the error notification to all observers.
+Any observers which subscribe after the source sequence terminates will immediately receive the stored value. If the source sequence terminates with an error, [`AsyncSubject`] will not emit any values but will pass along the error notification to all observers.
+
+
+[`AsyncSubject`]: xref:Bonsai.Reactive.AsyncSubject
+[`Take`]: xref:Bonsai.Reactive.Take
\ No newline at end of file
diff --git a/articles/subject-behavior.md b/articles/subject-behavior.md
index 926ec917..163b0dce 100644
--- a/articles/subject-behavior.md
+++ b/articles/subject-behavior.md
@@ -5,9 +5,12 @@ title: "BehaviorSubject"

-`BehaviorSubject` stores and passes the latest value emitted by the source sequence to each subscribed observer, and then continues to emit any subsequent values.
+[`BehaviorSubject`] stores and passes the latest value emitted by the source sequence to each subscribed observer, and then continues to emit any subsequent values.
-Any observers which subscribe later will immediately receive the latest stored value. However, if the source sequence terminates with an error, `BehaviorSubject` will not emit any values but will pass along the error notification to all subsequent observers.
+Any observers which subscribe later will immediately receive the latest stored value. However, if the source sequence terminates with an error, [`BehaviorSubject`] will not emit any values but will pass along the error notification to all subsequent observers.
> [!Warning]
-> `BehaviorSubject` is designed to multicast and share state updates from multiple sources, like a global variable. Because of this, even if one of the source sequences emitting values to `BehaviorSubject` terminates successfully, the `BehaviorSubject` will not send a termination message to any subscribed observers, but will remain active until the enclosing workflow scope is terminated to allow other sources to update the shared state.
\ No newline at end of file
+> [`BehaviorSubject`] is designed to multicast and share state updates from multiple sources, like a global variable. Because of this, even if one of the source sequences emitting values to [`BehaviorSubject`] terminates successfully, the [`BehaviorSubject`] will not send a termination message to any subscribed observers, but will remain active until the enclosing workflow scope is terminated to allow other sources to update the shared state.
+
+
+[`BehaviorSubject`]: xref:Bonsai.Reactive.BehaviorSubject
\ No newline at end of file
diff --git a/articles/subject-multicast.md b/articles/subject-multicast.md
index 2d385146..c956688c 100644
--- a/articles/subject-multicast.md
+++ b/articles/subject-multicast.md
@@ -3,8 +3,11 @@ uid: subject-multicast
title: "MulticastSubject"
---
-The `MulticastSubject` operator works like a sink which accesses the subject with the specified name, at the same scope level or above, and forwards any values emitted by the source sequence to the shared subject. Depending on the behavior of the subject, these values will then be passed to any operators subscribed to the subject, including any termination and error notifications.
+The [`MulticastSubject`] operator works like a sink which accesses the subject with the specified name, at the same scope level or above, and forwards any values emitted by the source sequence to the shared subject. Depending on the behavior of the subject, these values will then be passed to any operators subscribed to the subject, including any termination and error notifications.
:::workflow

:::
+
+
+[`MulticastSubject`]: xref:Bonsai.Expressions.MulticastSubject
\ No newline at end of file
diff --git a/articles/subject-publish.md b/articles/subject-publish.md
index 10b27013..52fdec97 100644
--- a/articles/subject-publish.md
+++ b/articles/subject-publish.md
@@ -5,8 +5,13 @@ title: "PublishSubject"

-`PublishSubject` passes to each subscribed observer only the values from the source sequence which were emitted after the time of subscription.
+[`PublishSubject`] passes to each subscribed observer only the values from the source sequence which were emitted after the time of subscription.
-This fire-and-forget behavior means that any observers which subscribe late might lose one or more items emitted between the time that `PublishSubject` was created and the time that the observer subscribed to it. If you require guaranteed delivery of all values from the source sequence, you need to ensure that all observers subscribe immediately upon workflow initialization. If this is not possible, you should consider switching to an [`AsyncSubject`](xref:Bonsai.Reactive.AsyncSubject) if the sequence contains a single value, or a [`ReplaySubject`](xref:Bonsai.Reactive.ReplaySubject) if the sequence contains multiple values.
+This fire-and-forget behavior means that any observers which subscribe late might lose one or more items emitted between the time that [`PublishSubject`] was created and the time that the observer subscribed to it. If you require guaranteed delivery of all values from the source sequence, you need to ensure that all observers subscribe immediately upon workflow initialization. If this is not possible, you should consider switching to an [`AsyncSubject`] if the sequence contains a single value, or a [`ReplaySubject`] if the sequence contains multiple values.
-If the source sequence terminates with an error, `PublishSubject` will not emit any items to subsequent observers, but will pass along the terminating error.
\ No newline at end of file
+If the source sequence terminates with an error, [`PublishSubject`] will not emit any items to subsequent observers, but will pass along the terminating error.
+
+
+[`AsyncSubject`]: xref:Bonsai.Reactive.AsyncSubject
+[`PublishSubject`]: xref:Bonsai.Reactive.PublishSubject
+[`ReplaySubject`]: xref:Bonsai.Reactive.ReplaySubject
\ No newline at end of file
diff --git a/articles/subject-replay.md b/articles/subject-replay.md
index f342d1b6..9641bc9b 100644
--- a/articles/subject-replay.md
+++ b/articles/subject-replay.md
@@ -5,6 +5,9 @@ title: "ReplaySubject"

-`ReplaySubject` passes to each subscribed observer all the values from the source sequence, regardless of when the observer subscribes.
+[`ReplaySubject`] passes to each subscribed observer all the values from the source sequence, regardless of when the observer subscribes.
-Any observers which subscribe late will immediately receive all values which were sent out between the time that `ReplaySubject` was created and the time that the observer subscribed to it. It is also possible to parameterize the `ReplaySubject` to throw away old values after a certain period of time, or after a specified buffer size is exceeded.
+Any observers which subscribe late will immediately receive all values which were sent out between the time that [`ReplaySubject`] was created and the time that the observer subscribed to it. It is also possible to parameterize the [`ReplaySubject`] to throw away old values after a certain period of time, or after a specified buffer size is exceeded.
+
+
+[`ReplaySubject`]: xref:Bonsai.Reactive.ReplaySubject
\ No newline at end of file
diff --git a/articles/subject-resource.md b/articles/subject-resource.md
index 17f1fe51..495b50dc 100644
--- a/articles/subject-resource.md
+++ b/articles/subject-resource.md
@@ -5,9 +5,12 @@ title: "ResourceSubject"

-`ResourceSubject` stores and passes the single last value emitted by the source sequence to each subscribed observer. The value is also only sent out after the source sequence terminates. If the source sequence does not emit any values, `ResourceSubject` will also complete without emitting any values.
+[`ResourceSubject`] stores and passes the single last value emitted by the source sequence to each subscribed observer. The value is also only sent out after the source sequence terminates. If the source sequence does not emit any values, [`ResourceSubject`] will also complete without emitting any values.
> [!Warning]
-> The type of the stored value must be [IDisposable](xref:System.IDisposable). When the enclosing workflow scope is terminated, the value will be disposed to free any allocated resources, such as file or memory handles.
+> The type of the stored value must be [`IDisposable`](xref:System.IDisposable). When the enclosing workflow scope is terminated, the value will be disposed to free any allocated resources, such as file or memory handles.
-Any observers which subscribe after the source sequence terminates will immediately receive the stored value. If the source sequence terminates with an error, `ResourceSubject` will not emit any values but will pass along the error notification to all observers.
\ No newline at end of file
+Any observers which subscribe after the source sequence terminates will immediately receive the stored value. If the source sequence terminates with an error, [`ResourceSubject`] will not emit any values but will pass along the error notification to all observers.
+
+
+[`ResourceSubject`]: xref:Bonsai.Reactive.ResourceSubject
\ No newline at end of file
diff --git a/articles/subject-subscribe.md b/articles/subject-subscribe.md
index 852430a0..fce0a0d0 100644
--- a/articles/subject-subscribe.md
+++ b/articles/subject-subscribe.md
@@ -3,11 +3,14 @@ uid: subject-subscribe
title: "SubscribeSubject"
---
-The `SubscribeSubject` operator is essentially a source which accesses a subject with the specified name, at the same scope level or above, and subscribes to it. The behavior of `SubscribeSubject` is defined by the type of the subject which is accessed, and values from the shared underlying sequence will then be passed to any operators downstream from `SubscribeSubject`, as if these operators were connected to the subject directly.
+The [`SubscribeSubject`] operator is essentially a source which accesses a subject with the specified name, at the same scope level or above, and subscribes to it. The behavior of [`SubscribeSubject`] is defined by the type of the subject which is accessed, and values from the shared underlying sequence will then be passed to any operators downstream from [`SubscribeSubject`], as if these operators were connected to the subject directly.
:::workflow

:::
> [!Note]
-> If the definition of the underlying subject changes, there is no need to change the `SubscribeSubject` as long as the name remains the same.
\ No newline at end of file
+> If the definition of the underlying subject changes, there is no need to change the [`SubscribeSubject`] as long as the name remains the same.
+
+
+[`SubscribeSubject`]: xref:Bonsai.Expressions.SubscribeSubject
\ No newline at end of file
diff --git a/articles/workflow-guidelines.md b/articles/workflow-guidelines.md
index 57e3af9e..a361ee89 100644
--- a/articles/workflow-guidelines.md
+++ b/articles/workflow-guidelines.md
@@ -9,7 +9,7 @@ This section offers guidelines and design patterns to consider when developing w
## Workflow Organization
:::do
-use `GroupWorkflow` nodes to separate independent functionality (e.g. acquisition, visualization and processing).
+use [`GroupWorkflow`] nodes to separate independent functionality (e.g. acquisition, visualization and processing).
:::
:::avoid
@@ -25,11 +25,11 @@ prefer using subjects over branches when sharing sequences across independent se
:::
:::do
-use a [`BehaviorSubject`](xref:Bonsai.Reactive.BehaviorSubject) to share global state which can be accessed by multiple consumers and modified by multiple producers.
+use a [`BehaviorSubject`] to share global state which can be accessed by multiple consumers and modified by multiple producers.
:::
:::avoid
-using [`MulticastSubject`](xref:Bonsai.Expressions.MulticastSubject) on variables which are not declared as [`BehaviorSubject`](xref:Bonsai.Reactive.BehaviorSubject). This will prevent accidental termination of the subject sequence if a producer terminates prematurely.
+using [`MulticastSubject`] on variables which are not declared as [`BehaviorSubject`]. This will prevent accidental termination of the subject sequence if a producer terminates prematurely.
:::
:::consider
@@ -42,20 +42,29 @@ moving all subject declarations to the top of the workflow. This will make sure
## Nested Operators
-Several reactive operators require specification of a nested workflow, e.g. [`SelectMany`](xref:Bonsai.Reactive.SelectMany) or [`CreateObservable`](xref:Bonsai.Reactive.CreateObservable). The operator itself will control when the nested workflow is initialized and subscribed to. If it is possible for a nested workflow to be executed multiple times, potentially in parallel, we call the operator *reentrant*. Some care is necessary to understand how to manage shared state and properties inside a reentrant nested operator.
+Several reactive operators require specification of a nested workflow, e.g. [`SelectMany`] or [`CreateObservable`]. The operator itself will control when the nested workflow is initialized and subscribed to. If it is possible for a nested workflow to be executed multiple times, potentially in parallel, we call the operator *reentrant*. Some care is necessary to understand how to manage shared state and properties inside a reentrant nested operator.
:::do
-use an [`AsyncSubject`](xref:Bonsai.Reactive.AsyncSubject) to share workflow input data inside a nested operator.
+use an [`AsyncSubject`] to share workflow input data inside a nested operator.
:::
:::avoid
-using [`PropertyMapping`](xref:Bonsai.Expressions.PropertyMappingBuilder) nodes inside reentrant nested operators.
+using [`PropertyMapping`] nodes inside reentrant nested operators.
:::
## Property Initialization
:::donot
-branch a source sequence to share the same value across different [`PropertyMapping`](xref:Bonsai.Expressions.PropertyMappingBuilder) nodes. This can introduce a race condition for operators that use property values at subscribe time.
+branch a source sequence to share the same value across different [`PropertyMapping`] nodes. This can introduce a race condition for operators that use property values at subscribe time.
:::
-Alternatively, you can either share the value using a subject, or branch after the `PropertyMapping` node (if both the value to share and the name of the property in each node are identical).
+Alternatively, you can either share the value using a subject, or branch after the [`PropertyMapping`] node (if both the value to share and the name of the property in each node are identical).
+
+
+[`AsyncSubject`]: xref:Bonsai.Reactive.AsyncSubject
+[`BehaviorSubject`]: xref:Bonsai.Reactive.BehaviorSubject
+[`CreateObservable`]: xref:Bonsai.Reactive.CreateObservable
+[`GroupWorkflow`]: xref:Bonsai.Expressions.GroupWorkflowBuilder
+[`MulticastSubject`]: xref:Bonsai.Expressions.MulticastSubject
+[`PropertyMapping`]: xref:Bonsai.Expressions.PropertyMappingBuilder
+[`SelectMany`]: xref:Bonsai.Reactive.SelectMany
\ No newline at end of file
diff --git a/docfx.json b/docfx.json
index b4b69afa..88654648 100644
--- a/docfx.json
+++ b/docfx.json
@@ -109,7 +109,10 @@
"https://horizongir.github.io/opencv.net/xrefmap.yml",
"https://horizongir.github.io/ZedGraph/xrefmap.yml",
"https://horizongir.github.io/opentk/xrefmap.yml",
- "https://horizongir.github.io/reactive/xrefmap.yml"
+ "https://horizongir.github.io/reactive/xrefmap.yml",
+ "https://bonsai-rx.org/ironpython-scripting/xrefmap.yml",
+ "https://bonsai-rx.org/ephys/xrefmap.yml",
+ "https://bonsai-rx.org/numerics/xrefmap.yml"
]
}
}
\ No newline at end of file
diff --git a/tutorials/acquisition.md b/tutorials/acquisition.md
index 62656d58..68802e35 100644
--- a/tutorials/acquisition.md
+++ b/tutorials/acquisition.md
@@ -13,9 +13,9 @@ Bonsai can be used to acquire and record data from many different devices. The e

:::
-- Insert a [`CameraCapture`](xref:Bonsai.Vision.CameraCapture) source.
-- Insert a [`VideoWriter`](xref:Bonsai.Vision.VideoWriter) sink.
-- Configure the `FileName` property of the `VideoWriter` operator with a file name ending in `.avi`.
+- Insert a [`CameraCapture`] source.
+- Insert a [`VideoWriter`] sink.
+- Configure the `FileName` property of the [`VideoWriter`] operator with a file name ending in `.avi`.
- Run the workflow and check that it generates a valid video file.
### **Exercise 2:** Saving a grayscale video
@@ -24,7 +24,7 @@ Bonsai can be used to acquire and record data from many different devices. The e

:::
-- Insert a [`Grayscale`](xref:Bonsai.Vision.Grayscale) transform between `CameraCapture` and `VideoWriter`.
+- Insert a [`Grayscale`] transform between [`CameraCapture`] and [`VideoWriter`].
- Run the workflow. The output should now be a grayscale movie.
- How would you modify the workflow to record **both** a colour and a grayscale movie?
@@ -38,10 +38,10 @@ Audio data is captured at much higher temporal sampling frequencies than video.

:::
-- Insert an [`AudioCapture`](xref:Bonsai.Audio.AudioCapture) source.
-- Insert an [`AudioWriter`](xref:Bonsai.Audio.AudioWriter) sink.
-- Configure the `FileName` property of the `AudioWriter` operator with a file name ending in `.wav`.
-- Make sure that the [`SampleRate`](xref:Bonsai.Audio.AudioWriter.SampleRate) property of the `AudioWriter` matches the frequency of audio capture.
+- Insert an [`AudioCapture`] source.
+- Insert an [`AudioWriter`] sink.
+- Configure the `FileName` property of the [`AudioWriter`] operator with a file name ending in `.wav`.
+- Make sure that the `SampleRate` property of the [`AudioWriter`] matches the frequency of audio capture.
- Run the workflow for some seconds. Playback the file in your favorite media player to check that it is a valid audio file.
### **Exercise 4:** Saving raw binary waveform data
@@ -50,8 +50,8 @@ Audio data is captured at much higher temporal sampling frequencies than video.

:::
-- Replace the `AudioWriter` operator with a [`MatrixWriter`](xref:Bonsai.Dsp.MatrixWriter) sink.
-- Configure the `Path` property of the `MatrixWriter` operator with a file name ending in `.bin`.
+- Replace the [`AudioWriter`] operator with a [`MatrixWriter`] sink.
+- Configure the `Path` property of the [`MatrixWriter`] operator with a file name ending in `.bin`.
- Run the workflow for some seconds.
- Open the resulting binary file in MATLAB/Python/R and make a time series plot of the raw waveform samples.
- **MATLAB:** Use the [`fread`](https://www.mathworks.com/help/matlab/ref/fread.html) function to read the binary file. The source data must be set to `int16`.
@@ -63,20 +63,20 @@ Audio data is captured at much higher temporal sampling frequencies than video.

:::
-- Insert an [`AudioReader`](xref:Bonsai.Audio.AudioReader) source.
-- Configure the [`FileName`](xref:Bonsai.Audio.AudioReader.FileName) property to point to the audio file you recorded in _Exercise 3_.
-- Insert an [`AudioPlayback`](xref:Bonsai.Audio.AudioPlayback) sink.
+- Insert an [`AudioReader`] source.
+- Configure the `FileName` property to point to the audio file you recorded in _Exercise 3_.
+- Insert an [`AudioPlayback`] sink.
- Run the workflow and check the sound is played correctly.
:::workflow

:::
-- Insert a [`KeyDown`](xref:Bonsai.Windows.Input.KeyDown) source.
-- Set the [`BufferLength`](xref:Bonsai.Audio.AudioReader.BufferLength) property of the `AudioReader` to zero, so that all audio data is read into a single buffer.
-- Combine the key press with the audio data using the [`WithLatestFrom`](xref:Bonsai.Reactive.WithLatestFrom) combinator.
-- Right-click the `WithLatestFrom` operator. Select the `Tuple` > `Item2` member from the context menu.
-- Move the `AudioPlayback` sink so that it follows the selected `Item2` member.
+- Insert a [`KeyDown`] source.
+- Set the `BufferLength` property of the [`AudioReader`] to zero, so that all audio data is read into a single buffer.
+- Combine the key press with the audio data using the [`WithLatestFrom`] combinator.
+- Right-click the [`WithLatestFrom`] operator. Select the `Tuple` > `Item2` member from the context menu.
+- Move the [`AudioPlayback`] sink so that it follows the selected `Item2` member.
- Run the workflow and press a key. What happens if you press the key several times?
## Arduino Acquisition
@@ -94,12 +94,12 @@ In order to communicate and interact with an Arduino using Bonsai, you must prog

:::
-- Insert an [`AnalogInput`](xref:Bonsai.Arduino.AnalogInput) source.
-- Configure the [`PortName`](xref:Bonsai.Arduino.AnalogInput.PortName) property to point to the correct serial port where the Arduino is connected.
+- Insert an [`AnalogInput`] source.
+- Configure the `PortName` property to point to the correct serial port where the Arduino is connected.
- Run the workflow and visualize the output of the analog source. What do you see?
- **Optional:** Connect a sensor to the analog input pin, e.g. a potentiometer or a button.
-- Insert a [`CsvWriter`](xref:Bonsai.IO.CsvWriter) sink. This operator records input data into a text file.
-- Configure the [`FileName`](xref:Bonsai.IO.CsvWriter.FileName) property of the `CsvWriter` operator with a file name ending in `.csv`.
+- Insert a [`CsvWriter`] sink. This operator records input data into a text file.
+- Configure the `FileName` property of the [`CsvWriter`] operator with a file name ending in `.csv`.
- Run the workflow, record some interesting signal, and then open the result text data file.
### **Exercise 7:** Control an LED
@@ -108,12 +108,12 @@ In order to communicate and interact with an Arduino using Bonsai, you must prog

:::
-- Insert a [`Boolean`](xref:Bonsai.Expressions.BooleanProperty) source.
-- Insert a [`DigitalOutput`](xref:Bonsai.Arduino.DigitalOutput) sink.
-- Set the [`Pin`](xref:Bonsai.Arduino.DigitalOutput.Pin) property of the `DigitalOutput` operator to 13.
-- Configure the [`PortName`](xref:Bonsai.Arduino.DigitalOutput.PortName) property.
-- Run the workflow and change the `Value` property of the `Boolean` operator.
-- **Optional:** Use your mouse to control the LED! Replace the `Boolean` operator by a [`MouseMove`](xref:Bonsai.Windows.Input.MouseMove) source (hint: use [`GreaterThan`](xref:Bonsai.Expressions.GreaterThanBuilder), [`LessThan`](xref:Bonsai.Expressions.LessThanBuilder), or equivalent operators to connect one of the mouse axis to `DigitalOutput`).
+- Insert a [`Boolean`] source.
+- Insert a [`DigitalOutput`] sink.
+- Set the `Pin` property of the [`DigitalOutput`] operator to 13.
+- Configure the `PortName` property.
+- Run the workflow and change the `Value` property of the [`Boolean`] operator.
+- **Optional:** Use your mouse to control the LED! Replace the [`Boolean`] operator by a [`MouseMove`] source (hint: use [`GreaterThan`], [`LessThan`], or equivalent operators to connect one of the mouse axis to [`DigitalOutput`]).
### **Exercise 8:** Control a servo motor
@@ -121,13 +121,13 @@ In order to communicate and interact with an Arduino using Bonsai, you must prog

:::
-- Insert a [`Timer`](xref:Bonsai.Reactive.Timer) source. Set its [`Period`](xref:Bonsai.Reactive.Timer.Period) property to 500 ms.
-- Insert a [`Take`](xref:Bonsai.Reactive.Take) operator. Set its [`Count`](xref:Bonsai.Reactive.Take.Count) property to 10.
-- Insert a [`Rescale`](xref:Bonsai.Dsp.Rescale) operator. Set its [`Max`](xref:Bonsai.Dsp.Rescale.Max) property to 10, and its [`RangeMax`](xref:Bonsai.Dsp.Rescale.RangeMax) property to 180.
-- Insert a [`Repeat`](xref:Bonsai.Reactive.Repeat) operator.
-- Insert a [`ServoOutput`](xref:Bonsai.Arduino.ServoOutput) sink.
-- Set the [`Pin`](xref:Bonsai.Arduino.ServoOutput.Pin) property of the `ServoOutput` operator to 9.
-- Configure the [`PortName`](xref:Bonsai.Arduino.ServoOutput.PortName) property.
+- Insert a [`Timer`] source. Set its `Period` property to 500 ms.
+- Insert a [`Take`] operator. Set its `Count` property to 10.
+- Insert a [`Rescale`] operator. Set its `Max` property to 10, and its `RangeMax` property to 180.
+- Insert a [`Repeat`] operator.
+- Insert a [`ServoOutput`] sink.
+- Set the `Pin` property of the [`ServoOutput`] operator to 9.
+- Configure the `PortName` property.
- Connect a servo motor to the Arduino pin 9 and run the workflow. Can you explain the behaviour of the servo?
- **Optional:** Make the servo sweep back and forth.
@@ -141,10 +141,10 @@ Bonsai allows processing captured raw video data to extract real-time measures o

:::
-- Insert a [`CameraCapture`](xref:Bonsai.Vision.CameraCapture) source.
-- Insert a [`RangeThreshold`](xref:Bonsai.Vision.RangeThreshold) transform.
-- Open the visualizer for the `RangeThreshold` operator.
-- Configure the [`Lower`](xref:Bonsai.Vision.RangeThreshold.Lower) and [`Upper`](xref:Bonsai.Vision.RangeThreshold.Upper) properties of the `RangeThreshold` to isolate your coloured object (hint: click the small arrow to the left of each property to expand their individual values).
+- Insert a [`CameraCapture`] source.
+- Insert a [`RangeThreshold`] transform.
+- Open the visualizer for the [`RangeThreshold`] operator.
+- Configure the `Lower` and `Upper` properties of the [`RangeThreshold`] to isolate your coloured object (hint: click the small arrow to the left of each property to expand their individual values).
This method segments coloured objects by setting boundaries directly on the BGR colour space. This colour space is considered a poor choice for colour segmentation. Can you see why?
@@ -152,9 +152,9 @@ This method segments coloured objects by setting boundaries directly on the BGR

:::
-- Replace the `RangeThreshold` operator by a [`ConvertColor`](xref:Bonsai.Vision.ConvertColor) transform. This node converts the image from the BGR colour space to the [Hue-Saturation-Value (HSV) colour space](https://en.wikipedia.org/wiki/HSL_and_HSV).
-- Insert an [`HsvThreshold`](xref:Bonsai.Vision.HsvThreshold) transform.
-- Configure the [`Lower`](xref:Bonsai.Vision.HsvThreshold.Lower) and [`Upper`](xref:Bonsai.Vision.HsvThreshold.Upper) properties of the `HsvThreshold` to isolate the object.
+- Replace the [`RangeThreshold`] operator by a [`ConvertColor`] transform. This node converts the image from the BGR colour space to the [Hue-Saturation-Value (HSV) colour space](https://en.wikipedia.org/wiki/HSL_and_HSV).
+- Insert an [`HsvThreshold`] transform.
+- Configure the `Lower` and `Upper` properties of the [`HsvThreshold`] to isolate the object.
- Test the resulting tracking under different illumination conditions.
### **Exercise 10:** Real-time position tracking
@@ -163,11 +163,11 @@ This method segments coloured objects by setting boundaries directly on the BGR

:::
-- Starting with the workflow from the previous exercise, insert a [`FindContours`](xref:Bonsai.Vision.FindContours) transform. This operator traces the contours of all the objects in a black-and-white image. An _object_ is defined as a region of connected white pixels.
-- Insert a [`BinaryRegionAnalysis`](xref:Bonsai.Vision.BinaryRegionAnalysis) transform. This node calculates the area, center of mass, and orientation for all the detected contours.
-- Insert a [`LargestBinaryRegion`](xref:Bonsai.Vision.LargestBinaryRegion) transform to extract the largest detected object in the image.
+- Starting with the workflow from the previous exercise, insert a [`FindContours`] transform. This operator traces the contours of all the objects in a black-and-white image. An _object_ is defined as a region of connected white pixels.
+- Insert a [`BinaryRegionAnalysis`] transform. This node calculates the area, center of mass, and orientation for all the detected contours.
+- Insert a [`LargestBinaryRegion`] transform to extract the largest detected object in the image.
- Select the `ConnectedComponent` > `Centroid` field of the largest binary region using the context menu.
-- Record the position of the centroid using a [`CsvWriter`](xref:Bonsai.IO.CsvWriter) sink.
+- Record the position of the centroid using a [`CsvWriter`] sink.
- **Optional:** Open the CSV file in Excel/Python/MATLAB/R and plot the trajectory of the object.
### **Exercise 11:** Background subtraction and motion segmentation
@@ -177,11 +177,11 @@ This method segments coloured objects by setting boundaries directly on the BGR
:::
- Create a grayscale video workflow similar to _Exercise 2_.
-- Insert a [`Skip`](xref:Bonsai.Reactive.Skip) operator. Set its `Count` property to 1.
-- In a new branch, insert a [`Take`](xref:Bonsai.Reactive.Take) operator. Set its `Count` property to 1.
-- Combine the images from both branches using the [`CombineLatest`](xref:Bonsai.Reactive.CombineLatest) combinator.
-- Insert the [`AbsoluteDifference`](xref:Bonsai.Dsp.AbsoluteDifference) transform after `CombineLatest`.
-- Insert a [`Threshold`](xref:Bonsai.Vision.Threshold) transform. Visualize the node output and adjust the [`ThresholdValue`](xref:Bonsai.Vision.Threshold.ThresholdValue) property.
+- Insert a [`Skip`] operator. Set its `Count` property to 1.
+- In a new branch, insert a [`Take`] operator. Set its `Count` property to 1.
+- Combine the images from both branches using the [`CombineLatest`] combinator.
+- Insert the [`AbsoluteDifference`] transform after [`CombineLatest`].
+- Insert a [`Threshold`] transform. Visualize the node output and adjust the `ThresholdValue` property.
_Describe in your own words what the above workflow is doing._
@@ -189,8 +189,8 @@ _Describe in your own words what the above workflow is doing._

:::
-- Replace the `CombineLatest` operator with the [`Zip`](xref:Bonsai.Reactive.Zip) combinator.
-- Delete the `Take` operator.
+- Replace the [`CombineLatest`] operator with the [`Zip`] combinator.
+- Delete the [`Take`] operator.
_Describe in your own words what the above modified workflow is doing._
@@ -201,12 +201,49 @@ _Describe in your own words what the above modified workflow is doing._
:::
- Create a grayscale video stream similar to _Exercise 2_.
-- Insert a [`BackgroundSubtraction`](xref:Bonsai.Vision.BackgroundSubtraction) transform. Set its `AdaptationRate` property to 1.
-- Insert a [`Sum`](xref:Bonsai.Dsp.Sum) operator. This operator will sum the values of all the pixels in the image.
-- Run the workflow, point the camera at a moving object and visualize the output of the `Sum` operator. Compare small movements to big movements. What happens to the signal when the object holds perfectly still?
-- Right-click the `Sum` operator. Select the `Scalar` > `Val0` member from the context menu.
+- Insert a [`BackgroundSubtraction`] transform. Set its `AdaptationRate` property to 1.
+- Insert a [`Sum`] operator. This operator will sum the values of all the pixels in the image.
+- Run the workflow, point the camera at a moving object and visualize the output of the [`Sum`] operator. Compare small movements to big movements. What happens to the signal when the object holds perfectly still?
+- Right-click the [`Sum`] operator. Select the `Scalar` > `Val0` member from the context menu.
> [!Note]
-> The `Sum` operator sums the pixel values across all image colour channels. However, in the case of grayscale binary images, there is only one active channel and its sum is stored in the [`Val0`](xref:OpenCV.Net.Scalar.Val0) field.
-
-- Record the motion of an object using a [`CsvWriter`](xref:Bonsai.IO.CsvWriter) sink.
+> The [`Sum`] operator sums the pixel values across all image colour channels. However, in the case of grayscale binary images, there is only one active channel and its sum is stored in the [`Val0`](xref:OpenCV.Net.Scalar.Val0) field.
+
+- Record the motion of an object using a [`CsvWriter`] sink.
+
+
+[`AbsoluteDifference`]: xref:Bonsai.Dsp.AbsoluteDifference
+[`AnalogInput`]: xref:Bonsai.Arduino.AnalogInput
+[`AudioCapture`]: xref:Bonsai.Audio.AudioCapture
+[`AudioPlayback`]: xref:Bonsai.Audio.AudioPlayback
+[`AudioReader`]: xref:Bonsai.Audio.AudioReader
+[`AudioWriter`]: xref:Bonsai.Audio.AudioWriter
+[`BackgroundSubtraction`]: xref:Bonsai.Vision.BackgroundSubtraction
+[`BinaryRegionAnalysis`]: xref:Bonsai.Vision.BinaryRegionAnalysis
+[`Boolean`]: xref:Bonsai.Expressions.BooleanProperty
+[`CameraCapture`]: xref:Bonsai.Vision.CameraCapture
+[`CombineLatest`]: xref:Bonsai.Reactive.CombineLatest
+[`ConvertColor`]: xref:Bonsai.Vision.ConvertColor
+[`CsvWriter`]: xref:Bonsai.IO.CsvWriter
+[`DigitalOutput`]: xref:Bonsai.Arduino.DigitalOutput
+[`FindContours`]: xref:Bonsai.Vision.FindContours
+[`Grayscale`]: xref:Bonsai.Vision.Grayscale
+[`GreaterThan`]: xref:Bonsai.Expressions.GreaterThanBuilder
+[`HsvThreshold`]: xref:Bonsai.Vision.HsvThreshold
+[`KeyDown`]: xref:Bonsai.Windows.Input.KeyDown
+[`LargestBinaryRegion`]: xref:Bonsai.Vision.LargestBinaryRegion
+[`LessThan`]: xref:Bonsai.Expressions.LessThanBuilder
+[`MatrixWriter`]: xref:Bonsai.Dsp.MatrixWriter
+[`MouseMove`]: xref:Bonsai.Windows.Input.MouseMove
+[`RangeThreshold`]: xref:Bonsai.Vision.RangeThreshold
+[`Repeat`]: xref:Bonsai.Reactive.Repeat
+[`Rescale`]: xref:Bonsai.Dsp.Rescale
+[`ServoOutput`]: xref:Bonsai.Arduino.ServoOutput
+[`Skip`]: xref:Bonsai.Reactive.Skip
+[`Sum`]: xref:Bonsai.Dsp.Sum
+[`Take`]: xref:Bonsai.Reactive.Take
+[`Threshold`]: xref:Bonsai.Vision.Threshold
+[`Timer`]: xref:Bonsai.Reactive.Timer
+[`VideoWriter`]: xref:Bonsai.Vision.VideoWriter
+[`WithLatestFrom`]: xref:Bonsai.Reactive.WithLatestFrom
+[`Zip`]: xref:Bonsai.Reactive.Zip
\ No newline at end of file
diff --git a/tutorials/closed-loop.md b/tutorials/closed-loop.md
index 1e191633..d65245b5 100644
--- a/tutorials/closed-loop.md
+++ b/tutorials/closed-loop.md
@@ -18,14 +18,14 @@ The easiest way to measure the latency of a closed-loop system is to use a digit
:::
- Connect the digital pin 8 on the Arduino to digital pin 13 using a jumper wire.
-- Insert a [`DigitalInput`](xref:Bonsai.Arduino.DigitalInput) source and set its `Pin` property to 8.
-- Insert a [`BitwiseNot`](xref:Bonsai.Expressions.BitwiseNotBuilder) transform.
-- Insert a [`DigitalOutput`](xref:Bonsai.Arduino.DigitalOutput) sink and configure its `Pin` property to pin 13.
-- Insert a [`TimeInterval`](xref:Bonsai.Reactive.TimeInterval) operator.
-- Right-click on the [`TimeInterval`](xref:Bonsai.Reactive.TimeInterval) operator and select `Output` > `Interval` > `TotalMilliseconds`.
+- Insert a [`DigitalInput`] source and set its `Pin` property to 8.
+- Insert a [`BitwiseNot`] transform.
+- Insert a [`DigitalOutput`] sink and configure its `Pin` property to pin 13.
+- Insert a [`TimeInterval`] operator.
+- Right-click on the [`TimeInterval`] operator and select `Output` > `Interval` > `TotalMilliseconds`.
> [!Note]
-> The [`TimeInterval`](xref:Bonsai.Reactive.TimeInterval) operator measures the interval between consecutive events in an observable sequence using the [high-precision event timer (HPET)](https://en.wikipedia.org/wiki/High_Precision_Event_Timer) in the computer. The HPET has a frequency of at least 10MHz, allowing us to accurately time intervals with sub-microsecond precision.
+> The [`TimeInterval`] operator measures the interval between consecutive events in an observable sequence using the [high-precision event timer (HPET)](https://en.wikipedia.org/wiki/High_Precision_Event_Timer) in the computer. The HPET has a frequency of at least 10MHz, allowing us to accurately time intervals with sub-microsecond precision.
- Run the workflow and measure the round-trip time between digital input messages.
@@ -36,28 +36,28 @@ The easiest way to measure the latency of a closed-loop system is to use a digit
:::
- Connect a red LED to Arduino digital pin 13.
-- Insert a [`CameraCapture`](xref:Bonsai.Vision.CameraCapture) source.
-- Insert a [`Crop`](xref:Bonsai.Vision.Crop) transform.
+- Insert a [`CameraCapture`] source.
+- Insert a [`Crop`] transform.
- Run the workflow and set the `RegionOfInterest` property to a small area around the LED.
> [!Tip]
-> You can use the visual editor for an easier calibration. While the workflow is running, right-click on the [`Crop`](xref:Bonsai.Vision.Crop) transform and select `Show Default Editor` from the context menu or click in the small button with ellipsis that appears when you select the `RegionOfInterest` property.
+> You can use the visual editor for an easier calibration. While the workflow is running, right-click on the [`Crop`] transform and select `Show Default Editor` from the context menu or click in the small button with ellipsis that appears when you select the `RegionOfInterest` property.
-- Insert a [`Sum`](xref:Bonsai.Dsp.Sum) transform and select the `Val2` field from the output.
+- Insert a [`Sum`] transform and select the `Val2` field from the output.
> [!Note]
-> The [`Sum`](xref:Bonsai.Dsp.Sum) operator adds the value of all the pixels in the image together, across all the color channels. Assuming the default BGR format, the result of summing all the pixels in the Red channel of the image will be stored in `Val2`. `Val0` and `Val1` would store the Blue and Green values, respectively. If you are using an LED with a color other than Red, please select the output field accordingly.
+> The [`Sum`] operator adds the value of all the pixels in the image together, across all the color channels. Assuming the default BGR format, the result of summing all the pixels in the Red channel of the image will be stored in `Val2`. `Val0` and `Val1` would store the Blue and Green values, respectively. If you are using an LED with a color other than Red, please select the output field accordingly.
-- Insert a [`GreaterThan`](xref:Bonsai.Expressions.GreaterThanBuilder) transform.
-- Insert a [`BitwiseNot`](xref:Bonsai.Expressions.BitwiseNotBuilder) transform.
-- Insert a [`DigitalOutput`](xref:Bonsai.Arduino.DigitalOutput) sink and configure its `Pin` property to pin 13.
-- Run the workflow and use the visualizer of the `Sum` operator to choose an appropriate threshold for [`GreaterThan`](xref:Bonsai.Expressions.GreaterThanBuilder). When connected to pin 13, the LED should flash a couple of times when the Arduino is first connected.
-- Insert a [`DistinctUntilChanged`](xref:Bonsai.Reactive.DistinctUntilChanged) operator after the [`BitwiseNot`](xref:Bonsai.Expressions.BitwiseNotBuilder) transform.
+- Insert a [`GreaterThan`] transform.
+- Insert a [`BitwiseNot`] transform.
+- Insert a [`DigitalOutput`] sink and configure its `Pin` property to pin 13.
+- Run the workflow and use the visualizer of the [`Sum`] operator to choose an appropriate threshold for [`GreaterThan`]. When connected to pin 13, the LED should flash a couple of times when the Arduino is first connected.
+- Insert a [`DistinctUntilChanged`] operator after the [`BitwiseNot`] transform.
> [!Note]
-> The `DistinctUntilChanged` operator filters consecutive duplicate items from an observable sequence. In this case, we want to change the value of the LED only when the threshold output changes from `LOW` to `HIGH`, or vice-versa. This will let us measure correctly the latency between detecting a change in the input and measuring the response to that change.
+> The [`DistinctUntilChanged`] operator filters consecutive duplicate items from an observable sequence. In this case, we want to change the value of the LED only when the threshold output changes from `LOW` to `HIGH`, or vice-versa. This will let us measure correctly the latency between detecting a change in the input and measuring the response to that change.
-- Insert the [`TimeInterval`](xref:Bonsai.Reactive.TimeInterval) operator and select `Output` > `Interval` > `TotalMilliseconds`.
+- Insert the [`TimeInterval`] operator and select `Output` > `Interval` > `TotalMilliseconds`.
- Run the workflow and measure the round-trip time between LED triggers.
_Given the measurements obtained in Exercise 2, what would you estimate is the **input** latency for video acquisition?_
@@ -70,20 +70,20 @@ _Given the measurements obtained in Exercise 2, what would you estimate is the *

:::
-- Insert a [`CameraCapture`](xref:Bonsai.Vision.CameraCapture) source.
-- Insert a [`Crop`](xref:Bonsai.Vision.Crop) transform.
+- Insert a [`CameraCapture`] source.
+- Insert a [`Crop`] transform.
- Run the workflow and use the `RegionOfInterest` property to specify the desired area.
-- Insert a [`Grayscale`](xref:Bonsai.Vision.Grayscale) and a [`Threshold`](xref:Bonsai.Vision.Threshold) transform (or the color segmentation operators).
-- Insert a [`Sum`](xref:Bonsai.Dsp.Sum) transform, and select the `Val0` field from the output.
-- Insert a [`GreaterThan`](xref:Bonsai.Expressions.GreaterThanBuilder) transform and configure the `Value` property to an appropriate threshold. Remember you can use the visualizers to see what values are coming through the `Sum` and what the result of the [`GreaterThan`](xref:Bonsai.Expressions.GreaterThanBuilder) operator is.
-- Insert the Arduino [`DigitalOutput`](xref:Bonsai.Arduino.DigitalOutput) sink.
-- Set the `Pin` property of the [`DigitalOutput`](xref:Bonsai.Arduino.DigitalOutput) operator to 13.
+- Insert a [`Grayscale`] and a [`Threshold`] transform (or the color segmentation operators).
+- Insert a [`Sum`] transform, and select the `Val0` field from the output.
+- Insert a [`GreaterThan`] transform and configure the `Value` property to an appropriate threshold. Remember you can use the visualizers to see what values are coming through the [`Sum`] and what the result of the [`GreaterThan`] operator is.
+- Insert the Arduino [`DigitalOutput`] sink.
+- Set the `Pin` property of the [`DigitalOutput`] operator to 13.
- Configure the `PortName` property.
- Run the workflow and verify that entering the region of interest triggers the Arduino LED.
-- **Optional:** Replace the [`Crop`](xref:Bonsai.Vision.Crop) transform by a [`CropPolygon`](xref:Bonsai.Vision.CropPolygon) to allow for non-rectangular regions.
+- **Optional:** Replace the [`Crop`] transform by a [`CropPolygon`] to allow for non-rectangular regions.
> [!Note]
-> The [`CropPolygon`](xref:Bonsai.Vision.CropPolygon) operator uses the `Regions` property to define multiple, possibly non-rectangular regions. The visual editor is similar to [`Crop`](xref:Bonsai.Vision.Crop), where you draw a rectangular box. However, in [`CropPolygon`](xref:Bonsai.Vision.CropPolygon) you can move the corners of the box by right-clicking _inside_ the box and dragging the cursor to the new position. You can add new points by double-clicking with the left mouse button, and delete points by double-clicking with the right mouse button. You can delete regions by pressing the `Del` key and cycle through selected regions by pressing the `Tab` key.
+> The [`CropPolygon`] operator uses the `Regions` property to define multiple, possibly non-rectangular regions. The visual editor is similar to [`Crop`], where you draw a rectangular box. However, in [`CropPolygon`] you can move the corners of the box by right-clicking _inside_ the box and dragging the cursor to the new position. You can add new points by double-clicking with the left mouse button, and delete points by double-clicking with the right mouse button. You can delete regions by pressing the `Del` key and cycle through selected regions by pressing the `Tab` key.
### **Exercise 4:** Modulating stimulus intensity based on distance to a point
@@ -91,35 +91,35 @@ _Given the measurements obtained in Exercise 2, what would you estimate is the *

:::
-- Insert a [`FunctionGenerator`](xref:Bonsai.Dsp.FunctionGenerator) source.
+- Insert a [`FunctionGenerator`] source.
- Set the `Amplitude` property to 500, and the `Frequency` property to `200`.
-- Insert an [`AudioPlayback`](xref:Bonsai.Audio.AudioPlayback) sink.
-- Externalize the `Amplitude` property of the `FunctionGenerator` using the right-click context menu.
+- Insert an [`AudioPlayback`] sink.
+- Externalize the `Amplitude` property of the [`FunctionGenerator`] using the right-click context menu.
-If you run the workflow, you should hear a pure tone coming through the speakers. The `FunctionGenerator` periodically emits buffered waveforms with values ranging between 0 and `Amplitude`, the shape of which changes the properties of the tone. For example, by changing the value of `Amplitude` you can make the sound loud or soft. The next step is to modulate the `Amplitude` property dynamically based on the distance of the object to a target.
+If you run the workflow, you should hear a pure tone coming through the speakers. The [`FunctionGenerator`] periodically emits buffered waveforms with values ranging between 0 and `Amplitude`, the shape of which changes the properties of the tone. For example, by changing the value of `Amplitude` you can make the sound loud or soft. The next step is to modulate the `Amplitude` property dynamically based on the distance of the object to a target.
:::workflow

:::
-- Create a video tracking workflow using `ConvertColor`, `HsvThreshold`, and the `Centroid` operator to directly compute the centre of mass of a colored object.
-- Insert a `Subtract` transform and configure the `Value` property to be some target coordinate in the image.
+- Create a video tracking workflow using [`ConvertColor`], [`HsvThreshold`], and the [`Centroid`] operator to directly compute the centre of mass of a colored object.
+- Insert a [`Subtract`] transform and configure the `Value` property to be some target coordinate in the image.
-The result of the `Subtract` operator will be a vector pointing from the target to the centroid of the largest object. The desired distance from the centroid to the target would be the length of that vector.
+The result of the [`Subtract`] operator will be a vector pointing from the target to the centroid of the largest object. The desired distance from the centroid to the target would be the length of that vector.
-- Insert an [`ExpressionTransform`](xref:Bonsai.Scripting.Expressions.ExpressionTransform) operator. This node allows you to write small mathematical and logical expressions to transform input values.
-- Right-click on the `ExpressionTransform` operator and select `Show Default Editor`. Set the expression to `Math.Sqrt(X*X + Y*Y)`.
+- Insert an [`ExpressionTransform`] operator. This node allows you to write small mathematical and logical expressions to transform input values.
+- Right-click on the [`ExpressionTransform`] operator and select `Show Default Editor`. Set the expression to `Math.Sqrt(X*X + Y*Y)`.
> [!Note]
> Inside the `Expression` editor you can access any field of the input by name. In this case `X` and `Y` represent the corresponding fields of the [`Point2f`](xref:OpenCV.Net.Point2f) data type. You can check which fields are available by right-clicking the previous node. You can use all the normal arithmetical and logical operators as well as the mathematical functions available in the [`Math`]() type. The default expression `it` means "input" and represents the input value itself.
-- Connect the `ExpressionTransform` operator to the externalized `Amplitude` property.
+- Connect the [`ExpressionTransform`] operator to the externalized `Amplitude` property.
- Run the workflow and verify that stimulus intensity is modulated by the distance of the object to the target point.
- **Optional:** Modulate the `Frequency` property instead of `Amplitude`.
-- **Optional:** Use the [`Rescale`](xref:Bonsai.Dsp.Rescale) operator to adjust the gain of the modulation by configuring the `Min`, `Max`, `RangeMax` and `RangeMin` properties. Set the `RescaleType` property to `Clamp` to restrict the output values to an allowed range.
+- **Optional:** Use the [`Rescale`] operator to adjust the gain of the modulation by configuring the `Min`, `Max`, `RangeMax` and `RangeMin` properties. Set the `RescaleType` property to `Clamp` to restrict the output values to an allowed range.
> [!Note]
-> You can specify inverse relationships using `Rescale` if you set the _maximum_ input value to the `Min` property, and the _minimum_ input value to the `Max` property. In this case, a small distance will generate a large output, and a large distance will produce a small output.
+> You can specify inverse relationships using [`Rescale`] if you set the _maximum_ input value to the `Min` property, and the _minimum_ input value to the `Max` property. In this case, a small distance will generate a large output, and a large distance will produce a small output.
### **Exercise 5:** Triggering a digital line based on distance between objects
@@ -127,12 +127,12 @@ The result of the `Subtract` operator will be a vector pointing from the target

:::
-- Reproduce the above object tracking workflow using `FindContours` and `BinaryRegionAnalysis`.
-- Insert a `SortBinaryRegions` transform. This operator will sort the list of objects by area, in order of largest to smallest.
+- Reproduce the above object tracking workflow using [`FindContours`] and [`BinaryRegionAnalysis`].
+- Insert a [`SortBinaryRegions`] transform. This operator will sort the list of objects by area, in order of largest to smallest.
To calculate the distance between the two largest objects in every frame you will need to take into account some special cases. Specifically, there is the possibility that no object is detected, or that the two objects may be touching each other and will be detected as a single object. You can develop a new operator in order to perform this specific calculation.
-- Insert a `PythonTransform` operator. Change the `Script` property to the following code:
+- Insert a [`PythonTransform`] operator. Change the `Script` property to the following code:
```python
from math import sqrt
@@ -155,8 +155,8 @@ def process(value):
return sqrt(d.X * d.X + d.Y * d.Y)
```
-- Insert a `LessThan` transform and configure the `Value` property to an appropriate threshold.
-- Connect the boolean output to Arduino pin 13 using a [`DigitalOutput`](xref:Bonsai.Arduino.DigitalOutput) sink.
+- Insert a [`LessThan`] transform and configure the `Value` property to an appropriate threshold.
+- Connect the boolean output to Arduino pin 13 using a [`DigitalOutput`] sink.
- Run the workflow and verify that the Arduino LED is triggered when the two objects are close together.
### **Exercise 6:** Centring the video on a tracked object
@@ -165,33 +165,33 @@ def process(value):

:::
-- Insert a [`CameraCapture`](xref:Bonsai.Vision.CameraCapture) source.
-- Insert a `WarpAffine` transform. This node applies affine transformations on the input defined by the `Transform` matrix.
-- Externalize the `Transform` property of the `WarpAffine` operator using the right-click context menu.
-- Create an `AffineTransform` source and connect it to the externalized property.
-- Run the workflow and change the values of the `Translation` property while visualizing the output of `WarpAffine`. Notice that the transformation induces a translation in the input image controlled by the values in the property.
+- Insert a [`CameraCapture`] source.
+- Insert a [`WarpAffine`] transform. This node applies affine transformations on the input defined by the `Transform` matrix.
+- Externalize the `Transform` property of the [`WarpAffine`] operator using the right-click context menu.
+- Create an [`AffineTransform`] source and connect it to the externalized property.
+- Run the workflow and change the values of the `Translation` property while visualizing the output of [`WarpAffine`]. Notice that the transformation induces a translation in the input image controlled by the values in the property.
:::workflow

:::
-- In a new branch, create a video tracking pipeline using `ConvertColor`, `HsvThreshold`, and the `Centroid` operator to directly compute the centre of mass of a colored object.
-- Insert a `Negate` transform. This will make the X and Y coordinates of the centroid negative.
+- In a new branch, create a video tracking pipeline using [`ConvertColor`], [`HsvThreshold`], and the [`Centroid`] operator to directly compute the centre of mass of a colored object.
+- Insert a [`Negate`] transform. This will make the X and Y coordinates of the centroid negative.
-We now want to map our negative centroid to the `Translation` property of `AffineTransform`, so that we dynamically translate each frame using the negative position of the object. You can do this by using [property mapping operators](../articles/property-mapping.md).
+We now want to map our negative centroid to the `Translation` property of [`AffineTransform`], so that we dynamically translate each frame using the negative position of the object. You can do this by using [property mapping operators](../articles/property-mapping.md).
-- Insert an `InputMapping` operator.
-- Connect the `InputMapping` to the `AffineTransform` operator.
+- Insert an [`InputMapping`] operator.
+- Connect the [`InputMapping`] to the [`AffineTransform`] operator.
- Open the `PropertyMappings` editor and add a new mapping to the `Translation` property.
- Run the workflow. Verify the object is always placed at position (0,0). What is the problem?
> [!Note]
> Generally for image coordinates, (0,0) is at the top-left corner, and the center will be at coordinates (width/2, height/2), usually (320,240) for images with 640 x 480 resolution.
-- Insert an `Add` transform. This will add a fixed offset to the point. Configure the `Value` property with an offset that will place the object at the image centre, e.g. (320,240).
-- Run the workflow, and verify the output of `WarpAffine` is now a video which is always centred on the tracked object.
-- **Optional**: Insert a [`Crop`](xref:Bonsai.Vision.Crop) transform after `WarpAffine` to select a bounded region around the object.
-- **Optional**: Modify the object tracking workflow to use `FindContours` and `BinaryRegionAnalysis`.
+- Insert an [`Add`] transform. This will add a fixed offset to the point. Configure the `Value` property with an offset that will place the object at the image centre, e.g. (320,240).
+- Run the workflow, and verify the output of [`WarpAffine`] is now a video which is always centred on the tracked object.
+- **Optional**: Insert a [`Crop`] transform after [`WarpAffine`] to select a bounded region around the object.
+- **Optional**: Modify the object tracking workflow to use [`FindContours`] and [`BinaryRegionAnalysis`].
### **Exercise 7:** Make a robotic camera follow a tracked object
@@ -201,14 +201,14 @@ On this exercise we will use the Pan and Tilt servo motor assembly to make the c

:::
-- Insert a [`CameraCapture`](xref:Bonsai.Vision.CameraCapture) source.
-- Insert nodes to complete a video tracking workflow using `ConvertColor`, `HsvThreshold`, and the `Centroid` operator.
+- Insert a [`CameraCapture`] source.
+- Insert nodes to complete a video tracking workflow using [`ConvertColor`], [`HsvThreshold`], and the [`Centroid`] operator.
- Run the workflow and calibrate the threshold to make sure the colored object is perfectly segmented.
To make the Pan and Tilt servo motors correct the position of the camera, we now need to transform the X and Y values of the centroid, which are in image coordinates, to servo motor commands in degrees. For each frame we will have an incremental error depending on the observed location of the object, i.e. the deviation from the image centre.
-- Right-click the `Centroid` and select `Output` > `X`.
-- Insert a [`Rescale`](xref:Bonsai.Dsp.Rescale) transform and set the `Max` property to 640 (the image width), and the `RangeMin` and `RangeMax` properties to 1 and -1, respectively.
+- Right-click the [`Centroid`] and select `Output` > `X`.
+- Insert a [`Rescale`] transform and set the `Max` property to 640 (the image width), and the `RangeMin` and `RangeMax` properties to 1 and -1, respectively.
The output of this workflow will be a relative error signal indicating how much from the centre, and in which direction, the motor should turn. However, the commands to the servo are absolute motor positions in degrees. This means we will need to integrate the relative error signals to get the actual position where the servo should be. We also need to be aware of the servo operational range (0 to 180 degrees) in order not to damage the motors. To accomplish this, we will develop a new operator to compute the error-corrected integration before sending the final command to the servos.
@@ -216,7 +216,7 @@ The output of this workflow will be a relative error signal indicating how much

:::
-- Insert a `PythonTransform` operator after `Rescale`. Change the `Script` property to the following code:
+- Insert a [`PythonTransform`] operator after [`Rescale`]. Change the `Script` property to the following code:
```python
position = 90.0
@@ -232,7 +232,7 @@ def process(value):
return position
```
-- Insert a `ServoOutput` sink.
+- Insert a [`ServoOutput`] sink.
- Set the `Pin` property to the Arduino pin where the horizontal Pan motor is connected.
- Configure the `PortName` to the Arduino port where the micro-controller is connected.
- Run the workflow and validate the horizontal position of the motor is adjusted to keep the object in the middle.
@@ -241,9 +241,42 @@ def process(value):

:::
-- Right-click the `Centroid` and select `Output` > `Y` to create a new branch for the vertical Tilt motor.
-- Insert a [`Rescale`](xref:Bonsai.Dsp.Rescale) transform and set the `Max` property to 480 (the image height), and the `RangeMin` and `RangeMax` properties to -1 and 1, respectively (note these values are swapped from before because in image coordinates zero is at the image top).
-- Copy and paste the `PythonTransform` script from the previous branch.
-- Insert a `ServoOutput` sink and set the `Pin` property to the Arduino pin where the vertical Tilt motor is connected.
+- Right-click the [`Centroid`] and select `Output` > `Y` to create a new branch for the vertical Tilt motor.
+- Insert a [`Rescale`] transform and set the `Max` property to 480 (the image height), and the `RangeMin` and `RangeMax` properties to -1 and 1, respectively (note these values are swapped from before because in image coordinates zero is at the image top).
+- Copy and paste the [`PythonTransform`] script from the previous branch.
+- Insert a [`ServoOutput`] sink and set the `Pin` property to the Arduino pin where the vertical Tilt motor is connected.
- Configure the `PortName` property.
- Run the workflow and validate the camera is tracking the object and keeping it in the centre of the image.
+
+
+[`Add`]: xref:Bonsai.Expressions.AddBuilder
+[`AffineTransform`]: xref:Bonsai.Vision.AffineTransform
+[`AudioPlayback`]: xref:Bonsai.Audio.AudioPlayback
+[`BinaryRegionAnalysis`]: xref:Bonsai.Vision.BinaryRegionAnalysis
+[`BitwiseNot`]: xref:Bonsai.Expressions.BitwiseNotBuilder
+[`CameraCapture`]: xref:Bonsai.Vision.CameraCapture
+[`Centroid`]: xref:Bonsai.Vision.Centroid
+[`ConvertColor`]: xref:Bonsai.Vision.ConvertColor
+[`Crop`]: xref:Bonsai.Vision.Crop
+[`CropPolygon`]: xref:Bonsai.Vision.CropPolygon
+[`DigitalInput`]: xref:Bonsai.Arduino.DigitalInput
+[`DigitalOutput`]: xref:Bonsai.Arduino.DigitalOutput
+[`DistinctUntilChanged`]: xref:Bonsai.Reactive.DistinctUntilChanged
+[`ExpressionTransform`]: xref:Bonsai.Scripting.Expressions.ExpressionTransform
+[`Grayscale`]: xref:Bonsai.Vision.Grayscale
+[`GreaterThan`]: xref:Bonsai.Expressions.GreaterThanBuilder
+[`FindContours`]: xref:Bonsai.Vision.FindContours
+[`FunctionGenerator`]: xref:Bonsai.Dsp.FunctionGenerator
+[`HsvThreshold`]: xref:Bonsai.Vision.HsvThreshold
+[`InputMapping`]: xref:Bonsai.Expressions.InputMappingBuilder
+[`LessThan`]: xref:Bonsai.Expressions.LessThanBuilder
+[`Negate`]: xref:Bonsai.Expressions.NegateBuilder
+[`PythonTransform`]: xref:Bonsai.Scripting.IronPython.PythonTransform
+[`Rescale`]: xref:Bonsai.Dsp.Rescale
+[`ServoOutput`]: xref:Bonsai.Arduino.ServoOutput
+[`SortBinaryRegions`]: xref:Bonsai.Vision.SortBinaryRegions
+[`Subtract`]: xref:Bonsai.Expressions.SubtractBuilder
+[`Sum`]: xref:Bonsai.Dsp.Sum
+[`Threshold`]: xref:Bonsai.Vision.Threshold
+[`TimeInterval`]: xref:Bonsai.Reactive.TimeInterval
+[`WarpAffine`]: xref:Bonsai.Vision.WarpAffine
\ No newline at end of file
diff --git a/tutorials/networking.md b/tutorials/networking.md
index 5f52e668..ccd3af3e 100644
--- a/tutorials/networking.md
+++ b/tutorials/networking.md
@@ -14,9 +14,9 @@ We will start by implementing a direct peer-to-peer communication link between t
:::
- Setup the above workflow.
-- Set the `Name` property of the `CreateUdpClient` source to `Emitter`.
+- Set the `Name` property of the [`CreateUdpClient`] source to `Emitter`.
- Set the `RemotePort` to 2342.
-- Set the `Connection` property of the `SendMessage` sink to `Emitter`.
+- Set the `Connection` property of the [`SendMessage`] sink to `Emitter`.
Open a new Bonsai window and setup the following workflow:
@@ -24,10 +24,10 @@ Open a new Bonsai window and setup the following workflow:

:::
-- Set the `Name` property of the `CreateUdpClient` source to `Receiver`.
+- Set the `Name` property of the [`CreateUdpClient`] source to `Receiver`.
- Set the `Port` property to 2342.
-- Set the `Connection` property of the `ReceiveMessage` source to `Receiver`.
-- Run the workflow and visualize the output of the `ReceiveMessage` source. Note the characters displayed in the `TypeTag`. Now change the `TypeTag` property of the `ReceiveMessage` source to `i`. This will make the source interpret the contents of the OSC message as a 32-bit integer. You can string multiple characters to describe complex messages.
+- Set the `Connection` property of the [`ReceiveMessage`] source to `Receiver`.
+- Run the workflow and visualize the output of the [`ReceiveMessage`] source. Note the characters displayed in the `TypeTag`. Now change the `TypeTag` property of the [`ReceiveMessage`] source to `i`. This will make the source interpret the contents of the OSC message as a 32-bit integer. You can string multiple characters to describe complex messages.
- If you have access to two computers over a shared network, you can try to setup one of them to be the `Emitter` and the other to be the `Receiver`. In this case, make sure to set the `RemoteHostName` property of the `Emitter` to match the IP address of the receiver computer.
### **Exercise 2:** Client/Server TCP communication
@@ -38,10 +38,10 @@ Next we will implement a responsive TCP server with support to accept multiple c

:::
-- Setup the above workflow (identical to the previous exercise but using `CreateTcpServer`).
-- Set the `Name` property of the `CreateTcpServer` source to `Emitter`.
+- Setup the above workflow (identical to the previous exercise but using [`CreateTcpServer`]).
+- Set the `Name` property of the [`CreateTcpServer`] source to `Emitter`.
- Set the `Port` property to 2342.
-- Set the `Connection` property of the `SendMessage` sink to `Emitter`.
+- Set the `Connection` property of the [`SendMessage`] sink to `Emitter`.
Open a new Bonsai window and setup the following workflow:
@@ -49,10 +49,10 @@ Open a new Bonsai window and setup the following workflow:

:::
-- Set the `Name` property of the `CreateTcpClient` source to `Receiver`.
+- Set the `Name` property of the [`CreateTcpClient`] source to `Receiver`.
- Set the `Port` property to 2342.
-- Set the `Connection` property of the `ReceiveMessage` source to `Receiver`.
-- Run the workflow and visualize the output of the `ReceiveMessage` source, optionally setting the `TypeTag` property to `i`. Try opening multiple copies of the receiver workflow and running them simultaneously. Verify that data is streamed to all instances successfully.
+- Set the `Connection` property of the [`ReceiveMessage`] source to `Receiver`.
+- Run the workflow and visualize the output of the [`ReceiveMessage`] source, optionally setting the `TypeTag` property to `i`. Try opening multiple copies of the receiver workflow and running them simultaneously. Verify that data is streamed to all instances successfully.
- If you have access to two or more computers over a shared network, you can try to set up multiple remote data listeners. In this case, make sure to set the `HostName` property of the `Receiver` node to match the IP address of the receiver computer.
### **Exercise 3:** Streaming image data
@@ -64,10 +64,10 @@ It is possible to share multiple data streams of different types simultaneously
:::
- Start from the previous emitter workflow.
-- Set the `Address` property of the `SendMessage` sink to `/cursor`.
-- Add a `CameraCapture` source.
-- Add a `ConvertToArray` transform to convert the image into an array of bytes.
-- Add a new `SendMessage` node with the `Address` property set to `/image`.
+- Set the `Address` property of the [`SendMessage`] sink to `/cursor`.
+- Add a [`CameraCapture`] source.
+- Add a [`ConvertToArray`] transform to convert the image into an array of bytes.
+- Add a new [`SendMessage`] node with the `Address` property set to `/image`.
- Ensure the `Connection` property of the new node is set to `Emitter`.
Open a new Bonsai window and setup the following workflow:
@@ -77,12 +77,24 @@ Open a new Bonsai window and setup the following workflow:
:::
- Start from the previous receiver workflow.
-- Set the `Address` property of the `ReceiveMessage` source to `/cursor`.
-- Add a new `ReceiveMessage` source with the `Address` property set to `/image`.
+- Set the `Address` property of the [`ReceiveMessage`] source to `/cursor`.
+- Add a new [`ReceiveMessage`] source with the `Address` property set to `/image`.
- Run the emitter workflow and the receiver workflow and verify that you can receive both data streams.
-- Set the `TypeTag` property on the new `ReceiveMessage` node to `b` for byte array.
-- Add a `ConvertFromArray` transform following the `ReceiveMessage` source.
-- Add a `Reshape` transform.
+- Set the `TypeTag` property on the new [`ReceiveMessage`] node to `b` for byte array.
+- Add a [`ConvertFromArray`] transform following the [`ReceiveMessage`] source.
+- Add a [`Reshape`] transform.
- Set the `Channels` property to 3 (color image) and the `Rows` property to 480 (or your camera image height).
-- Add a `ConvertToImage` transform to interpret the resulting buffer as an image.
+- Add a [`ConvertToImage`] transform to interpret the resulting buffer as an image.
- Run both the emitter and the receiver workflow and verify you can successfully receive and decode both data streams. If you have access to two or more computers over a shared network, you can try to set up multiple remote data listeners, each listening to one or both data streams.
+
+
+[`CameraCapture`]: xref:Bonsai.Vision.CameraCapture
+[`ConvertToImage`]: xref:Bonsai.Vision.ConvertToImage
+[`ConvertFromArray`]: xref:Bonsai.Dsp.ConvertFromArray
+[`ConvertToArray`]: xref:Bonsai.Dsp.ConvertToArray
+[`CreateTcpServer`]: xref:Bonsai.Osc.CreateTcpServer
+[`CreateTcpClient`]: xref:Bonsai.Osc.CreateTcpClient
+[`CreateUdpClient`]: xref:Bonsai.Osc.CreateUdpClient
+[`Reshape`]: xref:Bonsai.Dsp.Reshape
+[`ReceiveMessage`]: xref:Bonsai.Osc.ReceiveMessage
+[`SendMessage`]: xref:Bonsai.Osc.SendMessage
\ No newline at end of file
diff --git a/tutorials/scripting.md b/tutorials/scripting.md
index d951ca42..13305774 100644
--- a/tutorials/scripting.md
+++ b/tutorials/scripting.md
@@ -27,7 +27,7 @@ The following set of exercises are to be developed in a single workflow, so do n
### **Exercise 1:** Dot Field
-Implement the common BonVision render pipeline shown below. The `Draw` operator should be implemented as a `PublishSubject`.
+Implement the common BonVision render pipeline shown below. The `Draw` operator should be implemented as a [`PublishSubject`].
:::workflow

@@ -42,7 +42,7 @@ Next we will create the workflow that will initialize and update the state of th
> [!WARNING]
> Save the workflow before adding the `CSharpTransform` operator or you might run into this [issue](https://github.com/bonsai-rx/bonsai/issues/1834).
-- Set the `Name` property of the `BehaviorSubject` to `DotField`.
+- Set the `Name` property of the [`BehaviorSubject`] to `DotField`.
- Double-click the `CSharpTransform` operator and follow the instructions to generate a new script file. When prompted, name the script `RandomDotKinematogram`.
- When inside the Visual Studio Code project, look for a pop-up in the bottom-right corner asking about "Reload Extensions". Click the button as soon as it shows up. If you miss the chance you can also click on the small bell on the bottom-right corner of the VS Code window (in the status bar). This will load all necessary dependencies for the script into Visual Studio Code so it can assist you in writing the C# script.
@@ -93,13 +93,13 @@ public class RandomDotKinematogram
This small script simply generates a field of random dots uniformly distributed inside the unit circle every frame. After the script is saved in Visual Studio Code, you can go back to Bonsai and select the menu option `Tools` > `Reload Extensions` to recompile the scripts for your workflow. You will have to do this step every time you change something about your script that you would like to test in the Bonsai workflow.
-To visualize the dot field we are generating, we can use the `DrawCircleArray` operator from BonVision.
+To visualize the dot field we are generating, we can use the [`DrawCircleArray`] operator from BonVision.
:::workflow

:::
-Make sure to set the `PositionData` property to match the name of the `BehaviorSubject` we defined in the previous step (i.e. `DotField`).
+Make sure to set the `PositionData` property to match the name of the [`BehaviorSubject`] we defined in the previous step (i.e. `DotField`).
### **Exercise 2:** Kinematogram Parameters
@@ -332,3 +332,9 @@ Finally, we want to make sure only a specified proportion of the dots are moving
```
+
+
+[`BehaviorSubject`]: xref:Bonsai.Reactive.BehaviorSubject
+[`PublishSubject`]: xref:Bonsai.Reactive.PublishSubject
+
+[`DrawCircleArray`]: https://bonvision.github.io/
\ No newline at end of file
diff --git a/tutorials/state-machines.md b/tutorials/state-machines.md
index 686e5dff..285532c4 100644
--- a/tutorials/state-machines.md
+++ b/tutorials/state-machines.md
@@ -32,17 +32,17 @@ In this worksheet, we will be using an Arduino or a camera as an interface to de
:::
- Connect a digital sensor (e.g. beam-break, button, TTL) into Arduino pin 8.
-- Insert a `DigitalInput` source and set it to Arduino pin 8.
-- Insert a `PublishSubject` operator and set its `Name` property to `Response`.
-- Insert a `Timestamp` operator.
-- Insert a `CsvWriter` sink and configure its `FileName` property with a file name ending in `.csv`.
+- Insert a [`DigitalInput`] source and set it to Arduino pin 8.
+- Insert a [`PublishSubject`] operator and set its `Name` property to `Response`.
+- Insert a [`Timestamp`] operator.
+- Insert a [`CsvWriter`] sink and configure its `FileName` property with a file name ending in `.csv`.
- Run the workflow and activate the digital sensor a couple of times. Stop the workflow and confirm that the events were successfully timestamped and logged in the `.csv` file.
> [!Note]
> In order to avoid hardware side-effects, it is highly recommended to declare all hardware connections at the top-level of the workflow, and interface all trial logic using subject variables. This will have the added benefit of allowing for very easy and centralized replacement of the rig hardware: as long as the new inputs and configurations are compatible with the logical subjects, no code inside the task logic will have to be changed at all.
-- Right-click the `DigitalInput` source, select `Create Source (bool)` > `BehaviorSubject`, and set its `Name` property to `Led`.
-- Insert a `DigitalOutput` sink and set it to Arduino pin 13.
+- Right-click the [`DigitalInput`] source, select `Create Source (bool)` > [`BehaviorSubject`], and set its `Name` property to `Led`.
+- Insert a [`DigitalOutput`] sink and set it to Arduino pin 13.
### **Exercise 2:** Inter-trial interval and stimulus presentation
@@ -52,19 +52,19 @@ Translating a state machine diagram into a Bonsai workflow begins by identifying

:::
-- Insert a `Timer` source and set its `DueTime` property to be about 3 seconds.
-- Insert a `Sink` operator and set its `Name` property to `StimOn`.
-- Double-click on the `Sink` node to open up its internal specification.
+- Insert a [`Timer`] source and set its `DueTime` property to be about 3 seconds.
+- Insert a [`Sink`] operator and set its `Name` property to `StimOn`.
+- Double-click on the [`Sink`] node to open up its internal specification.
> [!Note]
-> The `Sink` operator allows you to specify arbitrary processing side-effects without affecting the original flow of events. It is often used to trigger and control stimulus presentation in response to events in the task. Inside the nested specification, `Source1` represents input events arriving at the sink. In the specific case of `Sink` operators, the `WorkflowOutput` node can be safely ignored.
+> The [`Sink`] operator allows you to specify arbitrary processing side-effects without affecting the original flow of events. It is often used to trigger and control stimulus presentation in response to events in the task. Inside the nested specification, `Source1` represents input events arriving at the sink. In the specific case of [`Sink`] operators, the [`WorkflowOutput`] node can be safely ignored.
**`StimOn`**:
:::workflow

:::
-- Insert a `Boolean` operator following `Source1` and set its `Value` property to `True`.
+- Insert a [`Boolean`] operator following `Source1` and set its `Value` property to `True`.
- Find and right-click the `Led` subject in the toolbox and select the option `Multicast`.
- Run the workflow a couple of times and verify that the sequence of events is progressing correctly.
@@ -75,14 +75,14 @@ Translating a state machine diagram into a Bonsai workflow begins by identifying

:::
-- In the main top-level workflow, insert a `Delay` operator and set its `DueTime` property to a couple of seconds.
-- Copy the `StimOn` operator and insert it after the `Delay` (you can either copy-paste or recreate it from scratch).
+- In the main top-level workflow, insert a [`Delay`] operator and set its `DueTime` property to a couple of seconds.
+- Copy the `StimOn` operator and insert it after the [`Delay`] (you can either copy-paste or recreate it from scratch).
- Rename the new operator to `StimOff` and double-click it to open up its internal representation.
-- Set the `Value` property of the `Boolean` operator to `False`.
+- Set the `Value` property of the [`Boolean`] operator to `False`.
- Run the workflow a couple of times. Is it behaving as you would expect?
-- Insert a `Repeat` operator after the `StimOff`.
+- Insert a [`Repeat`] operator after the `StimOff`.
- Run the worklow. Can you describe in your own words what is happening?
-- **Optional:** Draw a marble diagram for `Timer`, `StimOn`, `Delay`, and `Repeat`.
+- **Optional:** Draw a marble diagram for [`Timer`], `StimOn`, [`Delay`], and [`Repeat`].
### **Exercise 3:** Driving state transitions with external behaviour events
@@ -90,12 +90,12 @@ Translating a state machine diagram into a Bonsai workflow begins by identifying

:::
-- Delete the `Delay` operator.
-- Insert a `SelectMany` operator after `StimOn`, and set its `Name` property to `Response`.
-- Double-click on the `SelectMany` node to open up its internal specification.
+- Delete the [`Delay`] operator.
+- Insert a [`SelectMany`] operator after `StimOn`, and set its `Name` property to `Response`.
+- Double-click on the [`SelectMany`] node to open up its internal specification.
> [!Note]
-> The `SelectMany` operator is used here to create a new state for every input event. `Source1` represents the input event that created the state, and `WorkflowOutput` will be used to report the end result from the state (e.g. whether the response was a success or failure).
+> The [`SelectMany`] operator is used here to create a new state for every input event. `Source1` represents the input event that created the state, and [`WorkflowOutput`] will be used to report the end result from the state (e.g. whether the response was a success or failure).
**`Response`**:
:::workflow
@@ -103,10 +103,10 @@ Translating a state machine diagram into a Bonsai workflow begins by identifying
:::
- Subscribe to the `Response` subject in the toolbox.
-- Insert a `Boolean` operator and set its `Value` property to `True`.
-- Insert a `Take` operator and set its `Count` property to 1.
+- Insert a [`Boolean`] operator and set its `Value` property to `True`.
+- Insert a [`Take`] operator and set its `Count` property to 1.
- Delete the `Source1` operator.
-- Connect the `Boolean` operator to `WorkflowOutput`.
+- Connect the [`Boolean`] operator to [`WorkflowOutput`].
- Run the workflow a couple of times and validate the state machine is responding to the button press.
### **Exercise 4:** Timeout and choice
@@ -116,10 +116,10 @@ Translating a state machine diagram into a Bonsai workflow begins by identifying

:::
-- Inside the `Response` node, insert a `Timer` source and set its `DueTime` property to be about 1 second.
-- Insert a `Boolean` operator and set its `Value` property to `False`.
-- Join both `Boolean` operators with a `Merge` combinator.
-- Connect the output of `Take` to `WorkflowOutput`.
+- Inside the `Response` node, insert a [`Timer`] source and set its `DueTime` property to be about 1 second.
+- Insert a [`Boolean`] operator and set its `Value` property to `False`.
+- Join both [`Boolean`] operators with a [`Merge`] combinator.
+- Connect the output of [`Take`] to [`WorkflowOutput`].
- Run the workflow a couple of times, opening the visualizer of the `Response` node.
_Describe in your own words what the above modified workflow is doing._
@@ -130,23 +130,23 @@ _Describe in your own words what the above modified workflow is doing._

:::
-- Insert a `Condition` operator after the `StimOff` node, and set its `Name` property to `Success`.
-- In a new branch from `StimOff`, insert another `Condition`, and set its `Name` property to `Miss`.
-- Double-click on the `Condition` operator to open up its internal specification.
+- Insert a [`Condition`] operator after the `StimOff` node, and set its `Name` property to `Success`.
+- In a new branch from `StimOff`, insert another [`Condition`], and set its `Name` property to `Miss`.
+- Double-click on the [`Condition`] operator to open up its internal specification.
> [!Note]
-> The `Condition` operator allows you to specify arbitrary rules for accepting or rejecting inputs. Only inputs which pass the filter specified inside the `Condition` are allowed to proceed. It is often used to represent choice points in the task. Inside the nested specification, `Source1` represents input events to be tested. The `WorkflowOutput` node always needs to be specified with a `bool` input, the result of whether the input is accepted (`True`) or rejected (`False`). Usually you can use operators such as `Equal`,`NotEqual`,`GreaterThan`, etc for specifying such tests.
+> The [`Condition`] operator allows you to specify arbitrary rules for accepting or rejecting inputs. Only inputs which pass the filter specified inside the [`Condition`] are allowed to proceed. It is often used to represent choice points in the task. Inside the nested specification, `Source1` represents input events to be tested. The [`WorkflowOutput`] node always needs to be specified with a `bool` input, the result of whether the input is accepted (`True`) or rejected (`False`). Usually you can use operators such as [`Equal`], [`NotEqual`], [`GreaterThan`], etc for specifying such tests.
**`Miss`**:
:::workflow

:::
-- Insert a `BitwiseNot` operator after `Source1`.
+- Insert a [`BitwiseNot`] operator after `Source1`.
_Why did we not need to specify anything for the `Success` condition?_
-- In the top-level workflow, insert a `SelectMany` operator after the `Success` condition and change its `Name` property to `Reward`.
+- In the top-level workflow, insert a [`SelectMany`] operator after the `Success` condition and change its `Name` property to `Reward`.
- Inside the `Reward` node you can specify your own logic to signal the trial was successful. For example, you can make the LED blink three times in rapid succession:
**`Reward`**:
@@ -154,19 +154,19 @@ _Why did we not need to specify anything for the `Success` condition?_

:::
-- Insert a `Timer` node and set both the `DueTime` and the `Period` properties to 100ms.
-- Insert a `Mod` operator and set the `Value` property to 2.
-- Insert the `Equal` operator and leave its `Value` property at 0.
+- Insert a [`Timer`] node and set both the `DueTime` and the `Period` properties to 100ms.
+- Insert a [`Mod`] operator and set the `Value` property to 2.
+- Insert the [`Equal`] operator and leave its `Value` property at 0.
- Find and right-click the `Led` subject in the toolbox and select the option `Multicast`.
-- Insert a `Take` operator and set the `Count` property to 6.
-- Insert the `Last` operator.
+- Insert a [`Take`] operator and set the `Count` property to 6.
+- Insert the [`Last`] operator.
_Try out your state machine and check whether you understand the behavior of the reward signal._
- Copy the `Reward` node, paste it after the `Miss` condition, and change its `Name` property to `Fail`.
- **Optional**: Modify the `Fail` state in some way to signal a different trial outcome (e.g. make the LED blink more times, or move a motor).
-- In the top-level workflow, insert a `Merge` operator and connect to it the outputs of both conditional branches and before the `Repeat` node.
+- In the top-level workflow, insert a [`Merge`] operator and connect to it the outputs of both conditional branches and before the [`Repeat`] node.
_Try out your state machine and introduce variations to the task behavior and conditions._
@@ -199,7 +199,7 @@ stateDiagram-v2
> 
> :::
-- Record a timestamped chronological log of trial types and rewards into a CSV file using a `BehaviorSubject`.
+- Record a timestamped chronological log of trial types and rewards into a CSV file using a [`BehaviorSubject`].
### **Exercise 7:** Conditioned place preference
@@ -214,4 +214,31 @@ stateDiagram-v2
```
> [!Tip]
-> There are several ways to implement ROI activation, so feel free to explore different ideas. Consider using either `Crop`, `RoiActivity`, or `ContainsPoint` as part of different strategies to implement the `enter` and `leave` events.
+> There are several ways to implement ROI activation, so feel free to explore different ideas. Consider using either [`Crop`], [`RoiActivity`], or [`ContainsPoint`] as part of different strategies to implement the `enter` and `leave` events.
+
+
+[`BehaviorSubject`]: xref:Bonsai.Reactive.BehaviorSubject
+[`BitwiseNot`]: xref:Bonsai.Expressions.BitwiseNotBuilder
+[`Boolean`]: xref:Bonsai.Expressions.BooleanProperty
+[`Condition`]: xref:Bonsai.Reactive.Condition
+[`ContainsPoint`]: xref:Bonsai.Vision.ContainsPoint
+[`Crop`]: xref:Bonsai.Vision.Crop
+[`CsvWriter`]: xref:Bonsai.IO.CsvWriter
+[`Delay`]: xref:Bonsai.Reactive.Delay
+[`DigitalInput`]: xref:Bonsai.Arduino.DigitalInput
+[`DigitalOutput`]: xref:Bonsai.Arduino.DigitalOutput
+[`Equal`]: xref:Bonsai.Expressions.EqualBuilder
+[`GreaterThan`]: xref:Bonsai.Expressions.GreaterThanBuilder
+[`Last`]: xref:Bonsai.Reactive.Last
+[`Merge`]: xref:Bonsai.Reactive.Merge
+[`Mod`]: xref:Bonsai.Expressions.ModBuilder
+[`NotEqual`]: xref:Bonsai.Expressions.NotEqualBuilder
+[`PublishSubject`]: xref:Bonsai.Reactive.PublishSubject
+[`Repeat`]: xref:Bonsai.Reactive.Repeat
+[`RoiActivity`]: xref:Bonsai.Vision.RoiActivity
+[`SelectMany`]: xref:Bonsai.Reactive.SelectMany
+[`Sink`]: xref:Bonsai.Reactive.Sink
+[`Take`]: xref:Bonsai.Reactive.Take
+[`Timestamp`]: xref:Bonsai.Reactive.Timestamp
+[`Timer`]: xref:Bonsai.Reactive.Timer
+[`WorkflowOutput`]: xref:Bonsai.Expressions.WorkflowOutputBuilder
\ No newline at end of file
diff --git a/tutorials/synching-ephys.md b/tutorials/synching-ephys.md
index d5e61dc2..2a750ee9 100644
--- a/tutorials/synching-ephys.md
+++ b/tutorials/synching-ephys.md
@@ -13,14 +13,14 @@ The general approach when synchronizing two independent data acquisition clocks

:::
-- Insert a `KeyDown` source.
-- Insert an `Equal` transform and set its `Value` to one of the keys. The output of this operator will toggle between `True` and `False` depending on whether the key press matches the specified key.
-- Insert a `DigitalOutput` sink and connect it to Arduino pin 13.
+- Insert a [`KeyDown`] source.
+- Insert an [`Equal`] transform and set its `Value` to one of the keys. The output of this operator will toggle between `True` and `False` depending on whether the key press matches the specified key.
+- Insert a [`DigitalOutput`] sink and connect it to Arduino pin 13.
- Connect the Arduino pin 13 to OpenEphys analog input 1.
-- Insert an `Rhd2000EvalBoard` source.
+- Insert an [`Rhd2000EvalBoard`] source.
- Select the `Rhd2000DataFrame` > `BoardAdcData` field from the source output using the context menu.
-- Insert a `SelectChannels` transform and set the `Channels` property to 0. This will select only the first analog input channel.
-- Insert a `MatrixWriter` sink and configure its `Path` property with a file name ending in `.bin`.
+- Insert a [`SelectChannels`] transform and set the `Channels` property to 0. This will select only the first analog input channel.
+- Insert a [`MatrixWriter`] sink and configure its `Path` property with a file name ending in `.bin`.
- Run the workflow and alternate pressing the selected key and some other key. Repeat this a couple of times to make the LED change state.
- Open the binary file in MATLAB/Python/R and plot the raw data. What can you conclude from it?
@@ -30,15 +30,15 @@ The general approach when synchronizing two independent data acquisition clocks

:::
-- Using the workflow from the previous exercise, insert a `CameraCapture` source and point the camera such that you can see clearly both the LED and the computer keyboard.
-- Insert a `VideoWriter` sink and configure the `FileName` with a path ending in `.avi`.
-- Insert a `Crop` transform and set the `RegionOfInterest` property to a small area around the LED.
-- Insert a `Grayscale` transform.
-- Insert a `Sum (Dsp)` transform. This operator will sum the brightness values of all the pixels in the input image.
+- Using the workflow from the previous exercise, insert a [`CameraCapture`] source and point the camera such that you can see clearly both the LED and the computer keyboard.
+- Insert a [`VideoWriter`] sink and configure the `FileName` with a path ending in `.avi`.
+- Insert a [`Crop`] transform and set the `RegionOfInterest` property to a small area around the LED.
+- Insert a [`Grayscale`] transform.
+- Insert a [`Sum (Dsp)`] transform. This operator will sum the brightness values of all the pixels in the input image.
- Select the `Scalar` > `Val0` field from the right-click context menu.
-- Record the output in a text file using a `CsvWriter` sink.
+- Record the output in a text file using a [`CsvWriter`] sink.
- Open both the text file and the binary file in MATLAB/Python/R and check that you have detected an equal number of key presses in both files. What can you conclude from these two pieces of data?
-- **Optional:** Repeat the exercise, replacing the `KeyDown` source with a periodic `Timer`. Can you point out some of the limitations of synchronizing a video stream with ephys using this method?
+- **Optional:** Repeat the exercise, replacing the [`KeyDown`] source with a periodic [`Timer`]. Can you point out some of the limitations of synchronizing a video stream with ephys using this method?
### **Exercise 3:** Synchronizing video with ephys using GPIO
@@ -49,7 +49,7 @@ By connecting this strobe signal to the ephys system and counting the number of
- Connect one of the output GPIO camera pins to the OpenEphys analog input 1.
- Configure the camera output as _strobe_.
- Insert a `FlyCapture` source or other industrial grade camera capture source.
-- Record the embedded hardware frame counter into a text file using `CsvWriter`.
+- Record the embedded hardware frame counter into a text file using [`CsvWriter`].
- Record the OpenEphys analog input and verify that you can recover individual camera pulses.
- Point out some of the remaining difficulties of this approach and how you would adress them.
@@ -65,18 +65,18 @@ In this exercise you will track the display of a very simple visual stimulus: a

:::
-- Insert a `SolidColor` source and set its `Size` property to a positive value, e.g. 100,100.
-- Insert a `Timer` source and set the `Period` to one second.
-- Insert a `Mod` transform and set its `Value` property to 2.
-- Insert a `Multiply` transform and set its `Value` property to 255.
+- Insert a [`SolidColor`] source and set its `Size` property to a positive value, e.g. 100,100.
+- Insert a [`Timer`] source and set the `Period` to one second.
+- Insert a [`Mod`] transform and set its `Value` property to 2.
+- Insert a [`Multiply`] transform and set its `Value` property to 255.
> [!Note]
-> The output of `Timer` is a growing count of the number of ticks. The `Mod` operator computes the remainder of the integer division of a number by another. Because every integer number in the sequence is alternately even or odd, the remainder of the division of the clock ticks by two will constantly oscillate between 0 and 1. Together with the `Multiply` operator, this is an easy way to make a periodic toggle between 0 and some value.
+> The output of [`Timer`] is a growing count of the number of ticks. The [`Mod`] operator computes the remainder of the integer division of a number by another. Because every integer number in the sequence is alternately even or odd, the remainder of the division of the clock ticks by two will constantly oscillate between 0 and 1. Together with the [`Multiply`] operator, this is an easy way to make a periodic toggle between 0 and some value.
-- Insert an `InputMapping` operator and connect it to the `SolidColor` source.
+- Insert an [`InputMapping`] operator and connect it to the [`SolidColor`] source.
- Edit the `PropertyMappings` and add a mapping to the `Color` property. You will have to select four times the input to fill all the components of the `Color` scalar.
-- Run the workflow and verify that the output of `SolidColor` oscillates between black and white.
-- Insert an `Rhd2000EvalBoard` source.
+- Run the workflow and verify that the output of [`SolidColor`] oscillates between black and white.
+- Insert an [`Rhd2000EvalBoard`] source.
- Select the `Rhd2000DataFrame` > `BoardAdcData` and either save or visualize its output.
- Connect a photodiode, or a photoresistor, to the ephys analog input and hold it flat against the screen, on top of the visualizer window.
- Verify that you can capture the transitions between black and white in the ephys data using the photodiode.
@@ -91,3 +91,22 @@ To do this, you can use the photodiode technique described in the previous exerc
- Assuming a DLP projector, how would you design the optical trigger for a camera system that ensures a single pulse is generated for each projected frame (hint: In a DLP projector, each colour of a BGR frame is projected sequentially: first the Blue channel, then the Green, and finally the Red channel, in quick succession)?
- **Optional:** Synchronize a camera with a projector using the GPIO trigger system outlined above.
+
+
+[`CameraCapture`]: xref: Bonsai.Vision.CameraCapture
+[`Crop`]: xref:Bonsai.Vision.Crop
+[`CsvWriter`]: xref:Bonsai.IO.CsvWriter
+[`DigitalOutput`]: xref:Bonsai.Arduino.DigitalOutput
+[`Equal`]: xref:Bonsai.Expressions.EqualBuilder
+[`Grayscale`]: xref:Bonsai.Vision.Grayscale
+[`InputMapping`]: xref:Bonsai.Expressions.InputMappingBuilder
+[`KeyDown`]: xref:Bonsai.Windows.Input.KeyDown
+[`MatrixWriter`]: xref:Bonsai.Dsp.MatrixWriter
+[`Mod`]: xref:Bonsai.Expressions.ModBuilder
+[`Multiply`]: xref:Bonsai.Expressions.MultiplyBuilder
+[`Rhd2000EvalBoard`]: xref:Bonsai.Ephys.Rhd2000EvalBoard
+[`SelectChannels`]: xref:Bonsai.Dsp.SelectChannels
+[`SolidColor`]: xref:Bonsai.Vision.SolidColor
+[`Sum (Dsp)`]: xref:Bonsai.Dsp.Sum
+[`Timer`]: xref:Bonsai.Reactive.Timer
+[`VideoWriter`]: xref:Bonsai.Vision.VideoWriter
\ No newline at end of file
diff --git a/tutorials/synching.md b/tutorials/synching.md
index a805ec87..c37ed50a 100644
--- a/tutorials/synching.md
+++ b/tutorials/synching.md
@@ -11,16 +11,16 @@ Synchronizing behaviour and other experimental events with stimulation or record

:::
-- Insert a `CameraCapture` source and set it to index 0.
-- Insert another `CameraCapture` source and set it to index 1.
-- Combine both sources using a `WithLatestFrom` combinator.
-- Insert a `Concat (Dsp)` operator and set its `Axis` property to 1.
-- Insert a `VideoWriter` sink and record a small segment of video.
+- Insert a [`CameraCapture`] source and set it to index 0.
+- Insert another [`CameraCapture`] source and set it to index 1.
+- Combine both sources using a [`WithLatestFrom`] combinator.
+- Insert a [`Concat (Dsp)`] operator and set its `Axis` property to 1.
+- Insert a [`VideoWriter`] sink and record a small segment of video.
_How would you test the synchronization between the two video streams?_
> [!Note]
-> You can use the `FileCapture` source to inspect the video frame by frame by setting the `Playing` property to `False`. After setting the `FileName` property to match your recorded video, run the workflow, open the source visualizer, and then right-clicking on top of the video frame to open up the seek bar at the bottom. You can use the arrow keys to move forward and back on individual frames.
+> You can use the [`FileCapture`] source to inspect the video frame by frame by setting the `Playing` property to `False`. After setting the `FileName` property to match your recorded video, run the workflow, open the source visualizer, and then right-clicking on top of the video frame to open up the seek bar at the bottom. You can use the arrow keys to move forward and back on individual frames.
## Reaction Time
@@ -44,7 +44,7 @@ We will start by using a fixed-interval blinking LED as our stimulus.

:::
-- To configure the Arduino analog sampling rate, insert a `CreateArduino` source.
+- To configure the Arduino analog sampling rate, insert a [`CreateArduino`] source.
- Configure the `PortName` to the Arduino port where the microcontroller is connected.
- Configure the `SamplingInterval` property to 10 ms.
@@ -52,14 +52,14 @@ We will start by using a fixed-interval blinking LED as our stimulus.

:::
-- Insert a `Timer` source and set its `DueTime` property to 1 second.
-- Insert a `Boolean` source and set its `Value` property to `True`.
-- Insert a `DigitalOutput` sink and set its `Pin` property to the Arduino pin where the LED is connected.
+- Insert a [`Timer`] source and set its `DueTime` property to 1 second.
+- Insert a [`Boolean`] source and set its `Value` property to `True`.
+- Insert a [`DigitalOutput`] sink and set its `Pin` property to the Arduino pin where the LED is connected.
- Configure the `PortName` to the Arduino port where the microcontroller is connected.
-- Insert a `Delay` operator and set its `DueTime` property to 200 milliseconds.
-- Insert a `Boolean` source and set its `Value` property to `False`.
-- Insert a `DigitalOutput` sink configured to the same `Pin` and `PortName`.
-- Insert a `Repeat` operator.
+- Insert a [`Delay`] operator and set its `DueTime` property to 200 milliseconds.
+- Insert a [`Boolean`] source and set its `Value` property to `False`.
+- Insert a [`DigitalOutput`] sink configured to the same `Pin` and `PortName`.
+- Insert a [`Repeat`] operator.
### **Exercise 3:** Measuring reaction time
@@ -67,13 +67,13 @@ We will start by using a fixed-interval blinking LED as our stimulus.

:::
-- Insert an `AnalogInput` source.
+- Insert an [`AnalogInput`] source.
- Set the `Pin` property to the analog pin number where the duplicate LED wire is connected.
-- Insert a second `AnalogInput` source.
+- Insert a second [`AnalogInput`] source.
- Set the `Pin` property to the analog pin number where the button is connected.
-- Connect both inputs to a `Zip` operator.
-- Insert a `CsvWriter` sink and configure the `FileName` property.
-- Insert a `RollingGraph` visualizer and set its `Capacity` property to 1000.
+- Connect both inputs to a [`Zip`] operator.
+- Insert a [`CsvWriter`] sink and configure the `FileName` property.
+- Insert a [`RollingGraph`] visualizer and set its `Capacity` property to 1000.
- Run the workflow, and verify that both the stimulus and the button are correctly recorded.
### **Exercise 4:** Synchronizing video with a visual stimulus
@@ -84,35 +84,35 @@ To analyze movement dynamics in the reaction time task, you will need to align i

:::
-- Starting from the workflow in the previous exercise, insert a `CameraCapture` source and position the camera such that you can see clearly both the LED and the computer keyboard.
-- Insert a `VideoWriter` sink and configure the `FileName` with a path ending in `.avi`.
-- Insert a `Crop` transform and set the `RegionOfInterest` property to a small area around the LED.
-- Insert a `Grayscale` transform.
-- Insert a `Sum (Dsp)` transform. This operator will sum the brightness values of all the pixels in the input image.
+- Starting from the workflow in the previous exercise, insert a [`CameraCapture`] source and position the camera such that you can see clearly both the LED and the computer keyboard.
+- Insert a [`VideoWriter`] sink and configure the `FileName` with a path ending in `.avi`.
+- Insert a [`Crop`] transform and set the `RegionOfInterest` property to a small area around the LED.
+- Insert a [`Grayscale`] transform.
+- Insert a [`Sum (Dsp)`] transform. This operator will sum the brightness values of all the pixels in the input image.
- Select the `Scalar` > `Val0` field from the right-click context menu.
-- Record the output in a text file using a `CsvWriter` sink.
+- Record the output in a text file using a [`CsvWriter`] sink.
- Open both the text file containing the Arduino data, and the text file containing video data, and verify that you have detected an equal number of stimulus in both files. What can you conclude from these two pieces of data?
- **Optional:** Open the raw video file and find the exact frame where the stimulus came on. If you compare different trials you might notice that the brightness of the LED in that first frame across two different trials is different. Why is that?
### **Exercise 5:** Trigger a visual stimulus using a button
-To make our task more interesting, we will now trigger the stimulus manually using a button press and learn more about `SelectMany` along the way!
+To make our task more interesting, we will now trigger the stimulus manually using a button press and learn more about [`SelectMany`] along the way!
:::workflow

:::
- Connect a new push button component into one of the Arduino digital inputs.
-- Insert a `DigitalInput` source and set its `Pin` property to the Arduino pin where the new button is connected.
+- Insert a [`DigitalInput`] source and set its `Pin` property to the Arduino pin where the new button is connected.
- Configure the `PortName` to the Arduino port where the microcontroller is connected.
-- Insert a `Condition` operator.
-- Insert a `SelectMany` operator and move the stimulus generation logic inside the nested node:
+- Insert a [`Condition`] operator.
+- Insert a [`SelectMany`] operator and move the stimulus generation logic inside the nested node:
:::workflow

:::
-_Why do we need to remove the `Repeat` operator?_
+_Why do we need to remove the [`Repeat`] operator?_
- Ask a friend to test your reaction time!
- **Optional:** In the current workflow, what happens if you press the stimulus button twice in succession? Can you fix the current behaviour by using one of the higher-order operators?
@@ -123,14 +123,14 @@ _Why do we need to remove the `Repeat` operator?_

:::
-- Starting from the previous workflow, insert another `AnalogInput` source with the `Pin` property set to the button press pin number.
-- Insert a `GreaterThan` operator.
-- Insert a `DistinctUntilChanged` operator.
-- Insert a `Condition` operator.
-- In a new branch coming off the `VideoWriter`, insert a `Delay` operator.
-- Set the `DueTime` property of the `Delay` operator to 1 second.
-- Insert a `TriggeredWindow` operator, and set its `Count` property to 100.
-- Insert a `SelectMany` operator and inside the nested node create the below workflow:
+- Starting from the previous workflow, insert another [`AnalogInput`] source with the `Pin` property set to the button press pin number.
+- Insert a [`GreaterThan`] operator.
+- Insert a [`DistinctUntilChanged`] operator.
+- Insert a [`Condition`] operator.
+- In a new branch coming off the [`VideoWriter`], insert a [`Delay`] operator.
+- Set the `DueTime` property of the [`Delay`] operator to 1 second.
+- Insert a [`WindowTrigger`] operator, and set its `Count` property to 100.
+- Insert a [`SelectMany`] operator and inside the nested node create the below workflow:
:::workflow

@@ -139,3 +139,29 @@ _Why do we need to remove the `Repeat` operator?_
- Run the workflow and record a few videos triggered on the button press.
- Inspect the videos frame by frame and check whether the response LED comes ON at exactly the same frame number across different trials.
- If it does not, why would this happen? And how would you fix it?
+
+
+[`AnalogInput`]: xref:Bonsai.Arduino.AnalogInput
+[`Boolean`]: xref:Bonsai.Expressions.BooleanProperty
+[`CameraCapture`]: xref:Bonsai.Vision.CameraCapture
+[`Concat (Dsp)`]: xref:Bonsai.Dsp.Concat
+[`Condition`]: xref:Bonsai.Reactive.Condition
+[`CreateArduino`]: xref:Bonsai.Arduino.CreateArduino
+[`Crop`]: xref:Bonsai.Vision.Crop
+[`CsvWriter`]: xref:Bonsai.IO.CsvWriter
+[`Delay`]: xref:Bonsai.Reactive.Delay
+[`DigitalInput`]: xref:Bonsai.Arduino.DigitalInput
+[`DigitalOutput`]: xref:Bonsai.Arduino.DigitalOutput
+[`DistinctUntilChanged`]: xref:Bonsai.Reactive.DistinctUntilChanged
+[`FileCapture`]: xref:Bonsai.Vision.FileCapture
+[`Grayscale`]: xref:Bonsai.Vision.Grayscale
+[`GreaterThan`]: xref:Bonsai.Expressions.GreaterThanBuilder
+[`Repeat`]: xref:Bonsai.Reactive.Repeat
+[`RollingGraph`]: xref:Bonsai.Design.Visualizers.RollingGraphBuilder
+[`SelectMany`]: xref:Bonsai.Reactive.SelectMany
+[`Sum (Dsp)`]: xref:Bonsai.Dsp.Sum
+[`Timer`]: xref:Bonsai.Reactive.Timer
+[`VideoWriter`]: xref:Bonsai.Vision.VideoWriter
+[`WindowTrigger`]: xref:Bonsai.Reactive.WindowTrigger
+[`WithLatestFrom`]: xref:Bonsai.Reactive.WithLatestFrom
+[`Zip`]: xref:Bonsai.Reactive.Zip
\ No newline at end of file
diff --git a/tutorials/vision-psychophysics.md b/tutorials/vision-psychophysics.md
index 8d12fea4..b232a68e 100644
--- a/tutorials/vision-psychophysics.md
+++ b/tutorials/vision-psychophysics.md
@@ -6,7 +6,7 @@
## Getting Started
1. Install the **BonVision** package from the Bonsai Community feed in the package manager. 
-2. Go through the [basic stimuli tutorial](https://bonvision.github.io/pages/03-Creating-Basic-Stimuli){target="\_blank"} at the [BonVision](https://bonvision.github.io/){target="\_blank"} website.
+2. Go through the [basic stimuli tutorial](https://bonvision.github.io/pages/03-Creating-Basic-Stimuli) at the [BonVision](https://bonvision.github.io/) website.
> [!Warning]
> Make sure the latest version of the BonVision package is installed for this worksheet.
@@ -25,12 +25,12 @@ To allow sharing screen calibration for all displayed task elements, we start by

:::
-- Insert a `CreateWindow` source and set the `ClearColor` property to `Gray`.
-- Insert the `BonVisionResources` and `LoadResources` operators to preload all built-in BonVision shaders.
-- Insert the `WorkflowOutput` operator after `LoadResources` to ensure the workflow terminates when the shader window is closed.
-- Insert a `RenderFrame` source. This source will emit a notification when it is time for a new frame to be drawn on the screen.
-- Insert a `NormalizedView` operator. This will specify that our stimulus dimensions are resolution independent, aspect ratio corrected, and normalized to the range [-1,1].
-- Insert a `PublishSubject` operator and set its `Name` property to `Draw`. We will use these events whenever we need to draw any element on the screen.
+- Insert a [`CreateWindow`] source and set the `ClearColor` property to `Gray`.
+- Insert the [`BonVisionResources`] and [`LoadResources`] operators to preload all built-in BonVision shaders.
+- Insert the [`WorkflowOutput`] operator after [`LoadResources`] to ensure the workflow terminates when the shader window is closed.
+- Insert a [`RenderFrame`] source. This source will emit a notification when it is time for a new frame to be drawn on the screen.
+- Insert a [`NormalizedView`] operator. This will specify that our stimulus dimensions are resolution independent, aspect ratio corrected, and normalized to the range [-1,1].
+- Insert a [`PublishSubject`] operator and set its `Name` property to `Draw`. We will use these events whenever we need to draw any element on the screen.
The first step in developing our task will be to display a grating in the center of the screen at a random orientation for a specified period of time, and store the value of the orientation, so we can use it later to test the participant.
@@ -38,18 +38,18 @@ The first step in developing our task will be to display a grating in the center

:::
-- Insert a `CreateRandom` source.
-- Insert a `CreateContinuousUniform` and set its `Lower` and `Upper` properties to -1 and 1, respectively.
-- Insert an `AsyncSubject` and set its name property to `AngleDistribution`.
+- Insert a [`CreateRandom`] source.
+- Insert a [`CreateContinuousUniform`] and set its `Lower` and `Upper` properties to -1 and 1, respectively.
+- Insert an [`AsyncSubject`] and set its name property to `AngleDistribution`.
For now, we start by displaying a repeating sequence of random orientation gratings.
-- Insert a `Timer (Shaders)` source and set its `DueTime` property to 2 seconds.
-- Insert a `SelectMany` operator and set its name to `ReferenceGrating`.
-- Insert a `Repeat` operator.
+- Insert a [`Timer (Shaders)`] source and set its `DueTime` property to 2 seconds.
+- Insert a [`SelectMany`] operator and set its name to `ReferenceGrating`.
+- Insert a [`Repeat`] operator.
> [!Note]
-> The `Timer (Shaders)` source works exactly like the default `Timer (Reactive)` source, but it counts the time by using the screen refresh time, rather than the operating system time. This can be important for precise timing of screen stimuli, as it avoid clock drift and jitter when synchronizing multiple visual elements, and should be in general preferred when specifying the various intervals used to control elements in the BonVision or Shaders packages.
+> The [`Timer (Shaders)`] source works exactly like the default [`Timer (Reactive)`] source, but it counts the time by using the screen refresh time, rather than the operating system time. This can be important for precise timing of screen stimuli, as it avoid clock drift and jitter when synchronizing multiple visual elements, and should be in general preferred when specifying the various intervals used to control elements in the BonVision or Shaders packages.
To implement the `ReferenceGrating` state, we will need to sample a random angle from the angle distribution, use it to initialize the angle property of the gratings, and present the gratings for a specified period of time. At the end, we need to send out as a result the value of the random orientation which was generated.
@@ -58,27 +58,27 @@ To implement the `ReferenceGrating` state, we will need to sample a random angle

:::
-- Use the `Sample (Numerics)` operator to sample a random orientation value from the `AngleDistribution` subject and store it in a new `AsyncSubject` named `Angle`. This will allow us to reuse the sampled value when drawing the gratings later.
-- Subscribe to the `Draw` subject we defined previously and insert a `DrawGratings` operator.
-- Externalize the `Angle` property from the `DrawGratings` node and connect the `Angle` subject we created to it.
-- Insert a `Timer` operator and set its `DueTime` property to 1 second.
-- Insert a `TakeUntil` operator and connect the `DrawGratings` node as the source, and the `Timer` as the trigger.
-- Insert a `Last` operator. This will ensure we will get a notification whenever the `Timer` stops the presentation of the stimulus.
-- Insert a `Sample` operator following the `Angle` declaration, and connect the `Last` operator as a trigger. This will store the sampled angle value until it is time to return.
-- Insert a `WorkflowOutput` operator to specify the final output of the state.
+- Use the [`Sample (Numerics)`] operator to sample a random orientation value from the `AngleDistribution` subject and store it in a new [`AsyncSubject`] named `Angle`. This will allow us to reuse the sampled value when drawing the gratings later.
+- Subscribe to the `Draw` subject we defined previously and insert a [`DrawGratings`] operator.
+- Externalize the `Angle` property from the [`DrawGratings`] node and connect the `Angle` subject we created to it.
+- Insert a [`Timer`] operator and set its `DueTime` property to 1 second.
+- Insert a [`TakeUntil`] operator and connect the [`DrawGratings`] node as the source, and the [`Timer`] as the trigger.
+- Insert a [`Last`] operator. This will ensure we will get a notification whenever the [`Timer`] stops the presentation of the stimulus.
+- Insert a [`Sample`] operator following the `Angle` declaration, and connect the [`Last`] operator as a trigger. This will store the sampled angle value until it is time to return.
+- Insert a [`WorkflowOutput`] operator to specify the final output of the state.
_Run the workflow and verify whether the behaviour of the system is correct. Are different orientation values being used for each subsequent presentation of the gratings?_
### **Exercise 2:** Reusing stimulus definitions
-The second step in defining the contrast discrimination task is to display a second randomly oriented grating in each trial, with a small blank (or masking) period in between. To do this, we want to avoid repeating the entire workflow we designed for our reference grating, so we will make use of the `IncludeWorkflow` operator to reuse our stimulus presentation logic.
+The second step in defining the contrast discrimination task is to display a second randomly oriented grating in each trial, with a small blank (or masking) period in between. To do this, we want to avoid repeating the entire workflow we designed for our reference grating, so we will make use of the [`IncludeWorkflow`] operator to reuse our stimulus presentation logic.
**`ReferenceGrating`**:
:::workflow

:::
-- Inside the `ReferenceGrating` state, select all nodes before `WorkflowOutput`, right-click the selection, and choose the `Save as Workflow` option. Choose `RandomOrientationGrating` as the name for the extension.
+- Inside the `ReferenceGrating` state, select all nodes before [`WorkflowOutput`], right-click the selection, and choose the `Save as Workflow` option. Choose `RandomOrientationGrating` as the name for the extension.
After we have our new reusable operator, we can extend the workflow to include the blank period and the second grating stimulus.
@@ -86,9 +86,9 @@ After we have our new reusable operator, we can extend the workflow to include t

:::
-- Insert a `SelectMany` operator after the `ReferenceGrating` state and set its `Name` property to `Blank`.
-- Insert another `SelectMany` operator after `Blank` with the name `TestGrating`.
-- Insert a `Repeat` operator.
+- Insert a [`SelectMany`] operator after the `ReferenceGrating` state and set its `Name` property to `Blank`.
+- Insert another [`SelectMany`] operator after `Blank` with the name `TestGrating`.
+- Insert a [`Repeat`] operator.
For the `Blank` state we will use a simple gap interval where nothing is drawn on the screen. We can do this easily by delaying the transmission of the result of the previous state, before we move on to the next state.
@@ -97,10 +97,10 @@ For the `Blank` state we will use a simple gap interval where nothing is drawn o

:::
-- Insert a `Delay (Shaders)` operator between the input and the output of the state workflow.
+- Insert a [`Delay (Shaders)`] operator between the input and the output of the state workflow.
> [!Note]
-> Similar to `Timer (Shaders)`, the `Delay (Shaders)` operator works exactly like the `Delay (Reactive)` operator, but using the screen refresh clock instead of the operating system clock. This also ensures that any delayed notifications are resynchronized with the render loop, in case they were emitted from other external devices.
+> Similar to [`Timer (Shaders)`], the [`Delay (Shaders)`] operator works exactly like the [`Delay (Reactive)`] operator, but using the screen refresh clock instead of the operating system clock. This also ensures that any delayed notifications are resynchronized with the render loop, in case they were emitted from other external devices.
To implement the `TestGrating` state, we want to reuse our previous `RandomOrientationGrating` extension workflow and simply combine the random generated angle with the angle from the reference grating.
@@ -109,7 +109,7 @@ To implement the `TestGrating` state, we want to reuse our previous `RandomOrien

:::
-- Insert a new `RandomOrientationGrating` operator from the toolbox and combine it with the input by using the `Zip` combinator. This will generate a pair where the first value is the random angle from the first reference grating, and the second value is the random angle for this test grating.
+- Insert a new `RandomOrientationGrating` operator from the toolbox and combine it with the input by using the [`Zip`] combinator. This will generate a pair where the first value is the random angle from the first reference grating, and the second value is the random angle for this test grating.
_Run the workflow and validate the random angle pairs are distinct and valid from trial to trial._
@@ -121,7 +121,7 @@ Now that we have our two randomly generated gratings, we need to gather the resp

:::
-- Insert a new `Response` state after the `TestGrating` state using the `SelectMany` operator.
+- Insert a new `Response` state after the `TestGrating` state using the [`SelectMany`] operator.
To implement the response gathering state we will use key presses from the participant. We will use the left and right arrow keys to indicate which stimulus had the most clockwise orientation and compare the response with whether or not the first stimulus was more clockwise than the second stimulus.
@@ -130,20 +130,20 @@ To implement the response gathering state we will use key presses from the parti

:::
-- Connect the `Draw` subject from the toolbox to a new `DrawText` operator and set its `Text` property to a suggestive question (e.g. `A or B?`). Also edit the `Font` property and make sure the size is at least 72pt for readability.
-- Insert a `DelaySubscription (Shaders)` operator and set its `DueTime` property to 1 second.
+- Connect the `Draw` subject from the toolbox to a new [`DrawText`] operator and set its `Text` property to a suggestive question (e.g. `A or B?`). Also edit the `Font` property and make sure the size is at least 72pt for readability.
+- Insert a [`DelaySubscription (Shaders)`] operator and set its `DueTime` property to 1 second.
> [!Note]
-> As before, the difference with `DelaySubscription (Reactive)` is that `DelaySubscription (Shaders)` will use the screen refresh time and make sure that all effects of subscription are synchronized with the render loop.
+> As before, the difference with [`DelaySubscription (Reactive)`] is that [`DelaySubscription (Shaders)`] will use the screen refresh time and make sure that all effects of subscription are synchronized with the render loop.
-- Insert a `LessThan` operator after the input source node. This will compare the value of the randomly sampled angles for the first and second gratings, respectively, and will return true if the first grating is more clockwise than the second grating (i.e. its angle in radians is smaller than the second grating).
-- Insert a `KeyDown (Shaders)` source and set its `Key` property to `Left`.
-- Insert a `KeyDown (Shaders)` source and set its `Key` property to `Right`.
-- Insert a `Boolean` operator after each of the key press sources and set the `Value` property to `True` for the operator following the left key press.
-- Combine the results of both key presses with a `Merge` operator.
-- Insert a `First` operator since we are only interested in the first response from the participant.
-- Combine the comparison from `LessThan` with the response from the participant using the `Zip` combinator.
-- Insert an `Equal` operator to check whether or not the response matches the true angle comparison. This will be the result of the `Response` state and after it is reported, all other effects of the state will be determined (i.e. the question display).
+- Insert a [`LessThan`] operator after the input source node. This will compare the value of the randomly sampled angles for the first and second gratings, respectively, and will return true if the first grating is more clockwise than the second grating (i.e. its angle in radians is smaller than the second grating).
+- Insert a [`KeyDown (Shaders)`] source and set its `Key` property to `Left`.
+- Insert a [`KeyDown (Shaders)`] source and set its `Key` property to `Right`.
+- Insert a [`Boolean`] operator after each of the key press sources and set the `Value` property to `True` for the operator following the left key press.
+- Combine the results of both key presses with a [`Merge`] operator.
+- Insert a [`First`] operator since we are only interested in the first response from the participant.
+- Combine the comparison from [`LessThan`] with the response from the participant using the [`Zip`] combinator.
+- Insert an [`Equal`] operator to check whether or not the response matches the true angle comparison. This will be the result of the `Response` state and after it is reported, all other effects of the state will be determined (i.e. the question display).
### **Exercise 4:** Present trial outcome feedback to participants
@@ -153,7 +153,7 @@ The only step left for finishing our experimental prototype is to report the fee

:::
-- Insert a new `Feedback` state after the `Response` state using the `SelectMany` operator.
+- Insert a new `Feedback` state after the `Response` state using the [`SelectMany`] operator.
This final state will simply display a quad for a certain period of time, where the color will be modulated by the trial outcome value. We want to store this value until the end of the trial so we can report it for subsequent processing.
@@ -162,26 +162,64 @@ This final state will simply display a quad for a certain period of time, where

:::
-- Insert an `AsyncSubject` operator and set its `Name` property to `Result`. This will store the trial outcome result so it can be used to compute the color value of the quad.
-- Subscribe to the `Draw` subject and insert a `DrawQuad` operator.
-- Externalize the `ColorR`, `ColorG`, and `ColorB` properties from the `DrawQuad` node.
-- Subscribe to the `Result` subject and create a new `ExpressionTransform` operator.
+- Insert an [`AsyncSubject`] operator and set its `Name` property to `Result`. This will store the trial outcome result so it can be used to compute the color value of the quad.
+- Subscribe to the `Draw` subject and insert a [`DrawQuad`] operator.
+- Externalize the `ColorR`, `ColorG`, and `ColorB` properties from the [`DrawQuad`] node.
+- Subscribe to the `Result` subject and create a new [`ExpressionTransform`] operator.
-In the `Expression` property of the `ExpressionTransform` operator, create a structure holding the RGB color value using the following script:
+In the `Expression` property of the [`ExpressionTransform`] operator, create a structure holding the RGB color value using the following script:
```c#
it ? new(0 as R, 1 as G, 0 as B) : new(1 as R, 0 as G, 0 as B)
```
-- Connect the `ExpressionTransform` to the externalized properties.
-- Insert a `Timer` operator and set its `DueTime` property to 1 second.
-- Insert a `TakeUntil` operator and connect the `DrawQuad` node as the source, and the `Timer` as the trigger.
-- Insert a `Last` operator. This will ensure we will get a notification whenever the `Timer` stops the feedback presentation.
-- Insert a `Sample` operator following the `Result` declaration, and connect the `Last` operator as a trigger. This will store the trial outcome value until it is time to return.
-- Connect it to the `WorkflowOutput` operator to specify the final output of the state and trial.
+- Connect the [`ExpressionTransform`] to the externalized properties.
+- Insert a [`Timer`] operator and set its `DueTime` property to 1 second.
+- Insert a [`TakeUntil`] operator and connect the [`DrawQuad`] node as the source, and the [`Timer`] as the trigger.
+- Insert a [`Last`] operator. This will ensure we will get a notification whenever the [`Timer`] stops the feedback presentation.
+- Insert a [`Sample`] operator following the `Result` declaration, and connect the [`Last`] operator as a trigger. This will store the trial outcome value until it is time to return.
+- Connect it to the [`WorkflowOutput`] operator to specify the final output of the state and trial.
_Run the workflow and verify the visual feedback indeed matches the perceived results from each trial._
### **Exercise 5 (Optional):** Measure psychometric data
What is the minimal discrimination threshold for humans in this task? How would you extend the previous workflow in order to assess this?
+
+
+[`AsyncSubject`]: xref:Bonsai.Reactive.AsyncSubject
+[`Boolean`]: xref:Bonsai.Expressions.BooleanProperty
+[`CreateWindow`]: xref:Bonsai.Shaders.CreateWindow
+[`CreateContinuousUniform`]: xref:Bonsai.Numerics.Distributions.CreateContinuousUniform
+[`CreateRandom`]: xref:Bonsai.Numerics.CreateRandom
+[`Delay (Shaders)`]: xref:Bonsai.Shaders.Delay
+[`Delay (Reactive)`]: xref:Bonsai.Reactive.Delay
+[`DelaySubscription (Shaders)`]: xref:Bonsai.Shaders.DelaySubscription
+[`DelaySubscription (Reactive)`]: xref:Bonsai.Reactive.DelaySubscription
+[`Equal`]: xref:Bonsai.Expressions.EqualBuilder
+[`ExpressionTransform`]: xref:Bonsai.Scripting.Expressions.ExpressionTransform
+[`First`]: xref:Bonsai.Reactive.First
+[`IncludeWorkflow`]: xref:Bonsai.Expressions.IncludeWorkflowBuilder
+[`KeyDown (Shaders)`]: xref:Bonsai.Shaders.Input.KeyDown
+[`LoadResources`]: xref:Bonsai.Resources.LoadResources
+[`LessThan`]: xref:Bonsai.Expressions.LessThanBuilder
+[`Merge`]: xref:Bonsai.Reactive.Merge
+[`PublishSubject`]: xref:Bonsai.Reactive.PublishSubject
+[`RenderFrame`]: xref:Bonsai.Shaders.RenderFrame
+[`Repeat`]: xref:Bonsai.Reactive.Repeat
+[`Sample (Numerics)`]: xref:Bonsai.Numerics.Distributions.Sample
+[`SelectMany`]: xref:Bonsai.Reactive.SelectMany
+[`Last`]: xref:Bonsai.Reactive.Last
+[`Sample`]: xref:Bonsai.Reactive.Sample
+[`TakeUntil`]: xref:Bonsai.Reactive.TakeUntil
+[`Timer`]: xref:Bonsai.Shaders.Timer
+[`Timer (Shaders)`]: xref:Bonsai.Shaders.Timer
+[`Timer (Reactive)`]: xref:Bonsai.Reactive.Timer
+[`WorkflowOutput`]: xref:Bonsai.Expressions.WorkflowOutputBuilder
+[`Zip`]: xref:Bonsai.Reactive.Zip
+
+[`BonvisionResources`]: https://bonvision.github.io/docs/BonVisionResources/
+[`DrawGratings`]: https://bonvision.github.io/docs/DrawGratings/
+[`DrawQuad`]: https://bonvision.github.io/docs/DrawQuad/
+[`DrawText`]: https://bonvision.github.io/docs/DrawText/
+[`NormalizedView`]: https://bonvision.github.io/docs/NormalizedViewport/
\ No newline at end of file