Skip to content

Commit 3ce996a

Browse files
committed
Update xref for operators from packages with new docfx websites
1 parent 2b83afa commit 3ce996a

File tree

7 files changed

+22
-17
lines changed

7 files changed

+22
-17
lines changed

articles/editor.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -117,7 +117,7 @@ You can take advantage of tabs, windows, breadcrumbs and docked panels to naviga
117117

118118
![Editor Dock Panel](../images/editor-dockpanel.png)
119119

120-
Right-clicking on a nested node such as a [`GroupWorkflow`](xref:Bonsai.Expressions.GroupWorkflowBuilder) will bring up the context menu, where you can select the `Open in New Tab` or `Open in New Window` commands. You can also access these commands by right-clicking on the tab header or window title bar.
120+
Right-clicking on a nested node such as a [`GroupWorkflow`] will bring up the context menu, where you can select the `Open in New Tab` or `Open in New Window` commands. You can also access these commands by right-clicking on the tab header or window title bar.
121121

122122
Each tab or window displays a breadcrumb trail at the top, indicating the location of the current view within the nested workflows. Clicking a breadcrumb switches the view to the corresponding workflow, allowing you to navigate between levels.
123123

@@ -130,7 +130,7 @@ You can further organize tabs and windows by rearranging them into docked panels
130130
The `Explorer` panel also supports workflow navigation by providing a hierarchical tree view, similar to a file browser. Each level in the tree corresponds to a nested node. Selecting a node will update the `Workflow` panel view to display the corresponding nested workflow. You can also navigate the tree by using the keyboard arrow keys and pressing <kbd>Enter</kbd> to update the view. To open the node in a new tab or window, right-click on the node label and select one of the options. To expand or collapse the tree view at any level, click on the `+` or `-` icon to the left of the node label, or double-click the label itself. Icons adjacent to each label indicate the status of the corresponding workflow:
131131

132132
- ✏️ Editable workflow
133-
- 🔒 Locked workflow (`IncludeWorkflow`)
133+
- 🔒 Locked workflow ([`IncludeWorkflow`])
134134
- ⛔ Workflow contains errors
135135

136136
## Properties

articles/observables.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,7 @@ It is also possible to change the temperature of observable sequences using reac
7171

7272
<img alt="Replay operator" src="~/images/reactive-replay.svg" style="max-height:250px;padding:1em 0" />
7373

74-
Conversely, the [`Publish`](xref:Bonsai.Reactive.Publish) operator can be used to share a single subscription to a video file when sending images to downstream observers. In this case, instead of requesting a new subscription to the video for each new observer, the publish behaviour will always share only the images coming from the original subscription, no matter at what point the video is in. The original sequence has been turned from *cold* to *hot*.
74+
Conversely, the [`Publish`] operator can be used to share a single subscription to a video file when sending images to downstream observers. In this case, instead of requesting a new subscription to the video for each new observer, the publish behaviour will always share only the images coming from the original subscription, no matter at what point the video is in. The original sequence has been turned from *cold* to *hot*.
7575

7676
<img alt="Publish operator" src="~/images/reactive-publish.svg" style="max-height:250px;padding:1em 0" />
7777

@@ -81,5 +81,6 @@ In the Bonsai visual language, whenever two operators receive data from the same
8181
[`Condition`]: xref:Bonsai.Reactive.Condition
8282
[`Grayscale`]: xref:Bonsai.Vision.Grayscale
8383
[`KeyDown`]: xref:Bonsai.Windows.Input.KeyDown
84+
[`Publish`]: xref:Bonsai.Reactive.Publish
8485
[`Replay`]: xref:Bonsai.Reactive.Replay
8586
[`Sample`]: xref:Bonsai.Reactive.Sample

articles/subject-multicast.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,11 @@ uid: subject-multicast
33
title: "MulticastSubject"
44
---
55

6-
The [`MulticastSubject`](xref:Bonsai.Expressions.MulticastSubject) operator works like a sink which accesses the subject with the specified name, at the same scope level or above, and forwards any values emitted by the source sequence to the shared subject. Depending on the behavior of the subject, these values will then be passed to any operators subscribed to the subject, including any termination and error notifications.
6+
The [`MulticastSubject`] operator works like a sink which accesses the subject with the specified name, at the same scope level or above, and forwards any values emitted by the source sequence to the shared subject. Depending on the behavior of the subject, these values will then be passed to any operators subscribed to the subject, including any termination and error notifications.
77

88
:::workflow
99
![MulticastSubject workflow](~/workflows/language-subject-multicast.bonsai)
1010
:::
11+
12+
<!-- Reference-style links -->
13+
[`MulticastSubject`]: xref:Bonsai.Expressions.MulticastSubject

docfx.json

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -109,7 +109,10 @@
109109
"https://horizongir.github.io/opencv.net/xrefmap.yml",
110110
"https://horizongir.github.io/ZedGraph/xrefmap.yml",
111111
"https://horizongir.github.io/opentk/xrefmap.yml",
112-
"https://horizongir.github.io/reactive/xrefmap.yml"
112+
"https://horizongir.github.io/reactive/xrefmap.yml",
113+
"https://bonsai-rx.org/ironpython-scripting/xrefmap.yml",
114+
"https://bonsai-rx.org/ephys/xrefmap.yml",
115+
"https://bonsai-rx.org/numerics/xrefmap.yml"
113116
]
114117
}
115118
}

tutorials/closed-loop.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -271,13 +271,12 @@ def process(value):
271271
[`InputMapping`]: xref:Bonsai.Expressions.InputMappingBuilder
272272
[`LessThan`]: xref:Bonsai.Expressions.LessThanBuilder
273273
[`Negate`]: xref:Bonsai.Expressions.NegateBuilder
274+
[`PythonTransform`]: xref:Bonsai.Scripting.IronPython.PythonTransform
274275
[`Rescale`]: xref:Bonsai.Dsp.Rescale
275276
[`ServoOutput`]: xref:Bonsai.Arduino.ServoOutput
276277
[`SortBinaryRegions`]: xref:Bonsai.Vision.SortBinaryRegions
277278
[`Subtract`]: xref:Bonsai.Expressions.SubtractBuilder
278279
[`Sum`]: xref:Bonsai.Dsp.Sum
279280
[`Threshold`]: xref:Bonsai.Vision.Threshold
280281
[`TimeInterval`]: xref:Bonsai.Reactive.TimeInterval
281-
[`WarpAffine`]: xref:Bonsai.Vision.WarpAffine
282-
283-
[`PythonTransform`]: https://www.nuget.org/packages/Bonsai.Scripting.IronPython
282+
[`WarpAffine`]: xref:Bonsai.Vision.WarpAffine

tutorials/synching-ephys.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ The general approach when synchronizing two independent data acquisition clocks
3030
![Synching ephys with video](~/workflows/synching-ephys-video.bonsai)
3131
:::
3232

33-
- Using the workflow from the previous exercise, insert a `CameraCapture` source and point the camera such that you can see clearly both the LED and the computer keyboard.
33+
- Using the workflow from the previous exercise, insert a [`CameraCapture`] source and point the camera such that you can see clearly both the LED and the computer keyboard.
3434
- Insert a [`VideoWriter`] sink and configure the `FileName` with a path ending in `.avi`.
3535
- Insert a [`Crop`] transform and set the `RegionOfInterest` property to a small area around the LED.
3636
- Insert a [`Grayscale`] transform.
@@ -93,6 +93,7 @@ To do this, you can use the photodiode technique described in the previous exerc
9393
- **Optional:** Synchronize a camera with a projector using the GPIO trigger system outlined above.
9494

9595
<!-- Reference-style links -->
96+
[`CameraCapture`]: xref: Bonsai.Vision.CameraCapture
9697
[`Crop`]: xref:Bonsai.Vision.Crop
9798
[`CsvWriter`]: xref:Bonsai.IO.CsvWriter
9899
[`DigitalOutput`]: xref:Bonsai.Arduino.DigitalOutput
@@ -103,10 +104,9 @@ To do this, you can use the photodiode technique described in the previous exerc
103104
[`MatrixWriter`]: xref:Bonsai.Dsp.MatrixWriter
104105
[`Mod`]: xref:Bonsai.Expressions.ModBuilder
105106
[`Multiply`]: xref:Bonsai.Expressions.MultiplyBuilder
107+
[`Rhd2000EvalBoard`]: xref:Bonsai.Ephys.Rhd2000EvalBoard
106108
[`SelectChannels`]: xref:Bonsai.Dsp.SelectChannels
107109
[`SolidColor`]: xref:Bonsai.Vision.SolidColor
108110
[`Sum (Dsp)`]: xref:Bonsai.Dsp.Sum
109111
[`Timer`]: xref:Bonsai.Reactive.Timer
110-
[`VideoWriter`]: xref:Bonsai.Vision.VideoWriter
111-
112-
[`Rhd2000EvalBoard`]: https://www.nuget.org/packages/Bonsai.Ephys/
112+
[`VideoWriter`]: xref:Bonsai.Vision.VideoWriter

tutorials/vision-psychophysics.md

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -190,6 +190,8 @@ What is the minimal discrimination threshold for humans in this task? How would
190190
[`AsyncSubject`]: xref:Bonsai.Reactive.AsyncSubject
191191
[`Boolean`]: xref:Bonsai.Expressions.BooleanProperty
192192
[`CreateWindow`]: xref:Bonsai.Shaders.CreateWindow
193+
[`CreateContinuousUniform`]: xref:Bonsai.Numerics.Distributions.CreateContinuousUniform
194+
[`CreateRandom`]: xref:Bonsai.Numerics.CreateRandom
193195
[`Delay (Shaders)`]: xref:Bonsai.Shaders.Delay
194196
[`Delay (Reactive)`]: xref:Bonsai.Reactive.Delay
195197
[`DelaySubscription (Shaders)`]: xref:Bonsai.Shaders.DelaySubscription
@@ -205,6 +207,7 @@ What is the minimal discrimination threshold for humans in this task? How would
205207
[`PublishSubject`]: xref:Bonsai.Reactive.PublishSubject
206208
[`RenderFrame`]: xref:Bonsai.Shaders.RenderFrame
207209
[`Repeat`]: xref:Bonsai.Reactive.Repeat
210+
[`Sample (Numerics)`]: xref:Bonsai.Numerics.Distributions.Sample
208211
[`SelectMany`]: xref:Bonsai.Reactive.SelectMany
209212
[`Last`]: xref:Bonsai.Reactive.Last
210213
[`Sample`]: xref:Bonsai.Reactive.Sample
@@ -219,8 +222,4 @@ What is the minimal discrimination threshold for humans in this task? How would
219222
[`DrawGratings`]: https://bonvision.github.io/docs/DrawGratings/
220223
[`DrawQuad`]: https://bonvision.github.io/docs/DrawQuad/
221224
[`DrawText`]: https://bonvision.github.io/docs/DrawText/
222-
[`NormalizedView`]: https://bonvision.github.io/docs/NormalizedViewport/
223-
224-
[`CreateContinuousUniform`]: https://www.nuget.org/packages/Bonsai.Numerics
225-
[`CreateRandom`]: https://www.nuget.org/packages/Bonsai.Numerics
226-
[`Sample (Numerics)`]: https://www.nuget.org/packages/Bonsai.Numerics
225+
[`NormalizedView`]: https://bonvision.github.io/docs/NormalizedViewport/

0 commit comments

Comments
 (0)