Concepts#
Warning
This document is under construction.
Pipeline#
A pipeline is five stages through which data flows during a vortex acquisition or session. The vortex pipeline is sufficiently flexible to accommodate most OCT use cases. If a specific use case does not fit this format, the user can assemble their own pipeline from individual vortex components, usually at the C++ level to meet real-time requirements.
Generate
Calculate scan waveforms and other signals from the scan pattern. This stage corresponds to components in the Scan module.
Acquire
Collect data from digitizers or frame grabbers. Read/write synchronized I/O signals. This stage corresponds to components in the Acquire and IO modules.
Process
Perform OCT processing, including de-averaging, resampling, complex filtering, FFT, and log normalization. Perform non-OCT processing using linear transformations. This stage corresponds to components in the Process module.
Format
Organize data into useful formats with inactive segment removal, assembly of volumes, or rectification of non-rectangular patterns. This stage corresponds to components in the Format module.
Store
Maintain data on host or GPU memory. Deliver data to disk for persistent storage. This stage corresponds to components in the Endpoint and Storage modules.
System#
A system is a vortex pipeline that has been configured to accomplish a specific imaging task.
Each pipeline stage is populated with components related to that imaging task.
The whole system is managed by the Engine
, which coordinates flow of data through the pipeline in real-time.
The EngineConfig
encodes the pipeline configuration as well as the data management and flow control required for the system.
Application#
An application is a system that is paired with user-facing features, such as graphical display or event-driven changes in scan patterns.
An application interacts with the engine through the ScanQueue
, which receives scan patterns to execute, and endpoints, which asynchronously issue callbacks.
Using callbacks, the application can chain custom processing onto that already performed in the pipeline.