2. Core Program Structure
At the root of all things is the
DSP Kernel
, found at 5. DSP Kernel. This is the thing that
computes all audio into a top-level buffer.
The DSP kernel buffer is sent to the speakers in a realtime
setting using the Audio Render Thread
, found at
7. Audio Render Thread.
All DSP is done with Graforge
and Soundpipe
, found at
6. Graforge and Soundpipe.
Graforge is a library used to create block-based signal
chains in a modular way. DSP algorithms come from Soundpipe.
Signal chains are populated using the stack-based language
Runt
, 9. Runt.
Monolith has a few core peripherals that can be used to
control sounds in realtime. A Virtual Interface Layer
is used to separate the physical hardware components from
the DSP kernel, (see 8. Virtual Interface Layer).
Hardware messages are polled and conveyed to the virtual
interface layer via the Hardware Listener
(12. Hardware Listener).
The monome grid hardware is managed in
Monome Hardware and Libmonome
(13. Monome Hardware (Arc, Grid) and Libmonome).
The griffin knob hardware is managed in
Griffin Hardware
(14. Griffin Hardware).
Hardware peripherals can be used to control sounds via
Pages
(11. Pages).
Monolith largely is controlled using a Scheme
REPL (10. Scheme).
prev | home | next