[](http://xkcd.com/1190/)
# Programming and time
Computation is an inherently temporal medium: it comprises information processes that can be described abstractly but which unfold in actuality over time. Sound is likewise inherently temporal, as is any other perceptual or interactive medium.
But what is time? How do we represent and reason with it? How do we experience and create with it? Are there parallels in the treatment of time between art and computing? Are there differences we can learn from?
## Static and dynamic (unity and change)
The first division we can make separates that which changes from that which does not. Any time-based art (music, film, etc.) can be broken down into the "vertical" and "horizontal" structure. Similarly computations can be broken down into the unchanging "static" and variable "dynamic" components.
### Unity and change in time-based arts
**Vertical**: The vertical structure provides unity: that which remains relatively constant throughout, and thus encompasses the qualities of the whole. Since it influences all parts, vertical structure is often largely outlined in early stages of a work.
- Medum, macroform, frame, style
- Materials, technologies, techniques, constraints, rules
- Composed by associative, metaphorical, normative and hierarchical relations
- Semantics, intentions. Vertical elements of unity may be chosen to best convey the idea, feel, atmosphere, message; or as an experiment to liberate new creativity.
**Horizontal** structure refers to the temporal form of change: difference, movement, repetition, contrast, affinity, resolution.
- Composed of progressions: beginning as one thing and becoming another.
- Also reflections, recollections, repetitions.
- In parallel mixtures of rates and proportions (fast and slow, faster than and slower than).
- Bifurcating and coalescing.
- Continuous (gradual) or discrete (sudden).
- Quantitative (change in size, extent, proportion, measurable, numeric) vs. qualitative (change in kind, nature, tendency, individuality) aspects.
- Positive or negative, attractive or repulsive, convergent (affinity) or divergent (contrast).
- Effects, causes, intentions, story, destiny, chance.
- Negative space of change includes those aspects which endure; identity. The most enduring are the vertical aspects.
Eisenstein believed that “art is always conflict”, the opposition of forces that motivates and shapes action. The opposed forces are dissonance/consonance or tension/release. The premise of many audio-visual aesthetics is that the resolution of tension moves us through time, whether this is narrative/logical, visual or sonic/musical tension. We can also understand change perceptually in these terms. Contrast tends to increase visual intensity while affinity attenuates it. The temporal changes of any element can be described in this way. The components that construct tension may be based upon known perceptual principles, or principles established during the progress of the work.

Contrast and affinity can also be understood in terms of the ‘wrongness’ or visual dissonance, and ‘rightness’ or visual consonance, of an image. A dissonant image is a visually active moment of tension. Progressing from dissonance to consonance, tension to release, is a natural progression akin to musical resolution or cadence. As in music, most of the intensity and energy is usually devoted to the tension and particularly the climactic portions before release. Using these notions, we can move dynamically and musically through time. Complexity can be seen as a multitude of contrast (and simplicity the lack of it); dynamism as the amount of contrast over time, etc.
Story deserves special treatment. Narrative in film theory usually refers to a temporal macro-structure (the highest level of unity and contrast), the elements and driving forces, objectively characterized as:
- Exposition aka beginning
- establishing themes and the facts necessary, setting the micro-world into motion:
- Conflict and Climax aka middle
- increasing intensity; revealing/focusing the question of the work; leading up to the peak conflict in the subject
- Man vs. Beast, Race against Time, Brain vs. Brawn, Resist or Comply, Last Chance, Find the Killer, Solve the Riddle, ...
- Resolution aka end
- a return to the 'real world' from the micro-world of the work:
- completion, wrapping up incomplete elements,
- giving space for recovery, leaving trailing thoughts etc.
Each component and subsection of a work may also mirror this structure.

Of course, structure need not be so linearly defined; we may find sub-plots, unanswered questions. A complete work is a holistic experience whose dynamics may be artfully managed in such a way that, for example, what would normally be the least intense by itself becomes the most intense in the complete composition. A plain white screen may become the most powerful image of all, with the correct preparation, just as silence can become the most intense sound in a composition.
In art the distinction between horizontal and vertical, and between constrast and affinity, need not be hard and fast. Principles of unity in a work may be guidelines fit to be broken when needed, or a work may pass through phases of different unities or even thread multiple unities into a story. Excessive contrast leads to irritation or numbness. All of these aspects relate to how we perceive in time.
### Static and dynamic components in computing
Programming also involves a division of static and dynamic components. The rules of a programming language are usually **static**: they are not expected to change during the run-time of the program. On the other hand, the flow of control is partly determined as a program runs, and values in memory can change ("variables") or be allocated and freed as it goes; these are examples of **dynamic** components. Computing inherits from mathematical logic a rigorous attitude toward definitions, thus static and dynamic divisions tend to be more sharply discriminated and pedantically adhered to.
However there are times where exceptions are desirable or necessary.


From its industrial lineage computing inherited a linear task-oriented character: a program's job is well defined in advance. Turing used the analogy of a recipe: a series of instructions to perform. From mathematics and cryptography computing inherited the notion that a program's job is to compute an answer. One of the first principles in the theory of computation regards whether a program will or will not terminate with an answer; irrespective of how long that might actually take.
As mainframe computing became established in the 1950's, this evolved into "batch" oriented computing, in which control programs sequentially dispatch to other task oriented "job" programs, for reliability and efficiency. Dumb terminals (in modern terminology, "thin clients") are used to input data for new jobs and monitor output. Terminals present a pseudo-conversation, prompting the user for input when required by the running job, or waiting for commands to start a new job.

Gradually the terminal evolved into a complex interactive interface. The UNIX terminal (still present in MacOSX and Linux operating systems) allows jobs to be created in which the output of one is fed to the input of another (via "pipes"), and more complex jobs with conditional components defined by authoring scripts. Many modern programming languages also offer a terminal-like interface, called a Read-Eval-Print-Loop (REPL).
### Programming in time
Most programs that we consciously interact with today are not linear, task-oriented procedures invoked to compute an answer; instead they create a working environment in which we can switch between multiple concurrent tasks and levels of attention, running for as long as we might be interested in them. An operating system, a web server, may continue running for days or years. It might have various different users during this time, possibly many at a time, and might also have parts of itself replaced while it runs.
#### Real-time
Nevertheless, computing history has often neglected time; it is sometimes easier to theorize without it. Unfortunately, for audio and multimedia, the practical constraints of real-time are unavoidable. (In theory, there's no significant difference between theory and practice; but in practice there usually is.)
To be **timely**, an operation must produce results in less time than the playback of the results requires. E.g. filling a 30fps video frame must complete in less than 1/30th of a second. An operation takes place in real-time only if it can *always* perform its task in a timely manner with respect to the larger dynamical system of which it is a part. Any failure to do so results in a break in the output. "Time waits for no man."
The amount of time it takes for an input trigger to pass through the computing system and cause experienceable output is the **latency**. Interactive software requires low latency (fractions of a second) to feel natural. Gaming and especially musical applications require especially low latency.
#### Run-time
Furthermore, in the conventional view, software development occurs before and after a program runs but not during. But with server programming, in-app scripting, shell scripting, in-game development, live coding etc. this assumption breaks down. Programming for real-time performance, and supporting run-time programming, are both difficult; but the potential benefits for human-centered experience are profound.
#### With-time
To truly program media in time is to architect, orchestrate and choreograph a sequence of transformations and computations with as much precision as desired. The design of the language interface should incorporate a natural representation of temporal flow, which is often lacking in many general purpose languages and systems. The computer music community has been especially active in elevating time to a first-class citizen in programming. Further discussion here:
- [Computing needs Time](http://www.eecs.berkeley.edu/Pubs/TechRpts/2009/EECS-2009-30.pdf)
- [Programming With Time](http://impromptu.moso.com.au/extras/sorensen_ow_2010.pdf)
## The representation of time
### Linear, cyclic, non-linear, finite or infinite
A spatial metaphor of time forms a line from past to future. The line could be **ordinal** (a sequence or list, such as the script for a play) or **metric**, in which ecah event has a numeric position and each duration has a measurable length. Finite linear time has a definite beginning (zero time) and end; infinite linear time has no definite beginning or end. For example, a pre-recorded DVD encodes finite linear time, while the real-time video stream from a CCTV camera has no definite end. Cyclic or circular time represents a period that repeats; such as a clock face. Linear and cyclic time can be compbined by representation as a spiral. Linearizing time suggests the ability to navigate around it: rewind, fast-forward, skipping, scrubbing, scratching.
### Continuous vs. discrete
A **continuous** representation of time is one in which any period can be further subdivided into smaller periods, ad infinitum. No matter how short the duration, smaller durations can be described within it. This is the representation of time used in understanding analog systems, and the calculus of differential equations (such as function derivatives and integrals). For example, the function **sin(t)** is continuous (where **t** represents a real-valued variable such as time).
A **discrete** representation of time has a lowest temporal resolution below which shorter durations cannot be represented. It describes a **time series**, a sequence of discrete values. Discrete series are sometimes indicated using square brackets, such as **f[n]** (where **f** is an arbitrary function and **n** is the discrete integer series). It requires a different branch of mathematics: the calculus of difference equations and finite differences.
We do not know if nature is at root temporally continuous or discrete (it has been debated since at least the time of the early Greek philosophers), however we do know that if time is discrete, it is so on a scale so vastly smaller than what we can perceive, thus for practical purposes it may be considered continuous.
## The action of time
### Control flow
0. Halt / exit
1. Unconditional jump (label & goto)
2. Choice: conditional jump / if / switch
3. Loops: Repeat / while / for / foreach / break / continue
4. Non-local: subroutines / functions / callbacks / coroutines / continuations / exceptions
5. Parallelism: interrupts / Signals / Multi-threading
6. Dynamic code (Eval)
#### 0. Halt / exit
An exit can be desired (the program has done the job or the user has terminated using it), or undesirable (the program has reached an unexpected error). A program may also fail to halt at the appropriate time (see the [Halting Problem](http://en.wikipedia.org/wiki/Halting_problem)), for example by falling into an infinite loop. For a real-time performance, interactive art installation, server application, operating system, the program is not expected to halt by itself.

#### 1. Unconditional jump (label & goto)
Many of the following control flows build upon the ability to jump location in code, however the raw ability by itself has been [considered harmful](http://www.cs.utexas.edu/users/EWD/ewd02xx/EWD215.PDF) and is absent from many languages.
#### 2. Choice: conditional jump / if / switch
Introduces the interplay between data- and control-flow, i.e. this is where things start to get interesting.
#### 3. Loops: Repeat / while / for / foreach / break / continue
All loops depend on the ability for conditionals to lead to jumps *backward* to earlier points of code. Loops offer an abstraction to avoid repetitive action specification in code, for example. A static loop repeats a number of times specified by values at compile-time, rather like a macro. A dynamic loop depends on run-time values: this means it may be unpredictable and even fall into an infinite loop.
The combination of loop and switch gives the basic framework for an interpreter: a stream of input *tokens* (characters, words, events, notes) causes dispatch to different sections of code (implementing the semantic behavior according to the token), for as long as new input tokens continue to be available.
#### 4. Non-local: subroutines / functions / callbacks / coroutines / continuations / exceptions
The non-local code abstraction allows the same instructions to be invoked from multiple locations, while retaining the ability to "jump back" to where you previously jumped from when the invocation started. So, after jumping into a function body, you return to the site where the function was called. It allows another form of abstraction minimizing code repetition.
It also allows *callbacks*, a form of inversion of control, in which new code fragments can be inserted into an already-running program at pre-designed locations. For example, audio plugins depend on the use of callbacks to redefine how an audio program processes sound.
It relies on jump labels being dynamic. This can be reified further into a *continuation*: an object representing the "remaining work to be done". Coroutines use this to give the impression of a function that can be "paused" (returning to the call site), and then later "resumed" (returning to the pause site).
Note that many other control flows can be restated in terms of functions. (In functional programming, one basic formulation of a function is called *lambda*, and a series of famous papers showed lambda-the-ultimate for each one).
#### 5. Dynamic load / dynamic code (Eval)
This is often not included as a form of control flow, however it is probably the most powerful. The ability to load and execute code at run-time allows a program to change itself as it runs. The simplest level is to dynamically load pre-compiled code, but more expressive is to dynamically evaluate new code and run it (*eval*). It opens up the scope for run-time metaprogramming.
#### 6. Parallelism: interrupts / Signals / Multi-threading
Coroutines can give the impression of multiple parallel contexts of execution, but in reality only one is active at a time. Signals (interrupts) are a low level mechanism that operates in a similar manner, typically responding to low level events to interrupt current execution and jump to a new location. They are rarely used in high level code.
Multi-threading uses lower level CPU parallelism to enable truly parallel execution. This can be very difficult to code for, as even a simple operation as **i = i + 1** could have unexpected results if the variable **i** is also modified by another parallel thread.
----
## audio.go(), audio.wait(), etc.
The **audio** module comes equipped with a coroutine scheduler derived from [LuaAV](http://lua-av.mat.ucsb.edu/blog/?p=137). It allows us to schedule functions that can be paused and resumed in the process of generating audio. It is thus **strongly timed** in a similar manner to the ChucK live-coding language.
The Lua language itself does not have means to control time, however this has been added by the **audio** module. These functions are very useful for building up musical structures, because of the temporal accuracy. They include:
```
-- return the current scheduler time (in seconds):
audio.now()
-- launch a function in the scheduler:
audio.go()
-- pause a scheduled function (either for a duration or until a named event occurs):
audio.wait()
-- resume functions waiting for a named event:
audio.event()
```
If you will use these often, it may be worth caching them into local variables:
```
local now, go, wait, event = audio.now, audio.go, audio.wait, audio.event
```
### now()
Printing out **now()** in a new script will return the logical time (in seconds) since the script was loaded. Until we start scheduling with time, all script actions occur immediately, so now() will return 0.
```
print(audio.now()) -- prints 0
```
### go()
The function **go()** will take a function and arguments, create a *coroutine* based on them, and add this coroutine to the scheduler. An optional first argument can specify the time (in seconds) to wait before this function is run:
```
audio.go(print, "life")
audio.go(2, print, "and everything")
audio.go(1, print, "the universe")
-- prints: "life"
-- after 1 second, prints: "the universe"
-- after 1 second more, prints: "and everything"
```
Of course you can put your own function instead of using **print**!
### wait()
So far we can use **audio.go** to precisely choose when a function starts. This could be enough to create a sequencer, for example.
But since the functions are run as *coroutines*, we can also pause them in the middle, and resume them again later, using **wait()**. The **wait()** function allows us to pause the execution of a function for a number of seconds, after which it will continue:
```
audio.go(function()
print("life")
audio.wait(1)
print("the universe")
audio.wait(1)
print("and everything")
end)
```
What makes this more powerful is that it can be combined with other kinds of control flow such as for loops, while loops, nested function calls, and so on. Here's a simple example of an infinite process that prints "tick" every 1 second:
```
audio.go(function()
while true do -- loop forever
print("tick")
audio.wait(1)
end
end)
```
Combine this with **now()** to create a clock:
```
audio.go(function()
while true do -- loop forever
print("tick at", audio.now())
audio.wait(1)
end
end)
```
### Parallelism
Coroutines are Lua's way to provides parallelism within a script. One way of thinking about a coroutines is that it is like a parallel function or script state; another way to think about it is as a function that can be paused in mid execution, while Lua goes off to execute some other code, and to later returned to (resume) at the point at which it paused (yielded). Our software adds more power to coroutines by connecting them with the audio scheduler.
While a coroutine is paused, other coroutines and audio processes can continue to occur. So we can launch multiple coroutines to create parallel processes, like multiple players in an ensemble, voices in a drum machine, and so on.
Here's a very simple example; it prints out a tick every 1 second, and a TOCK every 4 seconds:
```
function clockprinter(name, period)
while true do
print(audio.now(), name)
audio.wait(period)
end
end
audio.go(clockprinter, "TOCK!", 4)
audio.go(clockprinter, "tick", 1)
```
Now we can create many parallel copies of the same function that can be scheduled alongside each other, each with potentially distinct timing, but without losing deterministic accuracy.
In this way you can easily create musical patterns like Steve Reich's [Clapping Music](http://www.youtube.com/watch?v=lzkOFJMI5i8) or the phasing patterns of his [Drumming](http://www.youtube.com/watch?v=YH9n6pwpK0A&list=PL1G8x4dgz5wN--kHkJ66eahWhPEMTD4Pd) for example.
Remember, the **go()** function can also take an optional first argument (delay in seconds), which allows us to schedule it to occur at some point in the future:
```
audio.go(2, clockprinter, "clackety", 1) -- will start 2 seconds later
```
Note that the even if the delay is 0, or is not given, the coroutine will not run immediately; **go()** simply adds the coroutine to the internal scheduler. (Lua is single-threaded by design, which means that only one actual function is executing at any time.)
### Scheduling with events
Sometimes we want to schedule activity to occur not at a given time, but when a given situation occurs. To support this, the **go()** and **wait()** functions can also take a string argument in place of a duration. The string represents the name of a unique event.
```
-- schedule a callback for the "foo" event:
audio.go("foo", function print("the foo happened") end)
-- trigger it:
audio.event("foo")
```
The **event()** function can then be used to resume ALL coroutines that were scheduled against or waiting upon a particular event. A classic use-case of this is to make sure that graphical rendering commands only execute during a window’s draw() method. Another use case is to schedule sequences to arbitrary rhythmic patterns.
```
-- launch a process to trigger "beat" events with a 1/0.5/0.5 pattern:
function rhythm()
while true do
audio.event("beat")
audio.wait(1)
audio.event("beat")
audio.wait(0.5)
audio.event("beat")
audio.wait(0.5)
end
end
audio.go(rhythm)
-- launch another process to respond to these events by alternating AAA and BBB:
function printer()
while true do
print("AAA")
audio.wait("beat")
print("BBB")
audio.wait("beat")
end
end
audio.go(printer)
```
### Nested coroutines
We can also bounce between functions within each parallel process, allowing us to decompose a complex pattern into smaller functions. We can also launch new coroutines from within another. Putting these together, here's an implementation of Steve Reich's Clapping Music:
```
--[[
An attempt to implement Steve Reich's "Clapping Music"
--]]
local audio = require "audio"
local buffer = require "audio.buffer"
local go, wait, now, event = audio.go, audio.wait, audio.now, audio.event
-- load up some clap sounds:
local p1 = buffer.load("paddle1.wav")
local p2 = buffer.load("paddle2.wav")
local dur = 1/8
-- clap N times:
function claps(sound, n)
for i = 1, n do
-- humanize:
local jitter = math.random() * 0.01
-- run the clap sound as another sub-process independent of main time:
go(jitter, audio.play, sound)
-- note length:
wait(dur)
end
end
function rest()
wait(dur)
end
-- this is the main pattern that is repeated over and over by each player:
function pattern(sound)
claps(sound, 3)
rest()
claps(sound, 2)
rest()
claps(sound, 1)
rest()
claps(sound, 2)
rest()
end
-- the process of each player:
function clapper(sound, shift)
while true do
for i = 1, 4 do
pattern(sound)
end
if shift then rest() end
end
end
-- player 1 does not shift:
go(clapper, p1, false)
-- player 2 shifts:
go(clapper, p2, true)
-- turn audio on:
audio.start()
```
----
[](http://www.thisiscolossal.com/2012/05/delightful-paper-pop-ups-by-jenny-chen/)