lindsay vickery
  • lv
  • music
    • listen
    • Works by Instrumentation
    • Works by Concept
    • 2014 onwards >
      • music 2022
      • music 2021
      • music2020
      • music2019
      • music 2018
      • music 2017
      • music 2016
      • music 2015
      • music 2014
    • music 2009-2013 >
      • music 2013
      • music 2012
      • music 2011
      • music 2010
      • music 2009
    • music 1997-2008 >
      • music 2004-8
      • music 2003
      • music 2002
      • music 2001
      • music 2000
      • music 1998-9
      • music 1997
    • music 1985-1996 >
      • music 1995-6
      • music 1993-4
      • music 1991-92
      • music 1988-90
      • music 1985-87
  • writing
    • Research Projects >
      • screening the score >
        • scrolling notation
        • rhizomatic scores
      • expanded music notation
      • realtime notation
      • scoring field recordings
      • spectral analysis as a compositional tool
      • computer controlled performance environment
    • research >
      • research 2021
      • research 2020
      • research 2019
      • research 2018
      • research 2017
      • research 2016
      • research 2015
      • research 2014
      • research 2013
      • research 2011-12
      • research 2008-10
      • research 2001-04
    • teXts
  • performance
    • performance 2017- >
      • performance 2022
      • performance 2021
      • performance 2020
      • performance 2019
      • performance 2018
      • performance 2017
    • performance 2009-16 >
      • performance 2016
      • performance 2015
      • performance 2014
      • performance 2013
      • performance 2012
      • performance 2011
      • performance 2010
      • performance 2009
    • performance 1997-2008 >
      • performance 2005-8
      • performance 2004
      • performance 2003
      • performance 2002
      • performance 2001
      • performance 2000
      • performance 1999
      • performance 1998
      • performance 1997
    • performance 1985-1996 >
      • performance 1992-6
      • performance to 1991
    • solo performance
  • gallery
    • scoreplayerscores
    • environmental scores
    • nature forms
    • landscapes
    • graphic scores
    • schematics
    • spectrograms
    • oddities
  • Blog
  • Blag

spatialising field recordings

16/4/2016

0 Comments

 
small waves raised by the evening combines components of the field recording of bullfrogs with spectrally frozen sections, using Eric Lyon’s thresher~ object to highlight amplitudinally prominent frequencies, with the source recording. Spatial depth was emulated by separating sonic features in different frequency bands into multiple channels with a series of frequency tracking patches developed by the composer in MaxMSP. As no actual spatial data was derived from the stereo field recording, the channels were then to spatialised according to their frequency (azimuth) and amplitude (distance) in the object ambipanning~. 
Picture
Lindsay Vickery small waves raised by the evening [2016] (excerpt).
A number approaches to frequency tracking were explored: manual frequency tracking, by annotating the spectrogram with multiple function objects and then retrieving the data by inputting the position of the audiofile as reported by snapshot~ to control the centre frequency of bandpass filters (Figure below); Automated frequency tracking controlling the bandpass filters’ centre frequency via frequencies derived from spectral anaylsis conducted using the sigmund~ object; and “ecological niche” tracking drawing on Krause’ theory which asserts that “animal and insect vocalisations tend to occupy small bands of frequencies leaving “spectral niches” (bands of little or no energy) into which the vocalisations (fundamental and formants) of other animals, birds or insects can fit.” (Wrightson, K. 2000. “Introduction to Acoustic Ecology”. Soundscape-The Journal of Acoustic Ecology, 1(1), pp. 10-13. p. 11). 
Picture
Manual frequency tracking by annotating the spectrogram with multiple function objects to automate the centre frequency of bandpass filters.
Picture
Audacity’s “Plot Spectrum” function.
This theory would suggest that band passing at particular frequencies will tend to capture divergent features of the sonic environment, Audacity’s Plot Spectrum function (Figure to the left) was used to determine the niche frequency bands in the recording and then bandpass filters were applied separately to each band.
These frequency tracking processes are similar to conventional Audio Expansion, however the complexity and the uniform spread of amplitudes from increasingly distant sources that are common in most field recordings, render poor results from conventional expansion. 
​Perhaps unsurprisingly then, the manual method for frequency tracking proved to be the most effective of those explored.
It could be argued that manual designation of the band pass frequencies adds a ‘human’ layer to the process in contradiction to the eco-structural aims of deriving all data from the environment itself. However, the process is no less of an intervention than choosing sonic features to be emulated by acoustic instruments and indeed, in this case, was achived through similar means: visual detection of features from a spectrogram. The approach is perhaps analogous to the ‘Cocktail Party Effect’ (Pollack, I. and J. Pickett. 1957. “Cocktail Party Effect”. Journal of the Acoustical Society of America 29 p. 1262) a feature of the human auditory perception in which conscious auditory attention allows for pre-semantic attenuation of signals in a complex environment.
0 Comments

Reading Spectrographic scores with Lyrebird... some evaluation.

16/4/2016

0 Comments

 
The figures below explore the effectiveness of these objectives through the comparison spectrograms of the source field recordings and the performances of a number of musicians. The spectrograms depict comparable frequency ranges, although the performer spectrograms are uniformly an octave lower than those of the field recording. The titles of the recordings are those given by the original recordist Philip Kenworthy.
The Figure to the right compares the field recording Bullfrogs and Rainstorm with a pianist’s performance. The performer’s relationship to Lyrebird is the most “score-like”, in that the pitch, rhythmic and dynamic contours of the bullfrog croaks from the field recording are adhered to with a great deal of precision. ​
Picture
Bullfrogs and Rainstorm: comparison between (a) field recording 339-899Hz and (b) piano performance 161-457Hz.
Picture
Kookaburra Sunrise: comparison between (a) field recording 279-7558Hz and (b) piano performance 129-3752Hz. 
The task is perhaps simplified because the pitch range of the croaks is limited to about 3 semitones, however the spectrogram indicates that this method of synchronisation of the recording and the performance is effective in this instance.
In a case where a more complex sound environment comprising a range of birdsongs is used, such as the figure on the right, the task is considerably more complex. Birdsongs are often too high pitched and rapid to be emulated entirely accurately. Lyrebird aims to provide information at a human scale showing the contour of extremely rapid birdcalls rather than precise detail.
Here the same pianist aims to portray three bands of activity in different ranges and is successful until the bands have simultaneous activity (about half-way through the figure), at which point activity in the highest band is ignored. 

In a complex environment such as the recording Whistlers and Crickets (Figure to the right) there is too much data for the performer to emulate the complete environment. Again the performer, this time on bass clarinet chooses specific features to emulate. A potential solution to the emulation of complex environments is the use of networked players displaying activity in different bands for a number of performers. 
Lyrebird does not represent the sonic events on a grid to indicate frequency, and it is evident that the performer here “overshoots” the high frequency events, performing them about an octave too high in relation to the lower pitched layers.
Picture
Whistlers and Crickets: comparison between (a) field recording 376-4785Hz and (b) bass clarinet performance 183-2088Hz. 
Picture
Whistlers and Crickets: comparison between (a) field 1447-15896Hz recording and (b) percussion performance 720-7939Hz.
In the Figure to the left, the performer’s aim was to interact with the recording through both emulation and improvisation. The field recording spectrogram shows the gradual fading of repetitive cricket sounds, followed by a prominent birdcall.
The percussionist emulates only the birdcall and then mimics the cricket sounds once they have ceased.  The passage suggests that Lyrebird can provide and effective representation of the sonic environment that allows the performer to interact by taking over features from the recording. 
Lyrebird allows for interaction with pre-recorded non-anthropogenic sound environments. The degree to which an interaction is meaningful is self-evidently dependent on the performer(s) abilities. Examples of this work can be heard at (https://lindsayvickery.bandcamp.com/album/works-for-instruments-and-field-recordings). However, unlike many musical experiences the potential for precise synchronisation of a performer with seemingly indeterminate sonic events arguably has an intrinsically interesting quality. The evaluation of the accuracy of performer emulation is something of an end in itself, and this includes both the degree of acoustic reproduction of the sounds (as demonstrated in the figures above) and the performer’s ability to “enter into” the soundworld of the recording through improvisation.

0 Comments

nature forms II

27/11/2015

 
Nature Forms II (2015) explores the possibility of recursive re-interrogation of a field recording through visualization and resonification/resynthesis via machine and performative means. In Nature Forms II a range of forms of representation are  used including semantic graphical notation, percussion notation and a hybrid form of sonogram notation. This most literal form of representation is the process I have previously termed the “spectral trace” in which notation is drawn directly onto the spectrogram to represent features of the field recording to be performed by an ensemble of flute (orange), clarinet (red), viola (green) and cello (blue).
Picture
“Spectral Trace” notation from Nature Forms II
The graphical symbols are presented on a proportional staff created to solve the problem that horizontally, the spectral trace is temporally proportional and performing with precision can be achieved through a scrolling score, however it is also vertically proportional – each pitch occupies a distinct vertical spatial position.
Picture
Vertically proportional stave showing a chromatic scale from E3 to F4.
The system was developed for the work here, apparently, there was time for everything [2015], and attempted to more-or-less retain the topographical layout of the traditional stave, while adding coloured lines to accommodate non-natural notes. A second approach draws on the concept and techniques developed in EVP (2011) and Lyrebird: Environment Player (2014).
The amplitude of the frequency of the single strongest detected sinusoidal peak in a recording is represented by the size of the rectangles drawn on a scrolling LCD obect (in this case jit.lcd). ​The “lyrebird” generative score process was also employed in Nature Forms II, to create high, middle and low frequency visualisations of the field recording.
Picture
Detail from the high frequency “lyrebird” score of Nature Forms II [2015] (excerpt).
Picture
Percussion part for Nature Forms II created using the generative score software “Semantics of Redaction” (excerpt).
​The final form of visualization in Nature Forms II is derived from the work Semantic of Redaction (2014) in which notation in generated by using accents detected in a speech recording in real-time to generate graphical symbols of varying vertical position, size and colour, determined by the frequency, amplitude and timbre of a speech recording at the accent point. In Nature Forms II the software detects accepts created by sharp attacks such as the chirping of crickets.
​A number of approaches to sonfication/resynthesis are used in Nature Forms II. The first is an additive synthesis approach using a patch called Sinereader to resonify greyscale spectrogram images .
Picture
Spectrogram of additive synthesis from Nature Forms II (excerpt).
Picture
Spectrogram of Ring Modulation synthesis from Nature Forms II (excerpt).
​Processes were also developed to resynthesise sounds using spectral analysis. In the first process, the strongest sinusoidal component detected each 40ms of the recording were resynthesized with a sinewave that was ring-modulated according to the currently detected brightness of the recording.  The aim was to retain the amplitude of sonic features of field recording while maintaining and equivalent brightness.
​This method is also used in portions of the performance to ring modulated the instruments via live processing to mimic the parametric brightness of the field recording by subtracting the brightness of the live instruments from that of the field recording to derive a ring modulation value.
Picture
Spectrogram of subtractive synthesis from Nature Forms II (excerpt).
​Subtractive synthesis was also employed by using frequency and amplitude detected in the recording to bandpass white-noise. At the opening of Nature Forms II “coloured noise” performed by the instrumentalists is gradually shaped into the sonic structure of the field recording using subtractive synthesis, and then cross-faded with the source recording.
Nature Forms II explores the notion of eco-structuralism, maintaining what Opie and Brown (2006) term the “primary rules” of “environmentally-based musical composition”: that “structures must be derived from natural sound sources” and that “structural data must remain in series”.
The structure of the original work is conserved using the approach discussed in the miracle of the rose, where the temporal proportionality of the recording is retained by aligning multiple notation and resynthesis versions of the recording in visual representations that can be alternated or combined in the creation process of the score, processing and fixed media.
The work also uses “spectral freezing” of components of the field recording to create spectrally derived chords from features of the recording bird sounds and a rusty gate which are then transcribed into notation for the instrumentalists and temporal manipulation of the recording to allow complex bird calls to be emulated in a human time-scale.
Picture
Visual representation of temporally proportional alignment of multiple resynthesis (a.- e.) and notation (e. – h) versions of the recording in Nature Forms II (excerpt): a. field recording spectrogram; b. additive synthesis spectrogram; c. ring modulation resythesis spectrogram; d. subtractive synthesis spectrogram; e. spectral “freeze” sonogram/score f. “lyre- bird score; g. “semantics of redaction” score; and h. “spectral trace” score. 
Recording generated with data from the source field recording using additive, subtractive and ring modulation resynthesis.

two rhizomes

21/10/2015

0 Comments

 
In Ubahn c. 1985: the Rosenberg Variations [2012] a rhizomatic, rather than single plane score model was also introduced to the Decibel Scoreplayer, in which interlinked staves (or paths) could be freely traversed by the score-reader. Two new works trash vortex [2015] and detritus [2015], feature scores comprising three layers: a graphical score, a rhizomatic path and a image collage (left). The “path” layer periodically (and indeterminately) becomes transparent causing the graphical score to appear to "submerge" into the background collage - making it more challenging for the performers to read. The scores (in the Decibel Scoreplayer) send messages about the “path” layer's status via OSC to a MaxMSP audio processing patch, which in turn alters the audio processing of the live instruments to reflect the state of the score.
Picture
Layered arrangement of Graphical score, Rhizomatic Path and Background Image Collage in trash vortex.
Picture
detritus "audience view".
The player supports two forms of "view":  the "audience view" a projection for the audience of the rhizomatic network map, with  dots representing each performers' position in the score; 

Picture
detritus "performer view" (path faded in)
Picture
trash vortex "performer view" (path faded out).
0 Comments

Audio processing from sound and image in ...with the fishes...

21/10/2015

1 Comment

 
The score for ..with the fishes.. [2015] was built in a series of tableaux: oil rigs, flood, nuclear leak, deep sea, jellyfish/methane, submerged city, trash vortex. The tableaux were joined into a long image and then a score for viola, cello and double bass was then added on another layer. The score contained several references to musical objects: a recurring ship’s bell and several passages from Debussy’s La Mer. At an early stage of development of the work the instruments were recorded separately performing the notation and also performing passages with only the visual images from the tableaux. Once all of these elements had been created and had fixed temporal positions – the position of the scrolling score and the position of the recorded performances – they could be further developed through a range of interactions. 
Picture
(above) Spectrogram of ship’s bell and b. notation of ship’s bell in the score of …with the fishes…
Picture
(above) The MaxMSP function object mapped to contours of the score background image to control spectral manipulation of live performers and pitchbend spectrally “frozen” (with resent~) loop of Debussy’s La Mer in …with the fishes…
The temporally fixed notation and audio allowed the process to proceed in a manner akin to manipulating audio in a DAW: processing strategies could be auditioned; data could be derived from the score and used to control audio processing, pre-processed audio could be added and so forth.
At the simplest level the sound of a ship’s bell was aligned with notation derived from a spectrogram of the same ships bell and short processed passages from Roger Désormière’s classic 1950 recording of La Mer were aligned with the short quotations from the work.
​The processing included convolution between sonifications of the score image and the performance of the same image by the musicians to produce hybrid sounds combining both machine and performer realisations of the images. The original image (a.) is compared to the spectrogram of the convolution between sonifications of the image and its performance by the musicians (b.) are shown to the left.
(Right) Image of Fukishima radioactive leak and b. spectrogram of convolution of instrumental reading of the image and sonification of the image in …with the fishes…
Picture
1 Comment

residual drift [2015]

23/8/2015

0 Comments

 
to commemorate the 50th anniversary of Varésè' death a work was created using the composer's percussion work Ionisation as a source. Apart from alluding to the origin of the piece from the residue of a Varésè work, the title uses terms from particle physics which perhaps bear the same  "fantastic" qualities  in our own time that the term ionisation did in 1931. 
In theory residual drift could be performed in conjunction with a performance of the source work. A sonogram of a recording of Ionisation was used to generate a score and accompanying pre-recorded electronics. The image of the sonogram was rendered in illustrator focussing on the range of the bass flute C2-around C4. Because sonograms are probably least effective at visualizing percussive works, features such as snare drum rolls appear as continuous pitches. This deficit was put to use in transforming Ionisation into a score for a solo instrument that makes its sound through "continuous" (as opposed to discrete) actions. The first few seconds of the Ionisation sonogram and the score of residual drift are shown below (note the "piano roll" style pitch indication is used as a "playhead" for the scrolling score to orientate the performer.
Picture
The sonogram was processed in Photoshop and then Illustrator to further "smooth" the features and to reduce hue variation. The bass flute graphical score was rendered in broadly three hues, corresponding to varied tonal qualities of the bass flute:  normal tone, diffuse tone and noise/breath. In performance the method of rendering these qualities is determined by the performer. The instrument is amplified to accentuate these tonal variations.
Picture
A sonogram of the sonification rendered by Photoshop. 
Picture
A sonogram of the sonification rendered by Illustrator. 
Picture
A sonogram of the false harmonics sonification.
Picture
The electronic component was rendered by resonifying the sonogram using processes first developed for my work 
unhörbares wird hörbar [2013] and discussed here and here. Originally a long file (24294px) was used generating 38 minutes of audio that was compressed to 8 minutes, but finally a short 4859 px version was sonified creating an audio file of roughly 8 minutes without compression. The change was made to avoid the continuous "flutter and wow"-like effects created by compressing the file by a factor of five.
The final version of the electronics part incorporates three versions of the sonification: processed by photoshop, processed by illustrator with added false harmonics and a version featuring a sonogram of just the upper harmonics of the recording (which were excluded in the image used to make the score). The high harmonics version prominently features the imprint of Ionisation's characteristic sirens.
Because the resonification software I built renders the audio in a mono file the audio was also spatialised using a process similar to that used in my work in nomine tenebris [2014] - that is detected pitches in to audio file were used to spatialize the mono track. Detected pitches were also used to generate "false harmonics" in one of the recordings by ring modulating the audio multiple times to add higher but related upper frequencies.
The three audio files were mixed together with some EQ and audio compression to create the final electronic part. The scrolling score and audio file are both played in the Decibel Scoreplayer. The work was written for and is dedicated to Cat Hope.

0 Comments

evolution of the score for between-states

12/8/2015

0 Comments

 
12 transformations of acoustic bass flute and bass clarinet recordings were created using various audio processing ideas I've been exploring. The transformations were then edited leaving the most interesting outcomes still in their original temporal position - and therefore all relating back to the source acoustic recording (see right).  The audio was mixed down and a score was created from a sonogram of the resulting track. The initial idea was to construct a notated score from the sonogram. The process involved annotating the sonogram as a MIDI-layer in Sonic Visualiser  and then exporting the MIDI-file and audio to Finale notation software for score editing. 
Picture
patchwork of recordings of audio transformations
Picture
between-states audio sonogram (detail)
Picture
score created in Finale from MIDI-annotated sonogram (detail) 
Unfortunately in this case I was unable to develop a satisfactory accommodation between the degree of detail in the score and the scroll-rate required to read it: if it was over-detailed it had to move too quickly to accurately read. For the first performance a simplified sonogram score (below) was used with a pitch-guide fixed at the left of the page. The pitch-guide replaced the single-line "playhead" that is used in most scrolling scores in the Decibel Scoreplayer. I had used the same approach  to read Percy Grainger's Free music in the mac version of the score player (and this had later been coded into the iOS version by Aaron Wyatt. The pitch-guide had the advantage of providing more detail to the performers more "morphological" information about the sonic shapes and could also be scrolled more slowly than a conventional score. However in this version each player was allowed to choose which shape they would render with their instrument.
Picture
simplified sonogram score (detail)
The (possibly) final version of the score identifies sounds to be performed by each performer (bass flute - red, bass clarinet - green) and contains text annotations and hue variations to represent different timbres. Since the parts are more defined the  full (unsimplified) sonogram is able to be re-added giving the performers a greater indication of the context in which their sounds are heard.
Picture
annotated sonogram score (detail) "pitch-guide" to the left
0 Comments

new recording of web of indra [1993]

6/8/2015

0 Comments

 
Picture
Web of Indra [1993] originally written for Magnetic Pig got a dust off from the trio Mixt. In this acoustic version of the work percussionist Paul Tanner also manages to work some magic emulating the cello part. This is their rough mix of the recording. Thanks guys!
0 Comments

my spread in metal mag

3/8/2015

0 Comments

 
Picture
Graphic scores by Jaap Blonk and Lindsay Vickery in the fashion mag METAL.
Picture
Picture
0 Comments

trash vortex or ...with the fishes...

2/8/2015

0 Comments

 
Picture
Forty percent of the summer Arctic sea ice melts, and here we’re literally watching the death throes of the planet, and these corporations, like Shell, look at it as a business opportunity.   (Hedges, C. (2013) Rise Up or Die. Truthdig)
"The skeptics and denyers were hanged by their feet - heads submerged beneath the rising tide".
Picture
Picture
In 1979, a military team arrived to gather up contaminated soil and debris, mixing it with cement and piling the sludge into a 350-foot-wide blast crater on Runit Island in the atoll's east. When the mound reached 25 feet high, army engineers covered it with a saucer-shaped concrete cap. A 2008 field survey of the Cactus Dome noted that 219 of its 357 concrete panels contained defects such as cracks, chips, and vegetation taking root in joints. (Atlas Obscura (2013). Cactus Dome: A Concrete Cap for a Nuclear Crater. Slate)
Picture
Other contributors to the jellyfish boom are the “dead zones” created by what scientists call “eutrophication.” That’s when farming pesticides and sewage pumped into rivers meet the ocean. Japan’s now-annual bloom of Nomura jellyfish, which each grow to be the size of large refrigerator, capsized and sank a 10-ton trawler when the fishermen tried to haul up a net full of them. (Guilford, G.(2013). Jellyfish are taking over the seas, and it might be too late to stop them. QZ)
Picture
Our first observations of elevated methane levels, about ten times higher than in background seawater, were documented . . . we discovered over 100 new methane seep sites… If even a small fraction of Arctic sea floor carbon is released to the atmosphere, we're f'd. a feedback loop where warming seas release methane that causes warming that releases more methane that causes more warming, on and on until the planet is incompatible with human life. (Jason Box)
Richardson, J. H. (2015) When the End of Human Civilization Is Your Day Job. Esquire.   
Picture
Captain Moore had wandered into a sump where nearly everything that blows into the water from half the Pacific Rim eventually ends up, spiraling slowly toward a widening horror of industrial excretion. For a week, Moore and his crew found themselves crossing a sea the size of a small continent, covered with floating refuse. It was not unlike an Arctic vessel pushing through chunks of brash ice, except what was bobbing around them was a fright of cups, bottle caps, tangles of fish netting and monofilament line, bits of polystyrene packaging, six-pack rings, spent balloons, filmy scraps of sandwich wrap, and limp plastic bags that defied counting. p. 121
There is more plastic by weight than plankton on the ocean's surface. p. 124
By 2005, Moore was referring to the gyrating Pacific dump as 10 mil­ lion square miles—nearly the size of Africa. p. 125
"Except for a small amount that's been incinerated," says Tony Andrady the oracle, "every bit of plastic manufactured in the world for the last 50 years or so still remains. p. 126 ( Weisman, A. (2007). The Earth Without Us. Thomas Dunn: New York)
Picture
0 Comments
<<Previous
Forward>>

    lindsay vickery

    test version

    Categories

    All

    Archives

    September 2020
    October 2019
    September 2018
    August 2018
    March 2018
    October 2017
    September 2017
    May 2016
    April 2016
    November 2015
    October 2015
    August 2015
    June 2015
    April 2015
    March 2015
    December 2014
    October 2014
    September 2014
    August 2014
    July 2014
    June 2014
    May 2014
    April 2014
    March 2014
    February 2014
    January 2014
    December 2013
    October 2013
    September 2013
    August 2013
    July 2013
    June 2013
    May 2013
    April 2013
    March 2013

    RSS Feed

Powered by Create your own unique website with customizable templates.