Music

You are currently browsing the archive for the Music category.

On the Disquiet Junto Slack, one of our members posted that they’d had a dream:

I had a dream about a piece of gear last night. I wouldn’t say that it was “dream gear,” though it was still cool. It was a small black metal box, about the size of three DVD cases stacked on top of each other. There were a few knobs and sliders, a small 3-inch speaker, headphone out, and a telescoping antenna, so it kinda looked like a little radio at first. The antenna was there for radio reception but there was other stuff going on. It was intended to be used as a meditation/sleep aid/ASMR machine. There were sliders for a four-band EQ and a tuning knob for the radio. The tuning knob had a secondary function that tuned a drone sound (kinda sounded like a triangle wave fed through a wavefolder/resonance thinger). The other feature of this box was something like a numbers stations generator. Another slider was for the mix between the drone and a woman’s voice speaking random numbers and letters from the NATO alphabet in a Google Assistant-/Alexa-/Siri-type voice but with far less inflection. The four-band EQ was to be used like a mixer as well in that it was how a person could adjust how much of the radio signal was audible over the drone/numbers by using the output gain of the EQ. There was also a switch that fed the drone/numbers signal into the EQ as well. The EQ was intentionally low-quality so that when you took it above 0dB, it would distort.

The Disquiet Junto Slack, #gear channel

Now what was weird was that I’m been doing something like this in AUM; I had a quiet ambient Dorian sequence driven by ZOA on several instances of KQ Dixie (a DX7 emulator), and was using Radio Unit (a radio streaming AU) to layer in some birdsong. I realized I could mostly emulate the dream box if I added another Radio Unit to pull in some random stations, but generating the “numbers station” audio was more of a challenge – until I remembered that OS X has the say command, that will let you use the built-in speech synthesizers to pronounce text from the command line.

I sat down, and after some fiddling (and looking up “how to add arbitrary pauses” so the rhythm was right), I created NATO::Synth to create the strings I wanted and pass them to say. It has a few nice little tweaks, like caching the strings created so it can decide to repeat itself, and properly inflecting the start and end of each “sentence”.

I saved the generated audio (recorded with Audio Hijack) to iCloud, loaded it into AUM, and then recorded the results. Very pleased with it!

My last performance was not as smooth as I hoped, so this time I decided that I would find a way to streamline it even further.

I decided to go further in the direction I’d taken with the Wizard of Hz show, and strip down even more. I decided to try to perform as much as possible of the set on the iPad, and use the laptop solely for streaming and Second Life. This freed me from hassles in switching setups in VCVRack, Live, and the other software I’d been using, but it also meant that I wouldn’t be using either of my favorite synths for this performance (the Arturia 2600 and Music Easel).

Having had some time between performances to really experiment with AUM and I felt comfortable using it to lay out my performance. I decided that I wanted to keep Scape as my background/comping program, and that I’d set up a series of light-handed scapes to give me a through-line. I then sat down with MIRack and Ripplemaker to create multiple Krell textures that I could bring in and out, and also discovered a couple of lovely lead patches for Ripplemaker that I paired with a Kosmonaut looper. I also brought in a couple public-domain samples from old sci-fi movies, heavily processed with Kosmonaut again, and felt like I had enough material to do an hour’s performance.

I used the iConnect Audio4+, which I now finally have the hang of, and set it up so that I had two stereo channels from the iPad and one mono channel routed to the iPad through Kosmonaut (again!) for some subtle reverb when I was doing my intro and outro. With the setup I used, the iConnect kept the iPad fully charged through the whole set.

I used Loopback to connect the multiple outs from the iConnect to the stereo ins on my Mac, and monitored on headphones. I pulled up Audio Hijack, entered the stream setup, and was ready to broadcast.

I got up early on the day, started up AUM, and ran a soundcheck to make sure everything was working. All sounded good, and I was good to go.

Mostly.

I didn’t stop AUM, and as a result, it ran for several hours before I tried to start using it. This apparently triggered some kind of a memory shortage, and when I started streaming, I was completely mute. Fortunately, I’d cued up a prerecorded VCVRack texture, and started that while I was trying to figure out what was wrong. I gave up and restarted the iPad, and AUM came up like a champ.

After that it was pretty smooth. I was able to fade the various patches in and out, play the sci-fi samples, and improvise over the Scape-provided background. Once it was off the ground, the performance was very easy to do. I did forget and leave the audio feed from Second Life enabled, so as a result this was a very sparse performance, but the sparseness worked out very well.

Overall this was a great way to do a performance and I plan to refine this further. Of particular note is that AUM saves things so well that it will be trivially easy to do this performance again, should I decide to; this is probably the first time I’ve had a performance setup I felt was robust enough to say that!

Last time I did a live streaming performance for an audience, it did not go well. I had long pauses, the mic didn’t work, and miscommunication over Slack to the remote venue resulted in my getting cut off before my set was finished. And this was even after a good bit of practice.

So when I signed up for the Wizard of Hz concert on RadioSpiral, decided that I needed to have as much backstop as possible in place so that no matter how tangled up I got mentally, I’d have a fallback to something that sounded good and would be a nice navigable arc from point A to point B. Ideally, I should have something that would sound great even if I got called away for the entire set!

My go-to process for this is Scape. I’ve had it since it first came out, and it meshes very well with what I enjoy hearing and enjoy playing. I started off with the Scape playlist that I often use to relax and get to sleep; this is a seven-scene playlist, with the transition time at max, with the per-scene time adjusted to be just a bit over an hour. This gives me a fallback for the whole hour; I can pull everything else back and lean on Scape while I decide what the next section should be.

In addition, Scape provides a very nice backdrop to improvise over, so I can be playing something while Scape gives me a framework.

I then put together a couple of Ableton Live sets: one built on the Arturia ARP 2600 and Buchla Music Easel emulations, and another built on Live’s really nice grand piano and the open-source OB-Xa emulator, the OB-Xd. I finally figured out how to change patches on the OB-Xd about 20 minutes before showtime.

I had set up a piano with a nice looping effect from Valhalla Supermassive (Supermassive and Eventide Blackhole figured heavily in the effects), but ended up not using it, and doing a small Launchpad set instead using the Neon Lights soundpack.

I was also able to open and close with the large singing bowl, played live and processed through the Vortex, which was a nice real analog performance touch.

Overall, I strove for a set that sounded played-through, but that had enough breathing room that I could fall back on Scape while making changes (switching Live sets, etc.), and I think I achieved that.

I did have Audio Hijack recording the set, so if it sounds OK, I’ll be releasing it on Bandcamp. (Followup: it came out pretty well! Definitely at least an EP.)

Only real issue was a partially-shorted cable between my iPhone and the mixer that I didn’t figure out until most of the way through the set.

The Disquiet Junto is doing an alternate tunings prompt for week 0440 (very apropos!).

I’ve done several pieces before using Balinese slendro and pelog tuning, most notably Pemungkah, for which this site is named. I wanted to do something different this time, using Terry Riley’s tuning from The Harp of New Albion, using Logic Pro’s project tuning option.

The original version was a retuning of a Bosedorfer grand to a modified 5-limit tuning:

However, Logic’s tuning feature needs two things to use a tuning with it:

  • Logic’s  tuning needs to be based on C, not C#
  • The tuning has to be expressed as cents of detuning from the equal-tempered equivalent note.

This leads one to have to do quite a number of calculations to put this in a format that Logic will like.

Read the rest of this entry »

I’m still learning the ins and outs of VCVRack; there are so many interesting modules available, and so many different possible directions to go in!

I’m starting to lean toward something in the middle ground between Berlin School sequencing and complete wacked-out crazy, searching for an ambient location somewhere in that space. Jim Frye posted a video of his beautiful “Clouds of Europa” patch on the VCVRack forums yesterday, and I transcribed it from the video to see how it works. After some experimentation, I tweaked the settings of the macro oscillators and added a fourth one, put envelopes on them to add some more air, added some LFO action to vary the sound a bit, and lengthened the delay time to add some more texture to the bass.

I will probably revisit this patch and change over the Caudal to the Turing Machine and see what I can do with that as the source of randomness to feed Riemann, but I’m very happy with the result so far.

My original iPad finally bit the dust in August, just before I could get a final good backup of it. Most of the stuff on it was already backed up elsewhere (GMail, Dropbox, iCloud), but Scape was the exception.

Scape is (at least not yet) able to back up its files to the cloud, so there wasn’t anyplace else to restore from — except I had take advantage of the fact that under iOS5, the files in the app were still directly readable using Macroplant’s iExplorer, so I had actually grabbed all the raw Scape files and even the Scape internal resources. Sometime I’ll write up what I’ve figured out about Scape from those files…

The Scape files themselves are just text files that tell Scape what to put on the screen and play, so the files themselves were no problem; they don’t include checksums or anything that would make them hard to work with.


Version:0.20
Mood:7
Date:20121113025954
Separation:0.50
HarmonicComplexity:0.50
Mystery:0.50
Title:Scape 117
Steam Factory,0.50,0.50,1.0000
Spirit Sine Dry,0.23,0.31,3.1529
Spirit Sine Dry,0.40,0.36,3.4062
Spirit Sine Dry,0.64,0.19,3.9375
Spirit Sine Dry,0.55,0.49,1.0065
Spirit Sine Dry,0.26,0.67,3.5039
Spirit Sine Dry,0.76,0.54,3.1211
Spirit Sine Dry,0.49,0.79,3.8789
Spirit Sine Dry,0.46,0.17,3.9766
Spirit Sine Dry,0.85,0.27,2.0732
Spirit Sine Dry,0.90,0.53,1.5154
Spirit Sine Dry,0.66,0.72,3.6680
Spirit Sine Dry,0.15,0.55,2.2527
Spirit Sine Dry,0.11,0.80,1.9320
Spirit Sine Dry,0.32,0.88,4.1289
Spirit Sine Dry,0.18,0.14,3.2779
Spirit Sine Dry,0.81,0.11,3.0752
Spirit Sine Dry,0.49,0.56,1.7528
Spirit Sine Dry,0.82,0.80,3.3783
Bass Pum,0.53,0.46,1.8761
Thirds Organ Pulsar Rhythm,0.50,0.50,1.0000
End

I wrote to Peter Chilvers, who is a mensch, and asked if there was any way to just import these text files. He replied that there unfortunately wasn’t, but suggested that if I still had access to a device that had the scapes on it, I could use the share feature and mail them one by one to my new iPad, where I could tap them in Mail to open them in Scape and then save them.

At first I thought I was seriously out of luck, but then I figured, why not share one from the new iPad and see what was in the mail? I did, and found it was just an attachment of the text file, with a few hints to iOS as to what app wanted to consume them:


Content-Type: application/scape; name="Scape 10";x-apple-part-url=Scape 10ar; name="Scape 10ar.scape"
Content-Disposition: inline; filename="Scape 10ar.scape"
Content-Transfer-Encoding: base64

Fab, so all I have to do is look through five or six folder containing bunches of scape files that may or may not be duplicates, build emails, and…this sounds like work. Time to write some scripts. First, I used this script to ferret through the directories, find the scapes, and bring them together.


use strict;
use warnings;
use File::Find::Rule;

my $finder = File::Find::Rule->new;
my $scapes = $finder->or(
$finder->new
->directory
->name(‘Scape.app’)
->prune
->discard,
$finder->new
->name(‘*_scape.txt’)
);
my $seq=”a”;
for my $scape ($scapes->in(‘.’)) {
(my $base = $scape) =~ s/_scape.txt//;

my $title;
open my $fh, “<“, $scape or die “can’t open $scape: $!”;
while(<$fh>){
chomp;
next unless /Title:(.*)$/;
$title = $1;
last;
}
$title =~ s[/][\\/]g;
if (-e “$title.scape”) {
$title = “$title$seq”;
$seq++;
die if $seq gt “z”;
}
system qq(mv “$scape” “$title.scape”);
system qq(mv “$base.jpg” “$title.jpg”)
}

I decided it was easier to do a visual sort using the .jpg thumbnails to spot the duplicates and filter them out; I probably could have more easily done it by checksumming the files and eliminating all the duplicates, but I wanted to cull a bit as well.

So now I’ve got these, and I need to get them to my iPad. Time for another script to build me the mail I need:

#!/usr/bin/env perl

=head1 NAME

bulk_scapes.pl – recover scape files in bulk

=head1 SYNOPSIS

MAIL_USER=gmail.sendername@gmail.com \
MAIL_PASSWORD=’seekrit’ \
RECIPENT=’icloud_user@me.com’ \
bulk_scapes

=head1 DESCRIPTION

C will collect up all the C<.scape> files in a directory
and mail them to an iCloud user. That user can then open the mail on their
iPad and tap the attachments to restore them to Scape.

This script assumes you’ll be using GMail to send the files; create an app
password in your Google account to use this script to send the mail.

=cut

use strict;
use warnings;
use Email::Sender::Simple qw(sendmail);
use Email::Sender::Transport::SMTP;
use MIME::Entity;

my $top = MIME::Entity->build(Type => “multipart/mixed”,
From => $ENV{MAIL_USER},
To => $ENV{RECIPIENT},
Subject => “recovered scapes”);

# Loop over files and attach. MIME type is ‘application/scape’.
my $n = 1;
for my $file (`ls -1 *.{scape,playlist}`) {
chomp $file;
my($part, undef) = split /\./, $file;
open my $fh, “<“, $file or die “Can’t open $file: $!\n”;
my $name;
while(<$fh>){
next unless /Title/;
(undef, $name) = split /:/;
last;
}
unless ($name) {
$name = “Untitled $n”;
$n++;
}
close $fh;
$top->attach(Path => $file,
Type => “application/scape; name=\”$name\”;x-apple-part-url=$part”,
);
}

my $transport = Email::Sender::Transport::SMTP->new(
host => ‘smtp.gmail.com’,
port => 587,
ssl => ‘starttls’,
sasl_username => $ENV{MAIL_USER},
sasl_password => $ENV{MAIL_PASSWORD},
);

sendmail($top, { transport => $transport });

I was able to receive this on my iPad, tap on the attachments, and have them open in Scape. Since there were a lot of these, it took several sessions over a week to get them all loaded, listened to, saved, and renamed using Scape’s edit function (the titles did not transfer, unfortunately).

So now I have all my Scapes back, and I’m working through the program, trying to get to the point where I have all the objects enabled again. I haven’t played with it in a while, and I’m glad to be rediscovering what a gem this app is.

Radio Free Krakatau
Composed and performed entirely in VCVRack.

Based on a picture of a VCVRack setup I saw on Facebook; I was able to figure some of the connections and setup, but not all of it; this is a record of my explorations of that set of modules, as I increased the complexity of the interconnections.

Sadly, the VCVRack savefiles were lost, so this is the only record of this performance.

This zip file (NotStraightLines-0.6.0-mac) is a Mac version of the NotStraightLines plugin. To make it super clear, this is all Andrew Belt’s work; I just built a binary!

The sound design for the Rachel Rising project, a multimedia presentation of a song cycle as poetry, video, and music. This project had been stalled for some time, as I was trying to figure out how to get the effect I wanted for an installation; I think I’ve finally got a couple of good ideas that revolve around having the internals of the Scape app to build a custom set of soundscapes from my own samples and visuals. More on that in the Scape posts to come.Jamming as part of Shojo Blue – Shojo Blue kind of fell apart, as we were only three people, our guitarist left, and then I was overwhelmed by work. Greg Hurley has moved into his house as of last weekend, so perhaps we’ll start working on a Western Skies sometime soon.

Considering whether the StillStream iOS app needs an iPad-specific version, and looking over the new interfaces in iOS 6. The Facebook API looks interesting, and it’s probably time to convert to storyboards and ARC. Given more recent reports of hiccups and hangs, I think I’m going to have to move forward at least to iOS 5, and I should probably go to 6 (perhaps as StillStream Radio Plus, re-releasing the old app as StillStream Radio Classic).

My studio’s been half-assembled for the last six months (other things have been taking up my free time). I’ve decided that today I’ll try to get it reassembled and back to the setup I had when I was playing at Different Skies. That lets me use everything together again, gets the Vortex back in business, and lets me get some music recorded again. I have gotten back to the point where MIDI into the laptop works, but a hard disk crash on one of my server disks has caused me to temporarily disconnect the studio machine to get that sorted out again. I do have the Firebox working for MIDI and audio input again, but don’t yet have the Vortex reintegrated.

Surprisingly, I’m finding that more and more I prefer the virtual instruments I have on the laptop to the internal sounds in the SD-1. Those are still good, mind you – I’ve still not got as playable a sax, flute, violin, and cello anywhere else – but the GarageBand pianos feel better than the internal ones, and they sound right. I’ve just started to scratch the surface on Live at this point, and an considering whether I should spring for the Suite upgrade now so I get the free Live 9 upgrade when it comes out.

I’m still working on the portable setup, and actually have played out live with the iPad and my computer speakers (it all fit in the backpack except for the subwoofer, which is pretty good). Interesting new iPad instruments keep coming out and are quite impressive; I am wondering whether a Mini would do as a performance device if I stuck to MIDI; the compacted display is already a little tight for the apps that have on-screen keyboards, and I can’t see that being better. I am starting to see things evolving away from my iPad 1 at this point – Scape won’t run in the background there, and Borderlands flat out says it’s not good on the iPad 1.

Tags: