For this week’s assignment, I used a combination of a set and a list to song lyric source texts. Songs are usually repetitive, but by putting the lyrics into a set, I could remove all repetitions with interesting effects. By casting the resulting set into a list, I could then shuffle the words and join them into a final poem. To add some rhythmic variation, I added that any commas and end of sentence punctuation should be replaced with line breaks.
tenderness ’cause hurt arms a untrue told feel you all someone me say stay but when hear if like for in there stand he new was baby
couldn’t took and kiss is felt to it never loved mad won’t glad knew him didn’t could the had care been hit i made have his that with
The source text was the song lyrics of “He Hit Me (And It Felt Like A Kiss)” written by Carole King and Gerry Goffin, and originally released by The Crystals in 1963. In selecting this text (and others that I also ran through the program – “Wishin’ and Hopin'” and “Wives and Lovers” by Burt Bacharach and Hal David, and “Run For Your Life” by John Lennon and Paul McCartney), I chose old pop songs with lyrical themes of misogyny and/or sexism. In rearranging the words, I hoped to create a healtheir, more positive context and render unrecognizable the lyrics’ original elements of violence and inequality.
Source code here.
For this assignment, I tried a number of different source texts. My favorite output came from using this Brownstoner blog post about a piece of Brooklyn real estate. You can find the source code on my Github page.
If you like modern, you might dig this muthafuckin’ streamlined one-family warehouse-to-loft conversion at 90 Wyckoff Street in Boerum Hill. The straight up 72-foot-long open living area on the dirty ass first floor gets light from a weak ass skylight in the dirty ass rear extension as well as another skylight over the dirty ass stairs going all the dirty ass way up to the dirty ass third floor, although there are no side windows.
Upstairs, the weak ass two bedroom floors have a muthafuckin’ more conventional layout; altogether there are four bedrooms, 3.5 baths, and rooms for storage or a fuckin’ office. Oh, and there’s parking!
It’s been on the wack ass market at least since May, when it was asking $7,990,000. The muthafuckin’ price recently dropped to $7,200,000, as Curbed pointed out earlier this wack ass week. That’s nowhere near the wack ass $8,880,000 someone recently paid for Michelle Williams’ and Heath Ledger’s old pad at 126 Hoyt Street, but it’s still one of the wack ass most expensive properties in the wack ass area. What do you think it’s worth?
Here is my concept for a generative acid house music sequencer interface:
My first attempt at rigging a 3d model.
Attempted to do something funky in Maya.
For this week’s homework, I wanted to expand upon some of the concepts I worked on in my previous assignment, a melodic sequencer that I created in Processing. This time around, I took MIDI and notes out of the equation and instead concentrated on rhythm with the intention of creating a generative drum machine concept that I had been thinking about for a while.
To accomplish this, I again turned to Processing and utilized the Beads library to handle timekeeping duties, and the Minim library‘s AudioPlayer to trigger samples of the classic Roland TR-707 drum machine, which I had in .wav format. Using the class example for creating a clock in Beads was a good starting place, and getting the samples to play back was fairly straightforward thanks to Minim’s documentation.
CODE IS HERE: https://github.com/devincurry/TheCodeOfMusic/tree/master/ED707v01
Instead of making a programmable step sequencer in the mold of a normal drum machine, I wanted to make a sequencer that played generative rhythms. However, I also wanted the rhythms to be musical, not randomly chaotic. To accomplish this, I first created a 16 step sequencer for playback. For each of the 6 different drum sounds, I hardcoded which steps could possibly play. Then, I used a weighted random to determine if the sound would play during each of its eligible steps.
Once I got this working, I added some user interface; the keyboard keys 1-6 could toggle each of the 6 sounds on and off. Red “lights” in the sketch provided some visual feedback for which instruments are active, and red text to update with the current step number. I also added a brief written explanation of the interface so anyone can hopefully use it on their own. Lastly, I included the “OCP” logo (from the original Robocop movies) and the name of the instrument, ED-707 (a combination of the combat robot ED-209 from Robocop with the TR-707 drum machine).
For the homework assignment this week, I worked off of the sample Processing code shown in class and continued to use the Midibus library. My main goal was to create an interface that would allow for controlling the steps in a 16-step sequence.
In my prototype, I used boolean statements to program whether a particular step would play. Each step would play a randomly chosen note from an array containing the midi values within the C-Major scale. I sent the midi signal into a Logic soft synth for playback. In the GUI, which I created using Processing’s 2-D primitives (and featuring a color scheme that references the classic Roland TR-808 drum machine), an active step “button” is lit up when it is going to play.
I have a several goals to take this project further. I want to add mouse functionality so that each step can be turned on and off with a click of the mouse on its respective button graphic. I would also like to implement the beads library for a more reliable clock and for added tempo controls. Lastly, I’d like to clean up my code by using object oriented programming and making each button its own class.
The code as currently constructed is available on my Github: