ICM Final: The Impulse and the Response

For my final project in Intro to Computational Media, I decided to focus on the p5 Sound library’s convolution reverb functionality. Reference for that object can be found here. Technically, I have been interested in p5’s ability to handle sound in the browser. Conceptually, I have been interested in convolution reverb as a medium for artistic messaging. In terms of my broader practice, I was excited to not only produce work in a collaborative manner but also create a tool for artists to use.

 

The result is a two part final. One, is a piece titled “The Impulse and the Response”. Two, is a browser based tool to be given to artists in order to create more pieces of a similar nature as this one.

“The Impulse and the Response”

This piece consists of two interactive sound pieces to be played in the browser alongside explanatory text. Two artists, Édgar J. Ulloa and Daniela Benitez, provided audio performances. A slider is available to change the amount of reverb that affects the audio that is playing. The reverb is a re-creation of the sonic qualities of the inside of the Statue of Liberty. In collaboration, my prompt to the artists was, “If you could do a sound performance inside of the Statue of Liberty, what would it be?”

Screenshot of “The Impulse and the Response”

Traditionally, convolution reverb is used to easily recreate reverbs of sonically pleasing spaces. Even experimentally, non-traditional use of convolution reverb focuses on innovative sound design results. I was interested in the potential for political or social commentary. Both of my collaborators describe themselves as affected by the United State’s immigration policy, social attitudes towards immigration policy, and the general climate of xenophobia. They wished to make pieces addressing these issues.

Giving them the opportunity to virtually inhabit this symbolic space not only gives potential to critique the space and it’s multi-dimensional symbolism, but uses the act of recreation of the space and it’s virtualization to match the physical, psychological, and political state of flux that many immigrants can face.

Challenges

The artistic metaphor completely relies on the technical understanding of how a convolution reverb works. However, most official definitions of convolution reverb err on the technical side and as a result are not always helpful descriptions to the average person. With this in mind, I decided that a brief but thorough artist’s statement that explains convolution reverb was essential. With that in mind, I will link to the piece here so that you can read about convolution reverb and it’s relevance to the piece in detail.

The Impulse and the Response

First approaches

My initial idea was about the general potential of convolution reverb as an artistic message. I had less thought about the possibilities of recreating spaces, but more of putting audio “through” another sound. Since convolution reverb works by loading other sounds, called impulse responses, to create reverberations, you can put any sound into it for unexpected results. The sound of the arctic shelf cracking and falling into the ocean could be used as an impulse response in a sound piece about global warming, for example.

My first presentation on the idea revolved around police shootings. Loud pops and bangs are used to create traditional impulse responses, and on the professional level may be done with starter pistols. I had mocked up an interface for a piece that would use audio from the shooting of Keith Lamont Scott as the impulse response for a convolution reverb. The user would be prompted to read news articles about the shooting.

The general approach was to use convolution reverb to achieve a kind of empathy that may not be available in other forms of media. Having your voice brought into this “space” created by the reverb may more easily prompt you to think about yourself in that space. After discussion the general concept with Allison Parrish, it was apparent that there was the potential for very problematic usage of this approach. Not only in the procuring of media as appropriation, but the general feeling of a serious and fatal issue boiled down into a “sound toy” that you play with.

Allison’s advice to hold the technology and the message in separate spaces while meditating on potential concepts was invaluable, and lead me to the current incarnation of the project. I feel that what I have is a much more positive approach while still having a strong social message, and is empowering to artists. This revolves around the second part of my project.

Convolution Reverb Online

At the bottom of the piece is a link where you can create similar pieces of your own. The broader vision for this is that I can provide this link to Édgar, Daniela, and other potential collaborators so that they can create their own versions of this kind of piece without having to rely on sound engineers or programmers.

Here is a screenshot of an initial prototype:

In trying to achieve a user friendly user experience, I learned much about Javascript , p5, and browser capabilities in general. This has been very enlightening, and I am very glad I could work on practical Javascript usage in this project. Some decisions while not practical, offered good practice; like trying to program as much of the UI as I could inside of Javascript instead of pre-made HTML. Other learning points were great research; iOS does not allow Javascript access to the microphone. Not just Safari, but iOS as a system. And I wound up writing some code I imagine myself coming back to over and over, like file uploads that can use system menus as well as drag and drop functionality.

Screenshot of Convolution Reverb Online tool

Source Code for Convolution Reverb Online

The Experience

While there are things to revisit on this project in terms of design, layout, and code cleanliness, I am really happy with the results. It has been a while since I tried to use solely Javascript for interactive work, and being able to immerse myself in p5 has been a great re-introduction to what the browser is capable of. And more so, the ethos of p5 was fully felt. I was able to program something novel and expressive without the syntax and technical aspects getting in the way. My classmates’ wonderful feedback, and Allison Parrish’s guidance was so important and appreciated. I’m proud of the end result, and am motivated to continue exploring programming as an artistic skill. It also seems that this project will have a life after this class, which I am very excited about further developing.

ICM Final Ideas

My concept for the ICM final project is an installation piece that uses the noises of important, controversial, political and or emotional issues to create reverberations (echos). A user would be prompted to read about these issues into a microphone, and then listen via headphones to their own words, put through the reverberations of these issues.

Interface mockups

final_mockup

final_mockup_screen2

Initial proof of concept code

http://alpha.editor.p5js.org/projects/SygbdEbZg

Goals

Use charged or confrontational news stories in an attempt to create a different kind of empathy using emotionally charged impulse responses in a convolution reverb system. The user will be effected, aurally, by the topic. Forcing to hear one’s self through the effect of the topic may perhaps create a different kind of connection. However, a sense of distance is still maintained. The user speaks with their own voice, accompanied by the echos of something serious.

“Political tools”?

Is a reverb supposed to be accurate? Is a tool not supposed to have an agenda? Is there a place for an opinionated tool?

Sound as data, data as agenda.

I am interested in data auralization, as opposed to data visualization, as a method of conveying data. While there are more utilitarian and straightforward aspects of this technique, I see no reason not to make a statement with the sound data and present it in a manner that unapologetically conveys a message.

“Hard to talk”

Depending on the impulse response in question, it could be difficult to hear or read the story given. However, these are difficult stories to hear or read without any audio effects at all.

 

Issues

Education

The conceptual core of appreciating the work is as much about communicating technically what is a convolution reverb, as it is about the metaphor of using it. Educating the user about convolution reverb before the artistic confrontation is necessary, but presents it’s own UX challenge.

Copyright

I am not entirely knowledgeable on the specific details of how sampled/remixed/referenced media is ethically and legally implemented in media art on a professional level. Time needs to be allocated towards getting permission to use works and researching publicly licensed media.

Topics

Are there topics that may be inappropriate for use? Would certain topics allow for easier media collection? Are there topics that may lessen the potential for harmful artistic appropriation (climate change could be less appropriative, when focused on animals or objects).

 

I had some great feedback after presenting in class, and will definitely try to refine the messaging and specific UX flow over the next few days in order to get started.

ICM Homework 8: Loading Media

This week’s homework was about loading and manipulating media. There were bonus points for not doing a video mirror or a sound board, so I decided I was going to avoid that.

To completely reduce the temptation, I wanted to load video clips instead of rely on webcam footage. Using the techniques Allison outlined in class, I wanted to address each pixel in these loaded videos to either display them or use their information to inform functionality.

I spent a lot, a LOT of time figuring out what videos would play nice with p5js. Many, many trips back and forth to and from Windows Movie Maker, cutting the length, shrinking the size, reducing the bitrate. I had a feeling that this might have been an issue with the online editor specifically, but in any case I was ready to deal with that limitation. After getting the video to play normally, I was ready to start analyzing each frame and draw.

crash_after_a_while

I was getting some inconsistent crashes with errors like these: “Exiting potential infinite loop at line 36”. It seemed to take issue with the for loops that were going through the frames, but it’s behavior was making me wonder if p5 was simply stalling out when faced with too large of a task. And now that I type this out, I realize that I haven’t had these errors with my desktop PC, only my laptop.

Being able to analyze the contents of the pixels and make decisions accordingly was of great interest. In playing around with what was possible, I wound up making somewhat of a ‘bonus’ sketch just to see if I could:

http://alpha.editor.p5js.org/projects/rktNzvLge

It isn’t pretty, it isn’t smooth, but it is technically a functional green screen.

With my meme break out of the way, I used my newfound pixel hunting abilities towards a more serious piece.

 

I have some NASA footage of a spaceship-level view of the earth. It needed to be downsampled a great amount in order for it to find it’s way into p5, but it wound up being ok for my purposes. This sketch loads the video, and then displays the video in a lower res field of dots. If you move your mouse, you can adjust the shape. As all sketches start with the mouse at 0,0, it can look like nothing at first and then dramatically stretch into view upon interaction.

Pixels in the video are analyzed and placed as circles. If the program detects that the given pixel is above a certain threshold of whiteness (in this case, the clouds), those white circles are larger than the ones that are not white. Then there is a “scan line”. It is an area defined in the sketch that is looking for these larger white dots. If there is a larger white dot detected in this area, it is highlighted with a green circle. A droning note is played with a different pitch depending on it’s location on the X axis.

http://alpha.editor.p5js.org/projects/rJ1peULee

There are still some bugs. As I moved development to my desktop, upon returning to the laptop it seems I cannot run the sketches. This is why I have added videos of the sketches running, just in case they cannot be seen on the presentation laptop. Also, when running smoothly, the NASA video sketch doesn’t always play the notes with the correct pitches. It appears that if you use the rate() and play() too rapidly in a row, sometimes the rate() command is ignored and the sound file is played at the original speed.

However, despite these setbacks, I am happy with the conceptual end result. I am interested in different ways to communicate data, and the idea of data “sonicalization” as opposed to visualization seems like an intriguing pursuit. Something like this could measure weather patterns, the health of a forest, or air quality conditions. The stylized output, visually and aurally, might create new opportunities to motivate people to digest these findings.

ICM Homework 7: Loading data, JSON, APIs

Our assignment for this week was to work with external data: loading JSON files with data sets and using them to visualize the information within. Getting familiar with using APIs lets you pull JSON files from popular websites and web services.

The example videos from Shiffman seemed to make sense, but it wasn’t necessarily smooth following along. It seems like the New York Times API documentation and services work in a slightly different way than when he had used it. This meant that certain code didn’t quite work.

Instead of providing a solid example URL, you are given the following code:
var url = “https://api.nytimes.com/svc/search/v2/articlesearch.json”;

url += ‘?’ + $.param({

‘api-key’: “a74472e8eca541b2b2577c690b887abe”,

‘begin_date’: “20161025”,

‘end_date’: “20161025”
});

$.ajax({
url: url,

method: ‘GET’,

}).done(function(result) {

console.log(result);

}).fail(function(err) {

throw err;

});

Using the $ didn’t seem to play well nice with p5, so I needed to piece the puzzle together and write the proper URL “by hand” to see if I could get a proper response JSON file. Fun fact, it seems like APIs that are mis-written in certain ways don’t just refuse to work. When I was looking for article headlines by date, my malformed query would respond with today’s headlines. Of course, I used today’s date as my initial test to see if I could get information from the API. So it wasn’t until later when I was playing around with different day requests that I noticed nothing I put in was working.

I was a little distracted by that because I had spent a good chunk of time trying to figure out the spacing of my project. And because of all the GIFs. What GIFs, you say?

https://alpha.editor.p5js.org/projects/rykND901x

My idea was to use not one, but two APIs. The user can enter a date to get a headline from the New York Times on that day, and then each word is run through the Giphy API. Each search returns an image that puts a GIF on the DOM, creating a chain of GIFs that are “translated” from English. Perhaps this technique can help revitalize the print media industry.

api_hmwk_screenshot

The main issue was the timing of the different API calls. There is still a bug where the GIFs aren’t necessarily placed in proper order. Clicking the button again can remedy this, and the bug doesn’t always show up. It seems that simply the time required to start a new API call for each word and load the GIF messes with the sequential nature of the loading. I spent lots and lots of time trying to figure out how to avoid this and make sure that the images would always be in the right order. Still haven’t managed to figure it out, but in general the proof of concept is functional enough and fun to play around with.

ICM Homework 6: DOM interfacing and manipulation

We’re breaking out of canvas land and into the DOM! Our homework this week involved using the HTML DOM with our p5 programs, getting and sending values on a webpage (outside of the usual canvas element we’ve been using so far).

I’m still having fun exploring the p5 sound library, and DOM control was a great way to logically add control and feedback to a music based program. I’m really impressed with p5’s stretching of sound, and decided to make a Vaporwave themed music player.

http://alpha.editor.p5js.org/projects/rkk0bvS1e

(Embedding this one was a bit fussy. Maybe because of all the DOM trickery? Worth asking about in class.)

Drag a sound file that is on your computer onto the indicated area and the file will load. When the file is successfully loaded, press the play button.

Certain aspects of layout are just so much easier in HTML, it was a great reminder to get back into it. Though there were some issues with centering some of my DOM elements, the advantages to really working the DOM are apparent for larger scale and multi-page p5 programs. I was able to implement drag and drop for file loading, change the color of elements based off of the amplitude of the music, and create sliders and buttons quickly and easily. Even funny stuff I hadn’t even thought of was available. Depending on the amplitude of the sound, the title of the sound file being displayed will switch from lowercase to uppercase. (CSS style ‘text-transform’,’uppercase’ for the curious)

Knowing that I can CSS style everything is even that much better, keeping in mind future edits and changes that might occur in a program. In this case, I used some Google Fonts to spruce up the Vaporwavey feel of the design. And all of this isn’t at the expense of the canvas sketch itself, which shows a waveform of the sounds currently playing. This gets embedded in the background thanks to a z-index property. In the requisite pink and blue, of course.

I started out with music more ambitious goals involving the p5 sound library, which were thwarted at nearly every turn. But my little Vapor-Player (…um…working title…) has captured my heart, and I will be revisiting it as I get more adept at sound manipulation in the browser.

The only question I have left is… what happened to shrug emoji’s right arm?!!?

shrug_emoji_hurt

Shrug emoji!! What did I do? Oh GOD I’m a monster!!!!

Editors note: When asked for comment about the incident shrug emoji offered no statement, but did shrug.

ICM Homework 5: Objects, constructors…and Synthesis!

We are starting to get towards the middle of the semester, and have added more foundational building blocks to the programming repertoire. This time around, it is working with objects and constructors. Getting used to “the Javascript way” has been a bit of a challenge, and I’m glad I get to practice the implementation even if I’m already familiar with the concepts.

This was also an opportunity to use p5’s sound library, which I have been very curious about. Getting my objects and constructors all set allowed me a lot of flexibility in terms of creation and playing around with ideas. I wanted bouncing balls that played sounds when they hit the ground, and the object oriented approach let me easily play around with their properties and behaviors, including the sonic ones.

In this sketch, you click to create a ball. The ball will fall and bounce, making a noise. Press keyboard buttons 1 through 4 to change the kind of ball you want to create.

http://alpha.editor.p5js.org/projects/SkcYoLKR

 

Lots of fun! I love sound interaction that is playful and a little unorthodox. I’m really impressed with p5’s implementation of it, and can’t wait to dig in more.

There were some conceptual questions I had to ask myself. Right now, the balls all more or less behave the same. They could easily be one soundBall class and then have their “type” passed to them to determine behavior. But I had the feeling that I might want to make their behavior drastically different from one another in the future, so I decided for now to make them all separate kinds of objects.

This made my displaying and updating routines in the main sketch a little repetitious. I was wondering if an elegant solution might be having an array of arrays, with each “second level” array hold all of each kind of ball. But then I was wondering if it is bad form to have arrays with mixed types of variables inside.

There were also some quirks in regards to getting the sound going, which might be out of the scope of the class for now but I will definitely want to pursue later. All in all, though, a fun time and whet my appetite for making more advanced objects and really working the sound library.

Synthesis

And then there was Synthesis! I’ve gone over this in my PComp blog as well, but will repost here as well.

This past week we had our “Synthesis” session: the combining of the things we have learned in Physical Computing and Computational Media. In short: getting our P5 sketches to talk to our Arduino. Fun stuff!


I was partnered with Chris Hall, who had an awesomely trippy sketch that we thought would be fun to control physically. Her original program triggered animation on clicking, and we managed to translate that to an Arduino button press.

img_20161007_123117

We were pretty happy with ourselves, pressing away and showing off our work:

We are blowing minds on the floor!

Tom Igoe saw our work, complimented us… and promptly told us to add more. His advice to use p5’s split() function was key information, and saved us a lot of time in figuring out how to implement two inputs. Though even then there were some issues in formatting the serial communication coming out of the Arduino. At first I had done some string concatenation on the Arduino side in order to send out comma separated values, but it was getting turned into jibberish. Using individual Serial.print() commands, then ending everything with a Serial.println() seemed to do the trick. We were now able to press the button to start the animation, and then use a potentiometer to change the hue of the colors in the sketch.

img_20161007_142857

 

ICM Homework 4: Re-working Homework 2

For our fourth homework assignment, we were asked to re-work an old homework assignment by adding functions, objects, a function that returns the results of a mathematical operation, and using objects. I decided to also implement the ‘bonus’ goal of using a function inside of an object.

It was really soothing to go back and re-organize everything in this sketch, almost like a spring cleaning. Adding a function for the lines in the middle allowed me to play around with the amount, and making the circles into objects allowed me to tweak all their behavior as I needed without the endless copying and pasting. And revisiting everything allowed me to create an oscillating fading effect instead of having abrupt jumping.

http://alpha.editor.p5js.org/projects/rJw1Igja

I decided I wanted to work with adding functions into an object, since I haven’t had much in-depth experience with Javascript specifically and was curious about how this was going to work.

Speaking of this…

thisthisthisthisthis
So many this-es! Theeses? Thisii? I had to use ‘this’ a lot. This felt strange to me coming from other programming languages. From example code in the book, it seems like these are necessary, and perhaps functions as a kind of in-object specific local variable. Still, it seemed odd to have to specify “this” all the time. Feels redundant. Then I was realized I was using it when I might not necessarily have had to. I don’t want to jump ahead in class, but would like to talk about the best ways to use ‘this’, and maybe some ways that we can avoid having to type it out so much.

 

Looking forward to moving onward into Javascripting, and the synthesis class on Friday to start taking these skills into the Arduino realm (and visa versa).

ICM Homework 3: Cat and Mouse Teamwork

For our week three homework, we were asked to create an algorithmic design with simple parameters that could be controlled with  a rollover, button, or slider made from scratch. I was paired with Lindsey, and having code experience I decided that I was going to first try and code a slider from scratch  from memory. I’m used to having slider code readily available from previous works and code examples, so I thought it might be good to flex those long dormant brain muscles.

Creating a slider means doing hit detection inside of the “knob” of the slider, and then adjusting the position of the knob while “dragging” it. Is my mouse clicking? Is my mouse inside of the box? If so, move the knob to the position of the mouse. Restrain its movement to the X axis, and prevent it from going outside the bounds of the slider.

sliderclickdetection_quirks

Not so fast!

 

No matter what your level of coding skills, you can always forget something. Which, in turn, the computer never forgets to remind you of. The ultimate game of “Simon Says”. We take our lovely sliders for granted, don’t we? They feel nice and intuitive. But if you only code the logic to say, “…when clicked and inside…” and then you move the mouse outside of the slider knob it will stop moving. This makes you appreciate how fast you can move your mouse during average use, and how little your mouse is literally “on” the knob as opposed to briefly clicking it then taking off in a direction.

The solution is to explicitly set the state of the knob, assigning a “clicked” or “currently dragging” status. Then the logic will let you move the knob with the mouse, even if the mouse is technically outside of the knob’s graphical boundaries.

After doing this, I then decided to take a look at the example UI code for the slider. It seems that code is all based off of the mouseClicked() and mouseReleased() functions, as opposed to mine which is based off of the boolean “mouseIsPressed” p5js variable. Need to remember to ask in class about the pros vs. cons for each approach.

 

Meanwhile, Lindsey was beefing up her skills with functions and for loops. She wanted to stick with the cat and mouse theme she has been working on, so after she made a cat and a bunch of mice, I tied everything together with the slider code and polished things up a bit.

We had a concept, and plugged the slider code in to execute:

https://alpha.editor.p5js.org/projects/Bkb-Lb9p

 

We weren’t quite sure if this counted as truly algorithmic or not. I really wanted to randomly assign the colors of each mouse, but given the limits of the assignment and where we are in regards to the learning material, I was scratching my head in regards to finding an elegant solution.

I wound up adapting some of the GOTO10 code that had been posted by Dan Shiffman as inspiration. This gave us some fun animation and randomized mouse colors, but perhaps some other trade offs.

https://alpha.editor.p5js.org/projects/HyymhZqT

 

The code wound up being a bit hacky in the end; what Lindsey had made, my first attempts to modify it, implementing logic from the GOTO10 approach, further attempts to modify… but eventually we wound up achieving our vision (more or less).

It is a fun approach, but with certain limitations for sure. I have some questions about how the GOTO10 style can be refined, which I’ll be posting to the GitHub group.

Lindsey was a great team mate, and really grew her skills over the course of the week. It was an honor to be on the ever expanding, illustrious Cat and Mouse series.

ICM Homework 2: Circles, Lines, and unexpected mouse positions

Our second assignment for ICM was to include the following three aspects into a p5.js sketch:

  • One element controlled by the mouse.
  • One element that changes over time, independently of the mouse.
  • One element that is different every time you run the sketch.

I wanted to stick with simple shapes and color schemes in order to easily highlight what was interactive and what wasn’t. A minimalist at heart, I decided to go with circles and lines.

The mouse controls the size of the lines and circles, and color of the circles. The mouse Y position controls the color of the circles (the “inner” circles” have the opposite color of the bigger “outer” circle). The X position of the mouse changes the size of the circles; they are paired diagonally opposite in order to avoid circle overlapping when growing.

The X position also determines the size of the crossed lines. This was my one allowed “cheat” to use something new: abs(). This makes any integer an absolute value: any negative number turns into a positive, but any positive stays positive. With a little bit of tricky math, I was able to determine the following: the closer the mouse is to the horizontal center of the screen, the larger the lines, while the closer to the edge of the sketch makes them smaller.

The stroke of the circles changes over time.

The background is a different level of darkness each time the sketch is run. There is a rect with half alpha that sits on top of the entire sketch, that chooses a different color each time the sketch is run. Since everything below is gray scale, this produces a new monochromatic color scheme each time.

A fun error with mouse position:

homework-2-error
Gah! Circles, how did you get so big?

I noticed that p5.js was keeping track of the mouse position outside of the canvas itself. This was most dramatic when I was scaling the size of my browser window as the sketch was running, causing the image above. Good to note this behavior, as in the future I’ll have to remember to build in safe guards against unexpected mouse position values (when we get into for loops).

Here is the sketch:

And here is the link:

https://alpha.editor.p5js.org/projects/HkTF0Khh

You can replicate the error described above. Simply hovering your mouse over the sketch keeps all variables within the expected bounds. However, while hovering over the sketch and clicking your mouse, if you hold down the mouse button and go anywhere else on your screen outside the sketch’s bounds, p5.js will still keep track of the mouse position. Good to know!

ICM Homework Post 1; first p5.js sketch, class goals and inspiration

For our first assignment in ICM, we were tasked with making a screen drawing using the basic drawing functions of p5.js. I decided to use the 2D primitives and basic color functions to create my own take on a Morrocan tile pattern (or something like it). These are pretty geometric, so to incorporate the curve() drawing function I decided to “sign” my name at the bottom with my first initial. This also used a different color mode that includes alpha information, giving a kind of “watermark” effect to labeling my authorship visually.

I have experience with coding in general, and have Processing and Javascript experience. However, despite playing around with Processing for a while, I only ever really “finished” one real project. I haven’t done much serious coding in Javascript in a long time, either. I didn’t have any major pitfalls in creating my homework piece, and was able to figure out everything I needed to do. A great part of this was the web editor, which was very responsive when the “Play” button was left on while I coded. I appreciate the “live coding” environment for visual programs, as it helped me (kind of) wrap my head around some of the trickier shape functions like arc() and curve(). It was smooth sailing for my first assignment, so I didn’t post any issues to github.

 

But it was good to shake off the cobwebs. It seems like no matter how long I have coded, picking up a language I haven’t used in a while requires me to get reaquainted with it. Javascript seems to have the habit of doing things for you and filling in the gaps on its own instead of stopping everything. This can be great for getting things to “just work”, but can make for some unexpected behaviors.

 

For example, I was coding along, throwing up shapes on the screen. I was looking at the different functions I was asked to incorporate into my assignment, and decided I should change the stroke weight of the next shape. When I did that, all the other shapes changed their stroke width! Why? They already had strokes?

It seems that just because you are setting the stroke color, which creates a default stroke width of 1, that does not mean that you have explicitly set the stroke width to 1. Additionally, if you use a basic color function only once in your program, it applies to all the 2D primitive shapes, even if the command is called after those shapes are drawn. Simple example here:

quirky_js_formatting_order
Notice, all shapes have the same stroke weight, even the ones “before” the stroke weight call

I’ve dealt with quirky behaviors like this before, so once I saw it I knew that I probably needed to explicitly set that color function at the top of my code in order to return it to what it looked like before. Still, confusing for a moment and a reminder of how unexpected behaviors can blindside you when you are gaining momentum in writing a program of certain length. Sometimes it isn’t what you just wrote on the last few lines that caused the error, but what you wrote on the first few lines when you started (or didn’t write, in my case).

Here is the code to my sketch here for people to look at:

https://alpha.editor.p5js.org/projects/rJb0SDQn

And after a bit of playing around in WordPress I figured out how to embed p5.js sketches into a blog from the alpha editor. Without further ado, my homework:

 

 

In terms of applying computation to my interests, I’ve had a pretty broad faith in being able to use technology for artistic expression. I welcome any opportunity to grow my coding skills in the service of creating art of any kind; visual, audio, textual, interactive, physical, etc. In terms of setting some solid goals for this semester, I would like to develop some interfaces in p5.js that might be able to be used for live performance of music, specifically something that might be able to leverage Chrome’s MIDI capabilities. I also have had a specific idea for a text/sculpture piece referencing Italo Calvino’s “If on a Winter’s Night a Traveler”. Not sure how in depth this blog post should get, but I can elaborate on the specifics of that idea later if desired.

I also have a desire to be able to teach this kind of creative coding. I’m hoping to learn about that process by taking this class, and hopefully being a resource to my fellow classmates in order to familiarize myself with different ways people come to learn code.

In terms of the ICM Inspiration Wiki page, I’m always happy to be reminded of Oblique Strategies! The game itself linked there didn’t seem to function (a server error of some kind), but Oblique Strategies can be found in many different forms and is a great way to get the mental juices flowing. Jerp Thorp’s work in making a Processing program that helped arrange names on the 9/11 memorial always struck me as moving; creative coding doesn’t always need to be flashy to be emotionally effective. Ryoji Ikeda’s superposition strikes me as great, because it brings the concept of “operators” into a choreographed and performative context. Not quite “the man behind the curtain” but not an upfront performer who is the center of attention, either.

 

My addition to the page was Anna Anthropy, her interactive work Dys4ia and her book Rise of the Videogame Zinesters. Anna is a indie game developer, but some of her work straddles the divide between game and “interactive narrative” (if there is a distinction to be made between the two). Anna is transgender, and Dys4ia is an autobiographical depiction of her transitioning process. Rise of the Videogame Zinesters is a kind of manifesto for DIY game development, an endorsement for accessible development tools, and a call for more “personalized” game development that empowers everyday people. I feel like there are a lot of parallels with this ethos and the intent behind libraries, frameworks and projects like Arduino, Processing and p5.js; making it easier for everyone to make digital art. Rise of the Videogame Zinesters is a quick read and a kick in the pants. If this doesn’t make you want to drop the excuses and start pushing some pixels around for art’s sake, nothing will.