Shells, Psychophysics, Convolution, and Musique Concrete (WDIL #1)
It's the start of my last quarter of junior year and I thought I'd start a journal explaining the things I've learned on a weekly basis. My chemistry and physics teacher from high school would have us do weekly exit entries at the end of the week where we would answer the question: "what did I learn?" Considering I'm pursuing an individualized studies major where I'm essentially in charge of my education, I wanted to introduce these weekly entries into my college education.
This week was syllabus week but I did still learn some interesting things. This quarter I'm taking a class on the introduction to C/C++, a class on hearing sciences, and a class on digital sound processing.
What Questions Do I have this Quarter?
Some questions I hope to answer from these classes include: How do we create audio filters that simulate sound in different spaces? How can I use the C/C++, SuperCollider, and spatial filters to make headphones that track a user's movement and adjust output sounds based on that movement? How do humans perceive pitch, and what is the science behind the phenomenon of perfect pitch? What is the science behind "tone deafness?"
What Did I Learn this Week?
Introduction to C/C++:
This week in my computer science class we started learning about shells and the Linux operating system. A shell is essentially a program that allows for command line interface. We can use a shell to get the computer to "do things." We enter commands into the shell and these commands are given to the operating system to perform. So far we have learned about files: creating directories, moving files to other directories, deleting files, seeing what files are in what directories, etc.
Hearing Sciences: Most of what we went over in class this week was review of acoustics, but we were introduced to a new term "psychoacoustics," which is a sub-area of psychophysics. Acoustics is an area of physics that focuses on the properties of sound. Some examples include frequency of a sine tone, amplitude of a signal, how a signal is affected by a space or environment, etc. Psychoacoustics, however, involves subjectivity. Psychoacoustics is the study of how living beings perceive sound. Some properties of psychoacoustics include pitch and loudness (their objective partner terms being frequency and amplitude). The principle of psychoacoustics is changing a sound and measuring the change in perception.
Another important topic we brushed up on is time-frequency duality and convolution. In the natural world, sounds (an input signal) pass through a system (a transfer function or impulse response) and are altered by the system to create a new sound (an output signal). One example is human speech. Let's say you're singing the vowel "aahhh." The input signal would be the vibrations coming from your vocal folds, the system would come from your oral cavity (the shape of your vocal tract that creates a specific vowel sound), and the output would be the convolution between those two signals (see graphs in picture).
In the human speech case, the frequency spectrum of the vocal-fold vibration is multiplied by the spectrum of the oral cavity, yielding a vowel sound. This is an example of convolution, where one signal is treated as an impulse response (also known as a transfer function) and the other as the input to that impulse.
Digital Sound Processing: In my digital sound class we briefly went over Musique Concrète, is a genre of music invented by French composer Pierre Schaeffer. The basis of Musique Concrète is taking real world sounds that wouldn't normally think of as "musical" (e.g. a train, hitting a casserole dish, etc.) and altering/manipulating those sounds so that they sound more abstract to create music. In the early ages of Musique Concrète, pieces of this genre were composed and synthesized using physical tape and tape machines--cutting and splicing the tape with razor blades, changing tape speed to adjust pitch, etc. Now a days a computer can be used to create these pieces.