Posts about performing music, recitals, concerts. Topics could cover stage fright, how to have a good recital, etc.

There has been a great amount of progress in terms of artificial inteligence (A.I.) in the last ten years, and it’s begining to get into the realm of music in several ways, and as we get more deeply involved in new technologies and new ways to create, it’s a good idea to think about the posibilities it can offer, the diferences between human technique and an artificial one, and what could be the limits?

It’s no secret that the limits of A.I. are yet to be discovered, however one of the most argued concepts is the fact that artificial intelegence could go beyond human capabilities, and this may already be true in a sense.

A.I. Plays What You Can’t

When it comes to music, practice can get you very far but there are some limitations to the human body that we as musicians simply cannot overcome. Of course the only way to know about these “impossible” feats was to test it through A.I.

Now even though we know there are some things that can be played, it’s not something that our hands are able to do. But there are two things that can be taken from this proof. First is the fact that complicated or hard things to play are not necessarily good, and on the other hand the way a musicians plays it’s directly connected to the way a musicians creates.

This sort of connection is very blurred when it comes to A.I. There have been many attempts to replicate the sort of magic musicians use when they make good music, and there hasn’t been any luck, while it is music, it’s not easy at all on the ears.

So can A.I. today still break some boundaries and help musicians explore new grounds in music? yes.

A.I. Lacks Someting

For the time being, A.I. can’t compose hit songs but they can teach us something about the potencial musical compositions.

Claire Evans from the band called YACHT, talked in an interview about how the band used A.I. in the making of their record. They made a machine learn their songs which resulted in strange changes that were hard to play but had interesting ideas. Evans said:

“AI forced us to come up against patterns that have no relationship to comfort. It gave us the skills to break out of our own habits,”

Live Coding


Another interesting new way to approach to music is programming, which has recently come in the form of live coding.

Live coding is a type of performance art in which the performer creates music by programming and reprogramming a synthesizer as the composition plays. The synthesizer code is typically projected onto walls or screens for the audience to inspect as they listen to the unfolding sound. Live coding events are sometimes known as algoraves, and at these events it’s common to see visualizations of the evolving music projected alongside the code. Often, these visualizations are created by a second performer manipulating graphics software in tandem with the live coder.

Erica Snyder

It’s a way to make music without playing instruments or even using a digital music instruments or samples, through coding the music and the graphics come to be, and that is fascinating.

Renick Bell’s performance was part of Algorithmic Art Assembly, a recent two-day festival in San Francisco dedicated to algorithmic music and art. The afternoons were filled with talks and demonstrations; the nights were filled with music.

Michael Calore,

Some of the talks were heavy on mathematics and computer science—music code on the screen is one thing, but Euclidean formulas are something else—but all of them were informative. Adam Florin, creator of the algorithmic audio plug-in Patter, traced the history of generative music from the middle ages, through John Cage and Iannis Xenakis in the mid-20th century, up to the software-dominated present. Musician Jules Litman-Cleper outlined the parallels between the patterns we see in nature and the patterns exhibited by computer systems. Producer Mark Fell, who along with artists like Oval released some pioneering algorithmic dance music in the 1990s, was brought on stage for a Q&A session.

Michael Calore,

Embracing A.I. and computers means understanding how connected they really are to what makes us human, in this case, art and music.

Read More

Who says that lows are not important or not heard? In music you have three frequencies that have to be balanced in order to be able to hear a composition the best way possible, there are highs, mids and lows, or treble, mids and bass. In order to achieve this balance every instrument has to play it’s part. In music where there are electronic devices or volume, this has to be a priority, in other genres the force applied determines the volume, and others the quantity will determine the volume.

In other words every instrument has a job and each instrument is essential once every job is assigned. However you may have heard that sometimes a person would say “The bass is not that important, it’s just there” but no matter how simple the bass may be in some ocasions, it’s always essential. Of course you have Jazz and Funk which gives the sound of the bass lots of freedom and space to do flashy things, but it’s not always like this.

The Bass is Rhythm and Harmony


Bass in most musical compositions lay the bases of two essential parts of music, rhythm and harmony. In other words, the bass it’s in the middle of the percussion and melodic instruments in terms of its role in the musical piece.

A lot of the music out there is very well received because of its rhythm, and this is not achieved by percussion alone, mostly because percussion tends to be high frequency sounds, but with the help of a well composed bass, it doesn’t only completes the rhythm, but also works as a bridge from the percussion to other instruments.

In terms of harmonies, the interesting thing is that sometimes people are not even sure of the part the bass is playing in a specific song, but you are hearing a beautiful harmony, and your ear tricks you into believing that strings or air instruments are responsibles, when in fact the bass is allwing these harmonies to happen with its low frequency notes.

The bass plays a powerful role in how we hear harmonies. When we hear several notes played at the same time, we hear them all relative to the lowest sounding pitch — the bass note.

The Science Behind It

According to the PNAS there is a scientific reason as to why the bass is so important in music, and they conducted a studied which verifies the following:

Previous work using electroencephalography (EEG) demonstrated that the auditory cortex encodes pitch more robustly in the higher of two simultaneous tones or melodies, and modeling work indicated that this high-voice superiority for pitch originates in the sensory periphery. Here, we investigated the neural basis of carrying rhythmic timing information in lower-pitched voices. We presented simultaneous high-pitched and low-pitched tones in an isochronous stream and occasionally presented either the higher or the lower tone 50 ms earlier than expected, while leaving the other tone at the expected time. EEG recordings revealed that mismatch negativity responses were larger for timing deviants of the lower tones, indicating better timing encoding for lower-pitched compared with higher-pitch tones at the level of auditory cortex. A behavioral motor task revealed that tapping synchronization was more influenced by the lower-pitched stream. Results from a biologically plausible model of the auditory periphery suggest that nonlinear cochlear dynamics contribute to the observed effect. The low-voice superiority effect for encoding timing explains the widespread musical practice of carrying rhythm in bass-ranged instruments and complements previously established high-voice superiority effects for pitch and melody.

Michael J. Hove, Céline Marie, Ian C. Bruce, and Laurel J. Traino

In other words, our brain’s capability to make sense of music and finding order is a lot easier thanks to the bass and the lower tones, this also aligns with the role of the bass in music.

Read More

I once attended an ASTA (American String Teachers Association) convention and went to a workshop on improvisation. You could cut the tension with a knife. The attendees, mostly classical string teachers, appeared to generally believe falsely that improvisation means having the guts to screw up in front of others!

Let’s look first at the surprising benefits of improvisation, and then look at what improvisation actually is. I think you’ll agree that by my definitions, teachers as well as students — at all levels — can easily learn and enjoy by doing what I call improvisation.

Perhaps the nicest benefit of improvisation is that it turns off your inner critic. Musicians who are constantly monitoring their playing for errors, and stopping when a mistake is made, are basically training themselves to be obsessively fearful of mistakes, rather than actually playing music. By playing straight through a passage of music, even a short and manageable part such as a phrase, a learner can focus on the continuity of music, and still train themselves to keep mental notes about what’s going well, and what needs improvement. Being saddled by too much inner critique is like breaking up your music with static on the radio.

Brain studies show that when the part of the brain that handles improvisation is turned on, the part of the brain involved in self-critique is turned off (see the article This is your brain on improvisation—and why your creativity depends on it). This indicates that the mere effort to improvise makes you less inhibited and negative. It also suggests that anyone deeply wrapped up in a live performance can go all in and benefit from turning off their inner critic while they perform.

And what exactly is improvisation? What is performance? Many imagine improvisation to involve making up notes, and performance to require an audience…    [···]

Read More