Many of you will probably know this already, but just to be sure we’re on the same page, let’s lay down some terminology. A periodic signal f(t) is a signal that repeats itself after a certain amount of time. The length of the sequence that is repeated, p, we call the period of that signal, such that f(t) = f(t+p). The frequency of the signal is 1/p, so if p is in units of seconds, the frequency is in Hz.

Since the signal repeats, we can think about it as going around in a circle. The phase of the signal at any given time is how far it’s gone around the circle We can describe the phase the same way we describe circles using degrees: if we start at 0 degrees then 360 degrees is one full revolution, 90 degrees is a quarter, etc. If two signals are going around the circle in the same speed (and therefore have the same period), they can still be offset from each other, meaning they have different phase. This gap or offset is the relative phase between two signals.

Relative phase is the key to drone navigation. It’s how drones know where they are, relative to a set of fixed “beacons”, through a process [also known as/related to] triangulation.
Let’s talk about small drones flying indoors, without a human operator. There are systems now that will let a drone “know” the layout of a room and navigate it with coordinates as if it were a super small scale GPS. (My favorite one is here.)

These small systems don’t need satellites, they rely on smaller “beacons” that communicate with each other and with the drone. The drone will shoot off a high frequency (MHz) signal, which the beacons will receive at different times. By measuring the difference between when each beacon receives the signal, we can accurately calculate where the signal came from relative to each beacon. Since we also know where the beacons are positioned, we can further calculate the source of the signal, i.e. the position of the drone.

The local positioning systems use a high frequency signal so that the period is short enough to accurately position the drone just by counting how many periods have passed between when each beacon receives the signal. But what if the signal were a lower frequency and the periods longer, so long that maybe only part of a period has passed? You could do the same thing but instead you would be counting fractions of a period.

That’s exactly how your brain works for figuring out where sound is coming from, with ears instead of beacons and a signal between 20 Hz and 20 kHz. Your brain is constantly doing phase comparisons by instinct, a lot faster than computers can. Phantom center, sound stage, separation, these are the result of phase. The tricky part is that the beacons are only looking for a signal at one known and unchanging frequency, which is not particularly musical. Your ears and brain have to do these phase comparisons across an incredibly wide frequency band.

In a live setting, when you have actual sounds coming at you from a car passing on the street, a dog barking outside the window, or a subway mariachi band ruining your commute, you can easily pick out where each sound is coming from. When you take those sounds and play them back on speakers, that position information can be lost. This happens when your playback system introduces its own phase deviation. Phase deviation is the relative phase between your input signal and output sound, measured across the passband of your speakers. If there is too much phase deviation in your system it becomes harder and harder to accurately hear phase cues. That’s why eliminating phase deviation is so important in monitors. If the phase between speakers and between frequencies is not correct, you’ve lost information. Accurate phase is necessary for truly accurate sound.

How do we get our monitors to have accurate phase? Well…

header.page-header h1 {font-size:40px !important;}

Category: Technology

Comments

Your email address will not be published. Required fields are marked *