Understanding Sound Waves: The Basics of Audio Engineering

Understanding Sound Waves: The Foundation of Audio Engineering
Sound is an integral part of our lives, shaping experiences from the music we listen to, the conversations we have, and the environments we inhabit. At the core of all these auditory experiences are sound waves — complex phenomena that, when understood, can help make better decisions as an audio engineer.
This blog aims to demystify sound waves, explaining how they travel, how they change over time and distance, and the fundamental concepts involved in measuring sound in the real world.
What are Sound Waves?
Sound waves are mechanical vibrations that travel through a medium. Typically, we think of sound in the air, but it can also travel well through liquids and solid materials. These waves are created when any object vibrates, causing the surrounding molecules to bounce back and forth. This oscillation generates pressure changes that expand throughout the material.
Sound waves are often compared to ripples in a pond, which radiate outward from the disturbance in the center. However, unlike ripples in a flat pond surface, sound waves will travel in all directions.
Sound is a mechanical energy transfer, which means it is made from the energy of movement. We use our muscles to push air, vibrate our vocal cords, and make sound when we speak. This sound travels through the air and eventually vibrates the listener’s eardrum.
At the end of the journey, this vibration is translated into electrical nerve impulses to be decoded by the listener’s brain. If you replace the eardrum with a microphone, we can create an alternating-voltage electrical signal called audio, which will be discussed in a future blog.
The Physics of Sound Waves
How Sound Travels
When sound is produced, the vibrating object creates compressions and rarefactions in the surrounding medium. Compressions are areas where particles are closely packed together, while rarefactions are areas where particles are spread apart. As the sound wave moves through the medium, these areas of compression and rarefaction travel outward from the source.
Think of how a guitar string will vibrate when picked. The string will first slam into air particles nearby, forcing them to bump into the next molecules in line. All this commotion causes the nearby air to be compressed into a high-pressure zone. After the string reaches as far as it can in one direction, it will swing backward and pull the air back with it, creating an opposing low-pressure zone. If the string is under enough tension, it will swing back and forth for several seconds, sending a continuous series of alternating high and low-pressure waves across the room.
The speed of sound depends on several factors, including the medium and its temperature. For instance, sound travels faster in water than in air and even quicker in solids. At room temperature, sound travels at about 1,125 feet per second (or 343 meters per second) in air, but in water, this speed can exceed 4,869 feet per second (or 1,480 meters per second).
Since there are 1,000 milliseconds in one second, as a rule of thumb, you can estimate sound will travel 1ft per ms. To be more precise, you could say 1.1ft/ms (1,125fps/1000ms), but that’s harder to remember!
This number becomes valuable to audio engineers who must precisely align (or re-align) sound waves for a listener. Often, large music shows will have multiple locations for their speaker systems, each at a unique distance from the audience. To make them work well together, these systems will be time-delayed to reinforce the original wavefront traveling from the stage to the back of the venue.
Wave Properties
Sound waves possess several fundamental properties that define their characteristics. Let’s look at a few of the important terms:
Frequency
This refers to the number of cycles a sound wave completes in one second, measured in hertz (Hz). The frequency determines the pitch of the sound; higher frequencies correspond to higher pitches, while lower frequencies yield lower pitches. The audible range for humans typically spans about 20 Hz to 20,000 Hz (20 kHz).
Frequency is a critical aspect of sound waves that influences our pitch perception. Audio engineers often work with different frequencies to achieve desired sounds in music and other audio recordings. Listeners perceive low frequencies at the bottom of the mix, nearer the floor, while high frequencies naturally lift towards the ceiling. Most people perceive 1kHz as roughly at eye level.
1) Sub-bass (20-60 Hz)
The low rumbling sounds are often felt rather than heard. They are common in genres like electronic and hip-hop music since there is not much sub-bass in typical acoustic instruments.
2) Bass (60-250 Hz)
Fundamental for the rhythm and feel of music, providing a foundation for harmony.
3) Midrange (250-5000 Hz)
It contains most vocal and instrumental elements, where clarity is essential.
4) Treble (5000-20,000 Hz)
High-frequency sounds add brightness and clarity to music but can also cause ear fatigue if overemphasized.
Amplitude
This measures the maximum extent of a vibration or oscillation, indicating how loud a sound is. Higher amplitudes correspond to louder sounds, while lower amplitudes produce softer sounds. Amplitude is usually measured in decibels (dB), a logarithmic scale that quantifies sound intensity.
Understanding amplitude is crucial for audio engineers to balance sounds in a mix. The louder sounds will tend to be perceived as closer to the listener, and quieter sounds will fall away from the listener.
1) Threshold of Hearing (0 dB)
The faintest sound detectable by the human ear.
2) Moderate Sounds (60-70 dB)
Normal conversation levels.
3) Loud Sounds (85 dB and above)
Prolonged exposure can lead to hearing damage.
4) Pain Threshold (120 dB)
The level at which sound becomes physically painful.
Since it’s logarithmic, dB changes might seem unusual at first. However, the rule of thumb is that a 6dB change equals double (or half) the original volume. So doubling the volume of a 100 dB concert would bring it to 106 dB rather than 200 dB.
Wavelength
This is the distance between successive crests (or troughs). Wavelength is inversely related to frequency; higher frequencies have shorter wavelengths, while lower frequencies have longer wavelengths. This will become very important when designing acoustics in studios or even building instruments to specific sizes to resonate in tune with the music.
Phase
This describes the position of a point in time on a waveform cycle, measured in degrees. Understanding phase is crucial in audio engineering, especially when dealing with multiple sound sources or microphones.
How Sound Waves Change Over Time and Distance
As sound waves travel, they undergo various changes that can affect their quality and perception. Several factors contribute to how sound waves alter over time and distance, and audio engineers use these changes as part of the mixing process, enabling them to create the illusion of depth and space in a song.
Attenuation
One of the primary factors affecting the propagation is attenuation, which refers to the reduction in amplitude (volume) as sound travels. Most attenuation is the result of either wasted energy (spread across large areas) or absorbed energy.
This happens partially because some of the sound is spread out in every direction and isn’t focused on any particular listener. Low frequencies are omnidirectional (travel in every direction equally), and higher frequencies tend to be more directional. Another reason the volume will drop over distance is due to energy loss as it pushes itself through the air. High frequencies are particularly fast and low power, so they tend to get absorbed by the air.
Diffusion
When sound waves encounter obstacles, they can scatter in different directions. This scattering can lead to a loss of energy and intensity.
Refraction and Reflection
Sound waves can also change direction due to refraction and reflection. Refraction occurs when sound waves travel through media of different densities, causing them to change speed and direction. For example, sound travels faster in warmer air than in cooler air, which can cause the waves to bend.
Reflection happens when sound waves bounce off surfaces. An echo occurs when the sound waves reflect off a distant surface and return to us. When the reflection is nearby, it creates a “First Reflection” echo, which helps your brain echo-locate in space. In fact, if you walk into a new room blindfolded, these early reflections will give you a surprisingly accurate interpretation of your environment. Understanding these phenomena is crucial for audio engineers, especially when designing spaces for optimal sound quality.
Doppler Effect
Another fascinating phenomenon related to sound waves is the Doppler effect. This occurs when there is a relative motion between a sound source and an observer. If the sound source moves towards the observer, the frequency appears higher (resulting in a higher pitch), and if it moves away, the frequency appears lower (resulting in a lower pitch). This effect is commonly experienced in everyday life, such as when an ambulance passes by with its sirens blaring.
Practical Applications in Audio Engineering
Understanding sound waves is vital for audio engineers, as it directly impacts their work. Here are a few practical applications:
Microphone Placement
Knowledge of sound wave propagation helps engineers determine optimal microphone placement to capture the desired sound while minimizing unwanted noise.
Room Acoustics
Understanding reflection, absorption, and diffusion aids in designing recording studios or performance spaces that enhance sound quality.
Equalization
By adjusting frequencies, engineers can shape sounds to achieve a balanced mix, highlighting some aspects while suppressing others.
Sound Design
Manipulating sound waves in post-production can create unique audio experiences, from film soundtracks to video game effects.
Conclusion
Understanding sound waves is fundamental to audio engineering and enriches our appreciation of sound in all its forms. By grasping concepts like frequency, amplitude, and how sound changes over time and distance, we can better appreciate the complexities behind the sounds we experience daily.
Whether you’re an aspiring audio engineer or simply a curious listener, this knowledge lays the groundwork for deeper exploration into the fascinating world of sound.
If you are interested in a career in the Music Industry, Dark Horse Institute’s music programs: Composition and Songwriting, Audio Engineering, or Music Business are a great way to take things to the next level!
Leave a Reply