Identifying Flute Notes Using CircuitPython, FFT, and a Budget Microphone

Image of the project in action

The Raspberry Pi Pico never ceases to amaze! I love pushing CircuitPython to its limits. In this article, we dive into using CircuitPython, Fast Fourier Transform (FFT), and a budget-friendly microphone breakout board to recognize flute notes and write them on an OLED screen.


Video of the project in action.

For full source code please visit : https://github.com/code2k13/pipico_flute_notes

What You’ll Need

  • A microcontroller that supports CircuitPython (e.g., Raspberry Pi Pico)
  • An electret microphone breakout board
  • An SSD1306 OLED display
  • Push button
  • Jumper wires

How It Works

This project captures sound from a microphone, processes it using FFT to extract the dominant frequency, and maps the frequency to a musical note. The identified note is displayed on an OLED screen.

Setting Up the Hardware

  1. Connect the microphone’s analog output to an analog input pin on the microcontroller (e.g., A1).
  2. Connect the OLED display to the I2C pins (GP18 for SDA, GP19 for SCL).
  3. Connect a push button to GP21, using a pull-down resistor configuration.

Writing the Code

Initializing Components

The script initializes the OLED display, the microphone, and the push button:

import board
import busio as io
import adafruit_ssd1306
import analogio
import digitalio
import ulab
import gc
import time

i2c = io.I2C(board.GP19, board.GP18)
oled = adafruit_ssd1306.SSD1306_I2C(128, 64, i2c)
button = digitalio.DigitalInOut(board.GP21)
button.direction = digitalio.Direction.INPUT
button.pull = digitalio.Pull.DOWN
mic = analogio.AnalogIn(board.A1)

Defining the Frequency Table

To identify notes, we define a lookup table with approximate frequency values:

frequency_table = [
{'center_freq': 384, 'symbol': 'Sa'},
{'center_freq': 433, 'symbol': 'Re'},
{'center_freq': 482, 'symbol': 'Ga'},
{'center_freq': 531, 'symbol': 'Ma'},
{'center_freq': 580, 'symbol': 'Pa'},
{'center_freq': 650, 'symbol': 'Dha'},
{'center_freq': 713, 'symbol': 'Ni'},
{'center_freq': 769, 'symbol': 'SA'},
]

Processing the Microphone Input

The function get_condensed_fft() collects audio samples and applies FFT to extract the dominant frequency:

def get_condensed_fft():
n_samples = 1024
samples = []
for i in range(n_samples * 14):
if i % 14 == 0:
samples.append(10 * ((mic.value * 3.3 / 65536) - 1.65))

data_array = ulab.numpy.array(samples, dtype=ulab.numpy.float)
fft_result = ulab.numpy.fft.fft(data_array)
magnitude = ulab.numpy.abs(fft_result) # Simplified magnitude calculation
magnitude[0] = 0.0 # Remove DC component
gc.collect()
return magnitude[:len(magnitude) // 2].tolist()

Finding the Dominant Frequency

We locate the peak frequency in the FFT result:

def get_highest_frequency(fft_data, sampling_rate=7182):
bin_size = sampling_rate / len(fft_data) / 2
max_amplitude_index = fft_data.index(max(fft_data))
highest_frequency = max_amplitude_index * bin_size
return highest_frequency

Matching the Frequency to a Note

This function checks which note the detected frequency corresponds to:

def find_symbol_for_frequency(highest_frequency, tolerance=0.05):
for entry in frequency_table:
center_freq = entry['center_freq']
symbol = entry['symbol']
if center_freq * (1 - tolerance) <= highest_frequency <= center_freq * (1 + tolerance):
return symbol
return "??"

Displaying the Identified Note

Once a button press is detected, the system captures and processes the audio, then displays the detected note and frequency:

while True:
if button.value:
oled.fill(0)
oled.text("Starting...", 5, 10, 1)
oled.show()
time.sleep(1)

for _ in range(500):
fft_data = get_condensed_fft()
highest_frequency = get_highest_frequency(fft_data)
symbol = find_symbol_for_frequency(highest_frequency)
oled.fill(0)
oled.text(f"Note: {symbol} ", 5, 20, 4)
oled.text(f"Freq: {highest_frequency:.1f} Hz", 5, 40, 1)
oled.show()
gc.collect()

oled.fill(0)
oled.text("Done", 5, 20, 1)
oled.show()
time.sleep(2)
oled.fill(0)
oled.show()
time.sleep(0.1)

Final Thoughts

With just a microcontroller, a basic microphone, and CircuitPython, you can analyze flute notes in real-time! This project not only introduces you to digital signal processing but also provides a practical tool for musicians. You can extend this by improving accuracy, adding more musical notes, or even visualizing sound waves on the OLED.