-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gibberish in the beginning of the recording data #442
Comments
I think the gibberish is produced in NRF52ADCChannel::demux. The first buffer in each recording is an old one stored by mic→output. The next few are recorded before, while or shortly after the mic is activated. What settle time does the mic need? The long "tail" of steps is perhaps the StreamNormalizer adjusting the calculated zero offset. Two possible workarounds without changing CODAL:
I think this is how MicroBitAudio mic input is operating... As the program starts, the mic is
The mic ADC channel continues to generate buffers and drive the pipeline, which is blocked in mic→output, and only results in MicroBitAudio calling deactivateMic() twice for each buffer (see SPLITTER_TICK below). No buffers pass through the StreamNormalizer until something pulls on the splitter. When StreamRecording is started, it simply starts accepting buffers. Though the pull request is prompted by the ADC producing a new buffer, if the pipeline was not flowing, the first buffer pulled is whatever mic→output was holding when it stopped flowing. Pulling that unblocks mic→output, and causes the splitter and rawsplitter to message MicroBitAudio to activate the mic. Once StreamRecording has ended, deactivateMic() is called repeatedly again because mic→output blocks when no buffers are pulled. uBit.audio.levelSPL can activate the mic and provide a pull on rawsplitter to keep mic→output unblocked and the mic active. The first block of the first recording after RESET is always slightly off, because it’s the first one through the StreamNormalizer, establishing the zero offset. Even after uBit.audio.levelSPL→activateForEvents(true), at the end of a StreamRecording, MicroBitAudio sees events 1032, 4 and 10, all of which lead to deactivateMic(). So the mic light briefly flickers off before subsequent events activate it again, which seems to generate some gibberish. Details... I put a serial number in the first byte of each NRF52ADCChannel::demux buffer, and copied it in StreamNormalizer::pull(). That revealed that the first buffer in a recording was old. DMESGing the first few bytes of each buffer in those places seemed to confirm they haven’t been corrupted. I don’t understand why the non-blocking mic→output DataStream holds the first buffer it receives until it is pulled, and drops newer buffers. I expected the opposite. While the ADC is driving the pipeline, rawsplitter and splitter continually send SPLITTER_TICK messages that cause MicroBitAudio to call activateMic or deactivateMic twice for each ADC buffer. |
When recording data from the microphone, the beginning of the data is clearly not right.
I've taken measurements with a recording and playback sampling rate of ~44.1 KHz, using the
StreamRecording
and retrieving the data with theSerialStreamer
. During sound recording, I've placed my finger on top of the microphone hole to try to minimise background noise, and once the data stabilises, it stays very close to the zero line.The top graph plots 11k samples, 250ms at 44.1Khz. The second graph zooms into the first 2560 samples:
The ADC DMA buffers are 512 bytes, and with a 14-bit sampling rate, we get 256 samples per buffer.
In most captures that I've done it looks like the data is only stable after 2560 samples, which is interestingly exactly 10 DMA buffers.
Source code to replicate
This example code needs this fix in the SerialStreamer to work correctly:
Serial output
Didn't include all the serial output at the end, stopped around the time the data stabilises.
MICROBIT.hex.zip
The text was updated successfully, but these errors were encountered: