You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I've been using the noise reduction algorithm to standardize the noise before input the signal into a deep learning model. The issue I found while doing the noise reduction is a slightly frequency shift to lower values on the signal.
Original:
Reduced noise:
The shift becomes evident when looking the colors near the 4096 frequencies on the mel spectogram.
My Question is, How can I avoid this shift to lower frequencies? (It happens when reducing Stationary or Non-stationary noise)
Thank you
The text was updated successfully, but these errors were encountered:
Hi Jose,
Thanks for pointing thiis out. I'll have to dig into the code to see where
this is happening. Presumably either in inverting the STFT or in the loop
of inverting the mel spectrogram back to a linear spectrogram, then back
into the mel spectrogram. I'd guess it's the latter. I'm not looking at the
code now but I believe I'm using librosa's build in mel fiilterbank and
infersion functions.
Hi, I've been using the noise reduction algorithm to standardize the noise before input the signal into a deep learning model. The issue I found while doing the noise reduction is a slightly frequency shift to lower values on the signal.
Original:
Reduced noise:
The shift becomes evident when looking the colors near the 4096 frequencies on the mel spectogram.
My Question is, How can I avoid this shift to lower frequencies? (It happens when reducing Stationary or Non-stationary noise)
Thank you
The text was updated successfully, but these errors were encountered: