Hi
I need to normalize audio (put volume in [1; -1] interval).
So I used code from Naudio demo, added few lines, and got the following:
But I have 2 questions:
I need to normalize audio (put volume in [1; -1] interval).
So I used code from Naudio demo, added few lines, and got the following:
using (var reader = new MediaFoundationReader("d:\\Projects\\audio.wma"))
{
ISampleProvider sp;
int sourceSamples;
if (reader.WaveFormat.Encoding == WaveFormatEncoding.Pcm)
{
if (reader.WaveFormat.BitsPerSample == 16)
{
sp = new Pcm16BitToSampleProvider(reader);
sourceSamples = (int) (reader.Length/2);
}
else if (reader.WaveFormat.BitsPerSample == 24)
{
sp = new Pcm24BitToSampleProvider(reader);
sourceSamples = (int) (reader.Length/3);
}
else
{
throw new ArgumentException("Currently only 16 or 24 bit PCM samples are supported");
}
}
else if (reader.WaveFormat.Encoding == WaveFormatEncoding.IeeeFloat)
{
sp = new WaveToSampleProvider(reader);
sourceSamples = (int) (reader.Length/4);
}
else
{
throw new ArgumentException("Must be PCM or IEEE float");
}
sampleData = new float[sourceSamples];
int n = sp.Read(sampleData, 0, sourceSamples);
}
float maxValue = 0.0f;
float minValue = 0.0f;
for (var i = 1; i < sampleData.Length; i++)
{
if (sampleData[i] > maxValue)
{
maxValue = sampleData[i];
}
else if (sampleData[i] < minValue)
{
minValue = sampleData[i];
}
}
var coef1Channel = 1/maxValue;
var coef2Channel = -1/minValue;
for (var i = 1; i < sampleData.Length; i++)
{
sampleData[i] = sampleData[i] > 0 ? sampleData[i]*coef1Channel :
sampleData[i]*coef2Channel;
}
}
so after this audio volume is normalized. But I have 2 questions:
- How can I convert this bytes array back to ISampleProvider or to IWaveProvider?
-
Is it possible to use MediaFoundationReader from stream instead of file?