Hi mark
Waiting for your reply.
Thanks
Waiting for your reply.
Thanks
Hi, you could try this instead of SampleChannel:Unfortunately, not working.SampleProvider = RawSourceWaveStream.ToSampleProvider();
ToSampleProvider:
public static ISampleProvider ToSampleProvider(this IWaveProvider waveProvider)
{
return SampleProviderConverters.ConvertWaveProviderIntoSampleProvider(waveProvider);
}
SampleChannel's constuctor:public SampleChannel(IWaveProvider waveProvider, bool forceStereo)
{
ISampleProvider source = SampleProviderConverters.ConvertWaveProviderIntoSampleProvider(waveProvider);
...
Although, this seems to be an ACM error which indicates that the Audio Compression Manager isn´t capable of your self-created waveformat and fails converting it._waveOut.Init(_waveProvider);
Yes. And when I try to initialize WaveOut without SampleChannel I get error:
WaveFormatConversionStream.CreatePcmStream
to convert to linear PCM before playing. However, you do need to make sure your WaveFormat
matches the one ACM is expecting. You can use the NAudio demo application to scrutinize the details of your ACM codecs, or maybe more simply by using AdpcmWaveFormat
.You will need to useYeahoo!WaveFormatConversionStream.CreatePcmStream
to convert to linear PCM before playing. However, you do need to make sure yourWaveFormat
matches the one ACM is expecting. You can use the NAudio demo application to scrutinize the details of your ACM codecs, or maybe more simply by usingAdpcmWaveFormat
.
WaveFormatConversionStream.CreatePcmStream(waveProvider)
is working! Thank you! AdpcmWaveFormat
is unusable because it contains protected fields without public accessors. And some classes trying to marshal WaveFormat bypass Serialize method. =\ Instead, I use WaveFormat FromFormatChunk
. It works fine for me. (:MemoryStream s = new MemoryStream();
MemoryStream newStream = new MemoryStream();
int length = 0;
byte[] buffer = null;
int read = 0;
MixingSampleProvider mixer2 = new MixingSampleProvider(_samples);
SampleToWaveProvider16 mixer3 = new SampleToWaveProvider16(mixer2);
length = mixer3.WaveFormat.AverageBytesPerSecond*Convert.ToInt32(noisePosition.TotalSeconds);
buffer = new byte[length];
WaveFileWriter waveFileWriter = new WaveFileWriter(new IgnoreDisposeStream(s), mixer3.WaveFormat);
while ((read = mixer3.Read(buffer, 0, buffer.Length)) > 0)
{
waveFileWriter.Write(buffer, 0, read);
}
waveFileWriter.Flush();
waveFileWriter.Close();
waveFileWriter.Dispose();
s.WriteTo(newStream);
Here are the details of the Exception:
System.IndexOutOfRangeException was caught
HResult=-2146233080
Message=Index was outside the bounds of the array.
Source=NAudio
StackTrace:
at NAudio.Wave.SampleProviders.Pcm16BitToSampleProvider.Read(Single[] buffer, Int32 offset, Int32 count)
at NAudio.Wave.SampleProviders.OffsetSampleProvider.Read(Single[] buffer, Int32 offset, Int32 count)
at NAudio.Wave.SampleProviders.MixingSampleProvider.Read(Single[] buffer, Int32 offset, Int32 count)
at NAudio.Wave.SampleProviders.OffsetSampleProvider.Read(Single[] buffer, Int32 offset, Int32 count)
at NAudio.Wave.SampleProviders.MixingSampleProvider.Read(Single[] buffer, Int32 offset, Int32 count)
at NAudio.Wave.SampleProviders.SampleToWaveProvider16.Read(Byte[] destBuffer, Int32 offset, Int32 numBytes)
at GamedayRadio.HalfInning.Process() in xxxxxx
InnerException:
Thanks in advance for any help/ideas/pointers/direction you can provide. NAudio.Wave.WaveIn sourceStream = null;
NAudio.Wave.WaveIn sourceStream2 = null;
NAudio.Wave.WaveIn sourceStream3 = null;
NAudio.Wave.DirectSoundOut waveOut = null;
NAudio.Wave.DirectSoundOut waveOut2 = null;
NAudio.Wave.DirectSoundOut waveOut3 = null;
NAudio.Wave.WaveFileWriter waveWriter = null;
NAudio.Wave.WaveFileWriter waveWriter2 = null;
NAudio.Wave.WaveFileWriter waveWriter3 = null;
private void button5_Click(object sender, EventArgs e)
{
for (int i = 0; i < NAudio.Wave.WaveIn.DeviceCount; i++)
{
if (sourceStream == null)
{
sourceStream = new NAudio.Wave.WaveIn();
sourceStream.DeviceNumber = i;
sourceStream.WaveFormat = new NAudio.Wave.WaveFormat(44100, NAudio.Wave.WaveIn.GetCapabilities(i).Channels);
sourceStream.DataAvailable += new EventHandler<NAudio.Wave.WaveInEventArgs>(sourceStream_DataAvailable);
waveWriter = new NAudio.Wave.WaveFileWriter(String.Concat("D:/record-", i + 1, ".wav"), sourceStream.WaveFormat);
sourceStream.StartRecording();
continue;
}
if (sourceStream2 == null)
{
sourceStream2 = new NAudio.Wave.WaveIn();
sourceStream2.DeviceNumber = i;
sourceStream2.WaveFormat = new NAudio.Wave.WaveFormat(44100, NAudio.Wave.WaveIn.GetCapabilities(i).Channels);
sourceStream2.DataAvailable += new EventHandler<NAudio.Wave.WaveInEventArgs>(sourceStream2_DataAvailable);
waveWriter2 = new NAudio.Wave.WaveFileWriter(String.Concat("D:/record-", i + 1, ".wav"), sourceStream2.WaveFormat);
sourceStream2.StartRecording();
continue;
}
if (sourceStream3 == null)
{
sourceStream3 = new NAudio.Wave.WaveIn();
sourceStream3.DeviceNumber = i;
sourceStream3.WaveFormat = new NAudio.Wave.WaveFormat(44100, NAudio.Wave.WaveIn.GetCapabilities(i).Channels);
sourceStream3.DataAvailable += new EventHandler<NAudio.Wave.WaveInEventArgs>(sourceStream_DataAvailable);
waveWriter3 = new NAudio.Wave.WaveFileWriter(String.Concat("D:/record-", i + 1, ".wav"), sourceStream3.WaveFormat);
sourceStream3.StartRecording();
continue;
}
}
}
error occured in line: sourceStream2.StartRecording();MmException: AcmNotPossible calling acmStreamConvert NAudio.MmException.Try (MmResult result, System.String function) NAudio.Wave.Compression.AcmStreamHeader.Convert (Int32 bytesToConvert, System.Int32& sourceBytesConverted) NAudio.Wave.Compression.AcmStream.Convert (Int32 bytesToConvert, System.Int32& sourceBytesConverted) NAudio.Wave.AcmMp3FrameDecompressor.DecompressFrame (NAudio.Wave.Mp3Frame frame, System.Byte[] dest, Int32 destOffset)I have tried with different radios online, but I always get that error. I don't know what is happening... Any help?
private IWavePlayer mWaveOutDevice;
private WaveStream mMainOutputStream;
private WaveChannel32 mVolumeStream;
private VolumeWaveProvider16 volumeProvider;
private string m_Url = "http://37.59.32.115:8122/";
enum StreamingPlaybackState
{
Stopped,
Playing,
Buffering,
Paused
}
private volatile StreamingPlaybackState playbackState = StreamingPlaybackState.Stopped;
private bool fullyDownloaded = false;
public bool m_Play = false;
float timer;
void Update()
{
if (m_Play)
{
playbackState = StreamingPlaybackState.Buffering;
StreamMP3(m_Url);
m_Play = false;
}
switch (playbackState)
{
case StreamingPlaybackState.Buffering:
case StreamingPlaybackState.Playing:
StreamMP3(m_Url);
break;
default:
break;
}
}
HttpWebRequest webRequest;
BufferedWaveProvider bufferedWaveProvider = null;
byte[] buffer = new byte[16384 * 4];
private void StreamMP3(string lUrl)
{
this.fullyDownloaded = false;
webRequest = (HttpWebRequest)WebRequest.Create(lUrl);
int metaInt = 0; // blocksize of mp3 data
webRequest.Headers.Clear();
webRequest.Headers.Add("GET", "/ HTTP/1.0");
webRequest.Headers.Add("Icy-MetaData", "1");
webRequest.UserAgent = "WinampMPEG/5.09";
HttpWebResponse resp = null;
try
{
resp = (HttpWebResponse)webRequest.GetResponse();
}
catch(WebException e)
{
if (e.Status != WebExceptionStatus.RequestCanceled)
{
Debug.LogError(e.Message);
}
return;
}
// needs to be big enough to hold a decompressed frame
try
{
// read blocksize to find metadata block
metaInt = Convert.ToInt32(resp.GetResponseHeader("icy-metaint"));
}
catch
{
}
IMp3FrameDecompressor decompressor = null;
try
{
using (var responseStream = resp.GetResponseStream())
{
ReadFullyStream readFullyStream = new ReadFullyStream(responseStream);
//do
{
if (bufferedWaveProvider != null && bufferedWaveProvider.BufferLength - bufferedWaveProvider.BufferedBytes < bufferedWaveProvider.WaveFormat.AverageBytesPerSecond / 4)
{
Debug.LogError("Buffer getting full, taking a break");
Thread.Sleep(500);
}
else
{
Mp3Frame frame = null;
try
{
frame = Mp3Frame.LoadFromStream(readFullyStream, true);
}
catch (EndOfStreamException)
{
this.fullyDownloaded = true;
Debug.LogError("reached the end of the MP3 file / stream");
// reached the end of the MP3 file / stream
// break; }
catch (WebException)
{
// probably we have aborted download from the GUI thread
// break; }
if (decompressor == null && frame != null)
{
// don't think these details matter too much - just help ACM select the right codec
// however, the buffered provider doesn't know what sample rate it is working at
// until we have a frame
WaveFormat waveFormat = new Mp3WaveFormat(frame.SampleRate, frame.ChannelMode == ChannelMode.Mono ? 1 : 2, frame.FrameLength, frame.BitRate);
decompressor = new AcmMp3FrameDecompressor(waveFormat);
if(bufferedWaveProvider == null)
{
this.bufferedWaveProvider = new BufferedWaveProvider(decompressor.OutputFormat);
this.bufferedWaveProvider.BufferDuration = TimeSpan.FromSeconds(20); // allow us to get well ahead of ourselves
}
}
int decompressed = decompressor.DecompressFrame(frame, buffer, 0);
if(bufferedWaveProvider != null)
{
bufferedWaveProvider.AddSamples(buffer, 0, decompressed);
}
}
}
if (this.mWaveOutDevice == null && this.bufferedWaveProvider != null)
{
Debug.Log("Creating WaveOut Device");
this.mWaveOutDevice = new WaveOut();
this.volumeProvider = new VolumeWaveProvider16(bufferedWaveProvider);
this.volumeProvider.Volume = 100.0f;
mWaveOutDevice.Init(volumeProvider);
}
else if (bufferedWaveProvider != null)
{
double bufferedSeconds = bufferedWaveProvider.BufferedDuration.TotalSeconds;
if(bufferedSeconds > 0.2f && playbackState == StreamingPlaybackState.Buffering)
{
Debug.Log("PLaying music...");
mWaveOutDevice.Play();
playbackState = StreamingPlaybackState.Playing;
}
}
}
}
finally
{
if (decompressor != null)
{
decompressor.Dispose();
}
}
}
}