I put in a full version ISSUE TRACKER.http://naudio.codeplex.com/workitem/16375
I looked to see if your option is possible, I continued my research.
I put in a full version ISSUE TRACKER.http://naudio.codeplex.com/workitem/16375
I looked to see if your option is possible, I continued my research.
I was trying to convert 16bit PCM files to IeeeFloat and encountered a problem where the files grew until hitting the 2GB limit.My code was:
publicstaticvoid ImportAudio3(string sourceFileName, string targetFileName) {using (var reader = new WaveFileReader(sourceFileName)) {using (var channel = new WaveChannel32(reader)) { WaveFileWriter.CreateWaveFile(targetFileName, channel); } } }
Is it intended that WaveChannel32.Read() never returns 0 or is it a bug?
I just realized that I can avoid using WaveChannel32 by using AudioFileReader and passing that directly to WaveFileWriter.CreateWaveFile, but I still thought I'd ask because I'm using WaveChannel32 in other places.
I had already done tests to avoid the latency to the first read.
But I have not found another solution.
From: ManuN
I had already done tests to avoid the latency to the first read.
But I have not found another solution.
Read the full discussion online.
To add a post to this discussion, reply to this email (naudio@discussions.codeplex.com)
To start a new discussion for this project, email naudio@discussions.codeplex.com
You are receiving this email because you subscribed to this discussion on CodePlex. You can unsubscribe or change your settings on codePlex.com.
Please note: Images and attachments will be removed from emails. Any posts to this discussion will also be available online atcodeplex.com
set PadWithZeros to false or preferably use Wave16ToFloatProvider instead. Really I want to obsolete WaveChannel32 and WaveMixerStream, as there are better ways to do mixing but a lot of people are still using them.
Hi all,
I have seen this topic, and I have a question like the one above. here
is my code;
private WaveIn sourceStream = null;private WaveOut waveOut = null;private BufferedWaveProvider bufferedWaveIn = null;public WaveIn_WaveOut_TimeShift(int deviceNumberIN, int deviceNumberOUT, int sampleRate) { sourceStream = new WaveIn(); sourceStream.DeviceNumber = deviceNumberIN; sourceStream.WaveFormat = new WaveFormat(sampleRate, WaveIn.GetCapabilities(deviceNumberIN).Channels); sourceStream.DataAvailable += new EventHandler<WaveInEventArgs>(sourceStream_DataAvailable); bufferedWaveIn = new BufferedWaveProvider(sourceStream); bufferedWaveIn.BufferDuration = TimeSpan.FromSeconds(30); waveOut = new WaveOut(); waveOut.DeviceNumber = deviceNumberOUT; waveOut.Init(waveIn); } ~WaveIn_WaveOut_TimeShift() { Dispose(); }publicvoid Dispose() {if (sourceStream != null) { sourceStream.StopRecording(); sourceStream.Dispose(); sourceStream = null; }if (waveOut != null) { waveOut.Stop(); waveOut.Dispose(); waveOut = null; } }publicvoid Start() { sourceStream.StartRecording(); waveOut.Play(); }publicvoid Stop() { sourceStream.StopRecording(); }privatevoid sourceStream_DataAvailable(object sender, WaveInEventArgs e) { bufferedWaveIn.AddSamples(e.Buffer, 0, e.BytesRecorded); } }
How can I add a delay of 15 second for example. I am routing waveIn to waveOut real time. I want to create a delay of n seconds on it.
Implement an IWaveProvider that in its Read method, just returns empty buffers until a total of 15 * WaveFormat.AverageBytesPerSecond have been requested. Then start returning data from the BufferedWaveProvider.
Mark
hi
Could you really get it to work? I did lots of try but I just hear all of the voice with echo, like no any echo canceller is there.
Just a difference is that I changed libspeex.dll to libspeexdsp.dll. because it couldn't find functions names in libspeex.dll.
It will be nice of you if you can help me.
Thanks
Sorry but I think that I could not define the question well.
I am routing waveIn input to waveout device at real time.
Then after I want to listen 15 seconds history so I want to roll back to 15 seconds back and listen history for 15 seconds. (I can loose real time data while listening history).
I think that BufferedWaveProvider buffers the desired length of data with BufferDuration property.
So in this case I only want BufferedWaveProvider to provide me a buffer (for waveOut device) that is from history of n seconds instead of latest.
İs this possible?
I thought IMFTransform used to make convertions
I found a simple example:
http://code.google.com/p/bitspersampleconv2/wiki/HowToUseResamplerMFT
you'd need to make your own customised version of BufferedWaveProvider in order to do this.
From: ManuN
I thought IMFTransform used to make convertions
I found a simple example:
http://code.google.com/p/bitspersampleconv2/wiki/HowToUseResamplerMFT
Read the full discussion online.
To add a post to this discussion, reply to this email (naudio@discussions.codeplex.com)
To start a new discussion for this project, email naudio@discussions.codeplex.com
You are receiving this email because you subscribed to this discussion on CodePlex. You can unsubscribe or change your settings on codePlex.com.
Please note: Images and attachments will be removed from emails. Any posts to this discussion will also be available online atcodeplex.com
Ok.
Thanks Mark.
I will try this option to see.
Lack of time, abandoned.
Here is a more detailed description:
OS: Windows 7 Professional 64-bit
NAudio: 1.6.0
Current Code:
WaveFileReader reader = new NAudio.Wave.WaveFileReader("a.wav");
WaveStream downsampledStream = new WaveFormatConversionStream(new WaveFormat(8000, reader.WaveFormat.BitsPerSample, reader.WaveFormat.Channels), reader);
WaveStream alawStream = new WaveFormatConversionStream(WaveFormat.CreateALawFormat(downsampledStream.WaveFormat.SampleRate, downsampledStream.WaveFormat.Channels), downsampledStream);
Fails with exception at the last line (alawStream).
After the last test failed I changed the input file.
VLC says: PCM U8 (araw), Mono, 22050 Hz, Bits per Sample: 8
When I debug step by step the three object reader, downsampledStream and alawStream contain the following information:
reader: WaveFormat {8 bit PCM: 22kHz 1 channels} NAudio.Wave.WaveFormat {NAudio.Wave.WaveFormatExtraData}
downsampledStream: WaveFormat {8 bit PCM: 8kHz 1 channels} , TotalTime {00:00:07.9250000}, CurrentTime {00:00:00}
alawStream: Exception NAudio.MmResult.AcmNotPossible {"AcmNotPossible calling acmStreamOpen"}
bei NAudio.MmException.Try(MmResult result, String function)
bei NAudio.Wave.Compression.AcmStream..ctor(WaveFormat sourceFormat, WaveFormat destFormat)
bei NAudio.Wave.WaveFormatConversionStream..ctor(WaveFormat targetFormat, WaveStream sourceStream)
bei RTPtest.Form1.Form1_Load(Object sender, EventArgs e) in C:\Users\joka\documents\visual studio 2010\Projects\RTPtest\RTPtest\Form1.cs:Zeile 50.
you should be starting with 16 bit audio, not 8 bit. The a-law encoder converts from 16 bit to 8 bit.
Thanks a lot. That works.
Is there any overview which conversions are supported? I think it depends on the codec!?
Every codec has a list of supported input and output formats. You can look at these using the ACM demo in the NAudio Demo project.