Quantcast
Channel: NAudio
Viewing all 5831 articles
Browse latest View live

New Post: AudioSubType Complement

$
0
0

I put in a full version ISSUE TRACKER.http://naudio.codeplex.com/workitem/16375

I looked to see if your option is possible, I continued my research.


New Post: WaveChannel32.Read never returns 0

$
0
0

I was trying to convert 16bit PCM files to IeeeFloat and encountered a problem where the files grew until hitting the 2GB limit.My code was:

 

publicstaticvoid ImportAudio3(string sourceFileName, string targetFileName) {using (var reader = new WaveFileReader(sourceFileName)) {using (var channel = new WaveChannel32(reader)) {
			WaveFileWriter.CreateWaveFile(targetFileName, channel);
		}
	}
}

Is it intended that WaveChannel32.Read() never returns 0 or is it a bug?

I just realized that I can avoid using WaveChannel32 by using AudioFileReader and passing that directly to WaveFileWriter.CreateWaveFile, but I still thought I'd ask because I'm using WaveChannel32 in other places.

New Post: AudioSubType Complement

$
0
0

I had already done tests to avoid the latency to the first read.

But I have not found another solution.

New Post: AudioSubType Complement

$
0
0
that's good. I wanted to make one that implements ISampleProvider, so I'll look at your solution. Another thing that I want to add is to let MF handle resampling at the reader stage, so if you are using it with WASAPI (which doesn't automatically resample) you can easily get to the sample rate it wants.


On 29 November 2012 10:16, ManuN <notifications@codeplex.com> wrote:

From: ManuN

I had already done tests to avoid the latency to the first read.

But I have not found another solution.

Read the full discussion online.

To add a post to this discussion, reply to this email (naudio@discussions.codeplex.com)

To start a new discussion for this project, email naudio@discussions.codeplex.com

You are receiving this email because you subscribed to this discussion on CodePlex. You can unsubscribe or change your settings on codePlex.com.

Please note: Images and attachments will be removed from emails. Any posts to this discussion will also be available online atcodeplex.com


New Post: WaveChannel32.Read never returns 0

$
0
0

set PadWithZeros to false or preferably use Wave16ToFloatProvider instead. Really I want to obsolete WaveChannel32 and WaveMixerStream, as there are better ways to do mixing but a lot of people are still using them.

New Post: How to code circullar buffer to create a delay and used the wavein for micrcophone and output to speaker?

$
0
0

Hi all,

I have seen this topic, and I have a question like the one above. here 

is my code;

private WaveIn sourceStream = null;private WaveOut waveOut = null;private BufferedWaveProvider bufferedWaveIn = null;public WaveIn_WaveOut_TimeShift(int deviceNumberIN, int deviceNumberOUT, int sampleRate)
            {
                sourceStream = new WaveIn();
                sourceStream.DeviceNumber = deviceNumberIN;
                sourceStream.WaveFormat = new WaveFormat(sampleRate, WaveIn.GetCapabilities(deviceNumberIN).Channels);
                sourceStream.DataAvailable += new EventHandler<WaveInEventArgs>(sourceStream_DataAvailable);

                bufferedWaveIn = new BufferedWaveProvider(sourceStream);
                bufferedWaveIn.BufferDuration = TimeSpan.FromSeconds(30);

                waveOut = new WaveOut();
                waveOut.DeviceNumber = deviceNumberOUT;
                waveOut.Init(waveIn);
            }
            ~WaveIn_WaveOut_TimeShift()
            {
                Dispose();
            }publicvoid Dispose()
            {if (sourceStream != null)
                {
                    sourceStream.StopRecording();
                    sourceStream.Dispose();
                    sourceStream = null;
                }if (waveOut != null)
                {
                    waveOut.Stop();
                    waveOut.Dispose();
                    waveOut = null;
                }
            }publicvoid Start()
            {
                sourceStream.StartRecording();
                waveOut.Play();
            }publicvoid Stop()
            {
                sourceStream.StopRecording();
            }privatevoid sourceStream_DataAvailable(object sender, WaveInEventArgs e)
            {
                bufferedWaveIn.AddSamples(e.Buffer, 0, e.BytesRecorded);
            }
        }

How can I add a delay of 15 second for example. I am routing waveIn to waveOut real time. I want to create a delay of n seconds on it.

New Post: How to code circullar buffer to create a delay and used the wavein for micrcophone and output to speaker?

$
0
0

Implement an IWaveProvider that in its Read method, just returns empty buffers until a total of 15 * WaveFormat.AverageBytesPerSecond have been requested. Then start returning data from the BufferedWaveProvider.

Mark

New Post: Speex Echo Cancellation with NAudio?

$
0
0

hi

Could you really get it to work? I did lots of try but I just hear all of the voice with echo, like no any echo canceller is there.

Just a difference is that I changed libspeex.dll to libspeexdsp.dll. because it couldn't find functions names in libspeex.dll.

It will be nice of you if you can help me.

Thanks


New Post: How to code circullar buffer to create a delay and used the wavein for micrcophone and output to speaker?

$
0
0

Sorry but I think that I could not define the question well.

I am routing waveIn input to waveout device at real time.

Then after I want to listen 15 seconds history so I want to roll back to 15 seconds back and listen history for 15 seconds. (I can loose real time data while listening history). 

I think that BufferedWaveProvider buffers the desired length of data with BufferDuration property.  

So in this case I only want BufferedWaveProvider to provide me a buffer (for waveOut device) that is from history of n seconds instead of latest.

İs this possible?

New Post: AudioSubType Complement

$
0
0

I thought IMFTransform used to make convertions

I found a simple example:

http://code.google.com/p/bitspersampleconv2/wiki/HowToUseResamplerMFT

New Post: How to code circullar buffer to create a delay and used the wavein for micrcophone and output to speaker?

$
0
0

you'd need to make your own customised version of BufferedWaveProvider in order to do this.

New Post: AudioSubType Complement

$
0
0
yes, I will be reimplementing the DMO Resampler but using MFT (it's the same COM object that implements both interfaces). But from what I can see if you ask the reader for a particular sample rate it will insert a resampler in the playback pipeline automatically. One advantage of this is that it can automatically flush the resampler buffer when you reposition, which can't be done in the playback device.


On 29 November 2012 12:50, ManuN <notifications@codeplex.com> wrote:

From: ManuN

I thought IMFTransform used to make convertions

I found a simple example:

http://code.google.com/p/bitspersampleconv2/wiki/HowToUseResamplerMFT

Read the full discussion online.

To add a post to this discussion, reply to this email (naudio@discussions.codeplex.com)

To start a new discussion for this project, email naudio@discussions.codeplex.com

You are receiving this email because you subscribed to this discussion on CodePlex. You can unsubscribe or change your settings on codePlex.com.

Please note: Images and attachments will be removed from emails. Any posts to this discussion will also be available online atcodeplex.com


New Post: How to code circullar buffer to create a delay and used the wavein for micrcophone and output to speaker?

New Post: AudioSubType Complement

Source code checked in, #a09115bcae84

$
0
0
adding some new audio format subtypes (thans ManuN) and reformatting some GUIDs

New Post: AudioSubType Complement

New Post: AcmNotPossible calling acmStreamOpen

$
0
0

Here is a more detailed description:

OS: Windows 7 Professional 64-bit

NAudio: 1.6.0

Current Code:

                WaveFileReader reader = new NAudio.Wave.WaveFileReader("a.wav");
                WaveStream downsampledStream = new WaveFormatConversionStream(new WaveFormat(8000, reader.WaveFormat.BitsPerSample, reader.WaveFormat.Channels), reader);
                WaveStream alawStream = new WaveFormatConversionStream(WaveFormat.CreateALawFormat(downsampledStream.WaveFormat.SampleRate, downsampledStream.WaveFormat.Channels), downsampledStream);

 

Fails with exception at the last line (alawStream).

After the last test failed I changed the input file.

VLC says: PCM U8 (araw), Mono, 22050 Hz, Bits per Sample: 8

 

When I debug step by step the three object reader, downsampledStream and alawStream contain the following information:

reader: WaveFormat    {8 bit PCM: 22kHz 1 channels}    NAudio.Wave.WaveFormat {NAudio.Wave.WaveFormatExtraData}

downsampledStream:  WaveFormat    {8 bit PCM: 8kHz 1 channels} , TotalTime    {00:00:07.9250000}, CurrentTime    {00:00:00}

alawStream: Exception NAudio.MmResult.AcmNotPossible {"AcmNotPossible calling acmStreamOpen"}

   bei NAudio.MmException.Try(MmResult result, String function)
   bei NAudio.Wave.Compression.AcmStream..ctor(WaveFormat sourceFormat, WaveFormat destFormat)
   bei NAudio.Wave.WaveFormatConversionStream..ctor(WaveFormat targetFormat, WaveStream sourceStream)
   bei RTPtest.Form1.Form1_Load(Object sender, EventArgs e) in C:\Users\joka\documents\visual studio 2010\Projects\RTPtest\RTPtest\Form1.cs:Zeile 50.

New Post: AcmNotPossible calling acmStreamOpen

$
0
0

you should be starting with 16 bit audio, not 8 bit. The a-law encoder converts from 16 bit to 8 bit.

New Post: AcmNotPossible calling acmStreamOpen

$
0
0

Thanks a lot. That works.

Is there any overview which conversions are supported? I think it depends on the codec!?

New Post: AcmNotPossible calling acmStreamOpen

$
0
0

Every codec has a list of supported input and output formats. You can look at these using the ACM demo in the NAudio Demo project.

Viewing all 5831 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>