Quantcast
Channel: NAudio
Viewing all 5831 articles
Browse latest View live

New Post: Creating a mixed file of 4 sounds. How can I achieve that?


New Post: How to concatenate two Wave files in memory to play

$
0
0
what is the WaveFormat of the WAV files? The error you are getting seems to be that the encoder cannot find a suitable WMA encoding to convert to.

New Post: developing for win ce 6.0 ??

$
0
0
I'm afraid I don't think it would work. Probably the best thing to try would be to take the NAudio source code and see if you can compile any of it for the compact framework.

Commented Unassigned: MediaFoundationReader reads less data on Windows 7 (on Windows 8.1 - OK) [16453]

$
0
0
Have got strange result of using MediaFoundationReader to extract audio on Windows 7 64-bit.

Actually, I use MediaFoundationReader for extracting WAVE data to temp file like this:

```
using (var reader = new MediaFoundationReader(sourceFilename))
{
WaveFileWriter.CreateWaveFile(copyFilename, reader);
}
```

After that I read WAVE data with WaveFileReader:

```
using (var sourceFileReader = new WaveFileReader(sourceFilename))
{
Console.WriteLine("Length:\t\t{0}", sourceFileReader.Length);
}

using (var copyFileReader = new WaveFileReader(copyFilename))
{
Console.WriteLine("Length of copy:\t{0}", copyFileReader.Length);
}
```

On Windows 8.1 I get the same data lengths, but on Windows 7 copy has end-trimmed data.

Is there a bug in NAudio or Media Foundation on Windows 7?

Test project in attachment.

![Image](http://auxmic.com/sites/default/files/pictures/naudio_test_win7_x64.png)
![Image](http://auxmic.com/sites/default/files/pictures/naudio_test_win8.1_x64.png)
Comments: is your source file a WAV? I'm not really sure what you're trying to do here

New Post: Modify current audio output stream (add effect)

$
0
0
Greetings all,

my goal is to modify the current audio output stream in order to apply audio effects such as echo or reverb. I currently don't have a clue on how to achieve this with NAudio, but I think it should be possible.

First I would have to retrieve the current active audio output device, then I'd have to apply an effect. Can anyone show me how this can be done?

Thanks in advance!
Regards

New Post: developing for win ce 6.0 ??

New Post: Bug report: WaveFileReader - Argument out of range exception

New Post: Understanding DataAvailable event WaveInEventArgs

$
0
0
I just started using NAudio and I have following code:
        capture = new WasapiCapture(SelectedDevice);
        capture.ShareMode = AudioClientShareMode.Shared;
        capture.WaveFormat = new WaveFormat(48000,32,2);
        SelectedDevice.AudioEndpointVolume.MasterVolumeLevelScalar = 1.0f;//Max
        capture.StartRecording();
        capture.RecordingStopped += OnRecordingStopped;
        capture.DataAvailable += CaptureOnDataAvailable;
and following function to handle DataAvailable event
        private void CaptureOnDataAvailable(object sender, WaveInEventArgs e)
        {

        }
I want to know what will be contents of e.Buffer will be like if my bit depth is 32, sample rate 48000Hz and channel 2?

Is it like this:
         e.Buffer[0] will contain least significant byte of channel 1
         e.Buffer[3] will contain most significant byte of channel 1
         e.Buffer[4] will contain least significant byte of channel 2
         e.Buffer[7] will contain most significant byte of channel 2
         and so on
If not, please explain its format.

New Post: How to concatenate two Wave files in memory to play

$
0
0
Thanks for your time (and for this great library)

The WaveFormat of WAV files is {16 bit PCM: 44kHz 1 channels}. Although, for the records, I'd like to tell that:
  1. We were able to concatenate both WAV files into a new WAV file, checking that the sources had the same format
  2. We were able to convert a single WAV file into a new WMA file, using the same MediaType that we're trying to use now
  3. We were able to join WAV files into a new WMA file, using a MixingSampleProvider, but this resulted somehow in "cutting" the end of both files before joining them (that's why we're trying with a custom provider as suggested in this thread)
Finally, after posting our question, we went on with tests. We tried both using an "implicit" encoder with the single line:
MediaFoundationEncoder.EncodeToWma(waveProvider, outputFile, 16000);
as well as with an "explicit" encoder:
                MediaType wmaMediaType = MediaFoundationEncoder.SelectMediaType(
                    AudioSubtypes.MFAudioFormat_WMAudioV8,
                    new WaveFormat(16000, 1),
                    16000);

                using (MediaFoundationEncoder wmaEncoder = new MediaFoundationEncoder(wmaMediaType))
                {
                    wmaEncoder.Encode(outputFile, waveProvider);
                }
The strange result is that - whether explicit or implicit, whether with mixing provider or custom one - if we start the test application and try with a join right away, we get that exception. Instead, if we first try with converting a single file, and then go with the join, no exception is thrown.

Both operations - joining two files and saving a single WAV file - share much of the same code.

New Post: How to concatenate two Wave files in memory to play

New Post: Understanding DataAvailable event WaveInEventArgs

$
0
0
Yes, it will be little endian, with interleaved samples. You can use BitConverter to help read samples out.
32 bit PCM is a bit unusual. More common to use IEEE float (use WaveFormat.CreateIEEEFloat ... static method)
In fact with WASAPI, I'd recommend leaving the WaveFormat alone. You can't change the sample rate in shared mode anyway. By default you'll get a WAVEFORMATEXTENSIBLE containing 32 bit floating point (IEEE float). You can use BitConverter.ToSingle to read them out

New Post: Creating a mixed file of 4 sounds. How can I achieve that?

$
0
0
Hi, so I have tried to do with SavingWaveProvider and BufferedWaveProvider, as your post about saving and playing audio.
It creates the WAV file, but haven't audio. Why?
I looked at BufferedWaveProvider and it's need somewhere to have a AddSamples function. How can I add this function to the MixingSampleProvider?or to WaveOut.
Here goes some code:
bufferedProviderRec = new BufferedWaveProvider(AudioPlaybackEngine.Instance.Mixer.WaveFormat);
                 savingRec = new SavingWaveProvider(bufferedProviderRec, "mixedAudio.wav");
                 outMix = new WaveOut();
                 outMix.Init(savingRec);
                 outMix.Play();
                 AudioPlaybackEngine.Instance.PlaySound(players[0].GetInfos.Path + "/" + players[0].GetInfos.NameWithExtension);
Thanks again to be so nice.

New Post: How to concatenate two Wave files in memory to play

$
0
0
Oops, sorry about that, I thought I had shared the link with enough permission! Now I accepted your request from Google Drive.

New Post: How to concatenate two Wave files in memory to play

$
0
0
Double checking the source code, one difference we can see is the following:
  1. Code for converting a single WAV file into WMA accesses the source WAV through MediaFoundationReader, which is then fed as IWaveProvider into encoding operation
  2. Code for joining WAV files into a single WMA accesses the source WAV files through WaveFileReader, which are then used to build the IWaveProvider and in turn this is fed into encoding operation
Could this be the root cause? We're going to test this right now.

EDIT:

That was it. We substitued every occurence of WavFileReader with MediaFoundationReader when working with MediaFoundation, and now our custom WaveProvider works great.

Thanks again for your attention

Regards

New Post: How to concatenate two Wave files in memory to play

$
0
0
hi, have just had a look at your project. It's because you're not calling MediaFoundationApi.Startup() before using Media Foundation APIs. I think there are some places where I do this automatically for you, but obviously its not in there everywhere at the moment.

New Post: Creating a mixed file of 4 sounds. How can I achieve that?

$
0
0
OK, if you are using AudioPlaybackEngine, then SavingWaveProvider needs to be put into the signal chain for that. Look for the call to Init in there, and wrap whatever is being passed in (probably a mixer) in a SavingWaveProvider.

New Post: Bug report: WaveFileReader - Argument out of range exception

$
0
0
So they are WAV files, but you didn't use WaveFileWriter to create them? Is that correct? Can you share a code snippet of creating the files?

New Post: Modify current audio output stream (add effect)

$
0
0
If you want to add effects to audio that you are playing yourself with NAudio, then this is done simply by creating an ISampleProvider that processes the samples as they are played. You would need to create the effect algorithms yourself, or borrow them from elsewhere.

However, if you want to intercept sound being played by other applications, this is not something that the underlying Windows audio infrastructure allows you to do easily, and cannot be done with NAudio.

New Post: Implemented Loopback capturing for NAudio

$
0
0
Yes, use WasapiLoopbackCapture to capture the audio, and WaveFileWriter to save it to disk. Look at the NAudio demo application source code if you want to see examples of how to do this.

New Post: Modify current audio output stream (add effect)

$
0
0
Windows can do that? Sounds interesting. How?` Ps. I'm looking forward to modify the sound of all running applications, i.e. the output device (speakers)
Viewing all 5831 articles
Browse latest View live