Quantcast
Channel: NAudio
Viewing all 5831 articles
Browse latest View live

New Post: Write mp3 file to serial port to play on other side on phone call

$
0
0
Hi mark

Waiting for your reply.

Thanks

New Post: Support for G722.2

$
0
0
Hi there,

just curious if there is native support for G722.2 codec within NAudio?

New Post: Support for G722.2

New Post: Write mp3 file to serial port to play on other side on phone call

$
0
0
Hi

Please Guide me where i am wrong, This code is not working. If this is not possible with this library or something is missing, please give me right direction. Scratching my head from last one week, still sucked at this point.

Regards

Rakesh Kumar

New Post: How to play a wave-file with a custom format?

$
0
0
Freefall wrote:
Hi, you could try this instead of SampleChannel:
SampleProvider = RawSourceWaveStream.ToSampleProvider();
Unfortunately, not working.
ToSampleProvider:
public static ISampleProvider ToSampleProvider(this IWaveProvider waveProvider)
{
    return SampleProviderConverters.ConvertWaveProviderIntoSampleProvider(waveProvider);
}
SampleChannel's constuctor:
public SampleChannel(IWaveProvider waveProvider, bool forceStereo)
{
    ISampleProvider source = SampleProviderConverters.ConvertWaveProviderIntoSampleProvider(waveProvider);
    ...
Although, this seems to be an ACM error which indicates that the Audio Compression Manager isn´t capable of your self-created waveformat and fails converting it.
Yes. And when I try to initialize WaveOut without SampleChannel I get error:
_waveOut.Init(_waveProvider);
// WaveOut: WaveBadFormat calling waveOutOpen
// DirectSound: (Localized message about non primitive data)
// WasAPI: Not a supported encoding Adpcm

New Post: How to play a wave-file with a custom format?

$
0
0
You will need to use WaveFormatConversionStream.CreatePcmStream to convert to linear PCM before playing. However, you do need to make sure your WaveFormat matches the one ACM is expecting. You can use the NAudio demo application to scrutinize the details of your ACM codecs, or maybe more simply by using AdpcmWaveFormat.

New Post: How to play a wave-file with a custom format?

$
0
0
markheath wrote:
You will need to use WaveFormatConversionStream.CreatePcmStream to convert to linear PCM before playing. However, you do need to make sure your WaveFormat matches the one ACM is expecting. You can use the NAudio demo application to scrutinize the details of your ACM codecs, or maybe more simply by using AdpcmWaveFormat.
Yeahoo!
WaveFormatConversionStream.CreatePcmStream(waveProvider) is working! Thank you!

P.S. AdpcmWaveFormat is unusable because it contains protected fields without public accessors. And some classes trying to marshal WaveFormat bypass Serialize method. =\ Instead, I use WaveFormat FromFormatChunk. It works fine for me. (:

New Post: Visual Studio C# and USB MIDI Keyboard Alesis Q88...

$
0
0
Hi there,

I have a question about NAUDIO. Is it possible to catch events / signals from my external midi keyboard connected to PC via USB cable? I am trying to write my own piano software for my usb midi keyboard but i stuck just at the beggining - don't know how to detect, catch signals from connected keyboard.

Please, any advise or help? Or do you know any existing sources / tutorials? Maybe someone did that before?

Greetings.

New Post: If signal is already stereo, how do I pan left/right?

$
0
0
Is there a description of the steps to perform, if my input and expected output is a WaveStream?

New Post: how to play gsm and dss files using NAudio

$
0
0
Hi Guys any one Give me sample code of Playing .gsm and .dss files.

Thanks in advance.

New Post: WaveStillPlaying calling WaveOutWrite

$
0
0
I get this error often when first pausing a file playing and then resuming using WaveOutEvent.

I also get these messages in my output folder when paused:

WARNING: WaveOutEvent callback event timeout
WARNING: WaveOutEvent callback event timeout
WARNING: WaveOutEvent callback event timeout
WARNING: WaveOutEvent callback event timeout
WARNING: WaveOutEvent callback event timeout
WARNING: WaveOutEvent callback event timeout

Evert 300 ms (Do to DesiredLatency) WaveOutEvent.cs line 152


Any ideas?

New Post: Mute Left Or Right Channel While Playing Audio

$
0
0
Is it possible to mute left or right channel while playing audio by using naudio?

New Post: Information on RawWaveSourceStream?

$
0
0
Hi,

I need to clone a WaveStream object. I'v read RawWaveSourceStream is the way to go, but i can't find the method inside the NAudio namespace nor any documentation about it online.

Using NAudio v.1.7. in combination with C# (Visual Studio 2013).

Thanks,

Bastiaan

New Post: Decoding *.spx files for playback

$
0
0
Hello,

I actually tried to use NSpeex (Link) to play Speex files from disk, which didn´t work whatever I tried (always Bandmode error).

Is there any example how to use the NAudio NSpeex plugin for file decoding? I´d like to have a Wavestream for this format, but in fact the port seems to only decode packets and not reads full files.

Any help would be appreciated.

Cheers.

New Post: Information on RawWaveSourceStream?

$
0
0
Hello,

a Wavestream is a class object, that NAudio has inherited from stream, see:
http://mark-dot-net.blogspot.de/2008/06/naudio-wavestream-in-depth.html

If you want to create a custom Wavestream then inherit from Wavestream and fill out the auto-generated methods that visual studio generates for you.

A RawsourceWavestream (I think you wrote it false and couldn´t therefore find it) is just a StreamReader for files that only contain wave data. So you must tell the RawsourceWavestream which waveformat the data has, and it will read it for you.

Cheers.

New Post: System.IndexOutOfRangeException using WaveFileWriter

$
0
0
Hi. I'm struggling with the below exception using WaveFileWriter. Code snippet below:
MemoryStream s = new MemoryStream();
MemoryStream newStream = new MemoryStream();

int length = 0;
byte[] buffer = null;
int read = 0;   

MixingSampleProvider mixer2 = new MixingSampleProvider(_samples);
SampleToWaveProvider16 mixer3 = new SampleToWaveProvider16(mixer2);

length = mixer3.WaveFormat.AverageBytesPerSecond*Convert.ToInt32(noisePosition.TotalSeconds);
buffer = new byte[length];

WaveFileWriter waveFileWriter = new WaveFileWriter(new IgnoreDisposeStream(s), mixer3.WaveFormat);

while ((read = mixer3.Read(buffer, 0, buffer.Length)) > 0)
{
    waveFileWriter.Write(buffer, 0, read);
}

waveFileWriter.Flush();
waveFileWriter.Close();
waveFileWriter.Dispose();   

s.WriteTo(newStream);
Here are the details of the Exception:

System.IndexOutOfRangeException was caught
  HResult=-2146233080
  Message=Index was outside the bounds of the array.
  Source=NAudio
  StackTrace:
       at NAudio.Wave.SampleProviders.Pcm16BitToSampleProvider.Read(Single[] buffer, Int32 offset, Int32 count)
       at NAudio.Wave.SampleProviders.OffsetSampleProvider.Read(Single[] buffer, Int32 offset, Int32 count)
       at NAudio.Wave.SampleProviders.MixingSampleProvider.Read(Single[] buffer, Int32 offset, Int32 count)
       at NAudio.Wave.SampleProviders.OffsetSampleProvider.Read(Single[] buffer, Int32 offset, Int32 count)
       at NAudio.Wave.SampleProviders.MixingSampleProvider.Read(Single[] buffer, Int32 offset, Int32 count)
       at NAudio.Wave.SampleProviders.SampleToWaveProvider16.Read(Byte[] destBuffer, Int32 offset, Int32 numBytes)
       at GamedayRadio.HalfInning.Process() in xxxxxx
  InnerException: 
Thanks in advance for any help/ideas/pointers/direction you can provide.

New Post: Record from 2 microphones

$
0
0
I have a laptop with native mic and a web cam with mic, i want to capture sound from these two devices. But when i'm tryin' do this i have an error:

already allocated calling waveinopen

NAudio have searched for 3 devices, and i wrote next:
    NAudio.Wave.WaveIn sourceStream = null;
    NAudio.Wave.WaveIn sourceStream2 = null;
    NAudio.Wave.WaveIn sourceStream3 = null;
    NAudio.Wave.DirectSoundOut waveOut = null;
    NAudio.Wave.DirectSoundOut waveOut2 = null;
    NAudio.Wave.DirectSoundOut waveOut3 = null;
    NAudio.Wave.WaveFileWriter waveWriter = null;
    NAudio.Wave.WaveFileWriter waveWriter2 = null;
    NAudio.Wave.WaveFileWriter waveWriter3 = null;

    private void button5_Click(object sender, EventArgs e)
    {
        for (int i = 0; i < NAudio.Wave.WaveIn.DeviceCount; i++)
        {
            if (sourceStream == null)
            {
                sourceStream = new NAudio.Wave.WaveIn();
                sourceStream.DeviceNumber = i;
                sourceStream.WaveFormat = new NAudio.Wave.WaveFormat(44100, NAudio.Wave.WaveIn.GetCapabilities(i).Channels);

                sourceStream.DataAvailable += new EventHandler<NAudio.Wave.WaveInEventArgs>(sourceStream_DataAvailable);
                waveWriter = new NAudio.Wave.WaveFileWriter(String.Concat("D:/record-", i + 1, ".wav"), sourceStream.WaveFormat);

                sourceStream.StartRecording();
                continue;
            }

            if (sourceStream2 == null)
            {
                sourceStream2 = new NAudio.Wave.WaveIn();
                sourceStream2.DeviceNumber = i;
                sourceStream2.WaveFormat = new NAudio.Wave.WaveFormat(44100, NAudio.Wave.WaveIn.GetCapabilities(i).Channels);

                sourceStream2.DataAvailable += new EventHandler<NAudio.Wave.WaveInEventArgs>(sourceStream2_DataAvailable);
                waveWriter2 = new NAudio.Wave.WaveFileWriter(String.Concat("D:/record-", i + 1, ".wav"), sourceStream2.WaveFormat);



                sourceStream2.StartRecording();
                continue;
            }

            if (sourceStream3 == null)
            {
                sourceStream3 = new NAudio.Wave.WaveIn();
                sourceStream3.DeviceNumber = i;
                sourceStream3.WaveFormat = new NAudio.Wave.WaveFormat(44100, NAudio.Wave.WaveIn.GetCapabilities(i).Channels);

                sourceStream3.DataAvailable += new EventHandler<NAudio.Wave.WaveInEventArgs>(sourceStream_DataAvailable);
                waveWriter3 = new NAudio.Wave.WaveFileWriter(String.Concat("D:/record-", i + 1, ".wav"), sourceStream3.WaveFormat);

                sourceStream3.StartRecording();
                continue;
            }
        }
    }



error occured in line: sourceStream2.StartRecording();

New Post: Source code commit for latest release of NAudio

$
0
0
What commit level should I use for downloading the latest "release" of NAudio? We build the NAudio DLL's in our project and would like to make sure we have the latest "release" source code.

Thanks
Paul

New Post: Support for G722.2

$
0
0
unfortunately, there may be some differences between G722.2 and G722, but looking into the library to determine best course of action.

Paul

New Post: Unity with Naudio RadioStreaming

$
0
0
I'm working in a script based on the NAudio Demo script modified for streaming a Shoutcast inside my Unity game.

I have tried to remove the original while loop using the update from the MonoBehvaiour class, I only get some noises but not music during the time I'm streaming with this script.

I usually get an error while the execution related with the format
MmException: AcmNotPossible calling acmStreamConvert NAudio.MmException.Try (MmResult result, System.String function) NAudio.Wave.Compression.AcmStreamHeader.Convert (Int32 bytesToConvert, System.Int32& sourceBytesConverted) NAudio.Wave.Compression.AcmStream.Convert (Int32 bytesToConvert, System.Int32& sourceBytesConverted) NAudio.Wave.AcmMp3FrameDecompressor.DecompressFrame (NAudio.Wave.Mp3Frame frame, System.Byte[] dest, Int32 destOffset)
I have tried with different radios online, but I always get that error. I don't know what is happening... Any help?

public class NAudioStreamer : MonoBehaviour {
private IWavePlayer mWaveOutDevice;


private WaveStream  mMainOutputStream;
private WaveChannel32 mVolumeStream;
private VolumeWaveProvider16 volumeProvider;



private string m_Url = "http://37.59.32.115:8122/";


enum StreamingPlaybackState
{
    Stopped,
    Playing,
    Buffering,
    Paused
}

private volatile StreamingPlaybackState playbackState = StreamingPlaybackState.Stopped;


private bool fullyDownloaded = false;
public bool m_Play = false;
float timer;

void Update()
{
    if (m_Play)
    {
        playbackState = StreamingPlaybackState.Buffering;
        StreamMP3(m_Url);
        m_Play = false;
    }

    switch (playbackState)
    {
        case StreamingPlaybackState.Buffering:
        case StreamingPlaybackState.Playing:
            StreamMP3(m_Url);
            break;

        default:        
            break;
    }

}

HttpWebRequest webRequest;
BufferedWaveProvider bufferedWaveProvider = null;
byte[] buffer = new byte[16384 * 4];


private void StreamMP3(string lUrl)
{
    this.fullyDownloaded = false;
    webRequest = (HttpWebRequest)WebRequest.Create(lUrl);

    int metaInt = 0; // blocksize of mp3 data

    webRequest.Headers.Clear();
    webRequest.Headers.Add("GET", "/ HTTP/1.0");

    webRequest.Headers.Add("Icy-MetaData", "1");
    webRequest.UserAgent = "WinampMPEG/5.09";

    HttpWebResponse resp = null;
    try
    {
        resp = (HttpWebResponse)webRequest.GetResponse();
    }
    catch(WebException e)
    {
        if (e.Status != WebExceptionStatus.RequestCanceled)
        {
            Debug.LogError(e.Message);
        }
        return;
    }
     // needs to be big enough to hold a decompressed frame

    try
    {
        // read blocksize to find metadata block
        metaInt = Convert.ToInt32(resp.GetResponseHeader("icy-metaint"));

    }
    catch
    {
    }

    IMp3FrameDecompressor decompressor = null;

    try
    {
        using (var responseStream = resp.GetResponseStream())
        {

            ReadFullyStream readFullyStream = new ReadFullyStream(responseStream);
            //do
            {
                if (bufferedWaveProvider != null && bufferedWaveProvider.BufferLength - bufferedWaveProvider.BufferedBytes < bufferedWaveProvider.WaveFormat.AverageBytesPerSecond / 4)
                {
                    Debug.LogError("Buffer getting full, taking a break");
                    Thread.Sleep(500);
                }
                else
                {
                    Mp3Frame frame = null;
                    try
                    {

                        frame = Mp3Frame.LoadFromStream(readFullyStream, true);



                    }
                    catch (EndOfStreamException)
                    {
                        this.fullyDownloaded = true;
                        Debug.LogError("reached the end of the MP3 file / stream");
                        // reached the end of the MP3 file / stream
// break;
                    }
                    catch (WebException)
                    {
                        // probably we have aborted download from the GUI thread
// break;
                    }
                    if (decompressor == null && frame != null)
                    {
                        // don't think these details matter too much - just help ACM select the right codec
                        // however, the buffered provider doesn't know what sample rate it is working at
                        // until we have a frame
                        WaveFormat waveFormat = new Mp3WaveFormat(frame.SampleRate, frame.ChannelMode == ChannelMode.Mono ? 1 : 2, frame.FrameLength, frame.BitRate);
                        decompressor = new AcmMp3FrameDecompressor(waveFormat);
                        if(bufferedWaveProvider == null)
                        {
                            this.bufferedWaveProvider = new BufferedWaveProvider(decompressor.OutputFormat);
                            this.bufferedWaveProvider.BufferDuration = TimeSpan.FromSeconds(20); // allow us to get well ahead of ourselves
                        }

                    }

                    int decompressed =  decompressor.DecompressFrame(frame, buffer, 0);

                    if(bufferedWaveProvider != null)
                    {
                        bufferedWaveProvider.AddSamples(buffer, 0, decompressed);

                    }
                }
            } 

            if (this.mWaveOutDevice == null && this.bufferedWaveProvider != null)
            {
                Debug.Log("Creating WaveOut Device");
                this.mWaveOutDevice = new  WaveOut();
                this.volumeProvider = new VolumeWaveProvider16(bufferedWaveProvider);
                this.volumeProvider.Volume = 100.0f;
                mWaveOutDevice.Init(volumeProvider);

            }
            else if (bufferedWaveProvider != null)
            {
                double bufferedSeconds = bufferedWaveProvider.BufferedDuration.TotalSeconds;
                if(bufferedSeconds > 0.2f && playbackState == StreamingPlaybackState.Buffering)
                {
                    Debug.Log("PLaying music...");
                    mWaveOutDevice.Play();
                    playbackState = StreamingPlaybackState.Playing;
                }

            }
        }
    }
    finally
    {
        if (decompressor != null)
        {
            decompressor.Dispose();
        }
    }
}
}
Viewing all 5831 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>