For the moment pcbSize = 18 (standard type) or 40 (with extensible format)
For the moment pcbSize = 18 (standard type) or 40 (with extensible format)
change
Add Reader.Dispose :: otherwise play the same song ;-)
privatebool TryOpenInputFile(string file) {bool isValid = false;try {
// Addif (reader != null) { reader.Dispose(); reader = null; }
// change to reader2using (var reader2 = new MediaFoundationReader(file)) { DefaultDecompressionFormat = reader2.WaveFormat.ToString(); inputFile = file; isValid = true; } }catch (Exception e) { MessageBox.Show(String.Format("Not a supported input file ({0})", e.Message)); }return isValid; }
In MediafoundationReader.cs (CreateReader....)
//var uri = new Uri(file);// Obsolete // var uri = new Uri(file,true) Uri nUrl = null;string str = "";if (Uri.TryCreate(file, UriKind.Absolute, out nUrl)) { str = nUrl.ToString(); }else { str = nUrl.AbsoluteUri; } MediaFoundationInterop.MFCreateSourceReaderFromURL(str, null, out reader);
Hello,
When i use the Stop() method, it pauses the MP3 but it doesn't Stop the MP3.
I need to use the a method to stop the song and when i use the Play() method it starts from the begin of the MP3.
I have implemented this MP3 reader:
http://naudio.codeplex.com/wikipage?title=MP3
But the last part CloseWaveOut() i have not implemented, i don't know if it is necessary.
thanks a lot.
Hi,
I was always getting null value for InputFileFormats and getting error like this...
Value cannot be null.Parameter name: source
[ImportMany(typeof(IInputFileFormatPlugin))]
public IEnumerable<IInputFileFormatPlugin> InputFileFormats { get; set; }
private IInputFileFormatPlugin GetPluginForFile(string fileName)
{
return (from f in this.InputFileFormats where fileName.EndsWith(f.Extension, StringComparison.OrdinalIgnoreCase) select f).FirstOrDefault();
}
I was using AudioFileReader...
is there any different using wavestream and AudioFileReader.
Thanks & Regards,
Hinshin
From: hinshin
Hi,
I was always getting null value for InputFileFormats and getting error like this...
Value cannot be null.Parameter name: source
[ImportMany(typeof(IInputFileFormatPlugin))]
public IEnumerable<IInputFileFormatPlugin> InputFileFormats { get; set; }
private IInputFileFormatPlugin GetPluginForFile(string fileName)
{
return (from f in this.InputFileFormats where fileName.EndsWith(f.Extension, StringComparison.OrdinalIgnoreCase) select f).FirstOrDefault();
}
I was using AudioFileReader...
is there any different using wavestream and AudioFileReader.
Thanks & Regards,
Hinshin
Read the full discussion online.
To add a post to this discussion, reply to this email (naudio@discussions.codeplex.com)
To start a new discussion for this project, email naudio@discussions.codeplex.com
You are receiving this email because you subscribed to this discussion on CodePlex. You can unsubscribe or change your settings on codePlex.com.
Please note: Images and attachments will be removed from emails. Any posts to this discussion will also be available online atcodeplex.com
Background
I have a reasonably complex application where an NAudio WaveIn stream is started and seems to run correctly. The input has two 44,100 channels. But if additional windows in the application are opened the application falls over with an "Access violation on location 0x00000000". These additional windows have nothing to do with NAudio code. The OS is Windows 7.
On investigation it appears that the problem only occurs if the waveIn BufferMilliseconds value has been increased to 482 milliseconds or greater, giving a buffer over 85,000 bytes. It is possible the crash may be connected to the garbage collector coming in soon after the additional, and memory hungry, application windows open.
Possible Cause
Suspecting a memory issue I changed the following code in the NAudio file WaveInBuffer.cs so the 'header' value is now 'pinned'. This seems to fix the problem.
//hHeader = GCHandler.Alloc(header);
hHeader = GCHandler.Alloc(header, GCHandlerType.Pinned);
I am not certain of the exact difference between the GCHandlerTypes 'Normal' and 'Pinned' but it may be worthwhile making this change to NAudio.
Kind Regards
John C
I stumbled upon a type not listed Encoding. (0x1610) (file m4a)
So I added a few types WaveformatEncoding
I'll see if I can add HEAACWAVEFORMAT (if necessary)
///<summary>/// Advanced Audio Coding (AAC) audio in Audio Data Transport Stream (ADTS) format./// The format block is a WAVEFORMATEX structure with wFormatTag equal to WAVE_FORMAT_MPEG_ADTS_AAC.///</summary>///<remarks>/// The WAVEFORMATEX structure specifies the core AAC-LC sample rate and number of channels, /// prior to applying spectral band replication (SBR) or parametric stereo (PS) tools, if present./// No additional data is required after the WAVEFORMATEX structure.///</remarks>///<see>http://msdn.microsoft.com/en-us/library/dd317599%28VS.85%29.aspx</see> MPEG_ADTS_AAC = 0x1600,///<summary></summary>///<remarks>Source wmCodec.h</remarks> MPEG_RAW_AAC = 0x1601,///<summary>/// MPEG-4 audio transport stream with a synchronization layer (LOAS) and a multiplex layer (LATM)./// The format block is a WAVEFORMATEX structure with wFormatTag equal to WAVE_FORMAT_MPEG_LOAS.///</summary>///<remarks>/// The WAVEFORMATEX structure specifies the core AAC-LC sample rate and number of channels, /// prior to applying spectral SBR or PS tools, if present./// No additional data is required after the WAVEFORMATEX structure.///</remarks>///<see>http://msdn.microsoft.com/en-us/library/dd317599%28VS.85%29.aspx</see> MPEG_LOAS = 0x1602,///<summary></summary>///<remarks>Source wmCodec.h</remarks> NOKIA_MPEG_ADTS_AAC = 0x1608,///<summary></summary>///<remarks>Source wmCodec.h</remarks> NOKIA_MPEG_RAW_AAC = 0x1609,///<summary></summary>///<remarks>Source wmCodec.h</remarks> VODAFONE_MPEG_ADTS_AAC = 0x160A,///<summary></summary>///<remarks>Source wmCodec.h</remarks> VODAFONE_MPEG_RAW_AAC = 0x160B,///<summary>/// High-Efficiency Advanced Audio Coding (HE-AAC) stream./// The format block is an HEAACWAVEFORMAT structure.///</summary>///<see>http://msdn.microsoft.com/en-us/library/dd317599%28VS.85%29.aspx</see> MPEG_HEAAC = 0x1610,
Hello everyone,
I am a new user of NAudio and I am trying to capture audio from my webcam using WaveIn and is working fine. But I am not getting how to achieve simultaneous playback of this captured audio.
Could you please help me to do this? What interface should I use here?
Thanks.
Hey guys I got the answer. We just need to use WaveOut object with the help of WaveInProvider! It was very simple... :-)
hi,
yes i am hearing audio, it is not hitting OnPostVOlumeMeter event..
private IInputFileFormatPlugin GetPluginForFile(string fileName)
{
return (from f in this.InputFileFormats where fileName.EndsWith(f.Extension, StringComparison.OrdinalIgnoreCase) select f).FirstOrDefault();
}
It returns null value for me...
Hi,
First of all I just want to say that I've been researching this for a couple of days without any luck. I've browsed and search through the discussion board without finding an answer.
I was wondering if someone could explain the function "AcmMp3FrameDecompressor.DecompressFrame" for me, what does it actually do?
I have a mp3 stream coming in and I do as follow:
using(var ms = new MemoryStream(streamBuffer) { var frame = Mp3Frame.LoadFromStream(ms); var f = new WaveFormat(); var format = new Mp3WaveFormat(frame.SampleRate, frame.ChannelMode == ChannelMode.Mono ? 1 : 2, frame.FrameLength, frame.BitRate); if (decompress == null) decompress = new AcmMp3FrameDecompressor(f); decompressed = decompress.DecompressFrame(frame, buffer, 0); }
What does the "DecompressFrame" actually give me? I'm after a way to convert the incoming mp3 stream into for example A-law format. Is this the correct way? I understand that in order to actually get A-law I have to later convert it to A-law format using"WaveFormat.CreateALawFormat", but first I need to understand what I get from that function mentioned above.
Hello guys,
Now there is one more issue. I wanted to enable/disable audio playback at run time but without stopping the audio capture. I am stopping the playback using WaveOut.Stop() method but after few seconds my application is throwing InvalidOperationException (with message Buffer full). I went through the code and I observed that there is no way to stop the working of WaveInProvider; it go on adding captured audio samples to the BufferedWaveProvider because WaveIn is still capturing audio samples. And once buffer is full then BufferedWaveProvider throws InvalidOperationException.
Will you please suggest some possible solution for solving this issue?
I was thinking of adding Start and Stop methods to the IWaveProvider interface. Will it be fine?
don't use GetPluginForFile at all. Just make an AudioFileReader. That is all you need to do.
this.fileWaveStream = new AudioFileReader(fileName);
DecompressFrame turns MP3 into PCM. You cannot go directly from MP3 to A-law. You must go to PCM first. Also, a-law is almost always 8kHz mono, so you would need to resample as well. And really sure you want to convert it to A-law? Unless you are integrating with some antiquated telephony hardware, I can think of no reason to want to do this. Any music you use this on will sound horrible.
I have written a detailed article on CodeProject about how to convert between any audio formats which you can access here.
Well you can either stop putting audio into your buffer while you are stopped, or simply set your BufferedWaveProvider up to discard on overflow (set the DiscardOnBufferOverflow property to true).
thanks, WAVEFORMAT encodings will matter a lot less in the future since MediaFoundation uses MediaTypes not WaveFormat for everything, but it is still handy to have a good guide to what the possible options are.
Mark
Hi,
My exact code is,
play button click
reader = new AudioFileReader(fileName);
player.Init(reader);
player.Play();
ISampleProvider sampleProvider = null;
sampleProvider = CreateInputStream(fileName);
return;
private ISampleProvider CreateInputStream(string fileName)
{
this.fileWaveStream = new AudioFileReader(fileName);
var waveChannel = new SampleChannel(this.fileWaveStream, true);
var postVolumeMeter = new MeteringSampleProvider(waveChannel);
postVolumeMeter.StreamVolume += OnPostVolumeMeter;
return postVolumeMeter;
}
void OnPostVolumeMeter(object sender, StreamVolumeEventArgs e)
{
volumeMeter1.Amplitude = e.MaxSampleValues[0];
volumeMeter2.Amplitude = e.MaxSampleValues[1];
}
But it was not hitting OnPostVolumeMeter event... Please correct my mistake...
Thanks & Regards,
Hinshin