IPB

Welcome Guest ( Log In | Register )

 
Reply to this topicStart new topic
MS WaveIn/Out API Synchronization Algorithm, a waveboard clock problem/discussion
cox
post Jan 1 2004, 14:28
Post #1





Group: Members
Posts: 1
Joined: 1-January 04
From: Rio de Janeiro
Member No.: 10850



Hi,

I am implementing a software to play audio data in different computers. The data is captured from a computer that sends thru network to another computer to playback.

I can use audio codecs, it's generic. You need to initialize the class with parameters to use codecs or not. I already implemented Speex, mLaw, uLaw codecs.

It's working fine but I have one problem. When I start to play, in some situations when one computers is playing the audio data a delay rises, I need to bufferize to do not loose data.

I was reading some articles and I realize that exists a problem of WaveBoards clocks dissimilitarity.

I guess that to guarantee the playback rate I will need to drop or to copy samples in my lpData Buffer.

If it's correct, I want to know how can I compute this number? I know that my buffer represents one time period. Do I need to compute the real time that the board take to play the audio-data and then use this reference to others packets/buffers or I have to compute this synchronization number for every packet/buffer? Or I am wrong, It's not a solution.

Any help is very good! wink.gif

Sorry my english mistakes, If you do not understand any part tell me and I will try to write again.

Thanks

cox cool.gif


--------------------
.cox

-- Guilherme Cox
Go to the top of the page
+Quote Post
Jasper
post Jan 2 2004, 12:30
Post #2





Group: Members
Posts: 189
Joined: 9-July 02
Member No.: 2536



You could indeed try compensating by checking how long it really takes to play buffers. But as you are getting your sound from a live feed you might be better off to let you be guided by the recording, just make sure your buffer doesn't grow too large or small. That way all computers would play more or less synchronized with the recording computer.
Go to the top of the page
+Quote Post
wkwai
post Jan 3 2004, 15:12
Post #3


MPEG4 AAC developer


Group: Developer
Posts: 398
Joined: 1-June 03
Member No.: 6943



I am not very sure if there exists a playback rate dissimilarities between various sound cards.. You will have to measure that!!

Anyway, why don't you use direct sound APIs? The bufferings are taken care of by the libraries.. It is a lot simplier than having to construct 2 wavein buffers and swapping the buffers..

There are a lot of ways of sending audio data to the sound card but Direct Sound APIs is the easiests.
Go to the top of the page
+Quote Post
Jasper
post Jan 4 2004, 11:34
Post #4





Group: Members
Posts: 189
Joined: 9-July 02
Member No.: 2536



Of course it is a matter of taste, but I wouldn't want to call DirectSound the easiest way to send audio data to a soundcard. It's actually quite easy to work with waveOut by simply using a queue of buffers. With DirectSound you have to setup events or poll the driver for it's current reading position. And don't forget PortAudio (http://www.portaudio.com/) either, it's not too difficult to work with either and even offers a wrapper for blockwing reads/writes.
Go to the top of the page
+Quote Post

Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 24th November 2014 - 16:06