IPB

Welcome Guest ( Log In | Register )

2 Pages V  < 1 2  
Reply to this topicStart new topic
Streaming FLAC over the Internet, Split from "The Future of FLAC"
funkyblue
post Nov 24 2012, 01:10
Post #26





Group: Members
Posts: 322
Joined: 28-November 01
From: South Australia
Member No.: 555



I realise that but the link posted a bunch of cryptic commands instead of a simple guide or the fact that dBpoweramp has a uncompress mode build in smile.gif
Go to the top of the page
+Quote Post
73ChargerFan
post Nov 24 2012, 02:02
Post #27





Group: Members
Posts: 38
Joined: 19-December 06
Member No.: 38813



Seeking - store a seek table in the header, say for every half second. When a user seeks backwards, the player seeks to the half second mark prior to that position, and then processes forward. With caching this could work well, and the size of the table would be small in relation to the file size. And if it is a piece of meta data, then tracks with the table wouldn't break older FLAC players.
Go to the top of the page
+Quote Post
Porcus
post Nov 24 2012, 02:20
Post #28





Group: Members
Posts: 1936
Joined: 30-November 06
Member No.: 38207



QUOTE (andrewfg @ Nov 23 2012, 23:49) *
Nevertheless information theory has nothing to say about algorithms that can losslessly compress a data stream at a predetermined number of bytes per second.


I wonder what you mean by «a predetermined number» ...


--------------------
One day in the Year of the Fox came a time remembered well
Go to the top of the page
+Quote Post
Soap
post Nov 24 2012, 03:44
Post #29





Group: Members
Posts: 1016
Joined: 19-November 06
Member No.: 37767



QUOTE (andrewfg @ Nov 23 2012, 18:49) *
.... But, statistically this algorithm will center its statistical Gaussian peak on the selected target CBR BPS. And the longer the file, the closer it will get to the target CBR BPS...

If you want to submit some math to prove me wrong, please feel free to do so.


Congrats, you just described an ABR procedure. Nothing about that will provide a Constant Bit Rate unless you pad.

You may know math, but you demonstrably don't know the terminology.


--------------------
Creature of habit.
Go to the top of the page
+Quote Post
greynol
post Nov 24 2012, 03:51
Post #30





Group: Super Moderator
Posts: 10079
Joined: 1-April 04
From: San Francisco
Member No.: 13167



You pad and your compression goes out the window. Provide the math to prove me wrong. wink.gif


--------------------
Your eyes cannot hear.
Go to the top of the page
+Quote Post
saratoga
post Nov 24 2012, 03:55
Post #31





Group: Members
Posts: 5052
Joined: 2-September 02
Member No.: 3264



I guess in principle you'd have to pad for at least as high a bitrate as your uncompressed PCM, since you can't make a format that compresses every possible stream smaller than the source. Of course if you're ok with hard to encode samples skipping, you could still do fairly well.
Go to the top of the page
+Quote Post
greynol
post Nov 24 2012, 04:00
Post #32





Group: Super Moderator
Posts: 10079
Joined: 1-April 04
From: San Francisco
Member No.: 13167



If you could manage to ensure that the only data that would grow is undesireale data, then sure. Good luck with that!


--------------------
Your eyes cannot hear.
Go to the top of the page
+Quote Post
Porcus
post Nov 24 2012, 11:36
Post #33





Group: Members
Posts: 1936
Joined: 30-November 06
Member No.: 38207



Well if you are encoding the signal in advance, you could – in principle – get a CBR at a bitrate equal to the peak. That would still save bandwidth, provided that the signal doesn't max out the bitrate anywhere.

If you are recording and streaming «live» (bar the encoding delay), that's a different story.


--------------------
One day in the Year of the Fox came a time remembered well
Go to the top of the page
+Quote Post
andrewfg
post Nov 24 2012, 14:04
Post #34





Group: Members
Posts: 84
Joined: 12-May 08
Member No.: 53478



QUOTE (greynol @ Nov 24 2012, 00:58) *
Despite your insistence that you're smarter than the rest of us,


Hey look, I am certainly NOT insisting that I am smarter than the rest of you.

I was merely reacting (badly) to the post of "soap" in which he wacked a wiki url ("spaghetti monster") at me and thereby claimed that he is smarter than me.

QUOTE (greynol @ Nov 24 2012, 00:58) *
I think I'll wait until I see your idea put into action and take note of what those CBR bitrates actually are, thanks. I won't be holding my breath on the bitrates being interesting, let alone the idea being implemented; in other words, I think we'll chalk this up as a fail on your part after it is all said and done.


I don't think you should chalk any such "fail" to me. The OP had asked about the future of flac. And I indicated that there is a specific weakness of flac in relation to its use for online streaming that could be detrimental to its future penetration in media servers and renderers. And I have proposed a possible solution. I don't think it is a "fail" on my behalf if that proposed solution would not be implemented. But it would sadly be a "fail" on behalf of the flac community, if the format were to die out because the community failed to look into this weakness.

It would be nice if others on this thread would be a little more humble in listening to criticisms, rather than (try to) shoot the messenger...



--------------------
AndrewFG (Whitebear -- http://www.whitebear.ch/mediaserver )
Go to the top of the page
+Quote Post
Soap
post Nov 24 2012, 14:21
Post #35





Group: Members
Posts: 1016
Joined: 19-November 06
Member No.: 37767



QUOTE (andrewfg @ Nov 24 2012, 09:04) *
I was merely reacting (badly) to the post of "soap" in which he wacked a wiki url ("spaghetti monster") at me and thereby claimed that he is smarter than me.


Well, "andrewfg", I'd love a cite of where I claimed I was smarter than you, but that is beside the point:

QUOTE (andrewfg @ Nov 24 2012, 09:04) *
And I indicated that there is a specific weakness of flac in relation to its use for online streaming that could be detrimental to its future penetration in media servers and renderers.


You mentioned a specific weakness of VBR encodings where the stream length is undetermined, not a weakness in a particular format but (as was pointed out) a weakness in any VBR compressed format. You went on to claim (and this is the meat of the issue) that one could make a lossless compressed format which was CBR. This is the claim which was responded to and quoted by me, and the claim which is indefensible. Curiously it is also the discussion you are currently choosing to ignore.



--------------------
Creature of habit.
Go to the top of the page
+Quote Post
andrewfg
post Nov 24 2012, 14:52
Post #36





Group: Members
Posts: 84
Joined: 12-May 08
Member No.: 53478



QUOTE (Soap @ Nov 24 2012, 03:44) *
Congrats, you just described an ABR procedure. Nothing about that will provide a Constant Bit Rate unless you pad.


I stand corrected. ABR would be just fine.


--------------------
AndrewFG (Whitebear -- http://www.whitebear.ch/mediaserver )
Go to the top of the page
+Quote Post
greynol
post Nov 24 2012, 15:24
Post #37





Group: Super Moderator
Posts: 10079
Joined: 1-April 04
From: San Francisco
Member No.: 13167



QUOTE (Porcus @ Nov 24 2012, 02:36) *
Well if you are encoding the signal in advance, you could – in principle – get a CBR at a bitrate equal to the peak. That would still save bandwidth, provided that the signal doesn't max out the bitrate anywhere.

This was why I was inquiring about bitrate histograms for lossless encodings.

You might be able to shave off a little with some source material, but what do you do when you find frames with bitrates larger than their uncompressed PCM counterparts?

This post has been edited by greynol: Nov 24 2012, 15:30


--------------------
Your eyes cannot hear.
Go to the top of the page
+Quote Post
andrewfg
post Nov 24 2012, 18:22
Post #38





Group: Members
Posts: 84
Joined: 12-May 08
Member No.: 53478



QUOTE (Soap @ Nov 24 2012, 14:21) *
You mentioned a specific weakness of VBR encodings where the stream length is undetermined, not a weakness in a particular format but (as was pointed out) a weakness in any VBR compressed format.


No that is absolutely is NOT what I said. Please read my OP again. I actually said that a weakness of flac is that it does not offer a CBR option.

QUOTE (Soap @ Nov 24 2012, 14:21) *
You went on to claim (and this is the meat of the issue) that one could make a lossless compressed format which was CBR. This is the claim which was responded to and quoted by me, and the claim which is indefensible.


No. This claim is defensible.

Let us start with some basic information theory. If you have a data stream containing maximal information such that its content appears to be random then indeed by definition such content cannot be losslessly compressed. However if you have a data stream containing to some extent redundant and repeating patterns, then that data can be losslessly compressed; essentially this is done by using short codes and lookup tables to substitute for the original longer repeating patterns. Music contains a lot of redundancy and repeating patterns, and it therefore lends itself well to being losslessly compressed. And flac is a good algorithm for maximally compressing music.

When a lossless compression algorithm processes a data stream, it reads the data block by block, and it creates and writes out the resulting short codes and lookup tables for each block successively. Depending on the redundancy and repetition in the content of each block, the resulting output block will tend to be shorter in size than the input block. But there are two extremes: a) if a particular block contains no redundancy and repetition, (i.e. it is indiscernible from random data), then the output block will be at maximum the same size as the input block plus some overhead due to the compression algorithm's framing mechanics, and b) if a particular block contains 100% redundancy and repetition then the output block will be at minimum zero plus some overhead due to the compression algorithm's framing mechanics. And typically the output block size will be somewhere in between these two maximum and minimum limits i.e. somewhere in the range between the overhead due to the compression algorithm's framing mechanics, and the size of the original input data plus the overhead due to the compression algorithm's framing mechanics. Or in other words, given an input block's size "I", and the overhead due to the compression algorithm's framing mechanics "M" then the output block's size "O" will be as follows:

M <= O <= (I + M)

And the compression factor "V" for each block is defined as:

V := O / I resulting in V(minimum) := (I + M) / I and V(maximum) := M / I

So indeed V is variable (VBR) within the range V(minimum) to V(maximum), depending on the redundancy and repetition in the content of each block. As the algorithm processes more and more blocks the individual V values for each block will average out to a value such that:

V(minimum) < V(average) < V(maximum)

The actual value of V(average) will be highly specific to the content of the specific data stream. And indeed its value is not predictable in advance.

The flac algorithm has 9 different aggressivity grades of compression, and on the same data stream each grade will result in a different V(average) compression rate. Let us call these V(average_0) through V(average_8), and so:

V(minimum) < V(average_0) < V(average_1) < ... < V(average_7) < V(average_8) < V(maximum)

Now let us say we want to target on a fixed bit rate over the whole data file. Or as another poster mentioned a fixed average bit rate. This is equivalent to targeting a fixed (average) compression factor. Let us call this A(target), and let us choose it so that:

V(average_0) < A(target) < V(average_8)

Then the compression algorithm, as it processes over each individual block in the input data, would start processing at grade 8 compression, and it will probably determine that it is delivering a higher net average compression than A(target). So for the next few blocks it could back off to say grade 0 compression, and those blocks would deliver a lower net average compression than A(target). And then again for the next few blocks it could select grade 8 compression again. And so on. The compression algorithm would juggle the compression grades so that the net-net average compression over all processed blocks would get close to A(target). And indeed the longer the file, the closer the algorithm will get to A(target).

Now obviously, as we have already seen, the V(average) of any track depends on the degree of redundancy and repetition in the content. So it is indeed possible that if one has bad luck in the choice of source material, then the chosen A(target) might not lie within the bandwidth V(average_0) < A(target) < V(average_8) for that particular source material. This means that the actual delivered level of compression could end up less than the selected A(target). The trick would obviously be to choose an A(target) that is closer to V(minimum) than it is to V(maximum). So in other words there will nevertheless always be some outlier cases where A(target) is not achieved. But if you are not too aggressive and don't have too much bad luck on source material choice, it would work fine.



--------------------
AndrewFG (Whitebear -- http://www.whitebear.ch/mediaserver )
Go to the top of the page
+Quote Post
greynol
post Nov 24 2012, 18:32
Post #39





Group: Super Moderator
Posts: 10079
Joined: 1-April 04
From: San Francisco
Member No.: 13167



QUOTE (andrewfg @ Nov 24 2012, 09:22) *
if a particular block contains no redundancy and repetition, (i.e. it is indiscernible from random data), then the output block will be at maximum the same size as the input block plus some overhead due to the compression algorithm's framing mechanics

Here is where we run into problems with information theory. My suspicion is that problems arise even when dealing with signals that still contain what you casually refer to as redundancy, especially since you have to figure out how to implement something with real-world constraints in terms of things like making a signal easy enough to decompress (and compress for live streaming) and other considerations which need to be taken into account in order to satisfy playback requirements.

Again, a simple histogram will show the feasibilty of creating a codec similar in function as flac that employs a CBR mode.

This post has been edited by greynol: Nov 24 2012, 19:06


--------------------
Your eyes cannot hear.
Go to the top of the page
+Quote Post
tuffy
post Nov 24 2012, 18:48
Post #40





Group: Members
Posts: 115
Joined: 20-August 07
Member No.: 46367



QUOTE (andrewfg @ Nov 24 2012, 12:22) *
Then the compression algorithm, as it processes over each individual block in the input data, would start processing at grade 8 compression, and it will probably determine that it is delivering a higher net average compression than A(target). So for the next few blocks it could back off to say grade 0 compression, and those blocks would deliver a lower net average compression than A(target). And then again for the next few blocks it could select grade 8 compression again. And so on. The compression algorithm would juggle the compression grades so that the net-net average compression over all processed blocks would get close to A(target). And indeed the longer the file, the closer the algorithm will get to A(target).

The types of subframes FLAC will search for at compression level 8 are a superset of those it will search for at compression level 0 (or any other lower compression level). That's why it takes longer to encode; FLAC is sorting through more combinations trying to find the smallest frame. For example, level 0 won't attempt to use LPC subframes. But if you're encoding at level 8 and the encoder sees that not using LPC subframes generates the smallest frame, it won't use them.
Go to the top of the page
+Quote Post
greynol
post Nov 24 2012, 18:48
Post #41





Group: Super Moderator
Posts: 10079
Joined: 1-April 04
From: San Francisco
Member No.: 13167



@andrewfg:
Read my edit.

This post has been edited by greynol: Nov 24 2012, 18:50


--------------------
Your eyes cannot hear.
Go to the top of the page
+Quote Post
Soap
post Nov 24 2012, 19:20
Post #42





Group: Members
Posts: 1016
Joined: 19-November 06
Member No.: 37767



QUOTE (andrewfg @ Nov 24 2012, 12:22) *
No. This claim is defensible.


That is still ABR if you don't pad, and it still fails to address your original concern:

QUOTE (andrewfg @ Nov 22 2012, 06:06) *
Which means that one cannot predict in advance the length of ouput file that will be generated for any input stream.


because one can not predict the maximum needed bitrate until the entire track is encoded.

This post has been edited by Soap: Nov 24 2012, 19:20


--------------------
Creature of habit.
Go to the top of the page
+Quote Post
andrewfg
post Nov 25 2012, 01:10
Post #43





Group: Members
Posts: 84
Joined: 12-May 08
Member No.: 53478



QUOTE (Soap @ Nov 24 2012, 19:20) *
That is still ABR if you don't pad, and it still fails to address your original concern:


Actually ABR would go a long way in addressing my concern. With the method I described, if you calculated Content Length based on the target ABR then you could be sure that the actual delivered Content Length would always be greater than or equal to the calculated Content Length.

Whilst it does not guarantee totally accurate Byte-Range Seeks, it does at least eliminate the potential for Seek beyond the end of the file.

QUOTE (Soap @ Nov 24 2012, 19:20) *
because one can not predict the maximum needed bitrate until the entire track is encoded.


Perhaps instead of calling it CBR or ABR, we should call MBR (minimum bit rate). As mentioned already this would be a significant improvement.


--------------------
AndrewFG (Whitebear -- http://www.whitebear.ch/mediaserver )
Go to the top of the page
+Quote Post
Porcus
post Nov 25 2012, 01:15
Post #44





Group: Members
Posts: 1936
Joined: 30-November 06
Member No.: 38207



QUOTE (greynol @ Nov 24 2012, 15:24) *
QUOTE (Porcus @ Nov 24 2012, 02:36) *
Well if you are encoding the signal in advance, you could – in principle – get a CBR at a bitrate equal to the peak. That would still save bandwidth, provided that the signal doesn't max out the bitrate anywhere.

This was why I was inquiring about bitrate histograms for lossless encodings.

You might be able to shave off a little with some source material, but what do you do when you find frames with bitrates larger than their uncompressed PCM counterparts?


Leave that particular file uncompressed? On average, you are likely to be able to produce a proof-of-concept which will be appreciated by the most curious (and maybe, but don't count on it, someone will actually implement while bandwidth is still an issue).


--------------------
One day in the Year of the Fox came a time remembered well
Go to the top of the page
+Quote Post
Soap
post Nov 25 2012, 01:29
Post #45





Group: Members
Posts: 1016
Joined: 19-November 06
Member No.: 37767



QUOTE (andrewfg @ Nov 24 2012, 19:10) *
Actually ABR would go a long way in addressing my concern. With the method I described, if you calculated Content Length based on the target ABR then you could be sure that the actual delivered Content Length would always be greater than or equal to the calculated Content Length.


In that situation what you are calling "target ABR" must always be uncompressed bitrate + format overhead. This is the entire point of discussing information theory. You do not know the compressibility of the entire track until you reach the end. And if you're willing to reach the end of the track you can deliver an accurate Content Length on VBR content. And if you're willing to send a Content Length greater than the actual content length you, again, can live very happily with VBR.


This post has been edited by Soap: Nov 25 2012, 01:31


--------------------
Creature of habit.
Go to the top of the page
+Quote Post
greynol
post Nov 25 2012, 05:39
Post #46





Group: Super Moderator
Posts: 10079
Joined: 1-April 04
From: San Francisco
Member No.: 13167



Axon posted a script to generate histograms and included some samples which were not at all surprising to me. AFAIC, it confirms what has already been suggested: peak bitrates are likely going to dash hopes of worthwhile CBR/ABR implementations of lossless compression.

This post has been edited by greynol: Nov 25 2012, 05:51


--------------------
Your eyes cannot hear.
Go to the top of the page
+Quote Post
Axon
post Nov 25 2012, 06:34
Post #47





Group: Members (Donating)
Posts: 1985
Joined: 4-January 04
From: Austin, TX
Member No.: 10933



QUOTE (greynol @ Nov 24 2012, 22:39) *
AFAIC, it confirms what has already been suggested: peak bitrates are likely going to dash hopes of worthwhile CBR/ABR implementations of lossless compression.

Hold that thought for a sec.

The original context of the OP's (poorly communicated) post was specifically about UPnP/DLNA seek compatibility, which demands CBR or well-behaved-ABR behavior of the raw bitstream. Now suppose the following of the streaming server:

  • Hard-define the bitrate of the FLAC stream to be slightly above redbook, like 1440kbps.
  • Configure the web server to gzip-compress the FLAC stream (!).
  • Modify the FLAC stream to insert padding frames between audio frames (assuming this is even possible in the FLAC format in the first place). The padding frames are inserted so as to make the uncompressed FLAC stream bitrate 1440kbps. However, because the padding frames are trivially gzip-compressible, the actual bandwidth usage is only slightly above the original FLAC bitrate.

TL;DR: CBR, seekable FLAC stream that retains VBR, sub-redbook bandwidth usage.

There, I fixed it for you cool.gif
Go to the top of the page
+Quote Post
2Bdecided
post Nov 26 2012, 14:09
Post #48


ReplayGain developer


Group: Developer
Posts: 5189
Joined: 5-November 01
From: Yorkshire, UK
Member No.: 409



I knew I should have brought popcorn.
Go to the top of the page
+Quote Post

2 Pages V  < 1 2
Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 30th October 2014 - 13:23