IPB

Welcome Guest ( Log In | Register )

3 Pages V  < 1 2 3 >  
Reply to this topicStart new topic
TAK 1.1.0 - Beta release
melomaniac
post Dec 21 2008, 23:11
Post #26





Group: Members
Posts: 43
Joined: 1-August 08
From: Brussels
Member No.: 56565



QUOTE (Synthetic Soul @ Dec 21 2008, 21:57) *
I tested TAK 1.1.0b3 as well as FLAC 1.2.1 again.
..

Thank you. Very informative.
I did some tests on my own (XP too) and they were very similar to yours.
We can see that now TAK 1.1.0b3 -p4 gives nearly the same compression efficiency than TAK 1.0.4 -p5 with a 4x gain in speed!
Thomas, you're a genius. wink.gif
Go to the top of the page
+Quote Post
greynol
post Dec 21 2008, 23:15
Post #27





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



If the speed has gone up with the compression remaining the same then the efficiency has gone up as well.

This post has been edited by greynol: Dec 21 2008, 23:16


--------------------
Placebophiles: put up or shut up!
Go to the top of the page
+Quote Post
buktore
post Dec 22 2008, 07:47
Post #28





Group: Members
Posts: 506
Joined: 24-November 06
Member No.: 38011



Thanks for this, Thomas smile.gif

Some remarks: (using -p4, was using -p5)

- With loud music, better compression & faster encode-decode time biggrin.gif
- With quiet music, lower compression.
- For the first time ever, I got 1 corrupted TAK file when I converted my APE album image to separated TAK files. but I can fix it when convert just that track and it's fine, need more testing on this when I have time.
- Overall, decoding speed improvement for old&new TAK files. (amd 1800xp)
Go to the top of the page
+Quote Post
Synthetic Soul
post Dec 22 2008, 12:31
Post #29





Group: Super Moderator
Posts: 4887
Joined: 12-August 04
From: Exeter, UK
Member No.: 16217



QUOTE (TBeck @ Dec 21 2008, 21:52) *
Although the results are obviously different from what i expected: Decoding got slower on your system!
...
I will try to compensate this in the next version.
I am still slightly concerned about my test set-up. Perhaps I will re-run both 1.1.0 and 1.0.4 again, to ensure that they both have an equal testing environment. My biggest concern with my comparison is misinforming users, and developers of course!


--------------------
I'm on a horse.
Go to the top of the page
+Quote Post
TBeck
post Dec 22 2008, 13:17
Post #30


TAK Developer


Group: Developer
Posts: 1095
Joined: 1-April 06
Member No.: 29051



QUOTE (buktore @ Dec 22 2008, 07:47) *
- For the first time ever, I got 1 corrupted TAK file when I converted my APE album image to separated TAK files. but I can fix it when convert just that track and it's fine, need more testing on this when I have time.

Can you provide more information please?! shock1.gif

Some questions:

- Which beta have you used?
- From another thread i seem to remember you are using foobar?
- What command line parameters are you using?
- What means corrupt?
- Any piece of information can help...

Please try to respond fast!

Thomas

This post has been edited by TBeck: Dec 22 2008, 13:19
Go to the top of the page
+Quote Post
buktore
post Dec 22 2008, 14:52
Post #31





Group: Members
Posts: 506
Joined: 24-November 06
Member No.: 38011



Hmm.. I can't reproduce it. huh.gif I don't have the corrupted file either since I already overwrite it.

What happened is, I converted APE album image file to 14 tak files using foobar. when I try to replay-gain them foobar said something like "track 7" is damaged or corrupted (it stop at around 30% of the file) I scan 2 time but it still happen. I was busy at the time so I didn't bother testing more.

the command line is -e -p4 -ihs -sts0 - %d using the latest beta (beta 3)

AMD 1800+XP, XP SP3, foobar 0.9.6.1 beta

I'm 100% sure that this is no user error.

It might be my system.. but this is, for any formats, the first time this kind of thing happen to me.

This post has been edited by buktore: Dec 22 2008, 15:00
Go to the top of the page
+Quote Post
TBeck
post Dec 22 2008, 15:24
Post #32


TAK Developer


Group: Developer
Posts: 1095
Joined: 1-April 06
Member No.: 29051



QUOTE (buktore @ Dec 22 2008, 14:52) *
Hmm.. I can't reproduce it. huh.gif I don't have the corrupted file either since I already overwrite it.

Thank you for the fast reply!

Do you still have the APE image file? Surely not...

QUOTE (buktore @ Dec 22 2008, 14:52) *
What happened is, I converted APE album image file to 14 tak files using foobar. when I try to replay-gain them foobar said something like "track 7" is damaged or corrupted (it stop at around 30% of the file) I scan 2 time but it still happen. I was busy at the time so I didn't bother testing more.

In your first post you wrote, it did work with a separate file: "but I can fix it when convert just that track and it's fine, need more testing on this when I have time." Can you please explain what exactly you did?

Thomas
Go to the top of the page
+Quote Post
buktore
post Dec 22 2008, 15:43
Post #33





Group: Members
Posts: 506
Joined: 24-November 06
Member No.: 38011



Well.. I still have the APE file (actually APE & separate CUE)

EDIT: what I mean was, I still have the ZIP that contain APE file. NOT the one I used at that time. I kinda doubt that it make any different since I can replay-gain the APE file I use to convert to TAK at the time just fine.

I tried to convert it 5 times now, using exactly same method, couldn't reproduced the corrupt file.

I don't really mean that I can "fix" the corrupted file, I just covert "track 7" from the same APE file to TAK again and it seem to be fine this time so it "fix" the problem for me.

And since I was busy at the time, I didn't check that the APE file is corrupted or not? (It's not, according to foo_verifier), If I convert the whole album again not just "track 7" will it happen again? (the answer is NO) because of this, I said "need more testing on this when I have time"

I hope that make sense.. unsure.gif

This post has been edited by buktore: Dec 22 2008, 15:54
Go to the top of the page
+Quote Post
TBeck
post Dec 22 2008, 17:48
Post #34


TAK Developer


Group: Developer
Posts: 1095
Joined: 1-April 06
Member No.: 29051



QUOTE (buktore @ Dec 22 2008, 15:43) *
Well.. I still have the APE file (actually APE & separate CUE)

Great!

QUOTE (buktore @ Dec 22 2008, 15:43) *
And since I was busy at the time, I didn't check that the APE file is corrupted or not? (It's not, according to foo_verifier), If I convert the whole album again not just "track 7" will it happen again? (the answer is NO) because of this, I said "need more testing on this when I have time"

Now i am not sure if you have tested the whole album again?

Sorry, although i didn't want to put pressure on you it might have happend. I am a bit in panic, because such errors can be a killer for an archiver... crying.gif

Thomas
Go to the top of the page
+Quote Post
buktore
post Dec 22 2008, 18:29
Post #35





Group: Members
Posts: 506
Joined: 24-November 06
Member No.: 38011



QUOTE
Now i am not sure if you have tested the whole album again?


Yes, I did. 5 times, everything's fine, no corrupted file show up so far. want me to try more?

QUOTE
Sorry, although i didn't want to put pressure on you it might have happend


No sweat wink.gif feel free to tell me what you want me to do.

QUOTE
I am a bit in panic


You sure is. rolleyes.gif first I thought it was my system (then again, this never happen before) now, it's really look like you knew this might happen.. have any clues?
Go to the top of the page
+Quote Post
Alexxander
post Dec 22 2008, 18:37
Post #36





Group: Members
Posts: 457
Joined: 15-November 04
Member No.: 18143



A quick conversion test with 2 albums compressed in FLAC 1.2.1 -8 using foobar2000 v0.9.6:

I did 3 times the same conversion and post only the best times (others discarded).

Default compression, fb2k parameters -e -ihs - %d
Tak1.0.4 took 44 secs and produced 816.771 kBytes
Tak1.1.0b3 took 47 secs and produced 816.719 kBytes

Maximum compression, fb2k parameters -e -p5m -ihs - %d and -e -p4m -ihs - %d
Tak1.0.4 took 3:28 min:sec and produced 811.294 kBytes
Tak1.1.0b3 took 2:41 min:sec and produced 811.324 kBytes

With these 2 albums Tak1.1.0b3 encodes slower, I tried it a 4th time half an hour later and still Tak1.0.4 default is faster. The tags are all preserved so compression is clearly similar at default and maximum compression.

Edit:
Just encoded twice 6 other albums FLAC 1.2.1 -8 encoded at default level:
Tak1.0.4 took 2:00 min:sec and produced 2.093.069 kBytes
Tak1.1.0b3 took 2:08 min:sec and produced 2.093.002 kBytes

Machine is Vista+Sp2beta with Intel Core2Duo at 2,4GHz and there were now no other CPU intensive tasks nor hard disk activity.

This post has been edited by Alexxander: Dec 27 2008, 23:05
Go to the top of the page
+Quote Post
Synthetic Soul
post Dec 23 2008, 10:20
Post #37





Group: Super Moderator
Posts: 4887
Joined: 12-August 04
From: Exeter, UK
Member No.: 16217



QUOTE (buktore @ Dec 22 2008, 17:29) *
QUOTE
I am a bit in panic
You sure is. rolleyes.gif first I thought it was my system (then again, this never happen before) now, it's really look like you knew this might happen.. have any clues?
No, I think Thomas is, quite rightly, concerned about a "report" about a corrupted TAK file. However, it seems to me, from everything that you've said, that this can only be an I/O issue. If the same process repeated five times did not result in the same corruption then it can hardly be a fault with the format itself.


--------------------
I'm on a horse.
Go to the top of the page
+Quote Post
TBeck
post Dec 24 2008, 01:41
Post #38


TAK Developer


Group: Developer
Posts: 1095
Joined: 1-April 06
Member No.: 29051



QUOTE (Synthetic Soul @ Dec 23 2008, 10:20) *
QUOTE (buktore @ Dec 22 2008, 17:29) *
QUOTE
I am a bit in panic
You sure is. rolleyes.gif first I thought it was my system (then again, this never happen before) now, it's really look like you knew this might happen.. have any clues?
No, I think Thomas is, quite rightly, concerned about a "report" about a corrupted TAK file. However, it seems to me, from everything that you've said, that this can only be an I/O issue. If the same process repeated five times did not result in the same corruption then it can hardly be a fault with the format itself.

That's also my interpretation. Lightened...
QUOTE (Synthetic Soul @ Dec 22 2008, 12:31) *
QUOTE (TBeck @ Dec 21 2008, 21:52) *
Although the results are obviously different from what i expected: Decoding got slower on your system!
...
I will try to compensate this in the next version.
I am still slightly concerned about my test set-up. Perhaps I will re-run both 1.1.0 and 1.0.4 again, to ensure that they both have an equal testing environment. My biggest concern with my comparison is misinforming users, and developers of course!

Yeah, this attitude is definitely one important reason why i am really happy and thankful that you intend to continue TAK testing! rolleyes.gif

I hope, it isn't to late, but i would like you to wait with another test.

Actuallay i wanted to release the final version before christmas, but since the release now anyhow has been delayed because of the uncertainties raised by buktore's report (and hey buktore, i am really thankful for your report!!! Even if it hopefully was a false alarm.), i may try to fix the speed issues and release another beta.

QUOTE (Alexxander @ Dec 22 2008, 18:37) *
With these 2 albums Tak1.1.0b3 encodes slower, I tried it a 4th time half an hour later and still Tak1.0.4 default is faster. The tags are all preserved so compression is clearly similar at default and maximum compression.
...
Machine is Vista+Sp2beta with Intel Core2Duo at 2,4GHz and there were now other CPU intensive tasks nor hard disk activity.

Too bad. Well, things are getting really difficult if io-activity has to be incorporated. I could imagine, that even a speed up of the code could result in an worse interaction with the io-system and overall lead to a performance drop. Thanks for the report!

Thomas
Go to the top of the page
+Quote Post
Synthetic Soul
post Dec 27 2008, 22:39
Post #39





Group: Super Moderator
Posts: 4887
Joined: 12-August 04
From: Exeter, UK
Member No.: 16217



QUOTE (Synthetic Soul @ Dec 22 2008, 11:31) *
I am still slightly concerned about my test set-up. Perhaps I will re-run both 1.1.0 and 1.0.4 again, to ensure that they both have an equal testing environment. My biggest concern with my comparison is misinforming users, and developers of course!
OK, I'm glad I did (although I have no idea where this leaves my comparison). These results show 1.1.0 in a more favourable light:

CODE
     |  1.0.4         |  1.1.0
======================================
p0   |  127x    139x  |  130x    141x
p0e  |  105x          |  108x
p0m  |   59x          |   61x
p1   |  105x    138x  |  109x    140x
p1e  |   89x          |   90x
p1m  |   50x          |   51x
p2   |   65x    123x  |   66x    125x
p2e  |   52x          |   53x
p2m  |   32x          |   33x
p3   |   41x    112x  |   38x    111x
p3e  |   33x          |   31x
p3m  |   23x          |   20x
p4   |   28x    107x  |   24x    104x
p4e  |   18x          |   15x
p4m  |   16x          |   14x
p5   |   21x     99x  |
p5e  |   11x          |
p5m  |   11x          |

For the record, here's both the old (comparison figures) and new figures for both versions, to show the differences in test. You will see that the 1.1.0 results (thankfully) compare very well; the results for 1.0.4 are unsatisfactorily different though.

CODE
      |  1.0.4 (new/old)       |  1.1.0 (new/old)
======================================================
-p0   |  127/130x    139/147x  |  130/131x    141/141x
-p0e  |  105/108x    138/146x  |  108/109x    142/142x
-p0m  |   59/ 59x    139/146x  |   61/ 61x    141/142x
-p1   |  105/107x    139/145x  |  109/108x    141/141x
-p1e  |   89/ 89x    138/145x  |   90/ 91x    140/141x
-p1m  |   50/ 50x    138/146x  |   51/ 51x    140/142x
-p2   |   65/ 65x    124/128x  |   66/ 66x    125/126x
-p2e  |   52/ 52x    123/128x  |   53/ 53x    125/126x
-p2m  |   32/ 32x    123/128x  |   33/ 32x    125/125x
-p3   |   41/ 41x    113/117x  |   38/ 38x    112/113x
-p3e  |   33/ 33x    113/117x  |   31/ 31x    111/112x
-p3m  |   23/ 23x    112/116x  |   20/ 20x    111/112x
-p4   |   28/ 28x    107/111x  |   24/ 24x    104/103x
-p4e  |   18/ 18x    107/111x  |   15/ 15x    104/104x
-p4m  |   16/ 16x    107/111x  |   14/ 14x    104/104x
-p5   |   21/ 20x     97/ 98x  |
-p5e  |   11/ 11x     99/102x  |
-p5m  |   11/ 11x     99/102x  |

NB: Just seen Thomas' response above. Well, I am always happy to test some more Thomas. smile.gif

Damn, that means I should not have updated my comparison, and I cannot update it again until I can find the time to re-test all codecs (which could be some time). sad.gif


--------------------
I'm on a horse.
Go to the top of the page
+Quote Post
lostintime
post Dec 28 2008, 01:30
Post #40





Group: Members
Posts: 2
Joined: 27-December 08
Member No.: 64932



I've created a 60 second 48 kHz (i.e. 2,880,000 samples) 16 bit mono WAV file of silence which I've named "60 seconds of silence.wav". When I do the following command:

CODE
type "60 seconds of silence.wav" | takc -e -pmax - "60 seconds of silence.tak"

I get a 1,052,878 byte file. But if I use TAKC directly on the file instead, i.e. without piping it to TAKC, with this command:

CODE
takc -e -pmax "60 seconds of silence.wav"

Then I only get a 4,342 byte file. Both the 1,052,878 byte file and the 4,342 byte file decode correctly to the original WAV file. I've tested both version 1.0.4 and version 1.1.0 beta 3 of TAKC with the same results.

Version 1.2.1 of FLAC doesn't do this, it produces exactly the same output in both situations:

CODE
type "60 seconds of silence.wav" | flac -e -8 - -o "60 seconds of silence.flac"
flac -e -8 "60 seconds of silence.wav" -o "60 seconds of silence_2.flac"

Produces an identical 16,714 byte file both times. Why is there such a big difference in size with TAKC?

This post has been edited by lostintime: Dec 28 2008, 01:31
Go to the top of the page
+Quote Post
TBeck
post Dec 28 2008, 03:20
Post #41


TAK Developer


Group: Developer
Posts: 1095
Joined: 1-April 06
Member No.: 29051



QUOTE (Synthetic Soul @ Dec 27 2008, 22:39) *
QUOTE (Synthetic Soul @ Dec 22 2008, 11:31) *
I am still slightly concerned about my test set-up. Perhaps I will re-run both 1.1.0 and 1.0.4 again, to ensure that they both have an equal testing environment. My biggest concern with my comparison is misinforming users, and developers of course!
OK, I'm glad I did (although I have no idea where this leaves my comparison). These results show 1.1.0 in a more favourable light:
...
For the record, here's both the old (comparison figures) and new figures for both versions, to show the differences in test. You will see that the 1.1.0 results (thankfully) compare very well; the results for 1.0.4 are unsatisfactorily different though.
...
NB: Just seen Thomas' response above. Well, I am always happy to test some more Thomas. smile.gif

Thank you, that' extremely helpful and will save me a lot of work! rolleyes.gif

Currently i have only two ideas what could be causing the different results:

1) The disk-io-system is behaving differently. Possibly smaller or larger io-sizes in TAK could make a difference.
2) The cpu-time is beeing calculated differently.

My intuition prefers 2).

QUOTE (Synthetic Soul @ Dec 27 2008, 22:39) *
NB: Just seen Thomas' response above. Well, I am always happy to test some more Thomas. smile.gif

Great! For the testing of my possible modifications a pass with -p0 will be sufficient.

QUOTE (lostintime @ Dec 28 2008, 01:30) *
I've created a 60 second 48 kHz (i.e. 2,880,000 samples) 16 bit mono WAV file of silence which I've named "60 seconds of silence.wav". When I do the following command:

CODE
type "60 seconds of silence.wav" | takc -e -pmax - "60 seconds of silence.tak"

I get a 1,052,878 byte file. But if I use TAKC directly on the file instead, i.e. without piping it to TAKC, with this command:

CODE
takc -e -pmax "60 seconds of silence.wav"

Then I only get a 4,342 byte file. Both the 1,052,878 byte file and the 4,342 byte file decode correctly to the original WAV file. I've tested both version 1.0.4 and version 1.1.0 beta 3 of TAKC with the same results.

If pipe encoding is used without the -ihs paramter (that is for instance required for foobar), TAK will try to save the wave meta data to the file. By default it will reserve 1 MB in the header! That's the default maximum size for both pipe and file based encoding.

When performing file beased encoding TAK can determine the actual meta data size and will reserve only as much space as is required. But with pipe encoding this isn't possible because there can be a footer at the end of the wave data and it's size can not be calculated before the end of the pipe has been encountered.

You may limit the maximum wave meta data size with the wm-switch:

-wm46 reserves 46 bytes for a standard wave header without any additional info.

-wm0 to reserve no space for the wave header. The reconstructed header of the encoded file may differ from the original file.

By default also a bit space for the seek table is beeing reserved. You may disable it with the sts-switch:
-sts0
Go to the top of the page
+Quote Post
Bugs.Bunny
post Dec 28 2008, 14:29
Post #42





Group: Members
Posts: 39
Joined: 21-August 08
Member No.: 57350



Hi TBeck,
I'm quite impressed by TAK! I did some testing and everything worked flawlessly. But for the moment I will stick with mokey's audio due to the lack of support in other software (bass audio/ffdshow...) I'm really looking forward to TAK 2.0 when these restrictions should fall (due to opening of the source code - if I got that right).
I think TAK has a great potential so thumbs up for the future development!

This post has been edited by Bugs.Bunny: Dec 28 2008, 14:30
Go to the top of the page
+Quote Post
TBeck
post Dec 29 2008, 06:11
Post #43


TAK Developer


Group: Developer
Posts: 1095
Joined: 1-April 06
Member No.: 29051



QUOTE (Synthetic Soul @ Dec 27 2008, 22:39) *
Damn, that means I should not have updated my comparison, and I cannot update it again until I can find the time to re-test all codecs (which could be some time). sad.gif

Now i am getting a bit egocentric... dry.gif If your primary purpose is TAK-testing, i would propose this approach:

1) Since i tend to describe TAK's performance in relation to FLAC and Monkey's Audio, i would be very happy to see a new comparison of them. Well, i will release another, slightly faster (and final!) beta soon.

2) You may keep your old comparison (which has become famous outside of TAK testing...) and update the new comparison step by step (codec by codec), when you have time and/or a new codec version has been released.

QUOTE (TBeck @ Dec 28 2008, 03:20) *
If pipe encoding is used without the -ihs paramter (that is for instance required for foobar), TAK will try to save the wave meta data to the file. By default it will reserve 1 MB in the header! That's the default maximum size for both pipe and file based encoding.

The next beta release will come with the default size for pipe encoding set to 46 Bytes instead of 1 MB. Thanks to "lostintime" for reporting the inappropriate default value!

QUOTE (Bugs.Bunny @ Dec 28 2008, 14:29) *
I think TAK has a great potential so thumbs up for the future development!

Thank you very much for your encouragement!

Thomas
Go to the top of the page
+Quote Post
halb27
post Dec 29 2008, 09:07
Post #44





Group: Members
Posts: 2424
Joined: 9-October 05
From: Dormagen, Germany
Member No.: 25015



No doubt: looking at compression ratio and important decompression speed TAK is the best player in the lossless field. Congratulations!

Just 2 questions @ TBeck:
a) TAK yields the best combination together with lossyWAV right now. Some months ago you announced specific development for use with lossyWAV to achieve further improvement. What's the current state here?
b) The impressive decompression speed makes TAK highly desirable for use with mobile DAPs (like FLAC, but with a better compression ratio). Has there been any contacts with DAP producers allowing them to implement TAK and giving them the necessary information? This would be great, especially as the time for lossless codecs on mobile DAPs is yet to come, but we may be close to it right now. It would be great to have TAK be a major player in this field.


--------------------
lame3100m -V1 --insane-factor 0.75
Go to the top of the page
+Quote Post
lostintime
post Dec 29 2008, 09:29
Post #45





Group: Members
Posts: 2
Joined: 27-December 08
Member No.: 64932



QUOTE (TBeck @ Dec 29 2008, 17:11) *
The next beta release will come with the default size for pipe encoding set to 46 Bytes instead of 1 MB. Thanks to "lostintime" for reporting the inappropriate default value!


Could I suggest that you also change the behaviour of TAKC so that if doesn't report an "invalid file extension" whenever you try to give an output filename an extension anything other than tak? FLAC will let you call the output file anything you want, with no extension added if not present in the output filename.
Go to the top of the page
+Quote Post
jcoalson
post Dec 29 2008, 18:11
Post #46


FLAC Developer


Group: Developer
Posts: 1526
Joined: 27-February 02
Member No.: 1408



QUOTE (halb27 @ Dec 29 2008, 03:07) *
b) The impressive decompression speed makes TAK highly desirable for use with mobile DAPs (like FLAC, but with a better compression ratio).

http://www.hydrogenaudio.org/forums/index....mp;#entry531840
Go to the top of the page
+Quote Post
halb27
post Dec 29 2008, 18:43
Post #47





Group: Members
Posts: 2424
Joined: 9-October 05
From: Dormagen, Germany
Member No.: 25015



QUOTE (jcoalson @ Dec 29 2008, 19:11) *
QUOTE (halb27 @ Dec 29 2008, 03:07) *
b) The impressive decompression speed makes TAK highly desirable for use with mobile DAPs (like FLAC, but with a better compression ratio).

http://www.hydrogenaudio.org/forums/index....mp;#entry531840

What do you mean? I'm quite aware that decompression speed depends on the format, implementation details and how they match the specific hardware.
Do you think FLAC is expected to be significantly faster than TAK on mobile DAPs?


--------------------
lame3100m -V1 --insane-factor 0.75
Go to the top of the page
+Quote Post
greynol
post Dec 29 2008, 18:46
Post #48





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



To put it more simply, I think the question should be whether TAK will be fast enough.

The fact that flac decompression speeds on a PC aren't compared on an equal basis really shouldn't have anything to do with it. It isn't like you're suggesting TAK is out-performing flac with anything except for the compression ratio.

EDIT: Formatting, removed an extra "really" and some other stuff too.

This post has been edited by greynol: Dec 29 2008, 19:17


--------------------
Placebophiles: put up or shut up!
Go to the top of the page
+Quote Post
jcoalson
post Dec 29 2008, 20:12
Post #49


FLAC Developer


Group: Developer
Posts: 1526
Joined: 27-February 02
Member No.: 1408



I mean just that the comparison of command-line decoding to file on a pc does not tell enough about feasibility in common embedded devices. we know many devices are capable of flac decoding (all modes) with a c decoder that is not optimized for the target, with some headroom. this comparison appears to show some tak modes on par with all flac modes, except:

- that comparison is not apples-apples because the flac time includes md5 computation which is significant and not present in playback situations (accounted for here and here)
- we already see variability in tak timing due to optimization for particular i/o cases on particular hardware

what matters is the complexity of the decode process; my hunch based on implementation of ape modes on rockbox and what little we know about tak is that with an open source c decoder, rockbox can probably get most but not all modes of tak decoding on most hardware. for other devices (native support) that's the first of many hurdles.


see also: http://www.rockbox.org/twiki/bin/view/Main...manceComparison
Go to the top of the page
+Quote Post
TBeck
post Dec 29 2008, 20:31
Post #50


TAK Developer


Group: Developer
Posts: 1095
Joined: 1-April 06
Member No.: 29051



QUOTE (jcoalson @ Dec 29 2008, 20:12) *
- that comparison is not apples-apples because the flac time includes md5 computation which is significant and not present in playback situations (accounted for [url=http://flac.sourceforge.net

Well, in Synthetic Souls' previous comparison TAK -p0 and -p1 where both decoding faster than FLAC -8 without MD5...

QUOTE (jcoalson @ Dec 29 2008, 20:12) *
- we already see variability in tak timing due to optimization for particular i/o cases on particular hardware

I don't know, why you think TAK has been optimized for particular hardware?

- The io routines and buffers are always the same. Some time ago i deceided to use a very small buffer for the io-system, which possibly isn't adaequate. I may increase it's size a bit. Would you call a fixed size for an io-buffer, which is always the same indpendently of the os or harware, a specific optimization?

- TAK does contain only one set of dsp functions (i386 and MMX) that are beeing used for any cpu.

Thomas
Go to the top of the page
+Quote Post

3 Pages V  < 1 2 3 >
Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 24th July 2014 - 07:05