Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: TAK 2.3.0 (Read 137867 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

TAK 2.3.0

Reply #25
I noticed that the encoder fails to work for regular ansi filenames if path happens to contain unicode characters. For example open command prompt in dir called "Pośród kwiatów i cieni" and try to encode a simple "test.wav" there. Usually encoders, even if they don't support unicode, have no problems doing this.

TAK 2.3.0

Reply #26
Hey I just wanted to say that TAK now is faster than ever and I love it ! My music playlist which consists of 2300+ songs is all in TAK because it's much faster than FLAC both when encoding and decoding. (I use OGG on my DAP, which is a Sansa ClipZip).

So thanks TBeck ! I wish the best for this format's future.

TAK 2.3.0

Reply #27
This is not by any means a neutral test, it is the least-FLACable piece of music from my collection, and it is the least flattering for TAK that I have ever come across (it is WavPack-friendly as long as time is no object ...). Consider it a worst-case test.
Music is Merzbow: I lead you towards glorous times, track 3 from Venereology (1994). For a listen: http://www.youtube.com/watch?v=gTRZdFqAOGA
Hardware: an old p4 3.06 with spinning hard drive.

tak.exe (GUI) used. No tweaking of options used except standard presets.  No firm procedure for repeating, this is not meant to be taken as hard science, but the decoding tests were repeated because they surprised me.

Encoding using “test” (as I guessed writing is the bottleneck, definitely looks to be the case) and “compress”, p3 p3e p3m omitted because I'm lazy, but they all fail to break the 100% threshold.

Code: [Select]
p0 : test=219x encode=131x, size=100.96%
p0e: test=153x encode=89x, size=100.94%
p0m: test=86x encode=69x, size=100.50%
p1 : test=159x encode=83x, size=100.51%
p1e: test=92x encode=69x, size=100.52%
p1m: test=60x encode=52x, size=100.27%
p2 : test=113x encode=79x, size=100.30%
p2e: test=62x encode=51x, size=100.29%
p2m: test=34x encode=31x, size=100.15%
p4: test=38x encode=36x size=99.92%
p4e: test=31x encode=29x, size=100.04%
p4m: test=17x encode=17x, size=100.04%

For comparison:
FLAC-8 encoding to file: about 20x, size=98.8%.



Decoding using “test”. This result was so surprising that I repeated the experiment a few times – these are representative figures.
Code: [Select]
Glorious Times-p0.tak  170* Ok
Glorious Times-p0e.tak  134* Ok
Glorious Times-p0m.tak  165* Ok
Glorious Times-p1.tak  92* Ok
Glorious Times-p1e.tak  147* Ok
Glorious Times-p1m.tak  168* Ok
Glorious Times-p2.tak  121* Ok
Glorious Times-p2e.tak  140* Ok
Glorious Times-p2m.tak  177* Ok
Glorious Times-p4.tak  180* Ok
Glorious Times-p4e.tak  173* Ok
Glorious Times-p4m.tak  141* Ok
Duration:      27.26 sec
Speed:        145.17 * real time


Surprise – the higher compressed decode faster!? Would have made sense if there were large differences in file size which had to be read from drive, but these are virtually identical in size.  This experiment was repeated due to the surprising result.


Decoding using “decompress”, writing to file; repeated due to the previous result being such a surprise. Somehow I don't get very consistent results here, but this is not way off:
Code: [Select]
Glorious Times-p0.tak   83* Ok
Glorious Times-p0e.tak  69* Ok
Glorious Times-p0m.tak  70* Ok
Glorious Times-p1.tak  58* Ok
Glorious Times-p1e.tak  77* Ok
Glorious Times-p1m.tak  58* Ok
Glorious Times-p2.tak  64* Ok
Glorious Times-p2e.tak  73* Ok
Glorious Times-p2m.tak  58* Ok
Glorious Times-p4.tak  48* Ok
Glorious Times-p4e.tak  72* Ok
Glorious Times-p4m.tak  60* Ok
Duration:      61.42 sec
Speed:          64.44 * real time


Seems obvious that speed is way beyond spinning drives.

TAK 2.3.0

Reply #28
I noticed that the encoder fails to work for regular ansi filenames if path happens to contain unicode characters. For example open command prompt in dir called "Po?ród kwiatów i cieni" and try to encode a simple "test.wav" there. Usually encoders, even if they don't support unicode, have no problems doing this.

I suppose this fails, because TAK always generates a full path specification. In your case it will determine the current directory and add it to the file name. Unfortunately this fails because of TAK's lack of unicode support. Indeed, the lack of unicode support is annoying...

Hey I just wanted to say that TAK now is faster than ever and I love it ! My music playlist which consists of 2300+ songs is all in TAK because it's much faster than FLAC both when encoding and decoding. (I use OGG on my DAP, which is a Sansa ClipZip).

So thanks TBeck ! I wish the best for this format's future.

Thank you very much!

However i don't expect TAK to decode much faster than FLAC. Your system must be something special.

This is not by any means a neutral test, it is the least-FLACable piece of music from my collection, and it is the least flattering for TAK that I have ever come across (it is WavPack-friendly as long as time is no object ...). Consider it a worst-case test.

I am always interested in such special files. Could you send me a short snippet?

TAK 2.3.0

Reply #29
I just want to say thank you for TBeck for making such an awesome codec. I just started using tak yesterday and must say I'm really impressed by it.
Excellent compression and encoding time. In -p4m i can encode on my system at ~108x. I've been using FLAC for over 10 years now for all my music CD's as well as recorded vinyl and i'm seriously thinking converting all to TAK.

I must congratulate you also on the best command line information i've seen in a CLI tool. You really don't need a manual for it.

Continue the good work!

TAK 2.3.0

Reply #30
Thank you.  That's very encouraging.

Continue the good work!

I will! Well, if time permits... I've just optimized a new compression technique, which was encoding far too slow: 0.1 * realtime on my PC. Now it's more than 250 * realtime and therefore practicable. But it's integration would require a format change and the compression improvement isn't big enough to justify it. So i will have to look for more tuning oppurtunities. At some point the sum of improvements may be sufficient.

It's simply getting more and more difficult to squeeze out some more compression without significantly affecting the decoding speed.

TAK 2.3.0

Reply #31
I'm trying to get takc to work with CUETools to runs some tests, but it always fails with
"Takc.exe has exited prematurely with code 2: The pipe has been ended."

This is the command line I'm using: takc -e -p%M -md5 -overwrite - %O
%M is replaced by the profiles, %O by the outfile by CUETools. If I try to run the command on a testfile in the command prompt, it works since I don't use a pipe in that case. Wavpack works fine with CUETools and a similar configuration, what am I missing?

I wish rockbox supported TAK, I'm interested in which battery times I would get with it.

TAK 2.3.0

Reply #32
I have no experience with CUETools, therefore there will be more knowledgeable people than me to answer.

But possibly the addition of the "-ihs" switch might help. It tells takc to ignore the files size defined in the wave header, which is often 0 if piping is beeing used.

TAK 2.3.0

Reply #33
Argh. It's because the outfile contains non-ansi characters, so it's the dreaded "unicode not supported" case again >.<
I'm not that knowledgeable about it, but why didn't you include support for unicode right from the start?

TAK 2.3.0

Reply #34
try foobar instead? that uses temporary filenames when encoding and then renames them when done. so it really doesn't matter whether the encoders support unicode or not.

TAK 2.3.0

Reply #35
try foobar instead? that uses temporary filenames when encoding and then renames them when done. so it really doesn't matter whether the encoders support unicode or not.

In many cases it works for non Unicode supporting CLI encoders because they don't care or use full path name of those temporary filenames.
However, as is reported by Case and confirmed by TBeck in this thread, it will fail for takc if the target directory name contains Unicode characters not present in your locale, since fb2k creates the temporary file in the target directory, and takc constructs full path name from it.

TAK 2.3.0

Reply #36
For some reason, I don't have that problem when running TAK with Wine. It accepts filenames with UTF-8 characters just fine. Or maybe I misunderstood the nature of the problem?

TAK 2.3.0

Reply #37
I will! Well, if time permits... I've just optimized a new compression technique, which was encoding far too slow: 0.1 * realtime on my PC. Now it's more than 250 * realtime and therefore practicable. But it's integration would require a format change and the compression improvement isn't big enough to justify it. So i will have to look for more tuning oppurtunities. At some point the sum of improvements may be sufficient.

It's simply getting more and more difficult to squeeze out some more compression without significantly affecting the decoding speed.

I want to venture that most TAK users do not have a major issue with a format change, as longs as: that it is accompanied by a major revision number (i.e. TAK 3.xx); and, backwards compatibility for decoding previous versions exists.

Of course, I expect you (the developer) already knows this. I merely state in public that fears over 'format change' might be exaggerated (unless *somehow* the reverse-engineered TAK decoder gained a lot of traction :shrug:). Looking at all the other lossless codecs, changing the format seems to derive from necessity and evolution via accumulative enhancements.

Whatever may be decided, I will try to stay updated and active with TAK hopefully as contributor rather than encumbrance
"Something bothering you, Mister Spock?"

TAK 2.3.0

Reply #38
But it does imply upgrading decoders for playing the new files.

TAK 2.3.0

Reply #39
I want to venture that most TAK users do not have a major issue with a format change, as longs as: that it is accompanied by a major revision number (i.e. TAK 3.xx); and, backwards compatibility for decoding previous versions exists.

I wholeheartedly agree with this.

TAK 2.3.0

Reply #40
I'm not that knowledgeable about it, but why didn't you include support for unicode right from the start?

Because my old Delphi 6 from 2001 provides very little (none for the GUI) unicode support and TAK uses some libraries i have written long ago which too don't support unicode. Even the implementation of unicode support for the command line version only would be a lot of work. Currently i am not sure, if i will implement unicode support before porting TAK to C++.

Because this definitely will take quite long and of course i understand how important unicode support  is, i may use a top-down-approach for the first step of the port:

- Put the Delphi codec core into a library (DLL), which can be called by C++.
- Port the much less comprehensive application logic to C++ and add unicode support.

But which road i will take depends on many factors i can't foresee. Therefore i can't make a clear statement.

Of course, I expect you (the developer) already knows this. I merely state in public that fears over 'format change' might be exaggerated (unless *somehow* the reverse-engineered TAK decoder gained a lot of traction :shrug:). Looking at all the other lossless codecs, changing the format seems to derive from necessity and evolution via accumulative enhancements.

I want to venture that most TAK users do not have a major issue with a format change, as longs as: that it is accompanied by a major revision number (i.e. TAK 3.xx); and, backwards compatibility for decoding previous versions exists.

I wholeheartedly agree with this.

I definitely don't intend to remove decoding support for older codec versions. If i ever had thought about it, the latest release of the great Monkey's Audio would have taught me better... But at some point i will remove the assembler optimizations for old versions. This will simplify the work on an open source decoder release. It's quite possible that one of the next TAK releases will come without assembler optimizations for V1.x files. I don't think that's a big issue. Decoding will still be quite fast.

unless *somehow* the reverse-engineered TAK decoder gained a lot of traction

What i would like. Yes, i have changed my mind... I don't think i will alter the format without releasing an open source decoder. I would like to make it easy for the ffmpeg developers to implement the new format.

But it does imply upgrading decoders for playing the new files.

Of course.

TAK 2.3.0

Reply #41
try foobar instead? that uses temporary filenames when encoding and then renames them when done. so it really doesn't matter whether the encoders support unicode or not.


Just to note that i've converted from FLAC over 60 albums with Foobar till now and haven't experienced any problems with filenames using lots of (´ ` ~ ^ ç ...and so on) on the filename characters, it converts everything flawlessly. So maybe it really depends how the applications pass the original names to the *.tak destination file or how they use the pipe for that matter.

TAK 2.3.0

Reply #42
Filename is taken care by fb2k, so you should have no problem if path to the destination directory doesn't contain Unicode characters not present in your code page. Otherwise it will fail.
If you don't get it, try encoding to C:\❤\☀\ or something.

TAK 2.3.0

Reply #43
Filename is taken care by fb2k, so you should have no problem if path to the destination directory doesn't contain Unicode characters not present in your code page. Otherwise it will fail.
If you don't get it, try encoding to C:\?\?\ or something.


Indeed:

Code: [Select]
1 out of 1 tracks converted with major problems.

Source: "D:\WAV Audio\02 Éàçãô.wav"
  An error occurred while writing to file (The encoder has terminated prematurely with code 2 (0x00000002); please re-check parameters) : "D:\?\02 Éàçãô.tak"
  Additional information:
  Encoder stream format: 44100Hz / 2ch / 16bps
  Command line: "C:\Users\Main\AppData\Roaming\foobar2000\encoders\Takc.exe" -e -p4m -tn4 -ihs  - "temp-A849E32B71D002A5CD61CFDEA5B46919.tak"
  Working folder: D:\?\
  
  Conversion failed: The encoder has terminated prematurely with code 2 (0x00000002); please re-check parameters


But by using this example I very much doubt I will encounter any problems whatsoever with my files, but nonetheless i understand the problem better now. Thanks.

TAK 2.3.0

Reply #44
Filename is taken care by fb2k, so you should have no problem if path to the destination directory doesn't contain Unicode characters not present in your code page. Otherwise it will fail.
If you don't get it, try encoding to C:\?\?\ or something.

Well I don't know, this is what I'm getting if I try to convert this album:

Code: [Select]
Source: "\\server\music\Sawano Hiroyuki\Shingeki no Kyojin OST\01. ???? - ?t?æk 0N t??tn.flac"
  An error occurred while finalizing the encoding process (Object not found) : "R:\01. ???? - ?t?æk 0N t??tn.tak"


There are no unicode characters in the space and the conversion is running, but fails on rename.

It also fails on more normal filenames, for example:
Code: [Select]
1 out of 1 tracks converted with major problems.

Source: "\\server\music\Sawano Hiroyuki\Shingeki no Kyojin OST\07. Cyua - Vogel im Käfig.flac"
  An error occurred while finalizing the encoding process (Object not found) : "R:\07. Cyua - Vogel im Käfig.tak"
  Conversion failed: Object not found


Only for .tak though, every other format is working.

edit: R:\ is my RAMdisk, up to 5GB space available. Not that it seems to matter, converting to another directory ends with the same error

TAK 2.3.0

Reply #45
Yes, the example above is quite unnaturally made up and you might not meet this kind of problem.
However, in my environment (Japanese, CP932), quite many of artist / album name (which will be naturally used as a folder name) actually bring the problem, because latin accent letters are missing in our code page, and available only through Unicode.

TAK 2.3.0

Reply #46
It's not made up though, just irregular: click.
I'm running Japanese locale myself, and even using filenames that only contain kanji will lead to the same error:

Code: [Select]
1 out of 1 tracks converted with major problems.

Source: "\\server\music\01. 戦場ヶ原ひたぎ(斎藤千和) - 二言目.flac"
  An error occurred while finalizing the encoding process (Object not found) : "R:\01. 戦場ヶ原ひたぎ(斎藤千和) - 二言目.tak"
  Conversion failed: Object not found



TAK 2.3.0

Reply #47
ChronoSphere, I'm quite certain your problem is caused by incorrect command line parameters. If you use pipes with foobar2000 you need to add -ihs parameter for TAK, otherwise it will remove the encoded file in the assumption that something went wrong when length didn't match.

TAK 2.3.0

Reply #48
Adding -ihs fixes that, indeed. I did read the takc help, but it wasn't clear the -ihs parameter is mandatory when piping to me.
Why not make it set the parameter automatically? Or is it something specific to the way foobar is piping the file?

CUETools still doesn't work though, but only with non-ansi names.

One more thing, is the way TAK works suitable for a (future) GPU implementation? I remember bryant saying that wavpack, for example, isn't.

TAK 2.3.0

Reply #49
Adding -ihs fixes that, indeed. I did read the takc help, but it wasn't clear the -ihs parameter is mandatory when piping to me.
Why not make it set the parameter automatically? Or is it something specific to the way foobar is piping the file?

Because it's possible that a software writes a valid wave header with valid size data to the pipe. Then you would like to store it in the TAK file to be able to restore the original file with bit-identical meta data. With -ihs applied, TAK will save no header and create it's own one on decoding. Which can differ from the original.

One more thing, is the way TAK works suitable for a (future) GPU implementation? I remember bryant saying that wavpack, for example, isn't.

At least as well as FLAC. Basically TAK is using the same kind of prediction filter as FLAC, the asymmetric mode of Mpeg4Als and LPAC. Possibly it can be implemented more efficiently, because it does only require 16 * 16 bit integer multiplications with an 32-bit accumulator. But i don't know, if current GPUs provide appropriate instructions to take advantage of the simplier arithmetic.

But i wouldn't expect a similar compression advantage of a more extensive evaluation of compression parameters as FLAC achieves. Im most of my evaluations TAK's fast heuristics came very close to a brute force approach which tries most of the possible parameter combinations.