HFE

Last week I had the pleasure of attending the 20th annual iPres conference on Digital Preservation in Ghent, Belgium. I enjoyed hearing from many of my respected colleagues on many aspects of preservation including one of my favorite topics, floppy disks. There was tutorials, lightning talks, and even a workshop, presented by Leontien Talboom, Elizabeth Kata, Chris Knowles, and myself. We titled the workshop “A Guide to Imaging Obscure Floppy Disk Formats“. The workshop was conceived by a mutual interest in imaging Wang 5.25in word processor disks, but expanded to include imaging of Amstrad 3in disks, 240K Brother Typewriter Disks, and Macintosh 400/800k disks.

I brought my hand soldered FluxEngine board and others brought their Greaseweazle board to show off how imaging obscure and uncommon disks can be done on a budget.

Photo of workshop taken on a Mavica Floppy Disk camera
Image taken during workshop on a Mavica FD200 Floppy Disk Camera.

During the conference we talked a bit about the different type of hardware that can be used and the difference between a disk image and flux image. There seems to be quite the exhaustive list of different types of file formats, some specific to a platform and others more generic. I recently did a blog post on the formats used by the Applesauce software, which have some unique features.

There are many disk image types which should be researched and added to PRONOM and other format description sites, but today lets take a look at a generic format used by many tools.

The HxC Floppy Emulator file format which the extension HFE is a popular format used with floppy drive emulators. There is a lot of complexity with what is included in many of these image formats, some are simply a raw sector representation of the binary data on a disk, others contain the complete flux readings from a floppy disk. The HFE format contains a little more than a raw image, including a header, a track lookup table, and the bitstreams for each track all with the purpose of emulating the physical media. The HFE format contains only a single pass over the data, where other formats may contain multiple reading of each track to get more complete data which can be helpful for damaged or purposely copy-protected disks. You can read more on Ashley’s blog, Library of Congress format description.

HFE version list

When using the HxC Floppy Emulator software, you can open and save to many different formats. The main format being their HFE native format. It comes in 5 versions.

hexdump -C test01.hfe | head
00000000 48 58 43 50 49 43 46 45 00 53 02 00 e8 01 00 00 |HXCPICFE.S......|
00000010 07 01 01 00 ff ff ff ff ff ff ff ff ff ff ff ff |................|
00000020 ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff |................|

Above is a hexdump of the main SDCard HxC Floppy Emulator file format. The format specification shows the 8 byte header “HXCPICFE”. This is a very unique pattern and should be all we need to make a robust signature for the format, but we do need to take into account the other HFE “versions” and see if they might clash or need to be identified separately.

hexdump -C test02-a2.hfe | head 
00000000 48 58 43 50 49 43 46 45 00 53 02 00 d0 03 00 00 |HXCPICFE.S......|
00000010 07 01 01 00 ff ff ff ff ff ff ff ff ff ff ff ff |................|
00000020 ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff |................|

The “A2” version of the format has the same header but some different bytes further into the file.

hexdump -C test03-rev2.hfe | head
00000000 48 58 43 50 49 43 46 45 01 53 02 00 00 00 00 00 |HXCPICFE.S......|
00000010 07 01 01 00 ff ff ff ff ff ff ff ff ff ff ff ff |................|
00000020 ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff |................|

The “Rev 2” version also has the same header. But if you look at the 9th byte you can see the value changed from 00 to 01, which according to the specification, this is the revision byte.

hexdump -C test04-rev3.hfe | head 
00000000 48 58 43 48 46 45 56 33 00 53 02 00 e8 01 00 00 |HXCHFEV3.S......|
00000010 07 01 01 00 ff ff ff ff ff ff ff ff ff ff ff ff |................|
00000020 ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff |................|

With “Rev 3” we see a change in the header with “HXCHFEV3” which appears to be referred to as HFEv3.

hexdump -C test05-stream.hfe | head 
00000000 48 78 43 5f 53 74 72 65 61 6d 5f 49 6d 61 67 65 |HxC_Stream_Image|
00000010 00 00 00 00 00 00 00 00 00 18 00 00 00 02 00 00 |................|
00000020 00 1a 00 00 53 00 00 00 02 00 00 00 40 9c 00 00 |....S.......@...|
00000030 07 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................|
00000040 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................|

This last format seems to be a special HxC stream image.

It seems the best option is to make three signatures to identify the three main headers. Additional software can be used to further parse the disk image. If you would like to see some sample images, you can download a bunch here. You can also take a look at my GitHub repository to see additional samples and a proposed set of signatures.

ATRAC

The year was 2001 and I found myself in need of an audio player and recorder. I had been burning CD’s for a few years, making mixed CD’s was fun and convenient, but I needed more flexibility. After some research I decided on a device that was super popular outside the United States, but was gaining some loyal fans.

This MZ-G750 MiniDisc device could record in a standard high quality mode through RCA, optical digital cable, and an optional microphone in mini-plug. This model also had the LP2 and LP4 modes which compressed higher, but could record up to 320 minutes on one MD disc.

Sony accomplished this by using a propriety compression codec called ATRAC, or Adaptive TRansform Acoustic Coding. This compression format was used with the MiniDisc and other Sony devices like the the flash memory Walkman’s sold later.

I recorded and stored a lot of music on the few disc’s I purchased over the next year, but as you may have surmised, the iPod came out later that year. I waited a bit but eventually purchased the updated 10GB model and the MiniDisc only was used to make a few recordings over the next little while.

As good as the MiniDisc is, the model I owned could record in a digital format, but lacked the connections to transfer the audio to a computer unless you used the optical cable and captured in real time to a computer with an optical input. This was by design, even when they put USB ports on later models, the software only allowed sending audio to the MiniDisc, but not back from the device.

A few years back I heard of some work the community has done to bring MiniDisc’s back from shadows. Now there is a thriving market and some models can cost a pretty penny. With that came some great tools and the ability to copy from the device back to the computer. The only problem, my device lacks a USB port. I kept my eye out for a “good” deal on a NetMD MiniDisc device. It took some time, but I am happy to report I am now the proud owner of a MZ-N420D.

With a new USB capable NetMD in hand, lets take a look at the different ATRAC formats!

The most common ATRAC formats are the ATRAC3 versions which generally have the extension OMA or OMG. But let’s start with ATRAC1, the format used on my earlier MiniDisc device when captured in Standard Mode. Using the amazing https://webmd.pro/ tool, I was able to connect my new device and “archive” my disc.

hexdump -C Test1.aea | head
00000000 00 08 00 00 54 65 73 74 31 00 00 00 00 00 00 00 |....Test1.......|
00000010 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................|
*
00000100 00 00 00 00 1e 01 00 00 02 00 00 00 00 00 00 00 |................|
00000110 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................|
*
00000b50 0c a0 45 57 54 44 32 35 41 44 22 34 32 24 13 23 |..EWTD25AD"42$.#|
00000b60 32 23 22 12 11 11 11 11 76 18 69 75 f8 63 69 a7 |2#".....v.iu.ci.|
00000b70 a4 5d 46 22 45 36 1f 59 55 9d 41 55 19 51 45 17 |.]F"E6.YU.AU.QE.|
00000b80 45 14 55 38 c2 cb 2c b2 88 26 fd b2 17 b3 f0 0f |E.U8..,..&......|

ffprobe -i Test1.aea
[aea @ 0x7fc5e6c04fc0] Estimating duration from bitrate, this may be inaccurate
Input #0, aea, from 'Test1.aea':
Duration: 00:00:01.63, bitrate: 302 kb/s
Stream #0:0: Audio: atrac1, 44100 Hz, stereo, fltp, 292 kb/s

ATRAC1 files can have the AEA extension, which ffmpeg can decode, but MediaInfo doesn’t appear to have added the support. According to the decoder the magic numbers for the ATRAC1 format are “Magic is ‘00 08 00 00‘ in little-endian”. This pattern matches my files, but the recent addition PRONOM fmt/1968 doesn’t match all the samples I have.

The magic numbers are too simple to be the only pattern used in a signature. The Track title follows the magic numbers but are not static. Then there are quite a bit of zero bytes, like a lot. All the samples I have seem to have some data around the 260 offset, then more zero bytes until around 2400 to 2800 byte offset range. I scanned all the samples I have through Tridscan, and it looks like the only bytes in common are the magic header, lots of zero’s, and a few strings.

	<GlobalStrings>
<String>ED33</String>
<String>EUD3</String>
<String>FTDC</String>
<String>T322</String>
<String>TC32</String>
<String>TC43</String>
<String>UC22</String>
<String>UED3</String>
<String>VD33</String>
<String>VETC</String>
<String>WEDD</String>
</GlobalStrings>

The ffmpeg libavformat code does tell us at byte 264 there will be a 01 or 02 which indicates channels. 44.1 kHz is assumed and the bitrate is calculated from a constant by how many channels, so not much else to identify common patterns. More testing needed.

ATRAC3 is what allowed my original MiniDisc to record in LP2 and LP4, extending the recording time. This format was also how some DRM was added to the device and computer to allow for some checking-in and checking-out of files, but to control their use. This was done with Desktop software from Sony, originally in the form of the title SonicStage, later incorporating OpenMG to manage the DRM. I used SonicStage to encode some audio into OMG and OMA formats.

OpenMG format files

These are audio files which have been converted to ATRAC3 format and encrypted in OpenMG format, which is the copyright protection technology for audio contents specific to OpenMG (with the extension .omg).

hexdump -C 01-Untitled.omg | head
00000000 30 80 30 80 06 07 66 6f 70 65 6e 4d 47 02 02 03 |0.0...fopenMG...|
00000010 eb 04 14 01 0f 50 00 00 04 00 00 00 ba d0 90 49 |.....P.........I|
00000020 3d 7f 61 7b 91 c4 30 06 02 67 01 02 02 3f 00 06 |=.a{..0..g...?..|
00000030 02 68 01 02 04 00 59 47 80 02 01 00 02 03 02 03 |.h....YG........|
00000040 a0 02 02 01 80 02 01 00 00 00 04 08 f5 94 79 c9 |..............y.|
00000050 6b 78 75 22 04 84 00 59 5e 30 83 0b 71 39 e3 e8 |kxu"...Y^0..q9..|
00000060 27 29 00 00 00 00 00 00 00 00 26 e2 65 d0 de e0 |')........&.e...|
00000070 69 19 73 45 1c c4 3b 36 8d 02 3b 72 bd eb 84 df |i.sE..;6..;r....|
00000080 cd 20 4e 43 d3 e3 23 8a 3f 9e df 80 f1 86 d1 aa |. NC..#.?.......|
00000090 2b 93 bf 09 59 0d d6 8f 78 5d 45 3a 9f d8 79 8b |+...Y...x]E:..y.|

ffprobe -i /01-Untitled.omg
[oma @ 0x7fed2440e980] Format oma detected only with low score of 1, misdetection possible!
[oma @ 0x7fed2440e980] Couldn't find the EA3 header !
/01-Untitled.omg: Invalid data found when processing input

The good news is there appears to be a standard header for the OMG format, but ffmpeg assumes they are OMA files. Turns out OMG was the original form of the format, but was replaced with OMA starting with SonicStage v2.1.

hexdump -C 01-Untitled.oma | head
00000000 65 61 33 03 00 00 00 00 17 76 54 49 54 32 00 00 |ea3......vTIT2..|
00000010 00 17 00 00 02 00 55 00 6e 00 74 00 69 00 74 00 |......U.n.t.i.t.|
00000020 6c 00 65 00 64 00 28 00 31 00 29 54 41 4c 42 00 |l.e.d.(.1.)TALB.|
00000030 00 00 11 00 00 02 00 55 00 6e 00 74 00 69 00 74 |.......U.n.t.i.t|
00000040 00 6c 00 65 00 64 54 58 58 58 00 00 00 17 00 00 |.l.e.dTXXX......|
00000050 02 00 4f 00 4d 00 47 00 5f 00 54 00 52 00 41 00 |..O.M.G._.T.R.A.|
00000060 43 00 4b 00 00 00 31 54 58 58 58 00 00 00 25 00 |C.K...1TXXX...%.|
00000070 00 02 00 4f 00 4d 00 47 00 5f 00 41 00 4c 00 42 |...O.M.G._.A.L.B|
00000080 00 4d 00 53 00 00 00 55 00 6e 00 74 00 69 00 74 |.M.S...U.n.t.i.t|
00000090 00 6c 00 65 00 64 54 58 58 58 00 00 00 23 00 00 |.l.e.dTXXX...#..|
*
00000c00  45 41 33 03 00 60 ff 80  00 00 00 00 01 0f 50 00  |EA3..`........P.|
00000c10  00 04 00 00 00 60 8a 07  e3 0a c9 91 63 46 c6 bc  |.....`......cF..|
00000c20  22 52 03 76 00 05 66 48  00 00 3b 86 00 00 00 00  |"R.v..fH..;.....|
00000c30  00 00 20 30 00 00 00 00  00 00 00 00 00 00 00 00  |.. 0............|
00000c40  00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  |................|

ffprobe -i 01-Untitled.oma
Input #0, oma, from '01-Untitled.oma':
Metadata:
title : Untitled(1)
album : Untitled
OMG_TRACK : 1
OMG_ALBMS : Untitled
OMG_ASGTM : 2366000
OMG_TIT2S : Untitled(1)
TLEN : 353000
Duration: N/A, start: 0.000000, bitrate: N/A
Stream #0:0: Audio: atrac3al ([34][0][0][0] / 0x0022), 44100 Hz, stereo, fltp

We learned from trying an OMG file in ffprobe that ffmpeg is looking for EA3 header, which is found in this OMA file. Both of these formats should have a nice header to work from for a signature. In fact there has already been a request and signature submitted for the OMA format. Mine are slightly different, but only takes a small tweak to work with all my samples. Also, it seems the extension AA3 was used for awhile before settling on OMA. OMA can have a few different types:

ffprobe -i 02-Untitled.oma 
[oma @ 0x7fbc7ef047c0] Estimating duration from bitrate, this may be inaccurate
Input #0, oma, from '/Star Trek/02-Untitled.oma':
Metadata:
title : Untitled(2)
album : Star Trek
OMG_TRACK : 2
OMG_ALBMS : Star Trek
OMG_ASGTM : 2366000
OMG_TIT2S : Untitled(2)
TLEN : 27000
Duration: 00:00:27.21, start: 0.000000, bitrate: 193 kb/s
Stream #0:0: Audio: atrac3p ([1][0][0][0] / 0x0001), 44100 Hz, stereo, fltp, 192 kb/s

I’ll leave the technical properties to be handled by tools more suited for parsing the format like ffmpeg. Maybe MediaInfo could have the formats added, but until then, it might be best to simply identify the main format. I am also aware of some later additions to the ATRAC family, such as ATRAC3plus, ATRAC Advanced Lossless, and ATRAC9 (WAV RIFF). There are other extensions like AT3 out there which use the ATRAC codec, like Sony’s Playstation or PSP. I will have to keep my eyes out for the even more elusive Hi-MD MiniDisc devices to find out more. For now, take a look at some samples and my proposal for signatures on my GitHub.

ePic

Image compression has been around for awhile. It seems everyone took a crack at making better algorithms to improve quality and size. Some chose to invent new ways and others chose to use existing methods but with their own flare. Kodak tried this with their PhotoCD, but there was a couple other photo processing options that popped up in 90’s. One was Seattle FilmWorks and another was Konica PC PictureShow. Both of which used “proprietary” formats to deliver developed film on disk.

Seattle FilmWorks later called PhotoWorks, used an image format with the extension SFW and was based on BMP and JPG, but with their own twist. The same goes for the format used by Konica’s PC PictureShow.

Konica PC PictureShow Disk

If you took your film in to be developed at one of Konica’s photo labs, you could could have those images put on a diskette or later a CD-R. The disks came with software to view your photos called PC PhotoShow. The images stored on disk where in another proprietary format with the extension KQP. The KQP format was actually licensed from another company called Pegasus Imaging Corporation, later known as Accusoft. They developed their own way to compress a JPEG file which they called an ePic. An SDK called PICTools was offered for many years, but seems not to be available anymore.

ePIC (Proprietary)
  • Supports PIC format compression, replacing the JPEG Huffman encoder with the proprietary ELS entropy encoder for 15% more compression.
  • Can be losslessly converted back to JPEG format using Op_RORE.

A search on the internet for Konica KQP shows quite a few people over the years wondering what to do with their old disks and converting the old format to JPG, only to find a lack of information and available tools to do so. One such person used python to edit the file and making the file renderable as a JPG. While the method worked well for their KQP files, it might not work for all of them. Let’s look closer and understand why.

hexdump -C Sample.PIC | head
00000000 42 4d 00 00 00 00 00 00 00 00 42 04 00 00 44 00 |BM........B...D.|
00000010 00 00 34 08 00 00 24 fa ff ff 01 00 18 00 4a 50 |..4...$.......JP|
00000020 45 47 00 00 00 00 00 00 00 00 00 00 00 00 fc 00 |EG..............|
00000030 00 00 ec 00 00 00 2c 00 00 00 18 00 00 00 00 00 |......,.........|
00000040 00 00 02 00 00 00 08 00 00 00 01 00 00 00 01 00 |................|
00000050 00 00 60 00 00 00 00 00 60 00 00 60 00 00 00 00 |..`.....`..`....|
00000060 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................|

At first glance the file appears to be a Bitmap (BMP), and it does have a Bitmap header claiming to have JPEG compression, but if we look a little further into the file.

identify -verbose Sample.PIC   
identify: length and filesize do not match `Sample.PIC' @ error/bmp.c/ReadBMPImage/950.
identify: unrecognized compression `Sample.PIC' @ error/bmp.c/ReadBMPImage/1019.

hexdump -C Sample.PIC
00000000 42 4d 00 00 00 00 00 00 00 00 42 04 00 00 44 00 |BM........B...D.|
00000010 00 00 34 08 00 00 24 fa ff ff 01 00 18 00 4a 50 |..4...$.......JP|
00000020 45 47 00 00 00 00 00 00 00 00 00 00 00 00 fc 00 |EG..............|
00000030 00 00 ec 00 00 00 2c 00 00 00 18 00 00 00 00 00 |......,.........|
00000040 00 00 02 00 00 00 08 00 00 00 01 00 00 00 01 00 |................|
00000050 00 00 60 00 00 00 00 00 60 00 00 60 00 00 00 00 |..`.....`..`....|
00000060 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................|
*
00000400 00 00 60 00 00 00 00 00 60 00 00 60 00 00 00 00 |..`.....`..`....|
00000410 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................|
*
00000440 00 00 ff d8 ff e0 00 10 4a 46 49 46 00 01 02 02 |........JFIF....|
00000450 00 00 00 00 00 00 ff e1 00 0a 50 49 43 00 01 19 |..........PIC...|
00000460 1e 01 ff c0 00 11 08 05 dc 08 34 03 01 11 00 02 |..........4.....|

We find a JPG marker, in fact almost the whole jpg file is included, except the quantization tables for luminance and chrominance which are needed to properly display the image. This is the area the Pegasus company thought they could encode better to further compress the image. Their method was to use a new algorithm called ELS (Entropy Logarithmic-Scale). This new method was used by the PICTools software to make a Pegasus PIC file while Konica used it for their KQP format. They are identical. By choosing the luminance and chrominance values during compression, you could make a highly compressed image, but required specific software to render.

Pegasus also made use of a special custom APP marker (PIC) within the JPEG structure of the PIC/KQP and also any JPG compressed using their software. This marker which takes up around 8 bytes holds the luminance and chrominance values. Take the above sample for instance, it is compressing the image with a Luminance of 25 and a Chrominance of 30, these are integer values and in hex they would be “19” and “1E” respectively.

hexdump -C Sample.PIC      
00000440 00 00 ff d8 ff e0 00 10 4a 46 49 46 00 01 02 02 |........JFIF....|
00000450 00 00 00 00 00 00 ff e1 00 0a 50 49 43 00 01 19 |..........PIC...|
00000460 1e 01 ff c0 00 11 08 05 dc 08 34 03 01 11 00 02 |..........4.....|
00000470 11 01 03 11 01 ff c4 00 51 00 01 00 03 01 00 00 |........Q.......|

So in theory one could strip out any part of the file before the JPG beginning of file magic bytes (FF D8 FF E0), locate the APP marker, use the values to generate the two quantization tables, insert them in the appropriate spot and save out a JPG file.

This may be the case for the first few versions of the ePic format, but later versions got more complicated. It seems a “PIC2” version replaced the earlier versions and this format is a little more complicated.

hexdump -C Sample.KQP | head
00000000 50 49 43 32 01 08 00 00 00 64 00 01 00 b9 3e 00 |PIC2.....d....>.|
00000010 00 05 08 00 00 00 4a 50 47 45 03 00 00 00 16 24 |......JPGE.....$|
00000020 00 00 00 43 6f 6d 70 72 65 73 73 69 6f 6e 20 62 |...Compression b|
00000030 79 20 50 65 67 61 73 75 73 20 49 6d 61 67 69 6e |y Pegasus Imagin|
00000040 67 20 43 6f 72 70 2e 06 68 3e 00 00 ff d8 ff e0 |g Corp..h>......|
00000050 00 10 4a 46 49 46 00 01 01 00 00 01 00 01 00 00 |..JFIF..........|
00000060 ff e1 00 16 50 49 43 00 03 00 00 01 00 00 00 00 |....PIC.........|
00000070 00 00 00 00 00 00 00 00 ff db 00 84 00 0f 0a 0a |................|
00000080 0a 0a 06 0f 0a 0a 0a 0f 0f 0f 0f 14 1e 14 14 14 |................|
00000090 14 14 28 1e 1e 19 1e 2d 28 32 32 2d 28 2d 2d 32 |..(....-(22-(--2|

Instead of the Bitmap (BMP) header, a proprietary PIC2 header is used, still containing a JPG in the JFIF format along with a the PIC APP marker, but encoded in a way that the simple method of adding a quantization table may not work. With the original format the JPG and the PIC/KQP were approximately the same size, this new version significantly reduces the size of the PIC/KQP in comparison with the JPG.

The ELS compression technology used in the ePic format seems to be patented by Pegasus and Accusoft, but is not entirely hidden as the libavcodec library includes a ELS decoder. Might be a fun project to use the code to decode the PIC/KQP formats fully.

In the meantime, a signature identifying the two versions should be added to PRONOM. Check out my proposal on my GitHub. If you need to convert your KQP or PIC files back to JPG here are a few links:

Konica PC PictureShow Version 4 (PIC2)

Accusoft PICTools Apollo Demo (Windows 7 Compatible)

Konica PC PictureShow for Macintosh

Interactive Quicktime

One of my favorite legacy formats to explore is any type of multimedia CD-ROM. The 1990’s and early 2000’s were filled with all sorts of multimedia for CD, Web, and Television. It is also one of the most difficult formats to try and preserve for the future. Many CD-ROM’s are filled with executables and/or Macromedia Director media, later having flash content. The operating systems and security needs today make playback almost impossible. For this reason many have built emulation services to mimic the original operation system and software to allow the many historic multimedia CD-ROM’s to once again interact with the user in a way many current systems still struggle with.

Many CD-ROM’s would come as Hybrid disc’s allowing them to be used on a Windows and Macintosh system, sometimes providing two different experiences. Then there were CD-Extra or Enhanced CD‘s as a separate session to an Audio CD which would contain bonus content playable only on a computer.

For fun I took a look back at some of my older Audio CD titles. I came across a couple, one claiming to be a “CD-Extra” and another an “Enhanced CD“. The CD-Extra disc when queried with cd-info claimed to have 12 tracks, with the 12th being a data XA track.

Disc mode is listed as: CD-ROM Mixed
CD-ROM Track List (1 - 12)
#: MSF LSN Type Green? Copy? Channels Premphasis?
1: 00:02:00 000000 audio false no 2 no
2: 02:13:66 009891 audio false no 2 no
3: 05:21:28 023953 audio false no 2 no
4: 08:18:19 037219 audio false no 2 no
5: 12:28:37 055987 audio false no 2 no
6: 16:11:58 072733 audio false no 2 no
7: 19:21:56 086981 audio false no 2 no
8: 23:17:49 104674 audio false no 2 no
9: 26:01:17 116942 audio false no 2 no
10: 28:30:02 128102 audio false no 2 no
11: 31:07:70 139945 audio false no 2 no
12: 37:29:46 168571 XA true no
170: 51:35:07 231982 leadout (520 MB raw, 516 MB formatted)
CD Analysis Report
CD-Plus/Extra
session #2 starts at track 12, LSN: 168571

Mounting the 12th track showed a mix of Macromedia Director (.DIR) files and quite a few Quicktime MOV movies. Playback was not possible on my current computer so I had to resort to using an emulator to experience this bonus content, full of band member photos and biographies.

The other disc I pulled out to explore was a bit different. Using cd-info the disc looked very similar:

Disc mode is listed as: CD-ROM Mixed
CD-ROM Track List (1 - 13)
#: MSF LSN Type Green? Copy? Channels Premphasis?
1: 00:02:00 000000 audio false no 2 no
2: 04:20:08 019358 audio false no 2 no
3: 08:04:27 036177 audio false no 2 no
4: 11:15:62 050537 audio false no 2 no
5: 14:54:32 066932 audio false no 2 no
6: 19:57:73 089698 audio false no 2 no
7: 26:12:36 117786 audio false no 2 no
8: 29:51:59 134234 audio false no 2 no
9: 34:44:00 156150 audio false no 2 no
10: 39:36:62 178112 audio false no 2 no
11: 42:06:01 189301 audio false no 2 no
12: 45:42:26 205526 audio false no 2 no
13: 57:10:54 257154 XA true no
170: 72:56:67 328117 leadout (735 MB raw, 730 MB formatted)
CD Analysis Report
CD-Plus/Extra
session #2 starts at track 13, LSN: 257154

The disc’s, even though were labeled CD-Extra and Enhanced CD, had the same structure and format. The difference was in the type of multimedia used. There was a simple application which launched Quicktime and loaded a single MOV movie. But, this was not your regular Quicktime Movie, this is a highly complex Interactive Quicktime movie.

The Quicktime movie could only be launched from an older operating system using Quicktime 6, and on the Macintosh, only a PPC CPU. The movie would launch with an interactive menu, allowing navigation as you might find on a DVD or Flash website, but all within a single MOV file. When I ran MediaInfo on the MOV file I got back quite a few tracks:

<media ref="/Volumes/VOLCANOECD/ALECD.mov">
<track type="General">
<VideoCount>10</VideoCount>
<AudioCount>1</AudioCount>
<OtherCount>51</OtherCount>
<FileExtension>mov</FileExtension>
<Format>QuickTime</Format>
<Format_Settings>Compressed header</Format_Settings>

Ten video tracks and 51 other tracks. Exploring with Quicktime, I could see the entire list of embedded content:

Quicktime movies, an Audio track, dozens of Flash, Photos, Animations, Sprites, with the possibility of more. These types of Quicktime files had requirements in order to run with Quicktime 6 being the last which could playback all the content correctly. Current versions of Quicktime give a warning on the lack of compatibility.

This Interactive Quicktime movie proudly claims; “Made with LiveStage Pro“, which was an authoring environment for Quicktime made by Totally Hip Software Inc. Started in 1995, but seemed to disappear after 2004 with no new development and by 2014 the website went offline.

If you would like to see a couple of Apple created simple examples see here.

LiveStage Pro was a very powerful authoring tool in its time, another similar tool called Electrifier competed for the interactive Quicktime market. Adobe GoLive also competed, but offered fewer features. The final Quicktime movie exported from LiveStage Pro was the main component, but the software did save a project format with the extension “LSD”. Versions 2 through 4 of LiveStage Pro had a similar header.

hexdump -C LiveStagePro4-s01.lsd | head
00000000 4c 53 41 46 00 00 00 04 00 00 09 16 00 00 00 00 |LSAF............|
00000010 00 00 00 00 00 00 00 00 00 00 09 0a 73 65 61 6e |............sean|
00000020 00 00 00 01 00 00 00 03 00 00 00 00 00 00 00 18 |................|
00000030 56 53 4e 6e 00 00 00 01 00 00 00 00 00 00 00 00 |VSNn............|
00000040 00 00 00 04 00 00 08 84 4d 50 52 4e 00 00 00 01 |........MPRN....|
00000050 00 00 00 49 00 00 00 00 00 00 00 21 6d 4f 55 54 |...I.......!mOUT|
00000060 00 00 00 01 00 00 00 00 00 00 00 00 55 6e 74 69 |............Unti|
00000070 74 6c 65 64 2e 6d 6f 76 00 00 00 00 18 57 6c 65 |tled.mov.....Wle|
00000080 66 00 00 00 01 00 00 00 00 00 00 00 00 00 00 00 |f...............|
00000090 00 00 00 00 18 57 74 6f 70 00 00 00 01 00 00 00 |.....Wtop.......|

All the samples from version 2 through 4 have the first four bytes as “LSAF“. It also seems the next four bytes may be version related. Version 1 however has a different header.

hexdump -C contest.lsd | head
00000000 4c 53 50 72 00 00 00 08 00 00 00 00 00 00 02 80 |LSPr............|
00000010 01 e0 00 00 00 00 02 58 00 00 00 01 00 00 00 01 |.......X........|
00000020 00 00 00 02 00 00 00 00 00 08 00 00 00 00 00 00 |................|
00000030 00 00 08 53 02 d9 ff c9 04 76 02 97 01 00 44 00 |...S.....v....D.|
00000040 0b 02 fb 03 c9 00 00 00 01 00 00 00 01 00 00 00 |................|
00000050 00 07 41 63 74 69 6f 6e 73 00 00 00 00 00 00 00 |..Actions.......|
00000060 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................|
00000070 00 00 00 00 00 00 00 00 05 00 00 00 01 50 49 43 |.............PIC|
00000080 54 ff ff 00 00 c1 ff 03 72 65 64 65 6e 6e 41 79 |T.......redennAy|
00000090 98 05 41 77 78 00 00 01 7a 00 10 00 00 31 fc 30 |..Awx...z....1.0|

Identification of a LiveStage project should be simple enough, but identifying and rendering back a Quicktime movie made by this software takes some work. In fact there are many “Enhanced CD’s” and CD-Extra titles out there with quite a few system requirements. If we are not careful, many of these little gems might get more difficult to experience or lost completely.

If you would like to explore the Quicktime Movie from the Enhanced CD mentioned here, send me a message. You can also take a look at my signature proposal and samples files on my Github for LiveStage.

SDIF

I have used and have researched a lot of audio editing software. Some are very simple and straightforward, others are feature rich and take some time to learn. While looking in a format, I came across some Audio software which nothing like I have used before. At first I was confused, I figured it would be simple to open a certain file format and play the audio. Not so fast.

Max is software which proudly says it is an, “infinitely flexible space to create your own interactive software”. Created by Cycling ’74 software, Max has been around for awhile, being developed in the mid 1980’s. It allows the user to make “patches” stringing around components and effects to accomplish an infinite amount of options and outcomes.

The software produces simple project files and patch files, but hey are just JSON data, at least in the latest version. But when working with audio files the software can save to a number of formats.

One of the options is a format called “SDIF”, which stands for “Sound Description Interchange Format“. SDIF was jointly developed by IRCAM and CNMAT, with proposals starting back in the mid-1990’s. Originally written as a Spectral Description, it was later changed to refer to a Sound Description.

The Specification states the general idea was to “store information related to signal processing and specifically of sound, in files, according to a common format to all data types. Thus, it is possible to store results or parameters of analyses, syntheses…” So not exactly the same as a simple WAVE file you can open and edit, this format was meant to store signal data for analysis.

Each SDIF file consists of a header and then an overall a succession of frames, not unlike chunks in the IFF/AIFF/RIFF formats, ordered in time. Each frame matrix declares a “Type” which can be a combination of many options. Lets take a look at a SDIF file:

hexdump -C test.sdif | head
00000000 53 44 49 46 00 00 00 08 00 00 00 03 00 00 00 01 |SDIF............|
00000010 31 54 52 43 00 00 00 20 00 00 00 00 00 00 00 00 |1TRC... ........|
00000020 00 00 00 01 00 00 00 01 31 54 52 43 00 00 00 04 |........1TRC....|
00000030 00 00 00 00 00 00 00 04 31 54 52 43 00 00 00 c0 |........1TRC....|
00000040 3f 74 7a e1 40 00 00 00 00 00 00 01 00 00 00 01 |?tz.@...........|
00000050 31 54 52 43 00 00 00 04 00 00 00 0a 00 00 00 04 |1TRC............|
00000060 3f 80 00 00 45 95 35 c3 00 00 00 00 00 00 00 00 |?...E.5.........|
00000070 40 00 00 00 46 06 e2 14 00 00 00 00 00 00 00 00 |@...F...........|
00000080 40 40 00 00 45 3b 42 3d 00 00 00 00 00 00 00 00 |@@..E;B=........|
00000090 40 80 00 00 43 5d 94 7b 00 00 00 00 00 00 00 00 |@...C].{........|

This test file has the opening frame “SDIF“, to identify it as an SDIF, then a reference to the type “1TRC. I would try and explain a Matrix 1TRC Sinusoidal Track, but I have no idea what it means. Something, something sine wave, etc. Someone much smarter than me can make use of this format. Here are a couple examples of SDIF with other frame types.

hexdump -C angry_cat.part.sdif| head
00000000 53 44 49 46 00 00 00 08 00 00 00 03 00 00 00 01 |SDIF............|
00000010 31 4e 56 54 00 00 00 88 ff ef ff ff ff ff ff ff |1NVT............|
00000020 ff ff ff fd 00 00 00 01 31 4e 56 54 00 00 03 01 |........1NVT....|
00000030 00 00 00 61 00 00 00 01 53 74 72 65 61 6d 49 44 |...a....StreamID|
00000040 09 30 0a 44 61 74 65 09 54 68 75 5f 41 75 67 5f |.0.Date.Thu_Aug_|
00000050 5f 33 5f 32 31 2e 33 32 2e 34 35 5f 32 30 30 30 |_3_21.32.45_2000|
00000060 5f 0a 54 61 62 6c 65 4e 61 6d 65 09 53 69 6e 75 |_.TableName.Sinu|
00000070 73 6f 69 64 61 6c 54 72 61 63 6b 73 0a 57 72 69 |soidalTracks.Wri|
00000080 74 74 65 6e 42 79 09 50 6d 5f 56 65 72 73 69 6f |ttenBy.Pm_Versio|
00000090 6e 5f 31 2e 32 2e 32 0a 00 00 00 00 00 00 00 00 |n_1.2.2.........|

hexdump -C cymbalum-c4.res.sdif| head
00000000 53 44 49 46 00 00 00 08 00 00 00 03 00 00 00 01 |SDIF............|
00000010 31 52 45 53 00 00 0d 20 00 00 00 00 00 00 00 00 |1RES... ........|
00000020 00 00 00 04 00 00 00 01 31 52 45 53 00 00 00 04 |........1RES....|
00000030 00 00 00 d0 00 00 00 04 42 49 27 7a 39 59 fc ab |........BI'z9Y..|
00000040 3d 35 06 c9 00 00 00 00 42 6e 68 68 39 63 99 b1 |=5......Bnhh9c..|
00000050 3e 25 f7 c0 00 00 00 00 42 c6 02 bb 39 8c 31 79 |>%......B...9.1y|
00000060 3f bb 7e 6e 00 00 00 00 43 01 82 96 3a 1d 36 44 |?.~n....C...:.6D|
00000070 3e d9 21 12 00 00 00 00 43 07 35 f0 3a 20 6f 6e |>.!.....C.5.: on|
00000080 3f 02 32 7f 00 00 00 00 43 30 84 0b 39 97 f9 1b |?.2.....C0..9...|
00000090 3e c6 43 c7 00 00 00 00 43 4d e4 e4 39 88 14 90 |>.C.....CM..9...|

Unfortunately, the common tools I use to explore AV formats don’t seem to work on this format. MediaInfo, FFProbe, Exiftool, all give me unknown file warnings. So I had to compile the SDIF software in order to get some details.

querysdif angry_cat.part.sdif 
Header info of file angry_cat.part.sdif:

Format version: 3
Types version: 1

Ascii chunks of file angry_cat.part.sdif:

1NVT
{
StreamID 0;
Date Thu_Aug__3_21.32.45_2000_;
TableName SinusoidalTracks;
WrittenBy Pm_Version_1.2.2;
}

Data in file angry_cat.part.sdif (9504872 bytes):
1933 1TRC frames in stream 0 between time 0.000000 and 5.794875 containing
1933 1TRC matrices with 45 --400 rows, 4 -- 4 columns

An interesting thing is that a SDIF file can be in text form as well.

sdiftotext test.sdif 
SDIF


SDFC

1TRC 1 1 0
1TRC 0x0004 0 4

1TRC 1 1 0.005
1TRC 0x0004 10 4
1 4774.72 0 0
2 8632.52 0 0
3 2996.14 0 0
4 221.58 0 0
5 1943.02 0 0
6 123.951 0 0
7 6705.04 0 0
8 4304.97 0 0
9 3554.29 0 0
10 23.7822 0 0

1TRC 1 1 0.01
1TRC 0x0004 10 4
1 4774.72 0.0353114 2.06098
2 8632.52 0.00442518 0.68795
3 2996.14 0.0238517 -1.42295
4 221.58 0.0089712 -2.44141
5 1943.02 0.00768914 2.64629
6 123.951 0.0397061 -0.17527
7 6705.04 0.0245643 -0.168753
8 4304.97 0.00894803 1.45553
9 3554.29 0.0265175 2.57231
10 23.7822 0.0419019 -2.17731

1TRC 1 1 0.2
1TRC 0x0004 10 4
1 2284.56 0.02781 2.47054
2 4222.62 0.0151738 1.55309
3 31.1554 0.00421461 -0.657285
4 310.99 0.0122306 1.25794
5 215.192 0.0174093 1.25468
6 6253.69 0.000894192 2.21334
7 8533.32 0.0296167 2.07209
8 8044.77 0.0423002 2.54088
9 6087.45 0.0264733 -2.05523
10 7052.7 0.0287347 0.426339

1TRC 1 1 0.205
1TRC 0x0004 10 4
1 2284.56 0 0
2 4222.62 0 0
3 31.1554 0 0
4 310.99 0 0
5 215.192 0 0
6 6253.69 0 0
7 8533.32 0 0
8 8044.77 0 0
9 6087.45 0 0
10 7052.7 0 0

1TRC 1 1 0.21
1TRC 0x0004 0 4

ENDC
ENDF

An interesting format for sure. But wait, there is more!

My initial interest in this format was when I was given access to a set of MUBU files. I was unclear on how there were created at first and it took me down a long path of learning about SDIF and the Max software from Cycling ’74 and IRCAM. MUBU turns out to be a toolbox for Max which adds more analysis features.

MUBU stands for MUlti-BUffer, which helps overcome some limitations. It is actually a container using the SDIF standard. Lets take a look.

hexdump -C test.mubu | head
00000000 53 44 49 46 00 00 00 08 00 00 00 03 00 00 00 01 |SDIF............|
00000010 31 4e 56 54 00 00 00 78 ff ef ff ff ff ff ff ff |1NVT...x........|
00000020 ff ff ff fd 00 00 00 01 31 4e 56 54 00 00 03 01 |........1NVT....|
00000030 00 00 00 53 00 00 00 01 4d 75 42 75 2e 43 6f 6e |...S....MuBu.Con|
00000040 74 61 69 6e 65 72 2e 4e 75 6d 54 72 61 63 6b 73 |tainer.NumTracks|
00000050 09 31 0a 4d 75 42 75 2e 43 6f 6e 74 61 69 6e 65 |.1.MuBu.Containe|
00000060 72 2e 56 65 72 73 69 6f 6e 09 31 2e 35 0a 4d 75 |r.Version.1.5.Mu|
00000070 42 75 2e 43 6f 6e 74 61 69 6e 65 72 2e 4e 75 6d |Bu.Container.Num|
00000080 42 75 66 66 65 72 73 09 31 0a 00 00 00 00 00 00 |Buffers.1.......|
00000090 31 4e 56 54 00 00 00 38 ff ef ff ff ff ff ff ff |1NVT...8........|

A MUBU file has the same SDIF frame header, but also include a “1NVT” frame, which is a Name Value Table. This is where the MUBU container is referenced. The MuBu file has its own structure:

If I query the MuBu file like I did the SDIF, I get the following:

querysdif test.mubu
Header info of file test.mubu:

Format version: 3
Types version: 1

Ascii chunks of file test.mubu:

1NVT
{
MuBu.Container.NumTracks 1;
MuBu.Container.Version 1.5;
MuBu.Container.NumBuffers 1;
}
1NVT
{
MuBu.Buffer.Index 0;
}
1NVT
{
MuBu.Track.MxRows 2;
AudioFile 1;
MuBu.Track.NonNumType 0;
MuBu.Track.MaxSize 93515;
meta_ISFT Lavf60.16.100;
MuBu.Track.Name mytrack;
MuBu.Track.BufferIndex 0;
MuBu.Track.SampleRate 48000;
FileName Wilhelm_Scream.wav;
MuBu.Track.MxVarRows 0;
MuBu.Track.MxCols 1;
meta_MetaDataSource WAV;
MuBu.Track.EndTime 1623.5;
FilePath /;
MuBu.Track.SampleOffset 0;
MuBu.Track.TimeTags 0;
MuBu.Track.Size 77929;
MuBu.Track.Index 0;
}

1TYP
{
1MTD M000 {unnamed}
1FTD M000
{
M000 Track-0-MatrixData;
}
}

Data in file test.mubu (3741392 bytes):
77929 M000 frames in stream 0 between time 0.000000 and 1.623500 containing
77929 M000 matrices with 2 -- 2 rows, 1 -- 1 columns

The MuBu file contains one audio track and one buffer. This is a simple test file, but MuBu files can be quite large with multiple tracks.

Working with the Max software or OpenMusic is not something I found to be easy to understand. I am sure if I was more musically inclined and with a little practice I could make some of this work. For the time being, a signature to identify a SDIF and MUBU will have to do. Check out the GitHub for my proposed signature and a couple examples.

Shorten

I was recently going through some of my old CD-R’s and came across this 11 year old fun memory.

I remember going to this 2003 Toad the Wet Sprocket concert in Salt Lake City with some friends, I had seen this band perform before, but this was the first time I was able to get a recording of the show. Normally having a recording of a concert of a well known band was a little shady, but for some bands, they not only allow recording of their live concerts, but they encourage it. There has been a few bands over the years who have this philosophy, one most have heard of is the Grateful Dead, because of all the tape trading, the band’s numerous concerts will live on forever.

The scene of recording concerts is still alive and well, and if you are into recording and sharing it is expected you share in a lossless audio format. The world of lossless audio is definitely in the minority of all those who listen to music on the daily. Most of us have been placated with the infinite playlists on services like Apple Music, Spotify, and Amazon Music. Most probably don’t care about owning music anymore, but for the few who consider themselves Audiophiles, having a lossless audio file is the only choice.

When it comes to formats, there are a few lossless formats to choose from, they all come with some advantages as well as some downsides. WAV files contain the full PCM audio stream, and while internet bandwidth today can handle full uncompressed audio, it can still be beneficial to use some compression for archiving or sharing over the web.

The most common lossless format today is the Free Lossless Audio Codec or FLAC, but there are also quite a few who like the Apple Lossless Audio Codec. Both offer many advantages, especially with metadata, cuesheets, and can contain cover album art. But many years ago another lossless format was most often used with bootleg recordings and audio sharing.

Shorten was one of the first lossless formats, developed by Tony Robinson in 1993 for SoftSound. It could cut the size in half of a typical 16-bit WAV file. It achieved this by using Huffman coding, kinda the same way a JPEG works, by reducing the frequency of how often patterns occur. Today FLAC and ALAC have replaced this format and offer improved features and support. Many audio players have dropped support for shorten making it difficult to use this old format.

The Shorten format uses the .SHN extension. It is one of the formats listed on the Library of Congress Sustainability of Digital Formats with the ID fdd000199, although a couple links don’t appear to work as it hasn’t been updated since 2011. Support was ended for this format and many of the links found on various websites are for broken, usually referencing the etree wiki. Much of which is archived on the Internet Archive.

Let’s take a look at the what makes up a lossless compressed SHN file. A quick look at a sample header:

hexdump -C test.shn | head
00000000  61 6a 6b 67 02 fb b1 70  09 f9 25 59 52 a4 d1 a8  |ajkg...p..%YR...|
00000010  dd cf 85 5a 01 57 a0 d5  a8 b6 6b 6d d2 41 10 80  |...Z.W....km.A..|
00000020  40 20 10 18 04 0a 01 44  d6 40 20 11 0d 8c 0a 01  |@ .....D.@ .....|
00000030  04 80 44 20 16 4b 0d d2  c3 b8 f8 55 a0 11 80 59  |..D .K.....U...Y|
00000040  98 56 1d b1 79 51 9f 39  f1 12 d2 d3 75 5c cd 08  |.V..yQ.9....u\..|
00000050  06 25 68 6b 52 5e 9f 4c  39 cd c1 32 c4 0d a9 b7  |.%hkR^.L9..2....|
00000060  69 34 56 f0 96 fa 46 89  a2 6e 8c ba d5 d0 58 de  |i4V...F..n....X.|
00000070  f5 44 5b aa 61 82 c7 85  88 37 d6 ee cb ab 4e 44  |.D[.a....7....ND|
00000080  91 19 b7 38 d4 20 ae 98  98 d1 2c 4a 4e 88 dd 3e  |...8. ....,JN..>|
00000090  36 68 1b 59 a8 7d 84 23  76 0a 84 21 a1 cd 80 8e  |6h.Y.}.#v..!....|

The first four bytes seem to be consistent among my samples. It makes me wonder if the ascii values have something to do with the author, Anthony (Tony) J. Robinson. In the source code for the shorten software, the file shorten.h defines the ascii “ajkg” as the magic header for the SHN format. Also found in current ffmpeg code. Although the tools don’t have much to say about them.

mediainfo test.shn 
General
Complete name                            : test.shn
Format                                   : Shorten
Format version                           : 2
File size                                : 3.17 MiB

Audio
Format                                   : Shorten
Compression mode                         : Lossless

ffprobe -i test.shn
Input #0, shn, from 'test.shn':
  Duration: N/A, start: 0.000000, bitrate: N/A
  Stream #0:0: Audio: shorten, 44100 Hz, 2 channels, s16p

Using the older SHNTOOL, we can get more information.

shntool info test.shn  
-------------------------------------------------------------------------------
File name:                    test.shn
Handled by:                   shn format module
Length:                       0:32.23
WAVE format:                  0x0001 (Microsoft PCM)
Channels:                     2
Bits/sample:                  16
Samples/sec:                  44100
Average bytes/sec:            176400
Rate (calculated):            176400
Block align:                  4
Header size:                  44 bytes
Data size:                    5697720 bytes
Chunk size:                   5697756 bytes
Total size (chunk size + 8):  5697764 bytes
Actual file size:             3325489
File is compressed:           yes
Compression ratio:            0.5836
CD-quality properties:
  CD quality:                 yes
  Cut on sector boundary:     no
  Sector misalignment:        1176 bytes
  Long enough to be burned:   yes
WAVE properties:
  Non-canonical header:       no
  Extra RIFF chunks:          no
Possible problems:
  File contains ID3v2 tag:    no
  Data chunk block-aligned:   yes
  Inconsistent header:        no
  File probably truncated:    unknown
  Junk appended to file:      unknown
  Odd data size has pad byte: n/a
Extra shn-specific info:
  Seekable:                   yes

Many Shorten Audio Files are found out there in archives and file sharing sites, so even though the format isn’t used to create new files, it will still be around for awhile. My GitHub has my signature proposal and a couple of samples.

Canvas

When it comes to design software there were many options over the years, many being released with a lot of hype and others disappearing not long after they released. There are few which lasted long enough to not be gobbled up by big names such as Adobe. One of those is Canvas by Deneba Systems.

First released in 1987, it is still available over at Canvas GFX. It’s amazing it was never bought by one of the big names, Adobe, Corel, Aldus, etc and remained under Deneba Systems until 2003 when it was bought by ACD Systems, but kept the name Deneba Canvas for a time. The later versions were not popular to all, and Mac support was dropped, but the software continued. Awhile back I was looking through a few of my old ZIP disks and found some software my father used in the mid 1980’s. He had a copy of Canvas version 2 for Macintosh. At that time I was more interested in playing games on our family’s Macintosh 128k than using design software.

Over the years I have come across many Canvas documents. With each version released, changes were made to the file format used to store the drawings and artwork. There were many file format changes as well as the extensions used with each version. Some are easily identifiable and others have some confusing structures. Lets look into it.

VersionPlatformExtensionDescription
Canvas 1-3 & artWORKSMacintoshnoneno strong pattern
Canvas 3.5Mac & WindowsCVSSimilar to v1-3
Canvas 5Mac & WindowsCV5CANVAS5 string
Canvas 6-8Mac & WindowsCNVCANVAS6 string
Canvas 9-XMac & WindowsCVXSimilar to 6-8
Canvas DrawMacCVDDifferent than others
Canvas Image FileCVIDAD5PROX

The first three versions of Canvas were Macintosh only and in those early days there was no extension, just a Type / Creator indicating to the Finder how to open them. Deneba Systems used the Creator codes DAD2, DAD5, through DADX.

The first versions are quite frustrating. I have gathered samples from Version 2, 3, 3.5 and artWORKS version 1. Even with numerous samples, there are no patterns I can discern from them. I even reached out to the current CanvasX technical support for answers. They wanted to be helpful, but their answers didn’t offer much help.

With “CVS” or ‘drw2’ for mac, the header contains ranges inside a structure, and other data like if it was compressed. When we see if it’s a valid file we check the ranges. There is no easy way to determine what hex values would be written because of flipping, Intel vs (PPC or 68K). Unfortunately, the research needed to identify the Hex value will require the original code for version 3.5 which we do not have access to easily. Canvas 3.5 code is 16 bit… this would also be an issue.

Let’s take a look at a couple samples:

hexdump -C Canvas2.1-Sample | head
00000000  00 00 03 06 00 00 3d 9c  00 00 00 2a 00 00 00 0a  |......=....*....|
00000010  00 00 00 76 00 00 00 36  00 00 00 2e 00 00 00 1e  |...v...6........|
00000020  00 00 00 12 00 00 00 42  00 00 00 1a 00 00 00 82  |.......B........|
00000030  00 00 00 3c 00 66 00 01  00 00 3d 9c 00 48 00 00  |...<.f....=..H..|
00000040  40 02 90 00 00 00 00 00  00 00 00 00 00 00 00 00  |@...............|
00000050  00 01 00 00 01 00 00 00  00 20 00 40 00 60 00 80  |......... .@.`..|
00000060  00 c0 01 40 01 80 01 c0  02 40 02 80 00 00 00 00  |...@.....@......|
00000070  00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 05  |................|
00000080  00 00 00 00 00 01 00 10  00 00 00 01 00 03 3f fc  |..............?.|
00000090  80 00 00 00 00 00 00 00  00 07 00 01 00 01 00 0b  |................|

hexdump -C Canvas2-s02 | head
00000000  00 00 03 b2 00 00 07 ec  00 00 00 2a 00 00 00 0a  |...........*....|
00000010  00 00 00 76 00 00 00 36  00 00 00 2e 00 00 00 1e  |...v...6........|
00000020  00 00 00 12 00 00 00 42  00 00 00 1a 00 00 00 82  |.......B........|
00000030  00 00 00 3c 00 66 00 01  00 00 07 ec 00 48 00 00  |...<.f.......H..|
00000040  40 02 90 00 00 00 00 00  00 00 00 00 00 00 00 00  |@...............|
00000050  00 01 01 00 01 00 00 00  00 20 00 40 00 60 00 80  |......... .@.`..|
00000060  00 c0 01 40 01 80 01 c0  02 40 02 80 00 00 00 00  |...@.....@......|
00000070  00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 05  |................|
00000080  00 00 00 00 00 01 00 10  00 00 00 01 00 03 3f fc  |..............?.|
00000090  80 00 00 00 00 00 00 00  00 07 00 01 00 01 00 0b  |................|

hexdump -C Canvas3.04 | head
00000000  00 00 02 5a 00 00 00 1c  00 00 00 2a 00 00 00 0a  |...Z.......*....|
00000010  00 00 00 76 00 00 00 36  00 00 00 2e 00 00 00 1e  |...v...6........|
00000020  00 00 00 12 00 00 00 42  00 00 00 1a 00 00 00 82  |.......B........|
00000030  00 00 00 3c 00 68 00 02  00 00 00 1c 00 48 00 00  |...<.h.......H..|
00000040  40 02 90 00 00 00 00 00  00 00 00 00 00 00 00 00  |@...............|
00000050  00 01 01 00 01 03 00 00  00 20 00 40 00 60 00 80  |......... .@.`..|
00000060  00 c0 01 40 01 80 01 c0  02 40 02 80 00 00 00 00  |...@.....@......|
00000070  00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  |................|
00000080  00 01 00 00 00 01 00 10  00 00 00 01 00 03 3f fc  |..............?.|
00000090  80 00 00 00 00 00 00 00  00 07 00 01 00 01 00 0b  |................|

hexdump -C Canvas5-3.5-Sample1.CVS | head
00000000  00 00 01 58 00 00 01 30  00 00 00 2a 00 00 00 00  |...X...0...*....|
00000010  00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  |................|
*
00000030  00 00 00 00 00 69 00 02  00 00 01 30 00 48 00 00  |.....i.....0.H..|
00000040  40 02 90 00 00 00 00 00  00 00 00 00 00 00 00 00  |@...............|
00000050  00 01 01 01 00 00 00 00  00 20 00 40 00 60 00 80  |......... .@.`..|
00000060  00 c0 01 40 01 80 01 c0  02 40 02 80 00 00 00 00  |...@.....@......|
00000070  00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  |................|
00000080  00 01 00 00 00 01 00 10  00 00 00 01 00 03 3f fc  |..............?.|
00000090  80 00 00 00 00 00 00 00  00 07 00 01 00 01 00 01  |................|

hexdump -C C3-5-S01.CVS | head
00000000  78 11 00 00 10 00 00 00  2a 00 00 00 0a 00 00 00  |x.......*.......|
00000010  26 00 00 00 26 00 00 00  26 00 00 00 26 00 00 00  |&...&...&...&...|
00000020  96 00 00 00 2a 00 00 00  2e 00 00 00 32 00 00 00  |....*.......2...|
00000030  00 00 00 00 01 6b 01 00  50 14 00 00 28 00 00 00  |.....k..P...(...|
00000040  6e 00 00 00 5b 00 00 00  01 00 04 00 00 00 00 00  |n...[...........|
00000050  e8 13 00 00 12 0b 00 00  12 0b 00 00 00 00 00 00  |................|
00000060  00 00 00 00 00 00 00 00  00 00 80 00 00 80 00 00  |................|
00000070  00 80 80 00 80 00 00 00  80 00 80 00 80 80 00 00  |................|
00000080  c0 c0 c0 00 80 80 80 00  00 00 ff 00 00 ff 00 00  |................|
00000090  00 ff ff 00 ff 00 00 00  ff 00 ff 00 ff ff 00 00  |................|

In the version 2 & 3 samples you can see some patterns, which I thought would allow for proper identification, but looking at more samples I found differences. One pattern I was hopeful might be consistent was the hex values “002000400060008000C00140018001C002400280”, but there are some which don’t match this pattern. If the file is truly compressed, it will be hard to know which values would be consistent among all files. I have over 8,000 samples and have a signature that only excludes around 20, so it will have to do for now.

When we start with Version 5 we get into some more identifiable headers, there is some oddness with some samples. But with an ascii string like “CANVAS5”, it should be easy, right? Not so fast, in version 5 you can compress the file structure. This removes the easily identifiable “CANVAS5” string. But some have a small string at the tail end, but others do not.

hexdump -C Canvas5-Sample1.CV5 | head
00000000  02 00 00 80 00 00 00 00  00 00 00 4e 96 00 00 4e  |...........N...N|
00000010  96 18 02 00 00 00 0e a8  da 43 41 4e 56 41 53 35  |.........CANVAS5|
00000020  00 01 00 00 00 00 00 05  03 00 00 00 00 00 00 00  |................|
00000030  00 00 00 00 00 21 00 00  00 21 00 00 00 79 00 00  |.....!...!...y..|
00000040  00 03 00 00 01 6b 00 00  00 03 00 00 00 01 ff ff  |.....k..........|
00000050  ff ff ff ff ff ff ff ff  ff ff ff ff ff ff ff ff  |................|

hexdump -C Canvas5-Sample3-cmp.CV5 | head
00000000  02 00 00 80 00 00 00 00  08 00 00 80 00 00 00 03  |................|
00000010  5c ff ff ff ff 00 00 40  22 00 00 03 50 10 00 89  |\......@"...P...|
00000020  07 60 bd 0f f0 00 00 10  03 04 10 56 00 20 05 00  |.`.........V. ..|
00000030  e0 18 02 10 35 04 30 4e  05 30 72 07 f0 a8 0d a1  |....5.0N.0r.....|
00000040  17 11 81 19 05 50 5c 00  60 0f 00 10 80 02 90 80  |.....P\.`.......|
00000050  03 f0 56 05 50 55 05 b0  75 12 51 29 05 e0 55 05  |..V.PU..u.Q)..U.|

hexdump -C Canvas5-Sample3-cmp.CV5 | tail
00001ff0  00 00 00 01 08 a5 ab c0  00 00 00 00 3f 89 2c 58  |............?.,X|
00002000  00 00 00 00 08 a5 ab 80  00 00 00 00 ff d4 11 e4  |................|
00002010  00 00 00 00 08 a5 ab 90  00 02 3e d8 ff d3 12 cc  |..........>.....|
00002020  00 00 00 00 00 00 00 00  00 02 3e d8 00 01 00 09  |..........>.....|
00002030  00 00 00 00 00 00 00 00  00 00 00 00 08 a5 ab f8  |................|
00002040  00 00 00 00 43 4e 56 35                           |....CNV5|

Canvas 6 uses a new extension, but has a similar structure to the file format. With compression as an option. But some of the compressed files on Windows has a reversed string, “5VNC“. So many Canvas 5 compressed look identical to Canvas 6 compressed, complicating identification.

hexdump -C Canvas6-Sample.CNV | head
00000000  01 00 80 00 00 90 07 cd  07 00 80 00 00 00 80 00  |................|
00000010  00 17 01 00 00 59 f5 0e  00 43 41 4e 56 41 53 36  |.....Y...CANVAS6|
00000020  00 01 00 00 00 00 06 00  00 00 00 00 00 00 00 00  |................|
00000030  00 00 00 00 00 21 7a 00  00 00 7a 00 00 00 03 00  |.....!z...z.....|
00000040  00 00 6e 01 00 00 03 00  00 00 01 00 00 00 ff ff  |..n.............|
00000050  ff ff ff ff ff ff ff ff  ff ff ff ff ff ff ff ff  |................|

hexdump -C Canvas6-Sample1-c.CNV | head
00000000  01 00 80 00 00 58 ea 2b  00 c2 1d 00 00 d0 09 00  |.....X.+........|
00000010  00 00 00 0f 2e 00 00 0b  07 00 00 09 c4 10 00 01  |................|
00000020  00 00 03 00 20 04 00 70  ff 00 80 05 00 c0 06 06  |.... ..p........|
00000030  50 20 03 00 0f 06 10 6b  00 a0 12 01 00 48 07 20  |P .....k.....H. |
00000040  6d 07 30 40 06 40 11 06  00 0b 05 00 10 00 10 71  |m.0@.@.........q|
00000050  01 40 21 00 00 59 01 00  0f 05 10 00 00 e1 14 00  |.@!..Y..........|

hexdump -C Canvas6-Sample1-c.CNV | tail
000016a0  00 00 00 12 f6 00 00 c0  f0 12 00 3c d0 80 7c 58  |...........<..|X|
000016b0  2f 14 00 00 00 00 00 bc  f4 8d 00 0f 00 00 00 00  |/...............|
000016c0  f1 12 00 7f 00 00 00 f8  2e 14 00 bc f4 8d 00 1c  |................|
000016d0  f2 12 00 04 f3 12 00 fc  d1 80 7c 09 04 00 00 00  |..........|.....|
000016e0  00 00 40 00 f2 12 00 ff  ff ff ff 00 f1 12 00 1c  |..@.............|
000016f0  f1 12 00 bc f4 8d 00 00  00 00 40 35 56 4e 43     |..........@5VNC|

While most have the “CANVAS6” string near the beginning, quite a few are missing the CNV5/5VNC string at the end. Instead, many have the string “%SI-0200” near the end, which I use in my signature suggestion. This structure remained the same from version 6 to 8.

hexdump -C Canvas8-S01.CNV | head
00000000  02 00 00 80 00 00 12 b8  80 00 00 11 19 00 00 11  |................|
00000010  19 18 02 00 00 00 0e f5  59 43 41 4e 56 41 53 36  |........YCANVAS6|
00000020  00 01 00 00 00 00 00 08  01 00 00 00 00 00 00 00  |................|
00000030  00 00 00 00 00 21 00 00  00 00 00 00 00 00 00 00  |.....!..........|
00000040  00 03 00 00 00 00 00 00  00 03 00 00 00 01 00 00  |................|
00000050  00 01 ff ff ff ff 00 00  00 02 00 00 00 02 00 00  |................|

But…….. There are plenty without these strings, just the “%SI-0200” near the end.

hexdump -C TELEGRPH.CNV | head
00000000  02 00 00 80 00 00 00 00  08 00 00 80 00 00 00 3d  |...............=|
00000010  f2 ff ff ff ff 00 00 75  76 00 00 3d e6 10 00 ff  |.......uv..=....|
00000020  00 00 b3 0d 90 a9 03 b0  8a 07 f0 98 07 60 80 08  |.............`..|
00000030  d0 35 01 c0 58 01 e0 59  04 80 b8 03 90 38 02 f0  |.5..X..Y.....8..|
00000040  e2 00 20 0b 03 70 1d 03  20 36 0f 30 00 01 80 09  |.. ..p.. 6.0....|

hexdump -C TELEGRPH.CNV | tail
00006850  2b 2c f9 ae 30 00 00 00  20 00 00 00 01 00 00 00  |+,..0... .......|
00006860  0f 00 00 00 10 00 00 00  1e 00 00 00 07 00 00 00  |................|
00006870  64 65 6e 65 62 61 00 00  00 00 01 4c 25 53 49 2d  |deneba.....L%SI-|
00006880  30 32 30 30 6d 61 63 00  00 00 00 00 00 00 00 00  |0200mac.........|
00006890  00 00 00 00                                       |....|

In version 9 and forward we have an extension change to CVX, but the format is similar with the “CANVAS6” string, but is a slightly different offset. It is still used with the current version of Canvas X.

hexdump -C Canvas9-Sample1.cvx | head
00000000  00 00 00 00 00 00 00 00  00 00 02 00 00 80 00 07  |................|
00000010  d1 84 d0 00 00 80 00 00  00 80 00 18 02 00 00 00  |................|
00000020  0f b7 ef 43 41 4e 56 41  53 36 00 01 00 00 00 00  |...CANVAS6......|
00000030  00 09 00 00 00 03 34 00  00 00 04 00 00 00 00 00  |......4.........|
00000040  00 00 00 3c 42 45 47 49  4e 5f 50 52 45 56 49 45  |...<BEGIN_PREVIE|
00000050  57 5f 54 41 47 3e 21 00  00 00 75 00 00 00 79 00  |W_TAG>!...u...y.|
00000060  00 00 03 00 00 01 6b 00  00 00 03 00 00 00 01 ff  |......k.........|
00000070  ff ff ff ff ff ff ff ff  ff ff ff ff ff ff ff ff  |................|

hexdump -C Canvas9-Sample1-compressed.cvx | tail
00004090  00 00 e0 20 00 57 80 00  00 00 00 00 0a 13 00 09  |... .W..........|
000040a0  00 00 04 00 00 00 00 01  00 00 00 00 bf ff e0 80  |................|
000040b0  bf ff e0 40 01 8c 5e 00  02 4a 22 d0 00 00 01 60  |...@..^..J"....`|
000040c0  bf ff e0 40 00 5c 08 18  00 00 00 00 00 0d 84 80  |...@.\..........|
000040d0  43 61 6e 76 61 73 39 2d  53 61 6d 70 6c 65 31 2d  |Canvas9-Sample1-|
000040e0  63 6f 6d 70 72 65 73 73  65 64 2e 63 76 78 00 18  |compressed.cvx..|
000040f0  bf ff e0 70 0a 12 6a a0  02 43 22 b4 00 0c aa 9c  |...p..j..C".....|
00004100  bf ff e0 80 00 00 00 01  00 00 00 00 00 0d 84 80  |................|
00004110  bf ff e0 b0 43 4e 56 35                           |....CNV5|

hexdump -C CanvasX2019-S01.cvx | head
00000000  00 00 00 00 00 00 00 00  00 00 01 00 80 00 00 00  |................|
00000010  6e ab 03 00 80 00 00 00  80 00 00 17 01 00 00 ef  |n...............|
00000020  b7 0f 00 43 41 4e 56 41  53 36 00 01 00 00 00 00  |...CANVAS6......|
00000030  09 00 00 4d 01 00 00 eb  4c 00 00 41 00 00 00 31  |...M....L..A...1|
00000040  52 45 56 03 00 00 00 01  00 00 00 00 00 00 00 00  |REV.............|
00000050  00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  |................|

This collection of file formats is very hard to make sense of. Some really great consistent patterns on many samples, with lots of exceptions. Super confusing. This software has had a long run, with the latter years staying pretty stagnate in terms of new development. It is worth defining and creating a signature for the consistent patterns, then we can dial in the variants over time?

The signatures I have built miss about 23 files in versions 1-3 out of the ~9000 samples I have and for Canvas 5, only some of the compressed files are currently not identified. But so far all my CNV and CVX files identify correctly, so probably good for now.

CanvasX dropped supported for the Macintosh, but did release an entirely different product called Canvas X Draw, which does support the Macintosh. Here is what a CVD file looks like:

hexdump -C CanvasXDraw7-Sample1.cvd | head
00000000  25 43 61 6e 76 61 73 43  56 44 09 31 2e 30 25 bb  |%CanvasCVD.1.0%.|
00000010  54 48 65 61 64 65 72 00  00 00 00 00 00 00 00 00  |THeader.........|
00000020  00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  |................|
00000030  00 bb 52 4d 61 63 4f 53  56 65 72 73 69 6f 6e 20  |..RMacOSVersion |
00000040  31 30 2e 31 33 2e 36 20  28 42 75 69 6c 64 20 31  |10.13.6 (Build 1|
00000050  37 47 31 34 30 34 32 29  31 30 2e 32 33 30 34 08  |7G14042)10.2304.|
00000060  00 00 00 70 6c 61 74 66  6f 72 6d 0a 73 00 00 00  |...platform.s...|
00000070  00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  |................|
00000080  00 00 00 00 00 05 00 00  00 02 00 00 00 00 00 00  |................|
00000090  00 08 00 00 00 6f 73 0a  73 00 00 00 00 00 00 00  |.....os.s.......|

There is also the matter of a Canvas Image, which the User Guide calls proxy images. They are Raster images used in placements within Canvas Documents. Should be easy to identify.

hexdump -C Canvas5-Sample1.CVI | head
00000000  00 00 00 01 44 41 44 35  50 52 4f 58 00 00 09 99  |....DAD5PROX....|
00000010  00 00 00 11 00 00 00 2d  00 00 00 03 00 00 00 08  |.......-........|
00000020  00 48 00 00 00 00 00 06  00 03 00 08 00 00 00 11  |.H..............|
00000030  00 00 00 2d 00 03 00 03  00 48 00 00 00 48 00 00  |...-.....H...H..|
00000040  00 00 00 00 00 00 00 00  00 00 00 11 00 00 00 2d  |...............-|
00000050  00 00 00 02 00 00 00 08  00 00 00 01 00 00 00 11  |................|
00000060  00 00 00 2d ff ff ff ff  ff ff ff ff ff ff ff ff  |...-............|
00000070  ff ff ff ff ff ff ff ff  ff ff ff ff ff ff ff ff  |................|

Phew, if you held on for this whole post you must really like confusing file format structures. This format has been on my mind on and off for about 6 years. Hopefully these signatures will work for the vast majority of the Canvas files found in archives and personal systems. As always here is my GitHub with the signatures I am proposing and a few samples to get you confused.

MAGIX

There are probably many reasons why a software developer might want to create a proprietary format to store their files in. The software may require special features that don’t fit into an existing format. I would hope a developer would try to use existing formats, or even better open formats, but for many reasons, which probably include profits, they choose to re-invent the wheel often.

MAGIX is a German company which started making software in 1994. In 2001 they developed their first video editing software which was called Movie Edit Pro. The software seems to be well received and is still in use today.

Like most video editing software, project files are used to store all the edits and links to video files. These are usually smaller text based, with many using XML as the project format. Not MAGIX, they decided to go with a different yet known format for their project files.

hexdump -C MAGIX15-s01.MVP | head
00000000  52 49 46 46 6c 37 01 00  53 45 4b 44 4d 56 50 48  |RIFFl7..SEKDMVPH|
00000010  08 00 00 00 00 00 00 00  00 00 00 00 4c 49 53 54  |............LIST|
00000020  0c 16 01 00 4d 56 50 4c  4c 49 53 54 00 16 01 00  |....MVPLLIST....|
00000030  56 49 50 4c 53 56 49 50  0c 07 00 00 00 dc 05 00  |VIPLSVIP........|
00000040  00 00 00 00 20 00 00 00  0c 00 00 00 80 bb 00 00  |.... ...........|
00000050  10 00 00 00 29 6b 55 e2  53 f8 3d 40 00 00 f0 42  |....)kU.S.=@...B|
00000060  01 00 00 00 bd 04 ef fe  00 00 01 00 06 00 08 00  |................|
00000070  00 00 01 00 06 00 08 00  00 00 01 00 3f 00 00 00  |............?...|
00000080  28 00 00 00 04 00 04 00  01 00 00 00 00 00 00 00  |(...............|
00000090  00 00 00 00 00 00 00 00  bd 8f 32 01 d0 02 00 00  |..........2.....|

Yes, they used the RIFF container format for their projects. Seems an odd choice, especially for video production although it is well suited for it. AVI is another video format which uses the RIFF container. The MVP project file uses the ID SEKD with the format MVPH. Earlier versions of Movie Edit Pro used a different extension.

hexdump -C MAGIXv11-s01.MVD | head
00000000  52 49 46 46 38 57 00 00  53 45 4b 44 53 56 49 50  |RIFF8W..SEKDSVIP|
00000010  70 00 00 00 00 dc 05 00  00 00 00 00 04 00 00 00  |p...............|
00000020  02 00 00 00 80 bb 00 00  10 00 00 00 8e 23 d6 e2  |.............#..|
00000030  53 f8 3d 40 00 00 f0 42  01 00 00 00 bd 04 ef fe  |S.=@...B........|
00000040  00 00 01 00 00 00 06 00  00 00 04 00 00 00 06 00  |................|
00000050  00 00 04 00 3f 00 00 00  28 00 00 00 04 00 04 00  |....?...(.......|
00000060  01 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  |................|
00000070  c8 1b 32 01 d0 02 00 00  e0 01 00 00 52 d7 da fb  |..2.........R...|
00000080  54 55 f5 3f 4c 49 53 54  04 00 00 00 70 68 79 73  |TU.?LIST....phys|
00000090  4c 49 53 54 d0 3d 00 00  74 72 6b 73 4c 49 53 54  |LIST.=..trksLIST|

The MVD format used on an earlier version of Movie Edit Pro is also a RIFF, and with the ID of SEKD, but has a format of SVIP.

RIFFpad can break down the chunks we see in an MVP file. Each of the LIST chunks has their own subchunks as well. I assume this his how the editing software stores each video/audio track references, etc. So I give it to MAGIX for at least using an understandable format to store their projects.

MAGIX has also used RIFF in many of its supporting formats. So far I have found mfx, afx, ifx, cfx, ctf, tfx, ufx, mmt, mmm, hdp, each having their own format:

hexdump -C 101_Loud.mfx | head
00000000  52 49 46 46 a8 6f 00 00  53 45 4b 44 4d 41 46 58  |RIFF.o..SEKDMAFX|
00000010  00 00 00 00 4c 49 53 54  94 6f 00 00 41 55 46 58  |....LIST.o..AUFX|
00000020  4c 49 53 54 88 6f 00 00  41 46 58 45 46 58 48 44  |LIST.o..AFXEFXHD|
00000030  20 00 00 00 00 00 25 0d  00 00 00 00 02 00 00 00  | .....%.........|
00000040  01 00 00 00 00 00 00 00  03 18 00 00 00 00 00 00  |................|
00000050  00 00 00 00 4c 49 53 54  54 6f 00 00 41 46 58 44  |....LISTTo..AFXD|
00000060  4c 49 53 54 50 6a 00 00  41 46 58 45 46 58 48 44  |LISTPj..AFXEFXHD|
00000070  20 00 00 00 00 00 25 0d  00 00 00 00 05 00 00 00  | .....%.........|
00000080  01 00 00 00 00 00 00 00  03 18 00 00 00 00 00 00  |................|
00000090  00 00 00 00 4c 49 53 54  1c 6a 00 00 41 46 58 44  |....LIST.j..AFXD|

Not sure the best way to manage all of these in terms of identification, as I am not sure what what is the purpose of each format. Maybe for now I’ll make a generic to catch them all as a MAGIX File.

ExtensionIDFORMAT
AFXSEKDSAFX
CFXSEKDSCFX
CTFSEKDSVIP
HDPSEKDSHDP
IFXSEKDSIFX
MFXSEKDMAFX
MMMSEKDSVIP
MMTSEKDSVIP
MVDSEKDSVIP
MVPSEKDMVPH
MXMMXMDmxmi
TFXSEKDSTFX
UFXSEKDSVIP

But, when it comes to their proprietary MAGIX Video format, I think they may have pushed things a little too far. Meet the MXV format:

hexdump -C MAGIXv11-s01.mxv | head
00000000  4d 58 52 49 46 46 36 34  9a cb 2b 00 00 00 00 00  |MXRIFF64..+.....|
00000010  4d 58 4a 56 49 44 36 34  4d 58 4a 56 48 32 36 34  |MXJVID64MXJVH264|
00000020  70 00 00 00 00 00 00 00  70 00 00 00 03 00 00 00  |p.......p.......|
00000030  42 93 2b 00 00 00 00 00  f0 00 00 00 00 00 00 00  |B.+.............|
00000040  7b 2e 00 00 4b 00 00 00  01 00 00 00 00 00 00 00  |{...K...........|
00000050  8e 23 d6 e2 53 f8 3d 40  80 02 00 00 e0 01 00 00  |.#..S.=@........|
00000060  80 02 00 00 e0 01 00 00  04 00 00 00 43 15 00 00  |............C...|
00000070  f0 00 00 00 00 00 00 00  28 19 00 00 00 00 00 00  |........(.......|
00000080  55 55 55 55 55 55 f5 3f  00 00 00 00 00 00 00 00  |UUUUUU.?........|
00000090  7f dd 05 00 00 00 00 00  4d 58 4a 56 48 44 36 34  |........MXJVHD64|

I am not sure what I am looking at, is it a RIFF? Is it a RIFF variant like RF64? MAGIX claims the format is:

This is the MAGIX video format for quicker processing with MAGIX products. It offers very low loss of quality, but it cannot be played via conventional DVD players.

MAGIX Video Pro X6

A look around the internet doesn’t bring much up in reference to this format. Just my recent page on the format wiki. A search for MXRIFF64 bring up nothing. But a closer look at other strings within the MXV file reveal we are probably looking at some sort of MPEG format.

I was able to locate a project on GitHub which claims to be able to demux the MXV format. The software is written in GO and appears to indicate this format is chunked based and has most of the chunks figured out. So if you find yourself stuck with some MXV files and don’t want to use the latest from MAGIX, this might be the tool for you.

This demuxer also has an interesting file you can download. It is called a “GRAMMAR” file and can be loaded into hex viewers like Synalyze It! can show the parts of a file you load. Its a great way to explore a format!

None of these formats are found in PRONOM, project files are not usually kept in archives, but if would be good to know about the RIFF files if they do turn up. The video format is for sure something the archival world should know about. MediaInfo is currently not aware of this format, but seems like it might be an easy task.

As usual, you can see some samples and my proposal signatures on my GitHub.

Melco

I came across another CD-ROM the other day with some fun embroidery formats. It includes the HUS format I recently posted on, plus a few more.

Like I mentioned before, this is a format genre which is not normally seen in the archival world, but is fun to take a peek into the world of embroidery formats. The HUS format from Husqvarna was a unique proprietary format, but looking at another in this set, we see a common container format.

filename : 'CH1604.ofm'
filesize : 25600
modified : 2002-04-29T05:58:26-06:00
errors   : 
matches  :
  - ns      : 'pronom'
    id      : 'fmt/111'
    format  : 'OLE2 Compound Document Format'
    version : 
    mime    : 
    class   : 'Text (Structured)'
    basis   : 'byte match at 0, 30'

First, what is an OFM file? It is the native format for Melco branded embroidery machines. They have been around for a few years. Melco has been around since 1972, but i’m sure the format is much newer. The fact that it is in an OLE container would indicate it was created in the mid 1990’s.

Looking inside the OLE container:

Path = CH1604.ofm
Type = Compound
Physical Size = 25600
Extension = compound
Cluster Size = 512
Sector Size = 64

   Date      Time    Attr         Size   Compressed  Name
------------------- ----- ------------ ------------  ------------------------
                    .....        19171        19456  EdsIV Object
                    .....         2502         2560  Design Icon
                    .....          130          192  Design Status
------------------- ----- ------------ ------------  ------------------------
                                 21803        22208  3 files

The EdsIV Object seems specific. Looking back at the web archive it looks like EDS IV was software available for the Melco products. In a user manual there are three formats associated with the software:

  • .CND – Condensed Format
  • .EXP – Expanded Format
  • .OFM – Project (Layout format)

The EdsIV Object file is unique and will work well for identification. There also seems to be some common patterns within the file that can further the correct identification.

hexdump -C EdsIV Object | head
00000000  03 00 00 00 03 00 00 00  00 00 00 00 00 00 ff ff  |................|
00000010  0b 00 0c 00 43 50 72 6a  44 65 66 61 75 6c 74 73  |....CPrjDefaults|
00000020  05 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  |................|
00000030  00 00 00 00 00 00 f0 3f  28 00 00 00 01 00 00 00  |.......?(.......|
00000040  7f 00 00 00 00 00 00 00  00 00 39 40 00 00 00 00  |..........9@....|
00000050  00 00 10 40 00 00 00 00  00 00 00 00 00 00 00 00  |...@............|
00000060  00 00 00 00 00 00 00 00  00 00 59 40 04 00 00 00  |..........Y@....|
00000070  00 00 00 00 00 00 00 00  00 00 00 00 00 80 51 40  |..............Q@|
00000080  00 00 00 00 00 00 3e 40  00 00 00 00 00 00 2e 40  |......>@.......@|
00000090  00 00 00 00 00 80 56 40  00 00 00 00 00 80 51 40  |......V@......Q@|

The CND and EXP formats are a different matter. I ran Tridscan across all the CND samples and it could not detect one common pattern among them all.

python tridscan.py *.csd

TrIDScan/Py v2.02 - (C) 2015-2016 By M.Pontello

File(s) to scan found: 60
Scanning for patterns...
Checking file 1/60 './Cf0103.csd'
Checking file 2/60 './Cr0005.csd'
  Pattern(s) found: 11
Checking file 3/60 './Fd0106.csd'
tridscan.py: Error: no patterns found!

Being a condensed format, I gather it might have some compression which makes for a difficult binary file to identify.

The EXP format on the other hand has a short pattern at the beginning:

hexdump -C CF0103.EXP | head
00000000  80 02 00 00 80 02 18 e7  80 02 19 e6 80 02 19 e6  |................|
00000010  80 02 19 e7 80 02 19 e6  80 02 19 e6 80 02 19 e6  |................|
00000020  80 02 19 e7 80 02 19 e6  80 02 19 e6 80 02 18 e7  |................|
00000030  00 00 fc 00 04 00 fc 00  04 ff fc 01 ed 00 ec 00  |................|
00000040  21 21 df de da 01 15 14  15 15 15 15 eb eb eb eb  |!!..............|
00000050  eb eb da 00 17 17 17 17  17 18 17 17 ea e9 e9 e9  |................|
00000060  e9 e8 e9 e9 ed 00 ec 00  18 18 19 19 18 19 19 19  |................|
00000070  18 18 e8 e8 e8 e7 e7 e7  e8 e7 e8 e8 fa 01 20 00  |.............. .|
00000080  21 00 20 01 21 00 20 00  f8 1e f7 1e f7 1f f7 1e  |!. .!. .........|
00000090  da 00 e6 e5 e5 e5 e5 e4  e5 e5 1a 1b 1b 1b 1b 1c  |................|

Currently Melco distributes a different software for use with their embroidery machines. Their DesignShop software also works with the OFM format. Downloading a copy of version 11 and using the trial version I get access to a few OFM sample files. Let’s see if they are the same.

hexdump -C BUBBLEBOY1.ofm | head
00000000  52 49 46 46 86 e5 01 00  4f 46 4d 38 76 72 73 6e  |RIFF....OFM8vrsn|
00000010  08 00 00 00 39 00 2e 00  30 00 30 00 6e 6f 74 65  |....9...0.0.note|
00000020  a8 00 00 00 ff fe ff 52  44 00 69 00 67 00 69 00  |.......RD.i.g.i.|
00000030  74 00 69 00 7a 00 65 00  72 00 20 00 3a 00 20 00  |t.i.z.e.r. .:. .|
00000040  41 00 45 00 30 00 38 00  33 00 0d 00 0a 00 46 00  |A.E.0.8.3.....F.|
00000050  61 00 62 00 72 00 69 00  63 00 20 00 3a 00 20 00  |a.b.r.i.c. .:. .|
00000060  54 00 77 00 69 00 6c 00  6c 00 20 00 0d 00 0a 00  |T.w.i.l.l. .....|
00000070  4d 00 45 00 4c 00 43 00  4f 00 20 00 2d 00 20 00  |M.E.L.C.O. .-. .|
00000080  41 00 43 00 54 00 49 00  4f 00 4e 00 20 00 49 00  |A.C.T.I.O.N. .I.|
00000090  4c 00 4c 00 55 00 53 00  54 00 52 00 41 00 54 00  |L.L.U.S.T.R.A.T.|

Well that is very different than the earlier example. We can see right away this is a different type of file, in fact the first few bytes tells us this another container format. The Resource Interchange File Format, is used in many various file formats, the most popular are WAVE, AVI, and CorelDRAW. It is a chunk based format and there are a few tools we can use to look closer.

Riffpad can open the file, but claims there is some extra data at the end. It does see four chunks and it gives us the code “OFM8”, which is what identifies this particular RIFF type.

I was also able to get some samples of version 10 of DesignShop and found they are the same OLE container. Also has the same “EdsIV Object” within the container. There is a small paragraph in the EdsIV user manual that indicates there are some versioning within the OFM format.

If you open an EDS III .OFM file and save it, it will be converted into an EDS IV .OFM file, which is no longer readable in EDS III.
Files saved in this version of EDS IV cannot be read by previous versions of EDS IV.

This version of EDS IV is capable of producing two types of OFM files. Files saved as “Melco Project File (.ofm)” can only be read with this version or higher versions of EDS IV. Files saved as “Melco Version 2.00 (.ofm)” can be read by any EDS IV user that has version 2.00.006 or higher software.

It never ceases to amaze me how many formats use the Compound Object Container format. Seems like more and more are documented often. For now, I made a signature to identify the OLE and RIFF version of OFM. I’ll keep my eye out for the older EDS III and other related formats. As always, you can find my signatures and a sample file on my GitHub.

PowerBI

I think when most of us have some data to sort or make sense of, we tend to gravitate toward a spreadsheet. Using Excel or LibreOffice, or if you really like to party, OpenRefine. There are plenty of meme’s out there representing the frustration people have with bugs, features and limitations of Excel specifically.

There are more tools out there for making sense of data, one some people have access to is Microsoft’s more advanced PowerBI tool. Marketed as a Data Visualization tool it is accessible to many with a Office 365 subscription. It offers expanded features than excel and isn’t as limited in row maximums.

PowerBi was recently the topic of a Code4Lib editorial issue. The writer of an article for their journal posted two PowerBI datasets which a reader later noticed had private data. After some miscommunications and misunderstandings an open letter was drafted and received some support. Code4Lib did release a statement and lessons were learned.

One statement from the Code4Lib staff caught my eye. “The released files were in a proprietary file format, Microsoft Power BI, with which none of the editors have experience.”

We all use tools for our jobs we are most familiar or available to us. No one can be an expert in all file formats. Some us try, but things change so fast it is impossible. But, we can do more in documenting and making formats identifiable through the tools we use for digital preservation. The File Format Wiki and PRONOM have had no mention of Power BI, so let’s change that.

Microsoft Power BI was released in 2011 and has been part of the Microsoft Power Platform. Power BI can gather data from many sources. The software can be accessed in the Office 365 cloud, but also using a Desktop application. In the desktop application, all the data sources and connections are stored in a single file with the extension PBIX. But there are other related formats.

filename : 'PowerBI-Test.pbix'
filesize : 401951
modified : 2024-02-22T11:29:41-07:00
errors   : 
matches  :
  - ns      : 'pronom'
    id      : 'x-fmt/263'
    format  : 'ZIP Format'
    version : 
    mime    : 'application/zip'
    class   : 'Aggregate'
    basis   : 'byte match at [[0 4] [401867 3] [401929 4]]'
    warning : 'extension mismatch'

Path = PowerBI-Test.pbix
Type = zip
Physical Size = 401951

   Date      Time    Attr         Size   Compressed  Name
------------------- ----- ------------ ------------  ------------------------
2024-02-22 18:29:40 .....            8           10  Version
2024-02-22 18:29:40 .....          488          230  [Content_Types].xml
2024-02-22 18:29:40 .....       397312       397312  DataModel
2024-02-22 18:29:40 .....         2848          882  Report/Layout
2024-02-22 18:29:40 .....          328          161  Settings
2024-02-22 18:29:40 .....          136          120  Connections
2024-02-22 18:29:40 .....        18972         1733  Report/StaticResources/SharedResources/BaseThemes/CY24SU02.json
2024-02-22 18:29:40 .....          358          357  SecurityBindings
------------------- ----- ------------ ------------  ------------------------
2024-02-22 18:29:40             420450       400805  8 files

Just like many modern Microsoft formats it is a ZIP container with a mixture of XML and JSON. There is also a DataModel file along with Settings and Connections. A quick peek at some of the contents shows us:

hexdump -C PowerBI-Test/Version | head
00000000  31 00 2e 00 32 00 38 00                           |1...2.8.|

hexdump -C PowerBI-Test/DataModel | head
00000000  ff fe 53 00 54 00 52 00  45 00 41 00 4d 00 5f 00  |..S.T.R.E.A.M._.|
00000010  53 00 54 00 4f 00 52 00  41 00 47 00 45 00 5f 00  |S.T.O.R.A.G.E._.|
00000020  53 00 49 00 47 00 4e 00  41 00 54 00 55 00 52 00  |S.I.G.N.A.T.U.R.|
00000030  45 00 5f 00 29 00 21 00  40 00 23 00 24 00 25 00  |E._.).!.@.#.$.%.|
00000040  5e 00 26 00 2a 00 28 00  3c 00 42 00 61 00 63 00  |^.&.*.(.<.B.a.c.|
00000050  6b 00 75 00 70 00 4c 00  6f 00 67 00 3e 00 3c 00  |k.u.p.L.o.g.>.<.|
00000060  42 00 61 00 63 00 6b 00  75 00 70 00 52 00 65 00  |B.a.c.k.u.p.R.e.|
00000070  73 00 74 00 6f 00 72 00  65 00 53 00 79 00 6e 00  |s.t.o.r.e.S.y.n.|
00000080  63 00 56 00 65 00 72 00  73 00 69 00 6f 00 6e 00  |c.V.e.r.s.i.o.n.|
00000090  3e 00 31 00 34 00 30 00  3c 00 2f 00 42 00 61 00  |>.1.4.0.<./.B.a.|

hexdump -C PowerBI-Test/\[Content_Types\].xml | head
00000000  ef bb bf 3c 3f 78 6d 6c  20 76 65 72 73 69 6f 6e  |...<?xml version|
00000010  3d 22 31 2e 30 22 20 65  6e 63 6f 64 69 6e 67 3d  |="1.0" encoding=|
00000020  22 75 74 66 2d 38 22 3f  3e 3c 54 79 70 65 73 20  |"utf-8"?><Types |
00000030  78 6d 6c 6e 73 3d 22 68  74 74 70 3a 2f 2f 73 63  |xmlns="http://sc|
00000040  68 65 6d 61 73 2e 6f 70  65 6e 78 6d 6c 66 6f 72  |hemas.openxmlfor|
00000050  6d 61 74 73 2e 6f 72 67  2f 70 61 63 6b 61 67 65  |mats.org/package|
00000060  2f 32 30 30 36 2f 63 6f  6e 74 65 6e 74 2d 74 79  |/2006/content-ty|
00000070  70 65 73 22 3e 3c 44 65  66 61 75 6c 74 20 45 78  |pes"><Default Ex|
00000080  74 65 6e 73 69 6f 6e 3d  22 6a 73 6f 6e 22 20 43  |tension="json" C|
00000090  6f 6e 74 65 6e 74 54 79  70 65 3d 22 22 20 2f 3e  |ontentType="" />|

So it looks like the ZIP structure follows the standard for OpenXML packages as it contains a “[Content_Types].xml” file. So using this XML alone would clash with too many other formats. From what I could find the “DataModel” file is what stores the data is more unique to this format, even though the name is pretty generic. Using a string within the file would probably help be more accurate. The “DataModel” file does have unicode double byte strings we can use. “STREAM_STORAGE_SIGNATURE” seems like a unique enough string to use, but it looks like it may not be unique to PBIX. Looks like the “DataModel” file is a Microsoft “MS-XLDM” file format and is a “Spreadsheet Data Model File Format“.

There is a variation to the DataModel file and I am not sure when the standard is used verses this variation, “This backup was created using XPress9 compression”. Not sure if it is versioning or how the file is saved, but they both seem to function correctly.

hexdump -C DataModel | head
00000000  54 00 68 00 69 00 73 00  20 00 62 00 61 00 63 00  |T.h.i.s. .b.a.c.|
00000010  6b 00 75 00 70 00 20 00  77 00 61 00 73 00 20 00  |k.u.p. .w.a.s. .|
00000020  63 00 72 00 65 00 61 00  74 00 65 00 64 00 20 00  |c.r.e.a.t.e.d. .|
00000030  75 00 73 00 69 00 6e 00  67 00 20 00 58 00 50 00  |u.s.i.n.g. .X.P.|
00000040  72 00 65 00 73 00 73 00  39 00 20 00 63 00 6f 00  |r.e.s.s.9. .c.o.|
00000050  6d 00 70 00 72 00 65 00  73 00 73 00 69 00 6f 00  |m.p.r.e.s.s.i.o.|
00000060  6e 00 2e 00 00 00 00 b0  07 00 76 75 00 00 2a d7  |n.........vu..*.|
00000070  86 4e 00 b0 07 00 ad ab  03 00 2c cb 06 00 00 00  |.N........,.....|
00000080  00 00 f8 6c 86 7f 00 00  00 00 68 01 56 6e 00 00  |...l......h.Vn..|
00000090  20 82 67 49 52 06 00 f6  ab fc fc fe 2d f6 da 8b  | .gIR.......-...|

After a bit of digging it seems like the MS-XLDM format can be found within an XSLX file. I found an example with these datasets. Within an XSLX there can be a found a file “xl/model/item.data” and it has the same structure as DataModel within a PBIX.

hexdump -C Customer Profitability Sample-no-PV/xl/model/item.data | head
00000000  ff fe 53 00 54 00 52 00  45 00 41 00 4d 00 5f 00  |..S.T.R.E.A.M._.|
00000010  53 00 54 00 4f 00 52 00  41 00 47 00 45 00 5f 00  |S.T.O.R.A.G.E._.|
00000020  53 00 49 00 47 00 4e 00  41 00 54 00 55 00 52 00  |S.I.G.N.A.T.U.R.|
00000030  45 00 5f 00 29 00 21 00  40 00 23 00 24 00 25 00  |E._.).!.@.#.$.%.|
00000040  5e 00 26 00 2a 00 28 00  3c 00 42 00 61 00 63 00  |^.&.*.(.<.B.a.c.|
00000050  6b 00 75 00 70 00 4c 00  6f 00 67 00 3e 00 3c 00  |k.u.p.L.o.g.>.<.|
00000060  42 00 61 00 63 00 6b 00  75 00 70 00 52 00 65 00  |B.a.c.k.u.p.R.e.|
00000070  73 00 74 00 6f 00 72 00  65 00 53 00 79 00 6e 00  |s.t.o.r.e.S.y.n.|
00000080  63 00 56 00 65 00 72 00  73 00 69 00 6f 00 6e 00  |c.V.e.r.s.i.o.n.|
00000090  3e 00 31 00 35 00 30 00  3c 00 2f 00 42 00 61 00  |>.1.5.0.<./.B.a.|

Because this file has a different filename and is in a different path, using “DataModel” should keep identification specific to a PBIX file.

The Power BI Report has a template option. This format uses the .PBIT extension and doesn’t contain any data only a template to use with other data. The structure is roughly the same, but doesn’t contain the “DataModel” file, but “DataModelSchema”, which appears to be a JSON file.

hexdump -C DataModelSchema | head
00000000  7b 00 0d 00 0a 00 20 00  20 00 22 00 6e 00 61 00  |{..... . .".n.a.|
00000010  6d 00 65 00 22 00 3a 00  20 00 22 00 38 00 36 00  |m.e.".:. .".8.6.|
00000020  65 00 34 00 32 00 62 00  33 00 30 00 2d 00 30 00  |e.4.2.b.3.0.-.0.|
00000030  34 00 34 00 33 00 2d 00  34 00 36 00 30 00 63 00  |4.4.3.-.4.6.0.c.|
00000040  2d 00 61 00 36 00 66 00  36 00 2d 00 36 00 66 00  |-.a.6.f.6.-.6.f.|
00000050  34 00 35 00 35 00 66 00  64 00 64 00 31 00 61 00  |4.5.5.f.d.d.1.a.|
00000060  35 00 36 00 22 00 2c 00  0d 00 0a 00 20 00 20 00  |5.6.".,..... . .|
00000070  22 00 63 00 6f 00 6d 00  70 00 61 00 74 00 69 00  |".c.o.m.p.a.t.i.|
00000080  62 00 69 00 6c 00 69 00  74 00 79 00 4c 00 65 00  |b.i.l.i.t.y.L.e.|
00000090  76 00 65 00 6c 00 22 00  3a 00 20 00 31 00 35 00  |v.e.l.".:. .1.5.|

The DataModelSchema JSON has some plain text strings which could be used for identification. Later in the file there is a string, “defaultPowerBIDataSourceVersion“.

000001c0  20 00 20 00 20 00 7d 00  2c 00 0d 00 0a 00 20 00  | . . .}.,..... .|
000001d0  20 00 20 00 20 00 22 00  64 00 65 00 66 00 61 00  | . . .".d.e.f.a.|
000001e0  75 00 6c 00 74 00 50 00  6f 00 77 00 65 00 72 00  |u.l.t.P.o.w.e.r.|
000001f0  42 00 49 00 44 00 61 00  74 00 61 00 53 00 6f 00  |B.I.D.a.t.a.S.o.|
00000200  75 00 72 00 63 00 65 00  56 00 65 00 72 00 73 00  |u.r.c.e.V.e.r.s.|
00000210  69 00 6f 00 6e 00 22 00  3a 00 20 00 22 00 70 00  |i.o.n.".:. .".p.|
00000220  6f 00 77 00 65 00 72 00  42 00 49 00 5f 00 56 00  |o.w.e.r.B.I._.V.|
00000230  33 00 22 00 2c 00 0d 00  0a 00 20 00 20 00 20 00  |3.".,..... . . .|

Seems like the best identification of the template format.

As usual you can find my signature proposal on my GitHub along with a couple “safe” samples.