XDC (8088 Domination)
- Extension: xdv
- Website: http://x86dc.wordpress.com
XDC is an(other) academic project by Jim Leonard, a.k.a. Trixter. The goal of the project was to play full screen, full motion video on an original IBM PC utilizing the PC's CGA graphics mode. The specs of the "original IBM PC" included:
- Intel 8088 CPU running at 4.77 MHz
- CGA graphics mode: 16KB framebuffer, one video page, limited color
- Sound Blaster audio
XDC stands for X86 Delta Compiler, as its method of operation produces executable code for each video frame, as opposed to traditional codecs which output compressed data. By outputting code instead of data, XDC avoids unnecessary CPU processing normally associated with the load-decompress-translate-update tasks of other codecs. Decompression is performed by CALLing the frame data, where it directly alters the contents of the CGA frame buffer.
An earlier version of this system was the basis of the award-winning demo "8088 Domination" in 2014.
XDV File Format
The header is stored in the github repository for the project:
'XDCV' - 4 bytes, header signature numpackets - 2 bytes, word value, number of packets that follow header largestpacket - 2 byte, word value, size in bytes of largest packet in the file achunksize - 2 bytes, word value, size of each audio chunk. This is always a fixed value even when compressed audio formats are in use, because each audio compression method produces fixed-size output chunks. samprate - 2 bytes, word value, playback sample rate in Hz vidmode - 1 byte, 1=160x200x16 composite CGA, 2=640x200x2 (others to follow?) numbcols - 1 byte, width of screenmode in byte columns (ie. 80) numprows - 1 byte, height of screenmode in pixel rows (ie. 200) features - 1 byte, bitfield, reserved for special handling: padding - rest of header is padded to the first sector boundary (512)
"Features" are currently unused.
After the header, comes one or more audio+video "packets" with the following structure:
0..end of videochunk ...unknown # of padding... audiochunk..end of packet
The very beginning of the packet is the video chunk, which is of a variable size. The very end of the packet is the audio chunk, which can easily be found by seeking backwards from the end "achunksize" bytes. All packets are aligned to sector boundaries (ie. the nearest 512-byte boundary). The data in the packet between the video and audio chunks is undefined (although likely filled with 0).
Video playback framerate is derived by the audio sample rate divided by the audio chunk size. It is possible (in fact, likely) that neither the framerate or audio sample rate will be a common/expected value. For example, samplerate could have been adjusted during creation from 32Khz to 32008Hz to avoid drift, or the framerate could be a non-integral number like 29.97 or 23.976.
Packets are padded to the nearest sector boundary (512-byte boundary). At the end of an XDV stream is an index that stores the size of each packet in sectors, one value per byte. (For example: A (tiny) stream that has numpackets=4 would have, at EOF-4, 4 bytes, one for each packet to read, with each value multiplied by 512 to get the packet length.) Loading the index is as easy as reading [numpackets] from the header and then seeking to EOF-[numpackets], then reading [numpackets] bytes. This index must obviously be loaded prior to playback so that the player knows how much data to load from disk for each packet.
Video Format
The video format operates in CGA graphics mode, which is an interlaced framebuffer utilizing 16KB of RAM. Each byte can represent 1-bit, 2-bit, or 4-bit graphics with 2, 4, or 16 colors respectively.
The interlaced memory arrangement represents even lines starting at offset 0, then odd lines starting at offset 8192. Only lines 0 through 199 are visible; extra data in the framebuffer outside of the first 200 lines is not displayed.
Data Resources
- http://trixter.oldskool.org/2014/06/19/8088-domination-post-mortem-part-1/
- http://en.wikipedia.org/wiki/Color_Graphics_Adapter#The_CGA_color_palette
Audio Format
The audio format used in 8088 Corruption is simply unsigned, mono, 8-bit PCM.