LivingTomorrow technical description

LivingTomorrow is an archive of video files. These were originally rendered in After Effects 6, using animation codec so as to maintain maximum colour and image quality. These files were then compressed to MPG2 (using Procoder version1). Procoder v2 was tested and produced no results for making MPG2 files, however it has been tested and found to make good MPG4 files.

The LT archive is accessed via a database programs written in PHP and C with MySql (running on llinux). The archive is organised into three 'channels' or folders: left, right and center. These correspond to the three left, right and center projectors in the exhibition site, be that a gallery or a public space site.

In their naming conventions, the files themselves have numeric prefixes, for example, 60, or 30, as in 60group1, or 30BD1. These numbers are one of the factors used to determine their 'chance' of being played.

The way the files are selected to play, and in which order they play, and to which projector, is a function of the database program itself, in terms of the structured organisation of the files themselves (ie into three channels/folders) and the naming convention of the files. It works in this way:

The system starts by playing a random file for the center screen.
Each 'centre' file has been programmed to allocate to every other file in the left and right channel a 'weighting' -- a number between 1 and 10. This 'weighting' is then multiplied by the numerical prefix in the actual file name to determine its chance of being played. For example, in the left channel, '60group1', with a weighting of '5' would have 300 'chances' of being played (derived from 60 x 5, whereas another file, "30BD2" with a weighting of 9 would have 270 'chances'). In LivingTomorrow every file gets at least one chance as the weightings start at 1 for all files. The outcomes are quite unpredictable.

The complication is that no file is of the same length. This makes the system very dynamic, and makes narrative and pattern emergence possible, but presents another problem. What happens when a 'centre' file finishes and the two files on the left and right are still playing? How to determine the next centre file?

This is solved by the program looking at which centre file gave both the currently playing left and right files a weighting. The left and right chances are then determined, as in 300 or 270 in the examples above, and then these chances are added together (ie the left and right) and 'linked' to the centre file that allocated those chances to that specific left and right. This is repeated for all the files that allocated a weighting to those specific two playing left and right files. This then determines the different chances of the range of centre files to be selected from for the next centre file to play (again, selected at random). This is how the system is programmed to allow for the beginnings of an order, a narrative to emerge. It is also how you can begin to order patterning, in terms of composition, across all three screens.

The work is played on at least a Pentium 4 2.8 Ghz PC running linux version 2.6 (maybe it will also work on v2.4), preferable the gentoo distribution, compiled for optimized use with the hardware. (other forms of Œdistribution¹ are, for example, Redhat, slackwear, susE, and Debian).

The three video channels are outputed to three projectors using three MPG2 decoding video cards. The ones used for LivingTomorrow were WinTV-PVR 350 (brand: Hauppauge). Older versions of the Optibase brand card also work on older versions of linux, maybe newer -- 'optibase VideoPlex XPress' is the name of the card. Any card running on a PC should also run on linux. However it needs to have linux drivers. These two cards cited have the linux drivers.

We continually tested and developed the work in the lab using three high quality monitors. However in the final installation we had an unexpected technical problem to do with the analogue to digital coversion of the MPG2 files. This was solved by inserting an 'Panasonic MX10 video mixer' between the computer card and the projector. This machine laid down a new synch pulse onto the video feed, and in the process boosted the threshold of the system's ability to cope with black in the files (some of the files were quite dark). It appears that the video cards were not able to process the blacks in some of the files, or couldn¹t adequately distinguish the synch pulse when the files were dark, and this led to loss of signal on those files as they went to the projector, which in turn couldn't play them and made them 'jump' all over the place. In future one could test the Optibase fcards as they may have a higher threshold for black. They do have better inter-frame synch pulse so the signal should not be lost so easily as it was using the other cards on the very dark files.

Linda asking Wiel Seuskens from Montevideo artlab some questions:

LW: The work is ready to stream. What exactly needs to happen to get it streaming?

WS: The way I see it is to convert everything to MPG4 and stream it with Darwin Streaming Server, an open source option. (could be either converted from the existing MPG2 files or could be converted from the original animation codec files which should lead to better quality final result). (There is an MPG4 implementation of Quicktime to MPG4 conversion. Programs on the MAC: QTPro can load MPG2 and save (output) to MPG4; Cleaner by Discrete Logic could also do it; or Final Cut Pro, the Compressor option.

Darwin uses playlists to define the order of playing, so a program should calculate a playlist for, let¹s say, 24 hours. To do this the program has to know the exact length of the fragments. You could use it as it is to calculate a playlist (ie in terms of the randomness and weightings) then convert this to a playlist which Darwin can use. However you would need to put into the database (therefore into the playlists) the exact length of all the files.

LW: What level of network would you need to receive the files , and at what quality loss for each type etc?

WS: This is a choice you have to make. The higher the quality, the higher connection speed you need. There always be a quality loss, but hard to describe in words, you have to do tests and see the result.

LW: Where does the 'program' being and end? where does the 'database' begin and end ? in terms of file naming conventions and organisation etc Is the 'program' and the 'database' the same thing? NO says Wiel.

WS:The Œdatabase¹ is a program that can hold data. mysql is a database program. PHP is a scripting language which communicates between the database and the user. It Œasks questions,¹ takes and retrieves information and gives commands to the database, which is the user interface. Œsql¹ is the protocol of this and a lot of database programs. It¹s a database language but not a program (?). For example is was used to get rid of the black spaces between the file changes (or was this c?‹need to check again with Wiel) With other programs you can give commands and ask questions to the database (queries) like ³INSERT INTO fragments SET name=¹30BD1.mpg¹, ³SELECT playchange FROM chances WHERE screen=¹left¹ AND fragment=¹30BD1.mpg¹². So I wrote 2 programs that talk to the database: one, with a web interface, in PHP to inserts all the playchances (the program detects itself which fragments are in which directory) and one in C that queries the database and calculates which fragment(s) should be played and actually plays them. This C program can be started with the webinterface I build with PHP.

LW: Is php mysql on pcs?

Yes, but playing the fragments through the PVR-350 cards cannot be programmed in C or PHP on a pc, at least, I found no information on the internet how to do it. (It is mainly used for interactive works communicating with databases on the internet.>>?)