fly to fuji (800k mpeg)
fly to berlin (2MB mpeg)
Developping a renderer which visualize a worldwide distributed database with unlimited geometry and textures in realtime (~ 1280/1024 pixel ~ 20-30 fps).
The IRIS Perfomer toolkit was chosen as the basic system to create the renderer. It provides the best performance especially from the multiprocessor Onyx with RealityEngine graphic but also on other machines including the future ones - and there was no need to hesitate about gl/OpenGL, quadword-aligning or things like this. Some extensions had to be added to the Performer toolkit like database paging and texture paging, because the development environment offers only 512MB main-memory and 4MB texture-memory.
The renderer has two main parts, the render-database-manager and the render engine. The renderer database client accesses a multi-layered database of practically unlimited size. At the moment it contains surface data (satelite imagery and aerial photographs), elevation data, transparent clouds, CAD-Models of buildings and Information billboards displaying names and current temperatures of selected cities. The maximum resolution for the different data layers can be different depending on the geographical region. All datas comes from a distributed database. The renderer database client is the interface between the Performer scenegraph and the distributed earth database. On the Performer side it knows all viewing and flight parameters like position and direction. From that it calculates the currently needed data and predicts the needed data for future flightpath. Then it requests the data for a special location with an appropriate resolution. Database access by the renderer is done asynchronously without affecting the framerate.
If you approach too fast you will get a coarse image, but the frame rate is not affected. This happens only if you fly around with few 100000 km/hour below 50 km height. Data no longer needed will be removed removed from the Performer scenegraph. (if they havent been rendered in the last 30 sec or if the memory limit is reached). The database is organized as a quadtree, containg higher levels of detail as you descend down the tree.
To overcome the very limitited size of texture memory a texture pager was implemented. The difference compared to the GL texture paging is that a particular texture is paged in before it is referenced. If the texture memory is full, the LRU tiles are freed. The paging is implemented with a callback mechanism in the drawing process.
The render engine is a full feature multiprocess Performer renderer with an still ugly graphical user interface (this will be solved in the nearer future). Beside many render options you can switch on-off the different information layers (clouds, CAD-Models, temperatures ...). You can also connect various input devices including a Earthtracker and a 6-degree-of-freedom SpaceMouse. The background color changes from black to blue when you descend into the atmosphere.
Dynamic Performer geometry is not used, currently every patch has an absolute position on the earth. The prototype ignores the problem of the seams between different LODs, but there are several solutions for this problem which are not very computationally expensive. Another problem is that the precision of 32-Bit floats is not sufficient to represent very small features in an earth scale coordinate system. The database uses 64-Bit doubles for everything, but as you know gl can handle 32-Bit floats only. There is a solution for that, but it is not implemented yet. Currently the precision effect can be seen on objects requiring a resolution less than 1 m. This is the case specialy by entering 3D-Models of buildings. When mixing transparent layers like clouds with non transparent ones like the elevation data the NON_OCCLUDE flag don't work how exspected and there is some occlusion.
SGI Onyx with RealityEngine2 graphics, bunch of other SGI machines, ATM network infrastructure, five years experience in realtime graphics (VR) and some experience in earth visualisation.