|
|
|
|
VISFILES |
Vol.34 No.2 May 2000
|
Real-Time Immersive Performance Visualization and Steering![]() Bill Hibbard
|
My productivity as a programmer has increased by an order of magnitude over the last 30 years, due to improved programming tools. Visualization has a big role to play in future improvements, so I have asked Dan Reed and Eric Shaffer to describe their work with the Virtue system for visualizing distributed programs. - Bill Hibbard
Eric Shaffer | ||||
|
Figure 1: Geographic network display. Figure 2: Time tunnel graph layout display Figure 3: Call graph, where vertex color corresponds to time spent executing that code, and vertex size represents the number of invocations. |
Graph LayoutAs an illustration of Virtue’s generic data mapping capability, we present some images captured from a performance visualization of a large parallel application. The visualization demonstrates a three-tiered hierarchy of network activity, processor-level execution metrics and software structure. As shown in Figure 1, a geographic network display allows graph vertices placed at specific locations on a map, in this case a globe. The appearance of the links can encode specific performance metrics. In this example, color is related to communication latency. Selecting a specific site allows a user to "drill down" to a process view of performance data from a system at a specific site. At the selected site, parallel execution on a single system can be presented as a time tunnel. In this graph layout, the timeline of activities in each task is color coded and laid out along the periphery of a cylinder. Chords connecting the timelines in the cylinder interior denote intertask communication. Figure 2 illustrates both types of behavior. At this level, one can select a specific task and drill down further to examine the structure of the software executing on that task This final level creates a call graph view of the code, where graph vertices represent specific code functions and links represent the static pattern of function calls. The vertical placement of nodes is used to indicate the direction of the call. Like all Virtue graphs, this display can be augmented with other data. Figure 3 shows a call graph where vertex color corresponds to time spent executing that code, and vertex size represents the number of invocations. User InterfacesImmersive virtual environments present unique demands on user interface design. Typing and 2D mouse interaction, the standard mechanisms for desktop software control, must be replaced by methods that allow users to move freely and select objects in a three-dimensional space. Virtue uses off-the-shelf speech recognition software to support voice com-mands. These com-mands allow users to perform common functions such as navigating the graph hierarchy and pausing graph animation of performance data streams. Complementing voice commands, selection is typically accomplished via a 3D "mouse" that uses ray shooting to pick objects. Once an object is selected, the mouse can be used to move or otherwise activate it. Virtue also supports the Virtual Technologies CyberGlove™ as an alternative to the mouse. In addition to supporting more natural selection mechanisms (such as simply grabbing an object), the CyberGlove provides tactile feedback through micro-vibrators when interacting with objects. Both the 3D mouse and CyberGlove can be used to issue commands via a simple glyph-drawing interface. This can be quite useful in noisy environments where accurate speech recognition is difficult. To aid data exploration, Virtue includes a set of virtual tools for manipulating and interrogating visual data representations. In keeping with our model of using intuitive representations, all the tools are virtual analogues of common real-world objects. The Magic Lens [1] seen in Figure 3, appears as a magnifying glass. The lens allows a user to see hidden data that is associated with specific objects. In the context of Virtue’s call graph visualization, the lens displays text labels that associate a source code file name with a graph node. Other tools include a cutting plane that computes metrics on data associated with a set of bisected graph edges (e.g., computing a maximum network latency given a set of latencies from the bisected network links) and a dipstick for displaying the exact data values associated with a selected object. Supporting TeamworkVirtue exports streaming video of the immersive visualization to remote collaborators via the MBONE, along with a Java interface for remote control of an immersive visualization. However, collaboration frequently requires both asynchronous and distributed interaction among team members. For asynchronous interaction, Virtue supports multimedia annotations. By selecting a Virtue object and issuing a command, users can attach an audio and video record of their insights about the displayed data. Visually, annotations are represented by wireframe bounding boxes that enclose the object in question. An example annotation is visible in Figure 3. The annotation system is based on modified versions of the VIC and VAT videoconferencing tools. When a record command is issued, these tools generate RTP (Real-time Transfer Protocol) audio and video files that are stored within an annotation server’s database. When the annotation is opened for replay, the audio and video is multicast across MBONE. Virtue displays received video through windows floating in the virtual environment. The same mechanisms used in the receipt of multimedia annotation streams allow Virtue to support live multicast audio and video streams. This enables videoconferencing between the immersive display and remote desktop or palmtop devices. Real-Time Data Acquisition and SteeringOptimizing distributed applications requires capture of performance data from multiple sites and networks, diverse computer systems and software components written in a variety of languages. Because these distributed applications execute in a network-connected environment, with shared resources, their execution context is rarely repeatable. Hence, off-line performance analysis may not enable one to optimize the application for future executions in new contexts. Ideally, performance analysis software should allow users to alter runtime parameters and resource allocations in response to changing conditions during application execution. As Figure 4 shows, Virtue relies on the Autopilot toolkit [4] for distributed measurement of executing software. Autopilot provides a set of low-overhead sensors that can be inserted in application code or into performance daemons that periodically gather hardware and software performance data and that can be inserted via the SvPablo [5] toolkit. In real-time performance visualization, sensors directly report their values to Virtue for real-time display. Much of the power of Virtue derives from the ability to perform closed-loop steering of applications. Control of application code is via Autopilot actuators. These routines are inserted into application software just like sensors, but as Figure 4 shows, they alter application or system parameters. Virtue provides a set of virtual controls, 3D sliders and buttons, forming an interface to these actuators. Real-time steering of an application is made as simple as pushing a button to reset a buffer size, or sliding a lever to alter a variable. The ability to make adjustments on-line rather than off-line can result in tremendous timesavings when optimizing long running applications. Future ChallengesAlthough immersive environments typically use large displays, most current systems have resolution no higher than that available on a desktop. Our experiences with Virtue suggest that large-scale performance visualization would benefit from higher resolution displays that fill the user’s field of view. Such displays would allow multiple, linked information structures to be viewed simultaneously. We have just completed construction on an 11’x24’ display wall powered by 18 LCD projectors, similar to Li’s scalable display wall [7]. The projectors are driven by a cluster of 18 Pentium III class PCs running Linux, with a Myrinet interconnection network. Moving Virtue to the display wall opens a rich set of research topics. We plan to explore new information presentation techniques that allow us to exploit the increased screen space, as well as what graphics performance gains can be made by exploiting the distributed rendering environment. Finally, we are interested in how large-scale displays can best support effective collaboration within a prototype "smart space," an active office environment made possible through pervasive computing. References
Bill Hibbard's research interests are interaction techniques, data models and distributed architectures for numerical visualization. He leads the SSEC Visualization Project and is primary author of the Vis5D and VisAD systems. He has degrees in mathematics and computer science from the University of Wisconsin - Madison. | ||||
|
Ken Brodlie,
Stuart Lovegrove and
Jason Wood
|
|
Reader Survey Archive Join the ACM and SIGGRAPH Join SIGGRAPH Calendar - Upcoming Events Contacts SIGGRAPH 2000 The SIGGRAPH home page Professional Chapters |