Â鶹´«Ã½

News Release

NSF and Science Honor Scientific Animation Advances at Â鶹´«Ã½

San Diego, CA, March 8, 2011 -- Â鶹´«Ã½ researchers affiliated with Calit2 and the Jacobs School of Engineering – Jürgen Schulze and Mark Ellisman – are on two of the winning teams in the 2010 International Science & Engineering Visualization Challenge. The Challenge is organized annually by the National Science Foundation (NSF) and the journal Science.

The two Calit2-affiliated winners shared their awards in the non-interactive media category with fellow researchers at Â鶹´«Ã½ and partner institutions. (The Challenge also announced awards in three other categories: photography; illustrations; and informational posters and graphics.)

GlyphSea

Click Here for a HighResolution Version
Calit2 project scientist Jurgen Schulze (pictured here inside the five-walled StarCAVE VR room) was part of the GlyphSea team recognized by the 2010 International Science & Engineering Visualization Challenge.

Calit2 research scientist Jürgen Schulze, who is also affiliated with the Department of Computer Science and Engineering (CSE) at the Â鶹´«Ã½ Jacobs School of Engineering, was part of the GlyphSea team led by Amit Chourasia of the San Diego Supercomputer Center (SDSC). GlyphSea is a novel way to encode and display vector data that clearly shows magnitude and direction. The project’s name refers to the use of glyph shapes, such as ellipsoids or spheres, which are marked at each end to allow observers to easily identify both the direction and intensity of the movement. This new visualization technique may help seismologists to accurately analyze ground movements for an earthquake, measure magnetic turbulence in deep space, or allow medical researchers to study areas such as blood flow and nutrient absorption. 

The basic work on GlyphSea was done by Emmet Mcquinn, a Master’s student in computer science and graduate research assistant at SDSC, who was advised by Chourasia and Schulze. His thesis committee included Schulze, Chourasia, Jean-Bernard Minster of the Scripps Institution of Oceanography’s Institute of Geophysics and Planetary Physics (IGPP), and Calit2 director Larry Smarr. In the Visualization Challenge, Chourasia, Mcquinn, Minster and Schulze were cited for their work on GlyphSea. (Watch an on the Calit2 website.)

As the NSF explained in announcing the award, “For earthquake scientists, predicting when the next ‘big one’ will strike is the million-dollar question. But predicting how much damage it will do is just as important – and almost as uncertain. Knowing exactly how seismic waves transform the landscape could offer clues. Seismologists have made numerous attempts to model seismic waves passing through Earth. But depicting their direction is difficult. Arrows or cones are ambiguous because viewed from the very front or the very back, they have the same shape: a circle.”

Click Here for a HighResolution Version
GlyphSea employs a novel technique of procedural dipole texturing to encode and display vector data, which shows magnitude and direction in an unambiguous and view-independent manner.

With GlyphSea, the Â鶹´«Ã½ team opted to use simple glyph shapes, such as spheres or ellipsoids, with a white dot on the end moving toward the observer and a black dot on the end moving away. This method can be applied to any arbitrary shaped glyph shape in general. By varying size and color to show magnitude, the method can display any kind of motion intuitively, from a major earthquake on the San Andreas fault to magnetic turbulence in stars millions of light-years from Earth.

“We wanted to experiment with immersive and large resolution displays and Calit2 is one such unique place where you not just find hardware, but expertise to accomplish your ideas,” said SDSC’s Chourasia. “The stereo immersive environment in the StarCAVE enables increased depth perception, and increased resolution allows us to view a very large number of glyphs distinctly.”

“My involvement in the GlyphSea project was to be Emmett's research advisor and help guide his work from an academic perspective,” explains Calit2’s Jürgen Schulze, who first met Mcquinn when the student took his Introduction to Computer Science course (CSE 167) in fall 2008. “Emmett implemented the software for the project, and Amit Chourasia was his day-to-day advisor on the research and also on the technical level.”

Schulze also helped Mcquinn adapt his algorithms for use in Calit2’s 360-degree StarCAVE virtual-reality environment.

“The strength of this method is that it is not only simple, but very intuitive.” said Chourasia, in noting that the application could potentially be used across a wide range of science domains. “GlyphSea could also be used to render visualizations to display the magnitude and direction of movement within the human body, such as blood flow. That’s because one of GlyphSea’s key benefits is that it can be used to show features at both the macroscopic and microscopic level. Moreover, the application is interactive, where various parameters could be customized and changes could be viewed in real time.”

Referring specifically to the to seisomological visual rendering created by the research team, Scripps’ Minster said that GyphSea’s technique of encoding and displaying orientation information of vector data by using procedural dipole texturing is what makes this application so unique. “This allows seismologists to study the ground motion dynamics at a level of detail not seen before,” said Minster. 

Whole Brain Catalog

Another of the four winning teams that tied for second place in the non-interactive media category was cited for a “Visualization of the Whole Brain Catalog.” The Whole Brain Catalog (WBC) project is based in Â鶹´«Ã½’s Center for Research in Biological Systems (CRBS) and led by its director, UCSD neuroscience and bioengineering professor Mark Ellisman. Ellisman and his WBC team worked with animator (and 2010 MacArthur Fellowship ‘genius’ awardee) Drew Berry to illustrate a journey deep inside the mouse brain. The video brings to life data from the WBC, a massive database of microscopy and other data sets on the mouse brain.

The video opens with a mouse sniffing a camera, then zooms in on the mouse brain, focusing on the hippocampus, the headquarters of scent and memory. From there it isolates the dentate gyrus, the region that recognizes smells and creates new memories. Individual brain cells then start to appear. Finally, a new connection forms between two neurons, representing the creation of a new memory.

“For a memory, you'd have many, many neurons forming, or connections being broken and new patterns being made,” animator Berry told the journal Science, which notes that Berry hopes the video will inspire a sense of wonder at how the brain works.

In addition to Berry and Ellisman, the Visualization Challenge honored François Tétaz, who composed and performed the original music for the WBC video, and the Walter and Eliza Hall Institute of Medical Research, where Berry is an animator for WEHI.TV. Funding for the video came from the Waitt Family Foundation.

According to organizers of the Visualization Challenge, they received 111 entries from 63 countries in the 2010 award year, with an outside panel of experts in scientific visualization reviewing the finalists and selecting the winners.

The deadline for entries in the 2011 Visualization Challenge is September 15, 2011, and the winners will be announced next February in the journal Science.

Media Contacts

Doug Ramsey
Jacobs School of Engineering
858-822-5825
dramsey@ucsd.edu