That’s it, folks: Engineers have developed what’s basically the Total Perspective Vortex, and we can all go home to whimper in front of our puny 60-inch TVs. A team at Stony Brook University in New York has unveiled what it dubbed the “largest resolution immersive visualization facility ever built,” which is a fancy way of saying they’ve created a room made up of 416 displays that can display a single, unified image. Amusingly — or ominously, considering it’d be perfect for A Clockwork Orange-style therapy — it’s called the Reality Deck.
I’m normally extremely averse to listing off gadget stats, but in this case I don’t think I have a choice: the whole system comprises a 33’x19’x10’ room (including tiled-display door), that has a total of 1.5 billion pixels. The Stony Brook folks claim it’s the first display to crack the billion pixel barrier and that it’s five times larger than the second-largest resolution display in the world.
Obviously it takes a lot of computing power to run that many displays, and the Reality Deck doesn’t disappoint. It uses 240 CPU cores, for a total of 2.3 teraflops of computing speed, with 1.2 terabytes of memory, and has 80 GPUs that hit 220 teraflops. That’s enough to display massive images, like the 45 gigapixel shot of Dubai shown above, along with large-scale modeling with enough power for interactivity. A recent test, for example, looked at post-Sandy New York and New Jersey via Google Maps.
An early model of the Reality Deck, from (of course) StarTrek.com
Having such a massive display fundamentally changes how we deal with data. My first thought goes to editing photos on a laptop, which involves an ungodly amount of zooming in and out, so much that it can be tough to assess an image as a whole. Here, images are in such ridiculously high resolution that you can step back and, say, look at a map of a disaster area as a whole, and then step up close to see minutiae that are perfectly rendered without any manipulation.
Dubai’s skyline, via Stony Brook
I haven’t tried it yet (Stony Brook, let’s hang out), but it sounds like a much more fluid and natural way of processing large-scale images. The potential for wizardry with data visualization is immense; I’d love to see how the insane levels of resolution get put to use in the future. I mean, imagine looking at stitched satellite photos of an entire region, then being able to look closely and see individual buildings. The scope of the whole thing is breathtaking. And, yeah, I know someone wants to know how porn looks on it, but I’ve got a better idea: with four massive walls, who wants to play some Mario Kart?
Follow Derek Mead on Twitter: @derektmead.