BlueShark – Gizmodo
When we think of the future of the military, we think of bigger and better weapons. Laser canons and the like. But what about the people operating those lasers? How can a behemoth like the Navy ready its future sailors for the high-tech combat of tomorrow? Believe it or not, with an Oculus Rift.
At the University of Southern California’s Institute for Creative Technologies, a mix of real life, augmented reality, and virtual worlds come together to form a project known as BlueShark. It’s an experiment to discover not only how the Navy of the future could work, but how it should work. I went there to train the way midshipmen will a decade from now.
What Is BlueShark?
BlueShark is the sexy codename given to the Enhanced Environment for Communication and Collaboration (aka E2C2). As the true name suggests, despite being largely funded by the U.S. military (a division of the Office of Naval Research known as SwampWorks), the work done here has much broader implications. The project is essentially a collection of technologies and environments (both physical and virtual) that examine how we humans may collaborate in the future, whether we’re in the same room or on the other side of the planet.
As for its Naval involvement, BlueShark’s focus is to ensure that ships with 50-year lifespans and the generations of sailors manning them can work in harmony far into the future. A control room full of dusty analog switches and knobs may be perfectly functional, but will they be intuitive five decades from now? Or would it be like handing a rotary phone to a seven-year old?
It’s a tall order. But considering that BlueShark’s working on projects for 2025, it’s amazing how far they’ve already gotten.
On our recent trip to the Institute for Creative Technologies at USC, I had the opportunity to run through the BlueShark demo as it currently stands. It essentially mimics the type of training that a new sailor might go through.
The ICT’s Mixed Reality lab practically overflows with gigantic screens, cameras, sensors, goggles, and more fun toys. Definitely not the kind of place you imagine when you think of classic “Navy training.” It looks more like something out of a WarGames remake.
I was led through the demo by Senior Chief Foster, a human-sized virtual avatar displayed on a large flatscreen TV. When you step into your designated position a camera (just a cheap Logitech webcam) senses you, and Foster snaps to life. He gave me a brief overview, had me tell him my name, and then guided me to the first (and most impressive) station.
The BlueShark Command Center is, as you might have guessed from the name, a simulated command outpost. It features four large screens in front of you, a swivel chair, and a tweaked-out Oculus Rift. The Oculus isn’t just there because it’s in fashion; its founder, Palmer Luckey, was a lab assistant in the Mixed Reality Lab before he launched his Oculus Kickstarter campaign.
Oh, and this also isn’t your regular-old Oculus. The head mounted displays (HMDs) BlueShark us, as you might guess, has a few extra bells and whistles.
The BlueShark Oculus has been outfitted with several red LEDs. While they appear to be solidly lit to the naked eye, in reality each one is strobing at a different frequency. This enables cameras in the room to tell which LED is which, and thus give accurate positioning data, which in turn informs what you are seeing in your virtual display. If you lean in or out, the imagery matches, making the virtual world feel that much more real. Beyond advanced tracking, the lab is finding ways to create fields of view that go beyond the Oculus’ current capabilities, virtually wrapping scenes around the user. For example, the photo above shows the Fakespace Labs Wide5, an experimental head mounted display that provides a more immersive 140 degree field of view.
The headset LEDs are matched by LED-covered straps that give accurate positioning data for your hands. It doesn’t register finger-wiggles or anything that precise, but I was able to manipulate objects in the virtual world as easily as I would have been able to on a touchscreen.
In the photo at the top of this post, you can see my point of view, looking at a touchscreen with options for Crow’s Nest Cam, Bridge, Combat Decision Center, and UAS Cam. In the real world, there was no interface. It was just a plain, cheap square of plexiglass on a stand. Yet, when I touched the spot where the buttons supposedly were, it registered the input perfectly and switched me to a different room.
This has profound implications. Let’s say the Navy decides that a room full of touchscreens is the best way to control a new warship. Today, in order to train sailors on it, you’d have to build physical replicas of that control room. But what if you could just place plain pieces of plastic that approximate where the screens would be, slap an Oculus Rift on your face, and perfectly simulate what will be there? Outside of the VR headset, it’s just a room full of cheap, blank plastic panels, but for the soldier in training, it would look and behave exactly as the real thing would. This would not only save a ton of money, but it would make the Navy far more adaptable.
The most incredible thing about this system, though, is how just a single press of a button can shift your perspective to a whole new vantage-point. When I hit that virtual button for Crow’s Nest, I was instantly transported to the top of the ship’s mast. It actually made me want to grab the sides of my chair at first. There I was, floating a hundred feet above the deck of the boat, and yet I still had all of my important screens and controls at my fingertips.
And then the best magic yet; another press of a button and I was even higher, looking down from a drone’s point of view. I was flying high above the water, looking down at the boat that my physical body would be on, and looking out over enemy ships, too.
Obviously, this has profound implications. Imagine a drone pilot actually being able to see the entire world from the drone he or she is flying, not just from a single camera angle, but as if they were sitting in the cockpit. You would have so much more situational awareness. You might also have more of a sense that what you are doing is a part of the real world, not just some game, along with all of the moral implications that implies.
It stunned me silent.
A Flexible Future
Another advantage of this system is its extreme customizability. Maybe you’re left-handed, and you want the ship’s throttle over on the left. You can just drag and drop it to where you need it to be. When the next user logs in, it would reset to his or her preferred settings. In this way you can customize the ship itself to work in the way that makes most sense to you.
It also allows for fewer personnel to be physically on a ship. As long as the data speed could handle it, you could virtually pipe in an expert on any subject you needed, and they would be able to see the situation exactly as they would if they were on the ship with you. This could be extremely useful in specialized repair situations, or when a crucial translation is in need.
As incredible as BlueShark sounds—and will become—the system is a long way from perfect. There was occasional lag in the display, and when there’s a slight disparity between what you are seeing and what you are physically experiencing, dizziness happens. I don’t easily get motion-sick, but I definitely found myself getting queasy. At one point, when the screen froze completely, I became so disoriented I almost took a nose-dive out of my chair.
But we’re at the very beginning of this technology. That it’s this good at such a rudimentary stage is staggering.
Where It’s Going
What remains to be seen is how the military adopts and implements BlueShark. One of the banner features of the Ford-class aircraft carriers is that they’re made to be modular, so as technology progresses, theoretically entire control rooms could be swapped out. Might we see one that’s just a bunch of lifeless glass panels that come to life when you look at them in virtual reality?
Even if that would make it endlessly customizable for the sailors using it, would the Navy ever trust a 100-percent digital system to pilot its boats? Would it be vulnerable to hacking? We asked the Navy these questions and got an official (and surprisingly candid) response from Lieutenant Commander Brent Olde, ONR Deputy, Human & Bio-Engineered Systems Division:
“Due to rapid advances in unmanned systems capabilities, the military is currently experiencing a paradigm shift in how it commands and controls its assets. With growing confidence and verified reliability in these new control mechanisms, I’d say yes, someday the Navy could have a completely new way of controlling large vessels – as depicted in BlueShark. However combat systems require multiple fail-safe redundancies to be in place in the case of system failures (some systems have quadruple redundancy), so if the Navy ever did replace the entire control system, it would only be after rigorous testing and redundant back-up mechanisms to maintain positive control.”
Emphasis added, but suffice it to say that BlueShark could well be a reality, assuming it proved safe and reliable enough.
There are obviously also manifold applications for this technology outside of the military. You could put together a team of experts from around the world, who could collaborate in a virtual environment (with everything they say being instantly translated into the language of each individual listener) on a project. And, obviously, the gaming potential is off the charts.
But ultimately, seeing the world from the perspective of a drone, turning a blank space into the world’s most high-tech command center just by looking through some glasses; these have huge implications for our military. If we aren’t always ready to adapt, we lose. The more flexible we make ourselves, the more adaptable we’ll be. BlueShark is the future—and it’s closer than you think.