This Month’s Q&A: September 2020

A headshot of a white man with thinning brown hair wearing a thick rain jacket over a plaid button-up.


Director, Emerging Analytics Center,

UA Little Rock

THE GEORGE W. DONAGHEY Emerging Analytics Center at UA Little Rock, usually just called the EAC, is a research center focusing on virtual and augmented reality, immersive visualization and interactive technology, as well as cybersecurity research. While it’s the kind of place where you’re likely to find undergrads wearing high-tech goggles and imagining they’re walking through a volcano, this facility also produces game-changing tools for utilities companies, agribusinesses, and engineering in general. We sat down with the new Director to hear about all the cool things they’re doing at the EAC.


Tell me about yourself. How did you make your way from Germany to Little Rock?

I’m originally from a little town called Gotha in Thuringia but I studied in Weimar—that’s Goethe, Schiller, and the birthplace of the Bauhaus movement. I actually studied computer science there, and I was lucky. After the Wall came down, I basically took advantage of many opportunities. One was that there wasn’t a settled standard curriculum the way it is at UA Little Rock. It was more flexible. Mainly, I was able to do research very early, as an undergrad. This was in 1994 and 1995.

After finishing my studies, I started with one of the premier Virtual Reality groups at the time in Germany near Bonn. We eventually got incorporated into the Fraunhofer conglomerate, where we basically did third-party funded research. This was a single institute that did work in Virtual Reality, Augmented Reality, and XR, which is a term covering virtual, augmented, and mixed reality.

It was an interesting place to work because probably half the people were very deeply, technically, scientifically involved, like me. But the other half were actually more at home in the humanities or the arts. We even had a geologist who was interested in earthquakes. Things like that help to widen your horizon. I was in contact with people from the arts, from architecture, from civil engineering, so that’s a normal thing for me.

While I was working at Fraunhofer in the early 2000s, my day job was software development and project support for our in-house VR system. Also, we started doing immersive visualization projects for the geosciences—oil and gas exploration companies. We also did some work for the car companies.

But speaking of Little Rock, it was while I was at Fraunhofer that I first got to know Dr. Carolina Cruz-Neira who would eventually become the first director of the Emerging Analytics Center at UA Little Rock. I also got to know Dr. Dirk Reiners, who would later be one of my Ph.D mentors. So I knew both of them for a long time. We were talking regularly, even after I left Fraunhofer to return to Weimar to get my doctorate. We would often be at the same conferences.

Eventually, near the finish line of my Ph.D, I received an offer from Carolina, who was then at the University of Louisiana at Lafayette, together with Dirk. They were running the Louisiana Immersive Technologies Enterprise, or LITE center. She had a project that had to do with CAVE technology—that’s an acronym for Cave Automatic Virtual Environment, which Carolina and two others had invented at the University of Illinois in 1992. In this case we built a CAVE around an omni-directional treadmill, which would allow dismounted soldiers to walk and run in a virtual environment very similar to the real world. Except we could create surroundings without actually having to spend money on building towns. For the actual project, we essentially developed software for non-programming people like psychologists, which allowed them to create scenarios and run them repeatedly for user-study experiments.

So this is what we did over three years in Louisiana. Eventually, I left in 2011 to become a university lecturer in the United Kingdom; that’s an assistant professor in the U.S. I did that for about seven years and then, in late 2017, Dr. Cruz-Neira, who in the meantime had moved to Little Rock, taking over the EAC, called me and said, “They made me chair of the computer science department. Do you want to come and help me rebuild it?”

So I came here to become an associate professor. Then in 2019, Dr. Cruz-Neira was inducted into the National Academy of Engineering, basically for inventing the CAVE and her research in VR. It’s very prestigious, a great honor, and once you reach that level, very interesting offers and opportunities start to come in. The big universities that have the money are looking for stars like that.

Dr. Cruz-Neira departed in January 2020 for a new position at the University of Central Florida. I was asked to become Interim Director of the EAC and in April 2020 I was officially named Director of the EAC.

Well, congratulations. So how do you define “Emerging Analytics”?

Actually, that’s a good question. Analytics basically means we have long-developed notions of representing things in an abstract way, most likely with math symbols. Certain things are not really represented with math but just with collected values, such as temperatures. Then we try to see if there is a pattern, something in there allowing us to make predictions or extrapolations into the future.

So that’s the analysis part. And the emerging thing is, we’re trying to do things in the engineering realm and to analyze highly complex problems using math representations. However, tables of numbers or simplistic business graphics will not help. Using emerging technologies like immersive or 3-D and interactive displays, we can advance understanding in such application cases. As they say: A picture is worth a thousand words. And I’d add to that: A video is worth a thousand pictures, but an interactive visualization and simulation is worth more than all of these combined.

What are some industry applications?

Well, for example, I come from a background where we collaborated with people in the geosciences for oil and gas exploration. If they go and physically do the exploration, it can cost from $500 million to $1 billion per project. That’s just the exploration part, and if, at the end, they can’t really figure out if it’s worth actually drilling and going into production, then they just have to chalk up that exploration as a huge loss.

What they’re looking at is very, very complex. It’s not just a bunch of numbers and tables and then some expert says, “Oh, it’s this one.” They have to look at interconnected things out of collected data. They have to run simulations and create predictive models. So they do seismographic surveys of ocean floors, for example. And then, because we know to a certain extent what is in those vast amounts of data, we can start making visual simulations. “Okay, if we take this cavern out slowly, what happens?” Usually, if you take things out, you have to bring in things from the other side, because otherwise the pressure gets too high on top, so things crumble. No one wants that.

So that’s the oil and gas industry. They actually have their own software companies, and we collaborate with them. We also work with the military, with the car industry, and with chip production. Current chip generations aren’t just 2-D layers of circuit diagrams, though they look that way. A chip today is layered in 3-D, and not just some layers but many of them. It’s actually interconnected between the layers, so that makes it really hard to understand the interactions there, because, on top of everything else, they’re also operating at the quantum level.

So for those kinds of complex things where the behavior isn’t just driven by what touches or doesn’t touch but also because it has different properties depending on what happens over time—thermal reactions and so on—there are all kinds of engineering domains where a visualization can show what is going on. And it doesn’t have to show the real thing. It might just show different colors, so the people working with it can understand what’s happening. On top of that, if that visualization is interactive, allowing to change parameters on the fly, this will be generally better than just a static picture. Again: words, pictures, videos, interactive visualization. Kind of an evolutionary path.

This is also the reason Dr. Cruz-Neira chose emerging to be part of the center’s name, because if at some point we can provide some of those industries, through software and technology, with an incremental update—a picture now, and later on better pictures or more pictures in less time—this is the core mission of EAC.

I’m told you have lots of students working in research projects, many of them industry funded. Is that unusual?

Very, especially in the U.S and especially involving undergrads. But the National Science Foundation encourages people who ask for support to also support student research as an add-on to traditional grants, where the NSF pays for research education for undergrads involved. This is a nice way to filter and encourage talent. We use undergrad aid to create a talent pipeline into graduate studies, which might just be a Master’s degree. But a Master’s degree in computer science today is basically a middle manager at a technical level tomorrow. These are people who basically manage, for example, 10 other people and can make twice the median pay in Arkansas. So, involving undergrads in research helps both the research and, in the longer term, the undergrads themselves.

Can you talk about some of the research projects you’re doing?

Sure. And I should say that one of the ideas behind the EAC is to collaborate with local industries in Little Rock or throughout Arkansas.

Right now we have a small project with a company called SolaRid, which provides agriculture businesses with what SolaRid calls “the sustainable solution to insect monitoring.” They have an interesting product based on the Internet of Things. Basically, it’s a device powered by the sun with a lot of sensors. It can indicate the temperature. It can figure out where it is on the earth. It has a small electrical grid. It can switch on lights, which attracts insects. And when they fly into the electrical grid, they get fried and the remains fall into a little bag.

So we can do analysis on top of that: Over a certain amount of time, how many ounces of insects did we get? But we can go further—while we’re frying the insects, we can suck up the fumes and try to figure out what kinds of molecules are in there, which could give us a hint as to what kind of insects we actually have. All of that can help in analyzing actual conditions on farms, allowing farmers to react in real time to changing conditions, possibly without overusing chemicals.

Out on a farm field they don’t have just one of these IoT devices. They have several, connected in a network. That means it’s not just a singular point of measurement. This could be set up to look at a whole region, and depending on the insect strike counts—that’s what we call the frying of insects—we could actually follow a swarm through several fields. Another way of using data to predict and project.

The EAC’s current collaboration with SolaRid is to develop end-user software for SolaRid’s customers. A farmer can look at the status of their devices and make decisions. An integrator may look at a larger region and say, “Oh, I saw a disease list changing. Let me give you a different profile so the right insects are being attracted and fried on your devices and not the wrong ones, such as bees.” You know, we want to keep the bees.

We’re also involved in something that’s funded ultimately by the Department of Homeland Security, in which we’re working on several different sub-projects. One is to take a look at x-ray images and detect not just solid threats like a knife or a gun, but also biological threats.

Such as?

Sausages from Poland—things like that. But also there is an increasing amount of traffic with organs, blood samples, and that kind of stuff. So they are detectable because x-ray machines are not just making photographs of something. They actually give you, over several disconnected steps, an ability to analyze things like atom configuration or electron setup. You’re seeing a certain density in the image and that gives you an idea of the chemical compound or the range of chemical compounds that are possible for the detected configuration. So, we can actually distinguish between plastics, light metal, heavy metal, and biological stuff. And the devices are getting better.

Cybersecurity is part of the EAC now too, right?

Yes, together with Philip Huff. We were very lucky to get him on board. We actually do three things. We’ve built what we call the UA Little Rock Cyber Gym, which Philip developed as part of his Ph.D work. It’s basically a software that allows students to run cybersecurity workouts in the Cloud; a workout is like a training unit in a gym—for example, how to defend an intruder or how certain malicious activities work. Because these workouts are self contained in virtual machines and never connected to real systems or networks, students can experiment and experience what would be at the very least problematic in the real world. This allows us to set up educational training material that says, “This is the problem, and this is how you would usually solve it.” And then you can click and go into one of those Cloud environments to work on the problem. We are currently working in collaboration with Virtual Arkansas and the Arkansas Department of Education to provide the Cyber Gym as a course resource for high-school cybersecurity courses, fully online, fully interactive.

We are also looking into cybersecurity research for power distribution. This is the field where Philip Huff originally comes from. Here the problem is that power companies are highly regulated, so if they have to update their computer and control systems on their networks, they can’t just walk around and stick in thumb drives or update their computers in an arbitrary way over the network. They have to have a plan. They have to plan in terms of, if this machine doesn’t update correctly, what could go wrong? Because they have network effects, you know? As I like to say, If the power goes out, people die. So, keeping the power on is of highest priority.

We are working on two projects for the power-distribution industry. One is basically what Phil is driving, from his experience, and that is to automatically build schedules for doing firmware and patch updates over networks. The goal is to figure out what’s the best way to do that to (a) minimize the effects if something goes wrong, and (b) to have a Plan B, C, and D if something goes wrong. So if this or that goes wrong, what do we do next, and so on? And you get a plan of action out of that. The solution is similar to a complex schedule and can be executed and verified using software automation.

The other project is more my métier: My students are using virtual machines and virtualized technologies to simulate control systems in substations, because substations aren’t just relays and breakers and so on. They also have a lot of computers, and everything is computer-controlled today. But usually, this is hidden behind, let’s say, industrial systems, and it’s hard to get information on what they’re actually doing, what state they’re in.

So we’re working on getting more of an idea of what’s going on in the network itself, so we can at some point say: This is the baseline behavior or traffic in a system, including the substations. This is the normal situation. Then we can use AI or machine learning to see where the fluctuations or irregularities occur. Is there something in there that tells us, no, that shouldn’t happen? Given the volume of events and information in such industrial networks, being able to just pinpoint where things happen will greatly alleviate operator stress.

The UA Little Rock CAVE showing how a feature at the Tulsa Zoo would look when built.

Not all VR or AR applications capture my imagination as much as the CAVE technology did when I visited you back in January. Can you talk more about how the CAVE is used in industry?

One of the core ideas behind the CAVE is to put the viewer into a space that has basically only walls that are being projected on and take you somewhere else. It’s like a movie theater with multiple screens. However, the other difference is that in a CAVE no movie is shown but images rendered in real time based on user interaction, just like any computer game but really big. The content of these images then could be the inside of a molecule, and it could be a simulation of the solar system, the galaxy, or the universe. You can manipulate objects, you can navigate, you can fly through the whole thing. So you can do all the things that you cannot do in the real world, including getting into a volcano or inside the sun because—well, you would not survive even getting near the sun.

So we’re basically going from an analysis point of view, where things are static and mathematics driven, to image representation, where we interact with the images. And the interaction with the images or manipulation of objects changes something in the simulation, which in turn will change the images in a way that we might then say, “Oh, I expected that,” or, “No, what is that? Why is that? Can I see the numbers behind that, please?”

CAVEs are very similar but you never find the same CAVE in two places because they’re hand built. They are unique. That also makes them extremely expensive, not just in building but also in maintenance. The CAVE we have is seven or eight years old, so for graphics people, this is like two or three generations in the past. There are lots of display tiles, so it’s still kind of high resolution, but to replace some of the technology with new technology wouldn’t really work, even if someone wants to spend the necessary money.

In that kind of environment, what you tend to concentrate on is things like architectural visualization, such as we developed for Nabholz to show life-size designs to their clients at the Tulsa Zoo. You want to show the immersive real-world experience of something that, if you need to build it, would probably cost $10, $20, $50 million. With a CAVE it is possible to show more or less actual real-world appearance to the people who have to sign off on it. They are now able to experience the building before it’s built, which is quite different from looking at 2-D plans and allows for discussing and experiencing variations before spending money on construction. Essentially, decision making becomes more confident.

Car manufacturers, or companies that are involved with car manufacturers, have used CAVEs for a long time because they provide a really big field of view. The bigger a display’s field of view, the more we accept it as reality. You can do design in this environment. You can do training in it as well. This is what the military tends to do to be able to train for situations that would be too costly in the real world, either in terms of material cost or human life. So with virtual reality, especially in a CAVE, you can do all of these things related to the physical world without many hours of training, like take-off and landing of an airplane, and also without damaging lives or the environment.

When we get to travel again, we’ll continue to go to the I/ITSEC conference and trade show. It’s held every December in Orlando, though it’s been “virtualized” for this year. That’s the place where all the new toys related to military technology are put on display.

One thing I saw last December was a round screen with a speedboat mockup and a virtual environment projected in the middle of the screen. Sitting in the boat you were in full control. And the best part of the speedboat mockup was that it was set on a high-performance hydraulic platform that could simulate decent waves and highway speeds. It would give you the real-world experience of driving a boat on the open sea at 60 miles an hour, bouncing up and down. When I tried it, I ultimately had to press the emergency button after two minutes because I’m not cut out for that. That was really, really, really scary and the simulated waves were not that big at all. But without such a CAVE-based simulation, you wouldn’t actually know what to expect. You would just come to the boat, put on a life vest, and someone would say, “Okay, let’s just take a ride.” And you’re thinking, What could possibly happen? What could possibly go wrong?

Ultimately, virtual reality and augmented reality technologies will help us to better prepare for challenges in an ever-complex physical as well as virtual world. Be that for professional training, exploratory research, or plain recreational use. At the EAC, we’ll contribute with research, and also by educating the next generation of computer science and engineering experts. Fun times are ahead.