Scott Anderson,
Executive Director, Forge Institute’s
American Cyber Alliance
WITH OCTOBER BEING Cybersecurity Awareness Month, we sat down with the recently named Executive Director of the American Cyber Alliance, a program powered by Forge Institute and dedicated to building public-private partnerships to reduce cybersecurity risks through training and collaboration with stakeholders in business and government. A 24-year veteran of the U.S. Air Force, new “civilian” Scott Anderson brings decades of military cybersecurity experience to the table.
———————————————-
Did you have the beard the last time I saw you?
No, this is new. In the Air Force, I had to shave almost every day for 24 years, and before that I couldn’t grow facial hair. I told myself, When I get out, I’m growing a beard. My retirement was effective in April.
Well, congratulations on that, and also on being named executive director of the American Cyber Alliance at Forge Institute. Tell me about your cyber background.
In the Air Force, I started off doing electronics, avionics, aircraft communication, and navigation systems. I did that for eight years on active duty. After that, I joined the Arkansas Air National Guard—all 24 years of my Air Force career was here in Arkansas.
Where are you from originally?
I grew up in Washington state. The southern base of Mount St. Helens.
Yep, I remember when that blew. So you were talking about joining the Air National Guard in Arkansas.
Yes, when I first joined, I heard there was an intel unit that was growing out on the Air Force Base in Jacksonville. I went and talked with them and decided to cross train to work on computers to support that intel mission. Then, four or five years later, I was commissioned as an intelligence officer.
In 2014 when that intel mission moved to Fort Smith, I had an opportunity to stay behind and stand up a Cyber Schoolhouse under an order of the Secretary of the Air Force, who was visiting. Six months later we were offering our first course, which is “light speed” when it comes to the Department of Defense.
That’s when I started reaching out and figuring out what cyber activities were going on in the state. I met some folks from academia and from the private sector, and I just started building on the already established community. My goal was to see how we could all work together to benefit the state of Arkansas. And maybe build some best practices that we could replicate across the country.
October is Cybersecurity Awareness Month. What’s your message?
To put it simply, I think we as a state need to understand cyber better. There’s a lot going on in cyber right now, and leaders need to be thinking about prioritizing cyber and cybersecurity and managing the risk in this complex environment.
What do you mean—“a lot going on in cyber right now”?
Coming from the military, I would say that what we call “the attack surface” is growing with everybody using and integrating technology into everything. It was growing before COVID, but it’s growing exponentially now because of the pandemic. Think about how many more people are working from home, and how many are using technology they’ve never used before. I’ve been told that cyberattacks have increased over 350 percent since COVID started. Another challenge is the adoption of new technologies (5G, Internet of Things, etc.) and investments in broadband. They enable great capabilities but also dramatically expand the “attack surface.”
Wow. What kinds of attacks are these?
Many are just the common criminal trying to steal people’s money through scams or fraud. Everybody’s read about the fraud in regard to the unemployment situation today—all the fraud tied into COVID grants and loans. And I just read an article saying there are 6,000 new phishing attempts coming out every single day.
But there are also Advanced Persistent Threats (APTs) that include threats from various nation-states that want to cause harm and damage to our country and to the American people or our way of life.
People need to be thinking about everything that’s going on. Cybersecurity, like physical security, is everyone’s responsibility, not just the responsibility of IT or cybersecurity professionals. It’s every single American’s responsibility.
So do you think most people just don’t get it, or don’t care about it enough, or just aren’t aware of what they can or should do?
I think it’s all of the above. Some people don’t understand it. Some people don’t care because “it happens to other people,” but not to them. There are also people who’re just blind to the threat. There are a lot of folks who understand the threat but are not equipped or empowered to mitigate it.
Everybody wants to implement technology to solve their problems or to make life easier, but when we do that, there’s a security risk. Everybody has home Wi-Fi. Do they share that password with everyone who comes to their house, or do they have a guest network setup?
Right. Do you share passwords?
Not all of them.
So how else is COVID specifically tied in with this?
Well, because people are working and learning at home in greater numbers, there’s been a lot of new equipment purchased during this pandemic. When purchases or technology capabilities are rushed and implemented into your infrastructure, who knows what you’re introducing into your network?
Another example may be the privacy risk of engaging in telehealth, which has really come into its own during the pandemic. The doctor texts you, you click on a link.
How would that be risky?
I don’t know what security practices my doctor’s office has implemented. When I choose my doctor, I don’t necessarily say, “Hey, what software do you use? Are your remote consults or appointments secure? How are you making sure no one is eavesdropping?” I’ve learned a new term since COVID started: Zoom-Bombing, where virtual meetings are either hijacked or someone drops stuff in there that’s not appropriate. If adversaries can do that on virtual meetings, is it possible for them to do it in your conversation with your doctor?
Let’s talk about kids and phones and screens and the Internet.
Not sure I am equipped to answer this question—I really think parents should be talking with their children about cybersecurity. And that reminds me, when I hear the word cyber, I think of all technology, not just cybersecurity. I have four boys, ages 10 to 22, and I talk with them about what they should and shouldn’t do with their technology.
But I don’t think most parents are well enough equipped to talk with their kids about cybersecurity and what they should do online and what they shouldn’t do online. Again, security is everyone’s responsibility, and so is cybersecurity. You want your kids to understand the ethical means of what you can and can’t do with technology, but that’s also huge from a safety and a physical security standpoint. Don’t post your location. Don’t share your personal information, because once it’s out there on the web, it’s out there.
I’ve read a couple articles about human trafficking, and how people are using technology to abduct children. In the old days, kids were taught not to talk to strangers. Now everybody is talking to strangers all day long.
Don’t ever post your location?
I think understanding where you’re at risk is the key. Now, obviously, when I use GPS, I allow access to my locations. That’s what you have to do for GPS to be useful. Otherwise, you might as well go back to the old map in the glovebox. Dated myself a little there.
But what I’m talking about is people checking in on social media. If they’re checking in at the beach or even posting a public picture of them at the beach—well, now there’s some risk, because maybe nobody’s at home.
That reminds me of all these Internet of Things gadgets we have now—Alexa, remote doorbell cameras, and so on. We hear about guys spying on children through Alexa or through some kind of camera in their room. These things must be adding to the threat.
They are, and 5G is going to grow the number of connected devices significantly. Just like with COVID, when we start adding all of these smart devices and integrating everything in our life, there is absolutely a larger surface area and increased vectors for attacks. And I have IoT devices that are hooked up at my house. There are ways I do it to reduce some risk.
But I would say that lots of times when IoT devices are built, security takes a back seat. Not only the hardware security, but even the software that runs them. Often, developers open everything up because they want it all to function; when you start adding security restraints, the functionality might not work as well. They want user experience to be high, so security is an afterthought.
Are you concerned about the election?
I don’t know a ton about election security, except every system has vulnerabilities. I doubt anybody’s going to hack into a voting machine; even though teams have done it, you have to have physical access. You can’t remote into it.
My concern is more in the information operations or influence operations. If there are news articles or social media postings saying that something happened, whether it happened or not, people are going to lose confidence in the voting system.
Do you have a worst-case, worst-fear cyber threat scenario?
I’m not a fearful person. But I hate that we have all of this knowledge and ability as a country, as a society, and yet there’s not enough getting done. I’m afraid we’re not doing everything we could to protect the American people. Knowing the threats as I do, and knowing the damage that could be caused, I think we’re walking a thin line.
I don’t want people to start dying before we secure our infrastructure from a cyberattack. I don’t want people to start starving. I don’t want everybody to be unable to work, things to shut down, or the American way of life is just over.
I feel like we’re getting a little taste of that in this pandemic—the various fallouts from it. Not just the deaths but the financial hardship.
Yes, the third-order effect of people being unemployed or not being able to provide for their family. Even the uptick in criminal activity around fraud and cyber activity—some of that could be because these people are now unemployed, and they’ve got to figure out a way to make money.
Training is a big part of ACA’s work. Tell me about that.
We provide training tailored to our customers’ needs. For example, if we had somebody from Entergy in a class, we would make sure they’re getting information that’s relevant and useful in the energy sector. Same with banking, or whatever sectors our attendees might come from.
Our training ranges from beginner to intermediate to advanced and also varies in length, from one- to four-day courses. And then we’ve built a 14-week bootcamp that takes folks who’ve had some prior information technology experience and turns them into cyber analysts. We’ve partnered with ACDS to build pathways for people to get into cyber. With ACDS, that 14-week bootcamp can be an apprenticeship program.
Today, cyber IT professionals and cybersecurity folks within organizations don’t fit the typical engineering profile that used to be the norm. Now it’s more of a first responder role, like a firefighter or an EMT or a cop. When a cybersecurity incident happens, they get called and they have to resolve the problem.
The majority of our training addresses the kinds of things they need to know to deal with cyber threats within organizations. From my military experience, I think of offensive and defensive operations, red team and blue team. I think of the movie War Games. Understanding the adversary and the different attack methods from an offensive operational standpoint equips you to be a better defender. We integrate this kind of learning into our curriculum and hands-on exercises, both in our micro-bootcamps and our 14-week course.
So more and more companies are including a cybersecurity person or persons on their staffs. Is that correct?
That is correct. But everybody should be doing more. While I would say that organizations are taking it more and more seriously, often companies are looking for hardware or software as solutions to solve their problems. I think they’ve got it wrong—training and developing their people is a better solution. People who implement and sustain the hardware and software need to better understand cyber and the risks. So again, I think all organizations should be making cybersecurity a priority, and should be hiring or developing their team to secure their systems and make their company more resilient.
What do you see in the future? Will the threats just keep getting worse?
What I see are continuing advances in technology, which means growing the number of ways people can attack. If they’re using some kind of machine learning or AI to attack systems, the only way to defend yourself is through machine learning and AI. So we at ACA are making sure that as new technologies are introduced, we integrate them into our courseware.
Then the question is, how do you integrate that into operations? There’s great innovation and research being done all over the U.S., and all over the world—everybody’s doing operations, to some extent, within their own companies. The problem is, these are all siloed. So we need to bring these things together, and train and develop people on how to operate utilizing new technologies and innovation. At ACA, this is our operational philosophy, and that’s the sweet spot in my mind
Because then you could develop best practices. You could be more relevant and successful at solving problems that we all face. Besides forwarding our operational philosophy into joint action, at ACA we focus on awareness. And let me say it again, cybersecurity is everyone’s responsibility.