On Sept. 30, Damon DePaolo, director of cybersecurity talent and education at MassMutual, joined Living Security co-founder Drew Rose for number six in our Breaking Security Awareness webinar series, Unifying Risk. The discussion at hand: when it comes to the human element of an organization, what are the combinations of behaviors that create risk? And how can we address them holistically through education, helping our employees to adopt security-minded behaviors?
Key takeaways from their discussion:
- When it comes to risk, if you're not actively doing something to change behavior, the behavior's going to go in the wrong direction.
- To make secure behaviors persist, you need to help your people develop a security mindset.
- Creating a security mindset means building security awareness training programs that connect to all areas of people’s lives, not just the ones that pertain to their jobs.
- Your security organization has all the data you need to start to look at the human risk elements. It's just a matter of breaking down silos and seeing things holistically.
- What combinations of behaviors contribute to risk that an individual team might not see because they're only focused on what resides within their realm of responsibility?
- Creating a culture of security depends in part on making your users want to make secure decisions, because they know that it’s in their best interest and that it's not going to hamper their work.
- People connect better to real-world examples than to generic edicts.
Couldn’t make it to the live event? We’ve got you covered. You’ll find the transcript and the video of the webinar below.
And if you like this month’s discussion, make sure to join us on November 18, when we’ll be talking with Nick Marcheselli, senior security engagement specialist for LogMeIn, about what it takes to seamlessly implement a successful security awareness training program. You don’t want to miss it, and not just for the sterling insights—there are always fun freebies when you show up live (iykyk).
Breaking Security Awareness Webinar 6, Unifying Risk
Featuring Living Security co-founder, Drew Rose, and guest Damon DePaolo, director of cybersecurity talent and education at MassMutual.
Brandyn Hampton: Welcome, everybody, to our BSA webinar number six, Unifying Risk. We're going to get started here in just a few minutes. Welcome, everybody, and we're just going to take a few more minutes to let everybody in, and we'll get started here in just a minute. All right, guys, I think we've got a good amount of people in. I think we're ready to go ahead and get started.
Hi, everybody. I'm Brandyn Hampton, event marketing manager here at Living Security. Welcome to our sixth series in the Breaking Security Awareness webinar series that we have started this year. Today we're going to be talking about unifying risk. Our host is Drew Rose, CSO and cofounder of Living Security, and our guest is Damon DePaolo, director of cybersecurity talent and education at MassMutual. Drew is a CISSP with a bachelor's of science in cybersecurity who has spent years building and optimizing security programs in the public and private sectors. While serving in the military, Drew learned effective strategies for fighting cybercrime and earned a top-level security rating in the US government. At Living Security, Drew applies his in-depth knowledge to reducing enterprise and personal risk by designing science-based, collaborative security awareness programs. Drew, I'm going to kick it to you so that you can introduce our lovely guest, Damon.
Drew Rose: Thank you, Brandyn, and it's a pleasure to be here, excited for today's conversation with one of my favorite people. As Brandyn mentioned, my cohost on this journey of unifying risk on our latest addition of Breaking Security Awareness is Damon DePaolo. Damon has worked in various aspects of technology for 21-plus years, including programing, research, and development, and, of course, cybersecurity. His passion and joy found in bringing his unique brand of creativity to every new challenge, and he's an advocate for the value of continuous learning and self-development. Damon is most happy and engaged when experimenting, creating, or tackling a new problem. Damon, welcome to Breaking Security Awareness. How is your day, sir?
Damon DePaolo: It's going great, Drew. Thanks for having me.
Drew: Now it's a pleasure. First off, to my audience, thank you for being here today. We're excited to have you share some time with us this afternoon, and we really hope that the conversation that we're going to have is going to be beneficial. But I don't want this to just be a two-person conversation. I will be monitoring the chat. I will be looking for questions. If you are interested in joining the conversation, let us know. I'd love to bring you in and see if you have some contribution or you have some questions, or maybe you can give your own perspective on the topic that we're chatting about. So be ready. Curve balls are incoming, as baseball season is wrapping up.
The first question, this is a bit of an icebreaker. I did not prep Damon with this icebreaker question. I actually have two. The first one is, who would you rather have on your red team, Ethan Hunt from Mission Impossible or Indiana Jones, and why?
Damon: All right, so in full disclosure, huge Indiana Jones fan, always have been my entire life. My initial goal in college was to go into archeology purely because of Indiana Jones, and I stopped when I realized there were no fedoras or whips involved. So I'm going to have to go with Indiana Jones, but not just because I'm a huge fan, because he is amazing at getting around obstacles and thinking in the most innovative capacity when he has to solve a problem.
Drew: That is an excellent answer, not just because of the fedora, because of his way in and around buildings. But I would have to chose Ethan Hunt. I just went on a Mission Impossible binge. I have done a couple of cross-country trips, and I binge watched all six of them. What I've recognized is how Ethan Hunt, in Mission Impossible, how he approaches trying to get access. I look at his methods, and it's very much defense in depth, going from one layer to the next layer to the next layer, and thinking through all the things that could go wrong or the different places that he could get found out during those different missions. I thought that was very exciting to watch as a cybersecurity professional. Last question, then I promise we'll get into the good stuff. Who would you rather be your CSO, Gandalf from Lord of the Rings, or Darth Vader? And go.
Damon: I'm going to have to go with Darth Vader, I think, someone who understands the ins and outs of the Dark Side is going to be able to help us, really, to find ways that we can protect against it. He has that glimmer of goodness inside that shows through at the end, so you know he wants to do the right thing.
Damon: Come on.
Drew: I feel like Darth Vader just embodies ... or is too self-portraying the cyber criminal stereotype. He is the epitome of the hacker wearing the hoodie that us security professionals are trying to get away from.
Drew: He's not a realistic leader of the security team.
Damon: Fair enough.
Drew: Thanks for playing along. So today's topic is really about unifying risk. Let's start just high level. When you hear those words, unifying risk, what do you think of? What are the first things that comes to your mind?
Damon: So, for me ... And it's important to point out that I'm coming from the angle of someone who's focused on security education. So when I think of unifying risk, I think of understanding all of those risks that come along with the people element of your security domain and how we can look at them holistically and bring them together and figure out what is the greatest risk from our people element that we can try to solve with education. What are the combinations of behaviors that create risk in order for us to really understand what we can do to enable our human element to be more secure?
Drew: I think that's a key point that we're going to be digging in to, is how we are to enable the human factor to reduce risk, right?
Drew: If you're not reducing risk, I think risk is increasing. If you're not actively doing something to change behavior, the behavior's going to go in the wrong direction, right?
Drew: It's really similar to exercising your muscles. You can get real big, strong fit, run a six-minute mile, but if you stop exercising, in two, three, six months from now, unless you're an Olympian, you're not going to be able to match where you were during that exercise. I think we're learning in these last several years in our industry of how we can help our end users exercise those muscles and get stronger when it comes to identifying risky areas of their life both at home and at the office.
So that really brings us to where we're at today. It's the eve of cybersecurity awareness month. As a company, Living Security, that has been in business for ... This is, I think, our ... '17, '18, '19 ... our fifth cybersecurity awareness month. I had to count. This is our fifth cybersecurity awareness month that we're trying to help bring solutions to clients. We see this as an opportunity for companies that aren't necessarily exercising that muscle throughout the year, doing it during this month because of many reasons. So what I'd love to learn from you right now is, what are some of the things you're doing right now during the cybersecurity awareness month to help exercise those muscles that relate to reducing risk?
Damon: So one of the things that I'm most proud of that we're doing during cybersecurity awareness month and that we're trying to do more across the board is bringing security education back to topics that matter most to people. So I fully believe that, if you really want to exercise those security muscles, if you want to build them and make them persistent with people, you have to build a security mindset into the individual, not just into your security team, not just with security professionals, but into every individual that you work with.
So by building content and providing content to people that relates to them on a way that security awareness annual training doesn't, the things related to their job duties might not, and connecting it to things like their children or the elderly parents that they take care of or other people in their lives or their home network, things that connect with a person on a personal level that make them feel like you're interested in giving them the techniques necessary to secure their lives, that's going to build a security mindset in them that's going to last into their work behaviors as well. It's going to make them think about security as an important element in their lives, not just a task that they have to do as part of their work. So providing webinars, some of the great content that you've put together, the Family First series of webinars, is going to be an integral part of our cybersecurity awareness month because it connects to people on that very personal level.
Drew: So I think that's one of the first things we're hearing in terms of unifying risk. It's like, if we can unify the thoughts around secure decision making both at home where, to be honest, they care most, to at the office, we think that the behavior change at home is going to correlate to behavior change or just thinking differently while at the office. Other than content, how do you help bridge that gap, and how do you do that at an organization your size?
Damon: Well, the key is really breaking down silos. Your security organization has all the data you need to start to look at the human risk elements. It's just a matter of seeing it holistically. So when you're in a big organization like this, we have well over 100 security professionals that are a part of our organization in a bunch of different areas, and they are amazing at taking the data that they have and using the tools that they have to identify risks to respond to incidents and mitigate risks in that realm. But when we think about that unified risk of the person, the human element, there's not always someone that's there looking at that and trying to get ahead of that.
So I think to take that first step into understanding unifying human risk and approaching that, you need to enable and empower a team of people to take all that data, that disparate data from across your different organizational elements and your different tools, look at your SIM and your logs and your outputs and identify, what behaviors are important to you that relate to the risks that you have in your organization? What combinations of behaviors that an individual team might not see because they're only focused on what resides within their realm of responsibility? So what combinations of behaviors add to something that might be minor there that create a bigger problem that needs to be addressed earlier before it becomes something that the email team has to take care of because someone has started responding to phishing emails or the like.
So I really think that the best way to make that first step into pulling together that picture is to start breaking down those silos, engage, empower a team to look at data holistically, to be able to pull that all together, and then give them the ability to make decisions to start to mitigate that risk. When we're talking about human risk, a lot of times those decisions are education.
Drew: So two years ago, the cybersecurity landscape looked a little bit different in terms of control. We were able to control the majority of our end users through a defense in depth, a layered perimeter because they were at the office, and we could control their internet connection. We had our firewalls and application firewalls and our proxies and all of these things in place to say, okay, if you are going to do something irresponsible or against policy, we'll have these controls to stop you.
A little while ago, someone real smart said to me, "We're not working from home; we're living from work." We're not working from home; we're living from work. So what that means is, in the last 18 months, going on two years, everyone's obviously gone home, and that perimeter has changed. We don't have that control. We can't walk around and see what type of PII people are leaving on desks anymore. It's like we're missing out, I think, on a lot of key data points that were a lot easier for us to assess if our end users are following policy. So from your perspective, when you think about getting data points to make better decisions on in terms of our current behavior, how has that shifted from two years ago to today? What are some of the key things that you're focusing on now that, I think, most organizations are never going to go back to normal?
Damon: I think you really nailed it. A lot that's changed is that we're so much more distributed now. You don't have the physical capability to monitor and see what's happening to people or around people or in people's environment anymore, which really means that you have to start to build a more resilient and more flexible security posture. So you have to have a strong focus on what to do when something has been identified, when an incident has been identified or when some sort of control break has been identified, because in this type of environment ... I mean, in any environment, there's no way to stop control breaks and incidents from happening. With as much defense as you can, you can try to mitigate as much risk as possible. If anyone ever tells you that they're going to remove all risk, just turn around and walk away, because they're selling you snake oil.
So going into a world now where we're so distributed, there's a lot of trust that gets put on the people. There is an inability to, like you mentioned, lock down to the level that you were previously capable of. But even in those environments, even when you could lock down everything around people, people find a way. They always found a way to get past those controls because the individual, the person who needs to do a job, is most concerned with one thing when you think of it from the perspective of the CIA triad, and that's availability. They want to get to their data so that they can do what they have to do as quickly and efficiently and the way they want to do it. People are amazing, and they are ingenious, and they are innovative, and they will leverage all of that capacity to get around your security controls in order to enable that availability. That's only getting harder and harder to mitigate now that we're distributed and that we spread people out like this.
So I think it comes back a little bit to what I was saying earlier, that to address that human element, you have to convince them of the importance of security. You have to enable them to start making secure decisions, to want to make secure decisions, because they know that it is in their best interest, they know that it's not going to hamper them. So you have to make it easy, enough friction so that they don't do the bad things, and then make that availability capability there so that, when they hit up against that friction, they can do the right thing and go the right way, and just continue to educate them on the importance of security in their daily lives and ingraining it into the work they do, not looking at it as a blocker or an add on.
Drew: I call it security experience. We've spent ... Technology companies that are in the B2C space ... Think of apps with in-game purchases, all of those addictive, mobile games. Those addictive mobile games have made it so insanely easy for the user to purchase coins, tokens, extra lives, extra levels, whatever. They make the most frictionless experience for the user they can to increase their revenue, and they've invested heavily to ensure that workflow is smooth. What we do from a security technology perspective, when we think about the [inaudible] make it so hard for end users to make the secure decision. So something that I've started thinking about [inaudible] within our [inaudible]. Let's make it easy for somebody to send out secure information via email. Instead, they have to install the add-on, and they have to type in the special six-digit code, and they have to press this button to send it to the right email [crosstalk]
Damon: Pick up your phone and use your biometrics there.
Drew: Have you come across any processes over the last couple months or year that you felt were very difficult and hard to achieve a good security expectation and worked on it to make something more of a better experience?
Damon: Well, I think the things you were talking about, a lot of those authentication methods are a great example. We, I think like many companies out there, have been looking for the right one that's going to give them seamless experience that really does streamline the end user's experience and allow them to do things like move away from passwords, because nobody wants to be using passwords anymore. We did run into some bumps along the way in our journey, and we're still on our journey to get there, so I don't by any means want to say that we've found the perfect solution. But I think we've taken some really great strides along the way, and we have worked with our users.
What's really important in this specific conversation, in this specific topic, is understanding your user's needs and connecting with them when you are engaging in efforts to utilize these products. You don't want to pick a product, say, yeah, this looks like it's going to be great, give it to a bunch of your technical people who are by far more understanding of some of the foibles of technology. You give me something that is a little bit clunky. I'm going to be fine with it because I'm like, oh, yeah, I used to program. I know this is just how things are. So you can't rely on me to tell you that your product is great and that people are going to love it. So engaging with your clients, the people within your organization, early and repeatedly and giving them [crosstalk]
Drew: Let's get tactical. You have 25-plus thousand people at your organization in financial services, so you're a huge target for hackers and attackers, and there's a lot of money and services at risk. How do you approach end users at a scale where you can derive whether or not we're making a good decision in implementing new technology, implementing new process, designing new features, so on and so forth.
Damon: The scale that you need is really not as big as you think, even with an organization this size. So it's not so much the scale of the people that you approach. It's understanding the right people to approach. You have to define your audience and build those personas of your key customers very clearly, very intentionally. Then you can go out and get a dozen of people that fit that persona, and use them as your focus group for this technology, and then slowly expand out from there. But by hitting the people that are going to have the most problems or be the most resistant, understanding who they are, what makes up that group of users, and bringing them in early on even in small numbers, you're going to hit 80 to 90 percent of the problems that you're going to experience. That's going to allow you to get the feedback you need to move forward. It's all about taking small steps and adjusting along the way.
Drew: So it sounds like you're advocating for talking to end users.
Damon: Absolutely. Talk to people. People are who we're here to serve, so [crosstalk]
Drew: So you work in an organization, and you came up through the ranks of many different technologies, not purely a cybersecurity architect. But you work with all of these super technical ... And I'm sure we have a lot of them on this call, super technical cyber people that ... They're in their zone when their hands are on their keyboard and they're looking down and they're solving a hard problem. How do we convince those types of cybersecurity pros that conversations with end users are as valuable as me and you both believe they are?
Damon: I mean, I think they're more open to that concept than you want to believe based on movies like Office Space. I take the requirements from the clients, and I talk to the developers. I think people understand the need to engage with people in the world we're in and the connectedness that we have right now. Understanding the need and wanting to are two very different things. When I was a highly focused developer type person, I definitely was there like, oh, I just want to put this together. I don't really need to talk to them though. They'll tell me what's wrong with it once I give it to them. That was 20 years ago, and the world has changed a lot since then. I think most people realize that.
But putting an infrastructure in place, a people infrastructure, in your cybersecurity organization to help mitigate that, put business information security officers in place that can be the key contacts to your customers and can broker those conversations or have those conversations. Have other elements of your team working together. It really is about having ... back the concept of unification. Having a unified security organization so that your technical people can do what they do best and focus on those technical things and have the support network that can go out and get that customer feedback for them is a great start.
Drew: Do you have any memorable stories of these conversations, like an a-ha moment of, oh, man, I totally learned something new today?
Damon: Oh, geeze, you're going to put me on the spot for a real-life story, huh? I mean, I think, in the security space, nothing's coming to mind right away. But in previous roles ... because I've been doing that kind of focused work on human centered design and bringing people in early on. I can tell you that I've done work efforts with other people across my organization where we have started from the point of building customer personas, understanding their people, and this could be our claimants, for instance, for some of our products.
Just the act of sitting people down and saying, "Gather this feedback. Bring it together. We're going to work on building a persona that defines who your core audience is and then talk about their journey," when you map that out with the people building the products, the people administering the products, and they can look holistically at who the person that they serve is, the a-ha moments come nonstop, left and right, like, oh, I never realized. We get complaints here, and I see it now. I understand now because I'm looking at it visually as a journey of this person's interactions with us. It makes sense why people are frustrated. Here's the bottleneck here. This is always where they fall down. From a security perspective, it might be, this is always where they're trying to get something done, and we're slowing them down, or our product is getting in the way or making that harder.
So I would say although a specific memorable moment isn't coming to mind right off the top of my head, every time I sit with someone and journey map one of their client's interactions with us, the a-ha moments come constantly.
Drew: That's powerful stuff. So let's get into the nuts and the bolts of the conversation that we were intending to have today. There's a ton of technology. Every day there's a new startup or a new top tech that's coming out that says, "We'll protect you. 99.9 to 99.99 percent security denials or whatnot." A lot of those tools are important. The thing that I always like to say is, as a security awareness company, human risk management company, we don't think that just educating your users and changing behaviors is going to eliminate risk. We believe that there is a place at the table and in the budget for firewalls and end point solutions and DLP. But we also think that there needs to be a bigger place at the table for human risk management because that's where, effectively, a lot of the risk is coming in from.
So the threats out there are incredibly unpredictable and difficult to manage, but like you mentioned earlier, gathering data is a really good place to begin. Can you talk to us a little bit about how to measure and quantify the enterprise human risk factor?
Damon: Yeah. I just want to support what you're saying. I agree. I think the tools are not going to ever be the only answer. They're a hugely important component of it, and human risk needs to also be at that table. I keep trying to sing the song that human risk and education are security controls. A lot of people don't see it as such. They see awareness as a compliance thing, and we need to go beyond that and educate people and recognize that that is a security control because human risk is a horizontal risk that cover everything. Human risk comes into play every single step of the way, be it intentional, unintentional, whatever. So when we look at-
Drew: [crosstalk] misconfigurations [crosstalk]
Damon: Yeah, absolutely right.
Drew: I was just thinking of ... I'm a big analogy guy. I was just thinking, all of these tools are like pieces of the armor. You have the ... I'm a huge medieval buff, I'm sure you know me. But you've got chest plates. You've got the leg things. You've got the chain metal. You've got the sword. You've got the staff. But alone it's nothing. It's protecting something. It's protecting the human so the human can go out there and do its job, but without the human, it's useless. What are we effectively protecting with all these investments? It's protecting the ability for our team to do their job.
Damon: Exactly, exactly. If that human putting that armor on has all the greatest intentions in the world but doesn't do these buckles here, then when they get into battle, it's going to fall off, and they're vulnerable, or if they make silly moves or if they run right into the dragon's mouth, so absolutely. I think once we get to that table, once we can get people to realize that human risk is an important risk, it needs its own set of controls, then you can start having a conversation about how you pull that data together. I think we already talked a little bit about it. Step one is recognizing you need to. Step two is putting that team together that has that mission, that explicit mission to view human risk and data from a human risk perspective and bring it all together from those disparate sources that you have, and then step three is to empower them to enact controls as a result of the data that they're seeing.
So they need to be able to observe trends. They need to be able to implement education activities or other types of outreach activities, or maybe even suggest technical controls that people haven't thought of yet that will add to the environment. But they need to be empowered to do that and to implement that and to have funding to implement that and be treated on the same level field as your other security control programs.
Drew: There's a lot of interesting threads we can tease out in a couple of those statements. I think a lot of us recognize that one size fits all training is not effective. We all have different backgrounds, perspectives, job duties, job details. So I think part of what you said is, how can we pull out data from these events to make decisions on who to target with maybe that educational material that's more relevant to them?
Drew: So we have a couple of questions coming in. This one is a very comprehensive question, which I love. this is from Nico. Most every ... I would say most. Most cyber programs have problems they want to solve. I guess, every cyber program has problems they want to solve. Maybe you're right there, Nico. DLP teams looking at data protection and IAM teams looking at identity. How do you as security education programs unify cyber programs to look towards user education as a first step towards human risk management?
Damon: So my thought there is you're offering a service to those technical security programs, and have that conversation. Like I said, human risk affects every security program out there. The DLP programs, there's going to be people that are going to try to either circumvent those, the DLP controls you put into place. There's going to be people that are going to avoid implementing them. There's going to be people who are maliciously trying to get at that data.
So explain to those programs and work with those programs to help them understand that, by taking the human risk factor into this unified human risk program or education program, whatever, you are lightening the load that they have to do and enabling them to focus on the technical controls that they are trying to implement. You're taking that human piece and addressing it for them, because if they try to address it just from a DLP perspective, then with the IAM team has to address human risk, they're going to try to address it from an IAM perspective. What you're going to end up with is a lot of noise to your end users that is going to make everything just fade into the background. They're going to start zoning it out. It's too much. So by bringing it together, we can be targeted and intentional about the actions that we take, and enable those programs to work on their technical risks while we handle the human risk.
Drew: I think to really double click on your comment, it's like literally using data from your other security technologies to make decisions on who you educate and on what topics. It's really starting to isolate the groups of people that are most at risk in specific areas and focusing your energy there instead of just trying to get the checkbox of compliance with your training programs each year.
Damon: I don't want to implement sensors to get my own data. That's already out there. Of course, we can integrate with yours and work together to make the best use of it.
Drew: I think there's some really cool information that we can get back from these other data solutions in terms of insights. We've done recently a study where using data from other technology, we were able to show that users with out-of-date browsers or operating systems were more than 60% likely to click on a simulated phishing email than users with an up-to-date browser or operating system. For me as a program owner, that's really powerful as, all right, well, here we're identifying a group of users that are at risk because their computers aren't up to date. They're clicking on simulated phishing emails. I need to go target them with something, whether that's training, whether that's a communication response, whether that's changing the technical controls that are protecting their systems.
Damon: If you look at that from the lens of either of those other data sources, if it's the team that's managing the OS versions or the browser version, their response is going to be, all right, well this browser version's out of date. I need to send something to automatically update it or make him update it, but that's not getting to the root problem of that user now. If they see them clicking on phishing messages, phishing emails, you're going to say, all right, I need to give them some targeted phishing education. But, again, it might not be the problem because they're exhibiting a whole host of insecure behaviors, not keeping their operating systems and their browsers up to date, not paying attention to what they're clicking on, that could really mean there's a deeper problem. By seeing that all in one place, you get the opportunity to make different decisions that might be far more effective.
Drew: I have this great book. You can't see it behind me because of my virtual background, but I'll pull it up. It's Inside the Nudge Unit. I'm not sure if you've heard of the nudge unit [crosstalk]
Drew: They're using research done by Richard Thaler. One of the things that this nudge unit did was they tried to figure out how to get these people that lived in a certain county to pay their taxes on time. They started to do some campaigns to test this out, and they were able to determine that saying things like, "Your neighbor pays their taxes on time. Why don't you?" Kind of almost a little bit of shaming, but this little nudge of, "Everyone else is doing it. You should do it as well," was able to increase the ability for these people to [inaudible] I was just thinking. I was like, man, I wonder if we can say, hey, send a quick email to a user, "96% of the company updates their operating system within 24 hours. Why don't you?" Or something real where the metrics make sense like, "Hey, everybody else is doing this because they know it's important, and this is the risk in not doing it," bringing that to life instead of just pressing delay, delay, delay. They're like, oh, man, I know this is a little bit of an inconvenience, but me taking this action is positive for reducing risk for my company. I think it could be an interesting experiment. I haven't personally done that experiment.
Damon: I'll say, not to go off too far on a tangent, but just the concept of bringing anything real you can into the conversation with the people that you're trying to protect when you're trying to educate them is hugely impactful. Every time we get in front of someone and talk directly with someone who we are providing education to, the first question out of their mouth is always, "So what's actually happening around here? Have we actually seen attacks come against us? Have we ever been breached?" People want that real ... Now, of course, you have to be how much real you share and what the nature of that is. But giving people those real life examples makes it something that they can connect with and something that they can understand better than something that's just high-level and generic.
Drew: So something that I've been seeing happening over the 15 years I've been in cyber is, in the beginning, cybersecurity used to be cybersecurity's job. We are outsourcing all cyber detection incident response, threat mitigation to this cybersecurity team. Then we literally flipped the table and said, "Nope, cybersecurity is everyone's responsibility." That was kind of the mantra for several years of, no, we need to get everybody. What we're seeing now with the rise of the BISO and with organizations putting the responsibility and accountability on the managers and GMs and VPs of their organization of, no, VP of HR of sales of human ... The cybersecurity posture of your organization is your responsibility. It's starting to see ... Those leaders have to account for why they have the worst simulated phishing results in the entire company or why their team keeps losing PII, and they are being held responsible for that. I'm curious about your thoughts and what you guys are doing at MassMutual to balance out this equation.
Damon: Well, I will say that competitive focus can be very effective, I've found. Even though our traditional compliance focused awareness, the stuff that you have to do, by adding that element of visible competitiveness of, this organization is here, and this organization is there, and this organization is down here, it gets organizational buy-in a lot quicker because you get that top-down focus. I think that top-down focus is important for spreading that message.
Drew: Sorry for interrupting you, but really double click there. What type of metrics would you provide ... You don't have to ... Just think outside of your organization ... would you provide to these leaders to say, hey, you rank here. Everybody else is way up here. You need [crosstalk] better.
Damon: It is as simple as participation rates. You set a timeline. You show them participation rates and their people are down here and their colleagues are up here, and all of a sudden this person is ringing your phone saying, "Hey, can I get some reports that show us by leader in my organization where there's low participation so I can go and focus on [inaudible]"
Drew: On your end, are you getting that? Is that [inaudible] you're getting those emails and calls?
Damon: Yes, we've gotten those, absolutely, during some of our [crosstalk]
Drew: Nobody wants to be at the bottom. It's a little bit of [crosstalk]
Drew: ... to encourage positive decisions around risk.
Damon: It doesn't take much. It's, exactly, a little bit of gamification. It doesn't have to be this fancy robust thing, a graph that just says here and here, and that's enough to get people engaged and wanting to do better, because nobody wants to be at the bottom.
Drew: I heard this comment a couple weeks ago as well. "The person that writes your bonus has more impact on you than the CEO of the organization," and I think that really holds true to what we're saying of your manager, your VP, your director, the one that is out there writing your bonus and doing your reports, they have much more influence than Damon and yourself on, the director of the security education team, than the CSO or the CEO, right? So being able to empower those leaders to encourage their team to make better decisions. But I think to the point that you just brought up, we have to show them data. I think participation data is solid. I think other behavioral information can really dial up the realness, right? Like simulated phishing clicks, malicious website visits, PII lost, devices stolen, even the report rates. Why is your team not reporting any phishing emails? We know you're getting them. We can see them in the logs. That's not good. John over on the West Coast, his team reports 97% of the phishing emails that are accidentally delivered because the controls didn't stop that. Go be more like John.
Damon: If you can make it self-service at a point. Once you get them in the habit of caring about that data, make it very smooth for them to access it and frictionless. Then you're going to win, because then you don't have to be the middle man anymore, and people can see, at a glance, where their organization is and that they need to improvement and [crosstalk]
Drew: You're talking about giving the ability of this management leader to understand where their organization is within the risk ranking of the entire company.
Damon: Yeah, and keep it concise. They don't even have to be able to dig super deep into that data, because they shouldn't have to dig super deep into that data. Dashboards are a dream. Again, I'm not going to say by any means we're at that point in my organization yet, but it is something that we are driving towards, is being able to get people insight into their data at a level that is going to help enable them make those decisions without us always having to intervene.
Drew: Quick plug for Living Security, and then we're going to move on to one or two more questions. We're working on a product right now that's called Unify. It's part of where this conversation is coming from. That's working on exactly bubbling up those insights and data for users and groups of users to find what areas people are most at risk for. I'm super excited to be working on that, because I think it's clear that the market and our clients are looking for the ability to give their VP access to simple tidbits of information to make them go do something in terms of education or assignment of training or even communications from a risk management perspective. So if you're interested in learning more about that, obviously, hit us up.
We had a question from the audience. I'm going to preface this. I am all about risk tolerance and understanding the risk that I'm bringing to my life or my company. I am not the type of cybersecurity person that is on the far right end of the spectrum where I trust no one, I don't have IoT devices, I'm super afraid and have 45-digit passwords that I change every two weeks. That's not me. On the other side, I'm not out there skydiving. I'm not out there doing these crazier things. I'm probably somewhere in the middle.
One of the things from a security experience that I enjoyed is, several applications that I'm part of, applications that don't have my sensitive information. They don't have PII. They're not financial applications. I go to log in on my phone, and maybe I saved my password on my password manager on my computer, or maybe I made a hard one and I forgot to save it. I have the option to say, "Send login via email," where it sends me an email. The application is trusting that my email is secure, which it is through two-factor authentication through an app on the phone, and that I can click on that unique URL, and it automatically logs me into the app. For some applications, I think that's awesome. For me, that's a great security experience. I wouldn't do that for my bank account. I wouldn't do that for my phone bill, maybe. But for logging in to watch a football game on ESPN, maybe I would.
So the question that came in, "What are some ideas to help move away from passwords?" What are your thoughts there? What are you doing at MassMutual, if you're allowed to share? If not, maybe just some general statements around passwords.
Damon: So I'll say I won't go too deep into what we're doing at MassMutual, but we are focused on this problem. I think a lot of organizations are right now, so that is definitely a focus of ours, without talking specifics around exactly what we're doing for obvious reasons. But I think there's a lot of alternatives out there, and they scale on how technical they are and how ... What's the word I'm looking for? How controversial they might be too. Because that's something you have to understand. When you're looking at doing something like replacing passwords ... And nobody likes passwords, but they are a warm blanket to a lot of people because it's familiar, and they feel like it's protecting them, even though they're using the same one across every account that they own.
So when you start to introduce the idea of something like behavior analytics that's going to observe ... a sensor that's on their system that's observing how they interact with their computer and how they interact with resources and then makes decisions on if it's the person that they expected to be at the keyboard, some people get a little bit nervous about that. So you have to understand where your organization is at, where your people are at, before you really start running. This is, again, back to what we were talking about before. Understand your people. Communicate with your people, because if they're not going to use it, it's no use. It's not going to help you at all.
So there's a scale. You have things like push notifications, secure email type notifications. There's apps that you can use to log in to your system that waits for authentication on your phone, and maybe there's biometrics on your phone. That's what secures the app. There are hardware tokens, things like YubiKeys and things like that that you can use as alternatives to passwords to log in. Then there is the much more in-depth behavior authentication, user behavior analytics. So there's a lot out there. I think you have to really understand what your systems can accommodate, what your people will accommodate, and then make a decision.
Drew: I think if you all, if the audience if 87 people were to take one thing away, if you're not talking to your end users, talk to them. Have a conversation.
Drew: It used to be a lot easier than it was, where you could go into the cafeteria with a couple gift cards and play a game and say, "Hey, let's talk about cybersecurity." You've got to be a little bit more creative these days. A last parting thing, I know Brandyn wants to wrap me up. At Living Security we have this tool called Donut on slack, and it basically matches you up with another person every week or two. I bet you that, if you could get people to opt in to a program where you could get matched up with somebody from the cybersecurity team, I think it'd be very interesting to say, "Hey, let's have a 15-minutes conversation about [inaudible] answer any questions you have, whether it's about your parents or your kids or an app or the company you work for," I think you would also, in turn, learn a lot about what people are interested in, what type of risk tolerance they have, and what they would like to learn about.
Damon, thank you so much for your time. I thoroughly enjoyed this conversation. I'm sure the audience did as well. Again, Living Security, human risk management, anything in relation to changing behavior, we have several different solutions. We just put a poll up there regarding our Living Security community. This is something we launched just a couple of months ago at our BSA con back in June. We have, I think, over 500 members at this time. I could be wrong, but it's definitely over that number. If you're looking for more feedback or more information on your program and you want to ask people in different organizations that are maybe similar to yours, this is definitely the community for you that you can join and get those questions answered and learn from the community, because that's really important to us. From my behalf, thank you so much for joining. I'll pass it over to Brandyn for a quick closing statement.
Brandyn Hampton: Thank you guys. I just wanted to thank Drew and Damon for being with us today. We are going to be back here. We're usually here every month, but we're taking a little sabbatical for cybersecurity awareness month in October, but we'll be back in November with Nick Marchiselli from LogMeIn, talking about some issues that are facing him and his program and how he can better help you guys with that. So thank you, everybody, for joining, and we'll also be emailing a recording of this webinar out to you post-event, and it will be available on our resource section on our website, LivingSecurity.com. Thank you, everyone. Thank you to Drew and to Damon, and everybody have a fantastic day.
Damon: Thanks a lot.
Never miss another update: sign up for our mailing list and be the first to hear about events from Living Security, including the rest of our Breaking Security Awareness webinar series.