Tech Talk: Spirion + Microsoft Purview Information Protection 10/17, 2:00 PM EDT Register Here

Close

Privacy Please Podcast Episode 5: Guest Nina Wyatt, Senior VP and CISO of Sunflower Bank

About this episode:

Join your hosts Cameron Ivey and Gabe Gumbs, as they talk with Nina Wyatt, Senior Vice President and CISO, at Sunflower Bank as they talk about:

  • Nina’s career path to becoming a CISO and the current challenges faced by anyone in this role
  • The role of consumer demand in driving privacy protections in business
  • The data privacy problems created by single sign-on and app sign-ins via Google and Facebook
  • The privacy risks when multiple pieces of personal identifying information can be combined from multiple sources
  • How apps like IKEA are changing to give consumers more control over how and when their data is collected

Listen to the episode and subscribe to the podcast here.

Transcript:

Cameron Ivey:

Ladies and gentlemen, welcome to another episode of privacy please. I’m your host, Cameron Ivy. On today’s episode, we have a very special guest, Nina Wyatt. She is the Chief Information Security Officer for Sunflower Bank and she is a delight. I hope you enjoy. Again, these opinions expressed here today are Nina’s and not associated or representative of her professional network, associations or her employer Sunflower Bank. Please enjoy.

Cameron Ivey:

Nina, I’m dying to know, what’s your favorite hometown restaurant and do you have a go to order? Or what’s your favorite food?

 

Nina Wyatt:

Well, I love soup and it’s probably the best, it’s the best thing that I can cook on my own. I don’t know why, but every kind of soup I’ve ever attempted or just throw together myself as is pretty amazing. Not to brag. And there’s this little place down the end of the street called Corner Kitchen and they make fabulous homemade soup. That’s my favorite spot to go and grab some soup, especially in Michigan where it’s cold half the time.

 

Cameron Ivey:

Sure. Yeah, that makes sense. Something warm. Always a good go to. Awesome. Kind of turn things over. What is the one thing you wish you had when you started out in security?

 

Nina Wyatt:

Oh, I wish that I had somebody tell me about the challenges that come with this job. I feel like I would have been probably better prepared to kind of handle the swings and the ups and downs if I had had that insight ahead of time.

 

Cameron Ivey:

And when did you actually get into security? Was that out of college?

 

Nina Wyatt:

I started working as an admin at a defense contractor way back in the day and I’d eventually moved into a permanent position in their security team, which, so I was responsible for printing and managing the badge system. I was first introduced to security via physical security and I had a couple mentors in that company that had kind of inspired me to go in the direction of IT security. I kind of have a little bit on the physical side where I started and then kind of moved into IT over the years.

 

Cameron Ivey:

That’s awesome. Really interesting. All right, so what are you most curious about right now?

 

Nina Wyatt:

I am most curious about how companies are going to respond to the new data privacy expectations.

 

Cameron Ivey:

Great topic. Wouldn’t you say Gabe?

 

Nina Wyatt:

What’s the name of this podcast again?

 

Gabe:

I was pleased. So Leah, let’s double click into that one a little bit. As a security professional, as a CISO and a consumer yourself, how do you view that through both of those lens? What’s kind of the approach you take when you think about how you’re going to respond to that as one of the individuals who’s going to be responsible for protecting the privacy of your own data at another organization?

 

Nina Wyatt:

I think that consumers in general that we’ve made a lot of, we haven’t fully put the expectation on companies to protect our data and we’ve chosen convenience and other luxuries that technology provides us, which are amazing, instead of data privacy. We don’t holistically have a clear understanding of what all that means on the backend as consumers. And as a CISO, I know all of the things that companies should be doing to protect our data or my data. And it still strikes me as concerning that companies still aren’t doing this and they’re not doing it in a manner that is completely transparent to consumers these days.

 

Gabe:

I think you’re absolutely right. I think one of the ways I tend to look at my own privacy is in that very trade off sense. I’m a big believer in, and maybe I take it a little far sometimes that if you’re not the consumer, you’re the product. But there are times when I’m willing to be the product and trade some of that convenience for that access and that information, et cetera. But I think by and large, most consumers don’t understand that. Is that something you encounter in your own personal life? Do you ever just interact with an organization and they’re asking you for information? You’re like, “There’s no way you need this.” Prime example, you moved to a new town, you get a new dentist and they’re like, “I need your social security number.” And this person is across the desk is just pen and paper in hand, ready to write it down on a piece of paper and you just stop and ask them, “Well why do you need this? And no, I’m not giving it to you.”

 

Nina Wyatt:

Exactly. I have done that before and much to the frustration of the person behind that desk filling out the paper forms with the social security number. They don’t need it. They have my insurance number, my insurance provider, there’s no need for that social security number, yet every medical provider out there still asks for it. And that still boggles me. Well, the social security number was never intended to be an identifier for individuals. It wasn’t ever intended to be an identification or source of truth for a person’s identity. And it turned into that all of a sudden over the years since, I’m not really sure why.

 

Gabe:

We should probably spend some time on one of these episodes digging into that why. Because you’re right, it has morphed into exactly that. And it does always strike me as curious why the phone company needs my social security number. You don’t need that.

 

Nina Wyatt:

Correct. Yeah.

 

Gabe:

Yeah. And that kind of leads us right down the path of, one of the other things is they don’t make it easy for us to understand not just what data they have, but again, so how are they processing it? It’s hard enough to understand why you need my social security number, but if you were to ask any of those organizations, “Well what do you do with it?” You get a myriad of answers that don’t tend to make sense either.

 

Nina Wyatt:

No, and I wouldn’t be surprised that many of the people that you’re asking that question just, they’re not in IT. They don’t understand where it goes, where it’s stored. If it’s processed, if it’s shared with any third parties. They’re not going to know that answer. And companies have so much technology at their hands and if there’s a day and age that we’ve empowered everybody to be very transparent and given them the tools to be transparent, it’s today and companies aren’t transparent with what they’re doing with our data.

 

Gabe:

That paradigm shifting though. Did you spend any time this year so far filing your own CCPA subject rights requests?

 

Nina Wyatt:

No.

 

Gabe:

No. I was going to say I’m still waiting for several of them to return information on myself, but that’s exactly where I’m at with it is looking at this new paradigm shift and having that power in my hand as a consumer and wanting answers. And I know that that’s, I’m not the only consumer out there in that boat.

 

Nina Wyatt:

Well, it interests me because when you look at the most fundamental control that companies’ technology departments in companies have, it’s really inventory. Knowing where your assets are. And if you don’t know where all of your data is and what you do with it and where it goes, and then what third parties do with that data, how can you with transparency, respond to a request like that? How would you know? That is the question.

 

Cameron Ivey:

Why do you feel the convenience technology offers is regarded over an individual’s concern for data privacy today?

 

Nina Wyatt:

I don’t think that companies make it easy for a consumer to understand what data they have, where it’s stored, how it’s processed or transmitted, and who they share that data with and why. You have these lengthy terms and conditions things and you click agree and you can read through that, but you really have to peel and read between the lines to really understand what they’re doing with your data. And they’re not, they don’t offer much insight in those terms and conditions about data privacy today. I think the second, I think if consumers understood, had clarity around how data is managed and what companies do with that data, then of course they would be inquisitive, they’d be asking more questions and they would be more conservative about making the decisions that they make when they’re interacting with technology or online.

 

Cameron Ivey:

Yeah, that’s a good point. Could you provide an example that consumers are not truly understanding how technology works on the backend and how their choices on social media or other widely used platforms could cause an issue in the future?

 

Nina Wyatt:

Sure. We see a lot of consumers that are just using technology to manage life. They’re doing their banking, they’re volunteering at a school, they’re managing their children’s identities. And let’s just throw an example out there of a mom of two who’s on the school PTO board. She’s the treasurer. We’ll call her Karen because Karen is in a lot of memes these days. Karen is on Facebook and it’s the first day of school and Karen posts a picture of her kids holding that chalkboard sign that says their age and their school grade and the school name. Karen’s in several groups on Facebook and she’s the administrator of the Facebook group for the PTO of her kids’ school district because she’s that treasurer entity of the PTO board. She manages all the accounting activities for the school PTO and Karen just subscribed to an app that’s going to make accounting easier for her and she uses Facebook to create her login for that app.

 

Nina Wyatt:

Karen does not realize that by using the same credentials with Facebook as she is with this accounting app that she’s now giving this app company visibility to information from Facebook, all the groups that she’s in, her kids’ information and all the information that she’s posted regularly throughout her Facebook entity. And so she’s inadvertently sharing several data elements with this app company, this accounting app company. Karen does not realize that her credentials to Facebook were already stolen months ago and because these credentials are widely known and she also used the same credentials for this accounting app, now a threat actor can access that information from the accounting app with malicious intent to either to steal transactional data, to steal funds, access her email account, purchase history, home address, children ages, the school they attend, et cetera. It’s all out there and I think the challenge with PII or personally identifiable information is that in a single Facebook post we don’t really think about, is it sensitive that I’m posting a picture of my son holding his school age and the name of the school that he’s in?

 

Nina Wyatt:

But you combine that with other pieces of personally identifiable information and all of a sudden you have this whole profile of a person that nobody knew existed before. And oftentimes we do this on an individual basis and it doesn’t seem to be a big deal. But when people gain access to multiple pieces of that PII, then all of a sudden there’s a wealth of information there and information that can be used to compromise either your bank account or some other thing that you hold that’s valuable to you. I think we see that a lot. Karen isn’t asking that app company what they’re going to do with the Facebook data that they’ve gained access to.

 

Cameron Ivey:

Course not.

 

Nina Wyatt:

Because Karen doesn’t know. She doesn’t care. She’s trying to manage life and that company didn’t come out and say, “Hey, we’re going to borrow this data from your Facebook activity to keep doing the business that we do and also sell this other data to this other company to help monetize and our interests as a company.” That’s just one example but we see this all the time.

 

Cameron Ivey:

It was a great example. Yeah, it’s pretty common. Gabe, do you have anything to piggyback off of that example?

 

Gabe:

Yeah. I just had a scary thought and I’m almost afraid to Google and look it up. Yeah, shared credentials is nothing new. We’ve been dealing with this since well, the dawn of credentials. And we certainly understand how that problem has been exacerbated, but aggregated authentication mechanisms such as Facebook and in general, single sign on applications have kind of made this problem a little bit worse. Email has become kind of the master key to our accounts. Oftentimes you’re talking to lay persons if you would. They’re always like, “Yeah, but if someone gets into my email, big deal. What are they going to find out? That I’ve been emailing my grandmother?” And my responses to them is usually, “Well, how do you reset your password if you forget it, to your bank?” And they’re like, “Oh, right.”

 

Nina Wyatt:

Yeah. Oh right.

 

Gabe:

Yeah. And so it’s like, well what happens when you leverage these shared credentials for this one source of truth for getting back into the rest of your life. And whether or not this trend will change, still remains to be seen. But I’m curious on your overall thoughts on whether or not this new regulatory landscape around privacy, will we be able to get consumers, you think, thinking about these things? Or is Gabe and Nina going to have to run around telling everyone about why there actually is important?

 

Nina Wyatt:

Yeah, I think it is going to change. I think the more consumers understand and the more clarity they have, they’re going to have the insight and some principle to ask and to inquire. You don’t know what you don’t know and you can’t manage what you don’t know. But the more visibility people have about how data does matter and how their email is sensitive, even if they’re just emailing their grandmother on the one off chance that they use it to reset their password for their bank account, they’re going to start connecting the dots and they’re going to start caring. We see that now.

 

Nina Wyatt:

I just saw a video on LinkedIn with the chief privacy officer for Ikea. And she had made this video about being transparent and how Ikea cares about consumer data and how they’re going to make their whole mobile app experience when people are shopping, very transparent to the consumer about what information they’re seeing, why they’re seeing it, and give consumers the option to turn and toggle things off and on based upon what they care about and what data they want to keep private versus data that they want to share with them. I think again, the more transparency that companies offer, the more people are going to start kind of jumping on that bandwagon and that trend of caring. They’re going to know what questions to ask and what things to care about.

 

Gabe:

I’d say it’s not lost on me that Ikea is very much an organization who’s headquartered in the European Union though. And privacy has certainly taken a different priority in that part of the world versus the US, at least in the not so distant past.

 

Nina Wyatt:

It really concerns that, I’m so surprised that it’s taken us this long. California, the CCPA just came out and that’s great. That’s a step in the right direction and we’re going to continue moving in that right direction as we do with state by state legislature and once we get enough states on board, then all of a sudden we’ll have federal guidance that’s acceptable with regards to data privacy. But it really shocks me that it’s taken us this long to realize that electronic data knows no state boundaries. I don’t know why that’s a surprise to everyone. How can we, why would legislature that is driven state by state, be effective for data that doesn’t observe state boundaries. Paper does. Lot easier to control what you send it in and out of a state if it’s a paper copy or a fax than it is to an email. You don’t even know where it’s going. It’s really, it’s shocking to me that that it’s taken us this long to consider it.

 

Cameron Ivey:

Especially with the Terminator Judgment Day. Come on, we’ve all been warned.

 

Gabe:

We have been warned.

 

Nina Wyatt:

We’ve been warned.

 

Cameron Ivey:

Well what do you feel like the feasibility of how companies respond to the heightened demand of consumer expectations?

 

Nina Wyatt:

I think companies have to really get great at understanding what they have, where it is, how they use it. It goes back to that.

 

Cameron Ivey:

Accountability?

 

Nina Wyatt:

Yeah, that, that fundamental asset management inventory. In order to manage something effectively, you have to know what you have and where it’s at. Is it still supported? Is it vulnerable? Where does it go? What do we do with it? And that’s probably one of the most difficult control areas for a company to get their arms around. Especially as things move and change so quickly. I think, that’s no easy feat for a company to really get that total life cycle management around what they have in their environment. And even what, we leverage cloud service providers all the time. How do you even measure and monitor what is leaving your organization? I think it is feasible, but we have to be experts at life cycle management and we really have to take a hard look at ourselves and our companies and how we do things and really put emphasis on that inventory of data.

 

Gabe:

I want to touch on that one quite a bit. But first, I’m going to hit you with a fun fact. Terminator Judgment Day released in 1991. HIPAA signed into law in 1996 so they have.

 

Nina Wyatt:

They did not take a message quickly from Terminator.

 

Gabe:

Not at all. But you mentioned moving quickly, and very quickly. HIPAA being released in 1996, it would be decades before we see this glut of privacy regulation again, namely the last year and a half, two years. Now we have this glut of privacy regulation between TCPA, GDPR, et cetera, et cetera, et cetera. But the speed of things moving, and again, if I can tie this into just your general security background here. Forget background, it’s what you do every single day. We’ve gotten pretty good at security operations. We’ve gotten fairly good at viewing threats and understanding as they come into our environment and analyzing them and triaging them. But privacy risk, privacy threats, I don’t see yet where there’s been some fundamental understandings of how we’re going to operationalize privacy.

 

Gabe:

I have some personal thoughts around those things. I have some professional thoughts around those also that I’ve been kind of throwing around. But I’d love to get your input on where you see just overall privacy operations and I guess first we should probably define what we even mean by privacy operations, but I’ll start with one of the things you said, understanding when and how data is being moved to third parties.

 

Nina Wyatt:

Well still, it goes back to inventory. You didn’t have threat management or when you’re talking about security operations and how that has evolved over time, you didn’t really have that evolution of security operations and all the fun and fancy tools we have in security today, without companies understanding what is in their inventory and how to scan those things for vulnerabilities and how to bump all of that data up against one another to draw conclusions. And aggregate that data to say, “Oh, well these things wouldn’t really be a big deal in singular form. But if you compile all of these things about all of these servers and you look at it with that lens on, then all of a sudden it is a big deal. It is a high risk.”

 

Nina Wyatt:

And I think again the same thing with privacy. We ask about privacy when we go to the dentist because we know about it. We know that that social security number that’s written on a Post It Note is going to be stuck on somebody’s monitor for three days and then it maybe it’ll be shredded but it’ll probably just go in the waste bin. And we know that they have a cleaning crew that comes in at midnight to clean the office. Companies have to know what data they have, where it’s at, how they manage it. We need to wrap that lifecycle management. The same thing we do for our hardware and software, same thing with privacy of data. And I think that’s the only way we can operationalize privacy of data.

 

Gabe:

Yeah. I can’t even, not that I would have argued with it at all, but it’s funny, I immediately went in my mind right to just the operationalize it thing because I think maybe I took for granted that, well of course you’ve inventoried your data. But no they haven’t. They absolutely haven’t. That is where it begins.

 

Nina Wyatt:

It’s really difficult to inventory your data when we as companies, we want things to be available to our consumers all the time. That is the expectation. High demand. Speed begets speed. You put a result in and you want results returned right away quickly. And so with these cloud service providers, they’ve got multiple data centers all over the world and data is bouncing back and forth and copied and copied and copied to make sure that you the consumer can get that result really quickly. And it’s always the result that you expect. And so when we’re talking about sharing data with third parties, whether we’re just using it as cold storage or we’re sharing it to return some result, how do we go to that other company and say, “Well give me, tell me exactly what data centers my data is going to be traversing through and stored in and how do I capture that in a manner, in my inventory system and risk rate that.” How do I do that?

 

Gabe:

Yeah. And what does privacy risk even look like? What is the risk that you will re-identify data in your environment? Because I’ve told you to forget about me or I’ve told you not to share it with a third party, but you’ve gone ahead and you’ve purchased another marketing list or you’ve acquired another organization that I didn’t make that same request with and you start combining data sets. You can analyze purchasing behaviors of individuals and things of that nature. What’s the risk that you will reintroduce that data into a pool that will allow you to re-identify me even after I told you to. I don’t, I certainly don’t think we are close to maturity on things like privacy risk.

 

Nina Wyatt:

No, we’re not. I would agree with that. I think that it is feasible for companies to manage privacy of data if they have full inventory and lifecycle management around that data. They know what they’re doing with their data, why and where it’s stored and collected, processed and transmitted. That in and of itself is a challenge. Beyond that, once you have that information, then you can start understanding, well, this data is sensitive and this data is not. And so that can really start driving those risk aware decisions as to whether or not you share something. You might say, “You know what? We do need to share this but we don’t need to share the full number or all the characters in this string. We really only need to share this as an identifier to map it to this and therefore we can kind of omit the risk.”

 

Nina Wyatt:

But you really can’t drive those risk aware decisions without knowing what you have and how it’s used and how sensitive it is. That really needs to be the focus I think of many companies today, is really classifying their data, inventorying data and understanding the risk that each data variable poses and then you can kind of start looking at it holistically and making those decisions.

 

Gabe:

I genuinely appreciate that the way you describe dealing with the risk was in, and I’m going to use some of the language from the law itself, curing the data. What you described was kind of altering the data, but you even described the situation that some of that was privacy preserving or format preserving as well. And those are, I think in the conversations I’m having, still kind of missing from how do you enable the business to still do what they need to do, but allow that data to be shared and handled, et cetera? Versus just taking the big no, the big no hammer to the data and saying, “No, we’re just going to delete it. No, you just can’t have access to it.” Big kudos to you and not that you needed it, but I really genuinely appreciate that because I know how much security can be the department of no. And what you just articulated there was very eloquent in terms of here’s one way that you should look at handling that data and still making it useful to the business.

 

Nina Wyatt:

Absolutely. And that’s, I think that’s a challenge for people in my role across the board. You want to do the right thing by the consumer, but you also need to meet the business. And so I always try to look at any problem or barrier with a, I take the Pinto, the Ford Edge, and the Cadillac. These are your options from low risk to high risk. What decisions can we make? And how do we enable the business, but also provide that security wrapper over it so that we’re doing the right thing? And that’s really a struggle for a lot of people because a lot of people have a challenge kind of going, “Well, can I still do the right thing? Can I still follow all the rules and get the desired outcome without doing it this one way?” And I think the answer to that is always yes. Yeah. You can absolutely do multiple things to get the same outcome in a different manner that meets everybody’s objectives. You just have to be open to all of that.

 

Gabe:

That’s awesome.

 

Cameron Ivey:

Yeah. I’m going to challenge you a little bit. What type of evidence do you think suggests that companies don’t have a handle of their asset management?

 

Nina Wyatt:

Oh, this is a touchy subject. I think when we see a data breach in the news and it comes out and then you don’t hear about it for a few weeks or a couple months and then all of a sudden it’s, “Oh, we’ve identified that it’s this many records.” The fact that it’s taking companies so long to understand not the how, not the how this happened, but what and when, what was breached? And who was affected? The fact that that’s taking companies so long to report and figure out through forensics and whatnot, I think suggests that companies aren’t, they aren’t inventorying data and really understanding what is where on all cases.

 

Cameron Ivey:

That’s a great, great point. Do you think that some of the companies also are withholding it on purpose for as long as they can?

 

Nina Wyatt:

Well, I think, that’s a best practice. You don’t want to falsely report, you don’t want to ensue panic. And so you really have to have your facts straight before you go public on something as sensitive as a data breach. But I think still for a company to take eight months to figure out what and who was affected on anything, that’s a long time. That’s a long time when you look at us being able to return for assaults or database queries in milliseconds. Why is it taking us this long? I realize there’s a validation component. There’s a legal component, but still, I still think that that length of time is a stretch because they have to continue to validate their results. Why? Because they’re not a 100% confident that those results are accurate. That that is, query and this process that we followed to find out what and who was impacted is a 100% foolproof? They can’t say that. They have to keep doing it and doing it and doing it. And I think that’s what’s taking so long.

 

Cameron Ivey:

That makes a lot of sense. I was going to say, for something to take eight months for someone’s personal information it’s just, I think that’s a great point that most companies do not have a handle of their asset management. At least we would hope that’s probably the case.

 

Nina Wyatt:

I think it’s changing though. I feel like I said, as consumers become more aware of the risk and they get the lingo, because companies start using that lingo. Companies start building more privacy measures in the apps that they interact with. 10 years from now it’s going to be a completely different story and the more consumers care about it, the more companies are, supply and demand. Goes right back to the basics. The more they demand it, the more it will be supplied. It’s going to be really interesting to see how things transform and evolve over the next decade with regards to privacy.

 

Cameron Ivey:

It’s true. It can very much well change. Gabe, did you have something?

 

Gabe:

No, that’s everything. That’s everything I have. I do want to thank you very much for coming on the show and sharing your thoughts. I certainly could carry on this conversation for several hours. Definitely more that we can touch. We’ve got RSA coming up next week. Might reach out to you after that and kind of get your thoughts on some of the trends and some of the common themes that we see there and we can definitely do this again.

 

Nina Wyatt:

I’d love that.

 

Cameron Ivey:

And also I wanted to ask you too, because you were kind of wheeling into it around the next 10 years. What are you most excited about for this year being 2020 obviously, that’s a cool, cool year to say, but what are you most excited about in your career and your path into data privacy and being the chief information security officer for?

 

Nina Wyatt:

Let’s see. I am most excited, I actually am working to get certified in privacy program management.

 

Cameron Ivey:

Awesome.

 

Nina Wyatt:

Really legitimize all of this stuff I have in my brain. And beyond that I’m really on the side volunteering with the state of Michigan to, they just issued a, a standard set of computer science expectations in education. And so partnering with a couple agencies to really try and help teachers understand what they can teach and integrate with their lesson plans to really get our next generation prepared for being IT enthusiasts.

 

Cameron Ivey:

That’s awesome. There’s nothing better than just anyone in this industry to come together and help each other learn to try to be ahead of the game as much as we can all together to protect what matters most.

 

Nina Wyatt:

Absolutely.

 

Cameron Ivey:

Yeah. Well Nina, thank you so much for your time. Really, really appreciate it and this was really fun. I hope to do some more in the future.

 

Nina Wyatt:

Great. Thank you so much Cameron. Gabe, it’s been a great.

 

Gabe:

Pleasure was mine. Thank you very much. Enjoy.

 

Nina Wyatt:

All right, thank you.

 

Gabe:

See you.