Follow Mark on LinkedIn Follow @marknca on Twitter Follow marknca on YouTube
marknca

Mornings With Mark
no. // 0 0 0 2

Exposing Secrets In Code

Subscribe to the podcast.

Watch the episode here

Join the discussion on LinkedIn

Tweet about this episode

Full machine generated transcript follows

Morning, everybody. How you doing today on this episode of the show. We're going to talk about secrets that aren't so secret anymore. There was a recent study done by NCSU that analyzed a large number about 13% of all GitHub repositories that were public and they were looking specifically for expose Secrets like API token.

So these are the access keys that you use to access AWS Azure Google Cloud. I'm get Hub itself that kind of stuff right? I have no idea behind these keys. Are there easier to use and recycle compared to usernames and passwords and what they're supposed to be kept secret.

They are supposed to have restricted access and they are supposed to be restricted in the permissions that they Grant of course in any cloud provider. You can actually go Grant admin level privileges to these keys. So there's a lot of potential for harm. If you're not actively applying the principle of least privilege now, you know just from the very name of having something called a secret.

you shouldn't be putting it out there as public information on get help but the study from NCSU found that that was actually happening again and again and again and they were seeing a rather constant stream of these Keys being pushed to get Hub all the time everyday now that may surprise some of you hopefully it surprises some of you but realistically probably doesn't this is another symptom of something that we've seen happened increasingly more often over.

The last few years is people have adopted more and more automated Bill techniques and more and more cloud services. Now don't get me wrong. This is not to say that the cloud or automated build pipelines are a bad idea or less secure their actually demonstrable far more secure and far more beneficial to your productivity than the traditional ways of doing it, but they do highlight a problem in that problem is that all of these tools automated build pipelines and access the cloud services amplify what you were capable of achieving so why is that a bad thing, but we seen it in the rash of mongodb breaches over the last few years and he's big quotes around the word reaches.

I'm just because of some default configuration choices made by the mango team in early versions. If you didn't know about those default settings and change them you were inadvertently on providing access to your data base outside of you. Organization in your infrastructure, that's generally a bad thing. If you don't want to provide direct access into your databases from the outside world, but that was the default positioning for a long time with that project dedicated to a project understood it inside out.

It was a quick simple fixes change two lines at config file in your all done. But for people who are simply deploying is part of a larger stack a larger area of responsibility for themselves. That was a problem. And that's where we're seeing the sort of operational mistake and make no bones about it.

It's absolutely a mistake. It's a risky organization at the cyber security threat of a traditional sense. Where do you know there's an error and software code or that these services are insecure themselves. This is simply a oversight or a mistake and a choice of what date of you're putting in these services in this case, it's good.

Hope somebody's making a commit on their workstation and they're pushing it up and they didn't realize it's a copy that file in there temporarily or I didn't have it properly in the get ignore file. System settings, right and that's one thing you should have right out of the gate is a really strong get ignore file for your entire organization and a standard way of saving a credential file so that they are covered under that ignore cuz you never want to be posting these things into your Repose.

Even if those are private repost because a simple mistake will turn those private ones public or somebody could copy the information on the laptop and lose that laptop across the border without laptop. This is purely in the realm of operational security. I think that is a huge gap in how most of us approach security is that we aren't paying attention enough to operational security.

How are we using these tools? How are we handling the information? We're worried about the technical side. We're not worried about the people side and most the time the vast majority of cybersecurity is accounting for people accounting for issues and challenges that people who deal with nobody's got malicious intent in these cases.

It's simply they're trying to get stuff done and the study from NCSU which I encourage you to read. The whole thing is rather rather interesting have done. Rather there's a rather interesting because it highlights and quantifiers some of these issues in the scale at which these issues are happening.

But at the end of the day, it's really been operational security to really about talking to your folks making sure they understand what's good. What's bad? That's all I have for you today. Hit me up online at Mark and see a the comments down blown is always by email me at Mark n.

C. I look forward to talking to you about this and everything else online and we'll see you on the next show. Morning, everybody. How you doing today on this episode of the show. We're going to talk about secrets that aren't so secret anymore. There was a recent study done by NCSU that analyzed a large number about 13% of all GitHub repositories that were public and they were looking specifically for expose Secrets like API token.

So these are the access keys that you use to access AWS Azure Google Cloud. I'm get Hub itself that kind of stuff right? I have no idea behind these keys. Are there easier to use and recycle compared to usernames and passwords and what they're supposed to be kept secret.

They are supposed to have restricted access and they are supposed to be restricted in the permissions that they Grant of course in any cloud provider. You can actually go Grant admin level privileges to these keys. So there's a lot of potential for harm. If you're not actively applying the principle of least privilege now, you know just from the very name of having something called a secret.

you shouldn't be putting it out there as public information on get help but the study from NCSU found that that was actually happening again and again and again and they were seeing a rather constant stream of these Keys being pushed to get Hub all the time everyday now that may surprise some of you hopefully it surprises some of you but realistically probably doesn't this is another symptom of something that we've seen happened increasingly more often over.

The last few years is people have adopted more and more automated Bill techniques and more and more cloud services. Now don't get me wrong. This is not to say that the cloud or automated build pipelines are a bad idea or less secure their actually demonstrable far more secure and far more beneficial to your productivity than the traditional ways of doing it, but they do highlight a problem in that problem is that all of these tools automated build pipelines and access the cloud services amplify what you were capable of achieving so why is that a bad thing, but we seen it in the rash of mongodb breaches over the last few years and he's big quotes around the word reaches.

I'm just because of some default configuration choices made by the mango team in early versions. If you didn't know about those default settings and change them you were inadvertently on providing access to your data base outside of you. Organization in your infrastructure, that's generally a bad thing. If you don't want to provide direct access into your databases from the outside world, but that was the default positioning for a long time with that project dedicated to a project understood it inside out.

It was a quick simple fixes change two lines at config file in your all done. But for people who are simply deploying is part of a larger stack a larger area of responsibility for themselves. That was a problem. And that's where we're seeing the sort of operational mistake and make no bones about it.

It's absolutely a mistake. It's a risky organization at the cyber security threat of a traditional sense. Where do you know there's an error and software code or that these services are insecure themselves. This is simply a oversight or a mistake and a choice of what date of you're putting in these services in this case, it's good.

Hope somebody's making a commit on their workstation and they're pushing it up and they didn't realize it's a copy that file in there temporarily or I didn't have it properly in the get ignore file. System settings, right and that's one thing you should have right out of the gate is a really strong get ignore file for your entire organization and a standard way of saving a credential file so that they are covered under that ignore cuz you never want to be posting these things into your Repose.

Even if those are private repost because a simple mistake will turn those private ones public or somebody could copy the information on the laptop and lose that laptop across the border without laptop. This is purely in the realm of operational security. I think that is a huge gap in how most of us approach security is that we aren't paying attention enough to operational security.

How are we using these tools? How are we handling the information? We're worried about the technical side. We're not worried about the people side and most the time the vast majority of cybersecurity is accounting for people accounting for issues and challenges that people who deal with nobody's got malicious intent in these cases.

It's simply they're trying to get stuff done and the study from NCSU which I encourage you to read. The whole thing is rather rather interesting have done. Rather there's a rather interesting because it highlights and quantifiers some of these issues in the scale at which these issues are happening.

But at the end of the day, it's really been operational security to really about talking to your folks making sure they understand what's good. What's bad? That's all I have for you today. Hit me up online at Mark and see a the comments down blown is always by email me at Mark n.

C. I look forward to talking to you about this and everything else online and we'll see you on the next show.