Watch this episode on YouTube.
Reasonably Accurate 馃馃 Transcript
Morning everybody. How are you doing today? On this episode of the show? We're gonna talk about secrets that aren't so secret anymore. There was a recent study done by N CS U that analyzed a large number of about 13% of all github repositories that were public and they were looking specifically for um exposed secrets like API token.
So these are the access keys that you use to access Aws, Azure, Google cloud, um github itself, that kind of stuff right now, the idea behind the keys are they're easier to use and recycle compared to user names and passwords, but they're supposed to be kept secret. They are supposed to have restricted access and they are supposed to be restricted in the permissions that they grant.
Of course, in any cloud provider, you can actually grant admin level privileges to these keys. So there's a lot of potential for harm if you're not actively applying the print of least privilege. Now, you know, just from the very name of having something called a secret, you shouldn't be putting it out there uh as public information on github.
But the study from N CS U found that that was actually happening again and again and again. And they were seeing a rather constant stream of these keys being pushed to github all the time every day. Now, that may surprise some of you. Hopefully it surprises some of you.
But realistically, it probably doesn't. This is another symptom of something that we've seen happen increasingly more often over the last few years as people have adopted more and more automated build techniques and more and more cloud services now, don't get me wrong. This is not to say that the cloud or automated build pipelines are a bad idea or less secure.
They're actually demonstrably far more secure and far more beneficial to your productivity than traditional ways of doing it. But they do highlight a problem and that problem is that all of these tools, automated build pipelines and access to cloud services amplify what you are capable of achieving.
So why is that a bad thing? Well, we've seen it in the rash of Mongo DB um breaches over the last few years and use big quotes around the word breaches just because of some default configuration. Um choices made by the Mongo team in early versions. If you didn't know about those default settings and change them, you were inadvertently providing access to your database outside of your organization and your infrastructure and that's generally a bad thing.
Um You don't want to provide direct access into your databases from the outside world, but that was the default positioning for a very long time with that project. And for people who are dedicated to that project and understood it. Inside note, it was a quick simple fix.
You just change two lines in a config file and you're all done. But for people who are simply deploying it as part of a larger stack, a larger area of responsibility for themselves, that was a problem. And that's where we're seeing this sort of operational mistake and make no bones about it.
It's absolutely a mistake. It's a risk to your organization. It's a cybersecurity threat, but it's not a vulnerability in the traditional sense where, you know, there's an error in software code or that these services are insecure themselves. This is simply a oversight or a mistake in the choice of what data you're putting in these services.
In this case, it's github, somebody's making a commit on their workstation and they're pushing it up and they didn't realize like, oh, I copied that file in there temporarily or I didn't have it properly in the Gign file system settings, right? And that's one thing you should have right out of the gate is a really strong get ignore, file for your entire organization and a standard way of saving credential files so that they are covered under that ignore, because you never want to be posting these things into your repos even if those are private repos because a simple mistake will turn those private ones public or somebody could copy the information on the laptop and lose that laptop or cross a border with that laptop.
This is purely in the realm of operational security. I think that is a huge gap in how most of us approach security is that we aren't paying attention enough to operational security. How are we using these tools? How are we handling the information? We're worried about the technical side, we're not worried about the people side.
And most of the time, the vast majority of cybersecurity is accounting for people, accounting for issues and challenges that people are dealing with. Nobody's got malicious intent in these cases. It's simply they're trying to get stuff done. And this study from N CS U which I encourage you to read the whole thing.
It's rather rather interesting, have double rather there. So it's rather interesting um because it highlights and it quantifies some of these issues in the scale at which these issues are happening. But at the end of the day, it's really about operational security. It's really about talking to your folks making sure they understand what's good, what's bad.
Um That's all I have for you today. Hit me up online at Mark NC A in the comments down below.