Security Fridays: Week 13spradeep 25/06/2020 Industry News
Security best practices to ensure data protection in the Cloud
This is a timely reminder that the grass isn’t always greener and that while people tend to believe that new tools and technology are always superior in every way, they still need the same vigorous rules and procedures that any tool does.
On almost a daily basis I hear someone talking about how they do not need to worry about security as they have uploaded their data to the cloud so they will just trust the inbuilt security of that environment. To a limited degree they are right, those environments do have a greater level of perimeter protection than most organisations can afford, and they are secure against DDoS and a number of other basic attacks.
However they are just as vulnerable to lazy practices and human error as any tool, and in fact in some cases more so than on premises technologies and it’s these that can actually make the dream of cloud security a nightmare.
On premise there is generally a group for everyone in your environment, so all the users that are within your perimeter under one banner. This in the past has been a weakness as lazy administration can lead to this group being added, then inherited folders meaning that if an attacker gets inside, or a user goes rogue then all that data exposed to everyone is vulnerable. Other groups like Authenticated Users and Domain Users are almost as bad and this can all lead to an attacker getting access to a whole lot of data once they get a foot hold in your environment.
Cloud environments take this one step further, and actively have the capacity to share data with the entire world. Attackers don’t even need to breach anything, a simple misconfiguration can just allow anyone, hostile or otherwise access to that data freely. As it’s a function of the platform designed to be used in certain circumstances, that exposure isn’t a bug or a fault that you would have any claim against, for that lost data either, it’s not a fault of the provider, it’s a fault of the admin.
So, just as it’s vital to have data lifecycle and management plans for on premise data, it’s even more important for cloud environments and people have to understand these likely will need to vary slightly from the on premise version due to the unique considerations of the environment. These need to be planned for and vigorously enforced and tested to avoid incidents such as this. The fact that such a hole in security was left for multiple weeks even after notification is a demonstration of very poor practice. Especially given the nature of some of the data held.
Tips to mitigate this:
1 – Put in place a strong data lifecycle and management plan, determine exactly what sort of data should be held where and what controls should be put in place against it. These should be a simple base line such as removing public access and limiting groups, and they should be cross checked by someone else to make sure they are done correctly due to the significant damage breaches like this can cause.
2 – Data discovery should be used against cloud environments, not only to check what admins have put up there but also to keep away of the types and locations of data that users are placing in cloud storage locations. Users tend to scatter gun data everywhere and it’s important to check they are not storing things that are unacceptable to the organisation in question.
3 – Audit all activity, only this can inform you who is making these transactions, who might need a little more education and indeed if someone has downloaded your data externally from your organisation, the audit will detect this. It should never have to come down to an external body telling you, you have been breached.
Read the article that was analysed here: https://www.infosecurity-magazine.com/news/indian-payment-app-bhim-data-breach/