Fortifying Cloud Vaults: A Guide to Securing Cloud Storage - Part3
As we continue our journey through the realm of cloud storage services, our focus now shifts to Google Cloud Platform (GCP), a leading player within the cloud computing field. With its array of innovative services and solutions, GCP offers organisations a powerful platform to build, deploy, and manage their applications and data in the cloud.
In this third part of our multi-part Cloud Storage Security blog series, we delve into GCP Buckets, Google Cloud’s scalable and reliable object storage service. GCP Bucket offers a flexible and cost-effective solution for storing a wide range of data types, from documents and images to multimedia content and application backups.
While each cloud provider offers unique features and strengths, there are common threads that link them together. All three platforms prioritise data security, scalability, and reliability, enabling organisations to build resilient and agile IT infrastructures.
GCP Buckets
GCP Buckets refer to storage containers provided by Google Cloud Platform (GCP) for storing objects such as files and data. GCP buckets are versatile and can be used to store various types of data, including structured and unstructured data, media files, backups, and more. They are accessible through APIs, command-line tools, and graphical user interfaces provided by Google Cloud Platform. Here’s an overview of GCP Bucket’s key features and capabilities:
- Scalability: GCP buckets offer virtually unlimited scalability, allowing you to store petabytes of data without worrying about capacity constraints.
- Durability: Google Cloud Storage provides high durability for data stored in GCP buckets, with multiple copies of each object stored across multiple data centres to ensure data resilience.
- Availability: GCP buckets offer high availability, with built-in redundancy and failover mechanisms to ensure that your data is always accessible when needed.
- Security: GCP buckets come with robust security features, including encryption at rest and in transit, access controls, and IAM policies to control access to your data.
Publicly Exposed Buckets
Risk Level: High
Ensure that the IAM policy linked to your Google Cloud Storage buckets restricts access from anonymous and public users. By adjusting the IAM policies, you can deny access to both “allUsers” and “allAuthenticatedUsers” members, which encompass any user on the Internet and any user or service account signed into Google Cloud Platform (GCP) with a Google account, respectively.
Misconfigured access permission is a prevalent security vulnerability affecting Cloud Storage resources. Providing permissions to “allUsers” and “allAuthenticatedUsers” members can potentially grant unrestricted access to your bucket’s content.
To identify any publicly accessible buckets within your Google Cloud account, follow these steps:
How can we check, the Bucket is publicly accessible ??
Execute the “projects list” command with custom query filters to display the IDs of all the Google Cloud Platform (GCP) projects accessible in your cloud account.
gcloud projects list
--format="table(projectId)"
Execute the “gsutil ls” command to list the identifier (name) of each storage bucket created for the specified Google Cloud Platform (GCP) project.
gsutil ls -p
Execute the “gsutil iam ch” command, providing the name of the Cloud Storage bucket you wish to examine as the identifier parameter. Apply custom query filters to describe the name of the IAM member(s) associated with the selected bucket.
gsutil iam get gs:///
--format=json | jq '.bindings[].members[]'
If the list of IAM member names returned by the “gsutil iam ch” command includes “allUsers” and/or “allAuthenticatedUsers,” it indicates that the designated Google Cloud Storage bucket is publicly accessible. Consequently, anyone on the Internet can access it.
How can we modify Bucket’s access ??
Execute the “gsutil iam ch -d” command, specifying the name of the publicly accessible Cloud Storage bucket you want to reconfigure as the identifier parameter. This command removes the “allUsers” member binding from the IAM policy associated with the selected bucket.
gsutil iam ch
-d allUsers gs://
Execute the “gsutil iam ch -d” command using the name of the publicly accessible storage bucket you want to reconfigure as the identifier parameter. This command removes the “allAuthenticatedUsers” member binding from the IAM policy associated with the selected bucket.
gsutil iam ch
-d allAuthenticatedUsers gs:///
Object Encryption using Customer-Managed Keys
Risk Level: High
To ensure maximum control over the encryption and decryption process of your data stored in Google Cloud Storage, you can opt for Customer-Managed Keys (CMKs). With Cloud Key Management Service (Cloud KMS), you have the ability to create, rotate, manage, and delete your own CMKs.
By default, Google Cloud Storage typically encrypts all data within buckets using Google-managed keys by default, using CMKs grants you full control over data encryption. This additional layer of encryption is particularly beneficial for sensitive and confidential data, offering enhanced multi cloud security and compliance with stringent organisational requirements.
How can we check, GCP Storage buckets are configured to encrypt data ??
To list the names of all Amazon S3 buckets available in your AWS cloud account using the AWS Command Line Interface (CLI) with custom query filters, you can use the following command:
gcloud projects list
--format="table(projectId)"
To list the identifiers of each storage bucket created for a specified GCP project using the gsutil Python tool, you can execute the following command:
gsutil ls -p
Use the “gsutil kms” command with the bucket name as the identifier parameter to inspect the default encryption key configured for the chosen Cloud Storage bucket.
gsutil kms encryption gs:///
If the format of the default encryption key returned by the ”gsutil kms encryption” command output doesn’t match
“projects/<project-id>/locations/us/keyRings/<key-ring-name>/cryptoKeys/<key-name>”,
It indicates that the data stored in the selected Google Cloud Storage bucket is not encrypted using a Customer-Managed Key (CMK).
How to make sure Object Encryption is enabled for the Bucket using Customer-Managed Keys ??
To create a new Cloud KMS key ring in a specified location, including selecting the region for encrypting and decrypting resources, you can use the following command:
gcloud kms keyrings create
--location=us
--project=
--format="table(name)"
Execute the “kms keys create” command to generate a new Cloud KMS Customer-Managed Key (CMK) within the KMS key ring established in the preceding steps:
gcloud kms keys create
--location=us
--keyring=
--purpose=encryption
--protection-level=software
--rotation-period=90d
--next-rotation-time=2020-10-10T12:00:00.0000Z
--format="table(name)"
To assign the Cloud KMS “CryptoKey Encrypter/Decrypter” role to the Cloud Storage service account and grant it permission to utilise your new CMK, execute the following “projects add-iam-policy-binding” command:
gcloud projects add-iam-policy-binding
--member serviceAccount:service-@gs-project-accounts.iam.gserviceaccount.com
--role roles/cloudkms.cryptoKeyEncrypterDecrypter
Execute the “gsutil kms encryption” command with the Cloud Storage bucket name as the identifier parameter and the name (ID) of the new Cloud KMS Customer-Managed Key as the value for the –k parameter. This will enable encryption at rest for the specified bucket using Customer-Managed Keys (CMKs).
gsutil kms encryption
-k projects//locations/us/keyRings//cryptoKeys/ gs:///
Conclusion
Securing cloud storage, especially within GCP Buckets, is imperative for safeguarding sensitive data and ensuring adherence to regulatory standards. By embracing industry best practices, organisations can effectively mitigate potential risks and uphold the integrity of their valuable assets.
Encryption stands as a cornerstone in data protection, whether data is stationary or in transit. Leveraging server-side encryption with keys managed by google cloud security adds an additional layer of security, guaranteeing that data remains encrypted and inaccessible, even in instances of unauthorised access attempts.
Furthermore, it is crucial to curtail public access to storage resources. Regularly scrutinising and revoking any public access permissions within GCP Buckets is essential to prevent inadvertent exposure of confidential information on the internet.
In summary, fortifying GCP Buckets entails implementing encryption for data at rest and in transit, limiting public access permissions to deter unauthorised exposure, and meticulously managing cross-account access through robust IAM policies. By prioritising these security measures, enterprises can confidently harness the scalability and adaptability of GCP Buckets while mitigating potential risks effectively.
As we conclude this segment of our multi-part Cloud Storage Security blog series, we’ve examined the importance of securing cloud storage across various platforms, including AWS S3, Azure Storage Containers, and GCP Buckets