Skip to content

Latest commit

 

History

History
474 lines (366 loc) · 20.6 KB

pentesting-cloud-methodology.md

File metadata and controls

474 lines (366 loc) · 20.6 KB
description
In the HackTricks Cloud Methodology you will find how to pentest cloud environments

Pentesting Cloud Methodology

Support HackTricks and get benefits!

Basic Methodology

Each cloud has its own peculiarities but in general there are a few common things a pentester should check when testing a cloud environment:

  • Benchmark checks
    • This will help you understand the size of the environment and services used
    • It will allow you also to find some quick misconfigurations as you can perform most of this tests with automated tools
  • Services Enumeration
    • You probably won't find much more misconfigurations here if you performed correctly the benchmark tests, but you might find some that weren't being looked for in the benchmark test.
    • This will allow you to know what is exactly being used in the cloud env
    • This will help a lot in the next steps
  • Check Exposed services
    • This can be done during the previous section, you need to find out everything that is potentially exposed to the Internet somehow and how can it be accessed.
      • Here I'm taking manually exposed infrastructure like instances with web pages or other ports being exposed, and also about other cloud managed services that can be configured to be exposed (such as DBs or buckets)
    • Then you should check if that resource can be exposed or not (confidential information? vulnerabilities? misconfigurations in the exposed service?)
  • Check permissions
    • Here you should find out all the permissions of each role/user inside the cloud and how are they used
      • Too many highly privileged (control everything) accounts? Generated keys not used?... Most of these check should have been done in the benchmark tests already
      • If the client is using OpenID or SAML or other federation you might need to ask them for further information about how is being each role assigned (it's not the same that the admin role is assigned to 1 user or to 100)
    • It's not enough to find which users has admin permissions "*:*". There are a lot of other permissions that depending on the services used can be very sensitive.
      • Moreover, there are potential privesc ways to follow abusing permissions. All this things should be taken into account and as much privesc paths as possible should be reported.
  • Check Integrations
    • It's highly probably that integrations with other clouds or SaaS are being used inside the cloud env.
      • For integrations of the cloud you are auditing with other platform you should notify who has access to (ab)use that integration and you should ask how sensitive is the action being performed.
        For example, who can write in an AWS bucket where GCP is getting data from (ask how sensitive is the action in GCP treating that data).
      • For integrations inside the cloud you are auditing from external platforms, you should ask who has access externally to (ab)use that integration and check how is that data being used.
        For example, if a service is using a Docker image hosted in GCR, you should ask who has access to modify that and which sensitive info and access will get that image when executed inside an AWS cloud.

Multi-Cloud tools

There are several tools that can be used to test different cloud environments. The installation steps and links are going to be indicated in this section.

A tool to identify bad configurations and privesc path in clouds and across clouds/SaaS.

{% tabs %} {% tab title="Install" %}

# You need to install and run neo4j also
git clone https://github.com/carlospolop/PurplePanda
cd PurplePanda
python3 -m venv .
source bin/activate
python3 -m pip install -r requirements.txt
export PURPLEPANDA_NEO4J_URL="bolt://neo4j@localhost:7687"
export PURPLEPANDA_PWD="neo4j_pwd_4_purplepanda"
python3 main.py -h # Get help

{% endtab %}

{% tab title="GCP" %}

export GOOGLE_DISCOVERY=$(echo 'google:
- file_path: ""

- file_path: ""
  service_account_id: "some-sa-email@sidentifier.iam.gserviceaccount.com"' | base64)

python3 main.py -a -p google #Get basic info of the account to check it's correctly configured
python3 main.py -e -p google #Enumerate the env

{% endtab %} {% endtabs %}

AWS, Azure, Github, Google, Oracle, Alibaba

{% tabs %} {% tab title="Install" %}

# Install
git clone https://github.com/aquasecurity/cloudsploit.git
cd cloudsploit
npm install
./index.js -h
## Docker instructions in github

{% endtab %}

{% tab title="GCP" %}

## You need to have creds for a service account and set them in config.js file
./index.js --cloud google --config </abs/path/to/config.js>

{% endtab %} {% endtabs %}

AWS, Azure, GCP, Alibaba Cloud, Oracle Cloud Infrastructure

{% tabs %} {% tab title="Install" %}

mkdir scout; cd scout
virtualenv -p python3 venv
source venv/bin/activate
pip install scoutsuite
scout --help
## Using Docker: https://github.com/nccgroup/ScoutSuite/wiki/Docker-Image

{% endtab %}

{% tab title="GCP" %}

scout gcp --report-dir /tmp/gcp --user-account --all-projects
## use "--service-account KEY_FILE" instead of "--user-account" to use a service account

SCOUT_FOLDER_REPORT="/tmp"
for pid in $(gcloud projects list --format="value(projectId)"); do
    echo "================================================"
    echo "Checking $pid"
    mkdir "$SCOUT_FOLDER_REPORT/$pid"
    scout gcp --report-dir "$SCOUT_FOLDER_REPORT/$pid" --no-browser --user-account --project-id "$pid"
done

{% endtab %} {% endtabs %}

{% tabs %} {% tab title="Install" %} Download and install Steampipe (https://steampipe.io/downloads). Or use Brew:

brew tap turbot/tap
brew install steampipe

{% endtab %}

{% tab title="GCP" %}

# Install gcp plugin
steampipe plugin install gcp

# Use https://github.com/turbot/steampipe-mod-gcp-compliance.git
git clone https://github.com/turbot/steampipe-mod-gcp-compliance.git
cd steampipe-mod-gcp-compliance
# To run all the checks from the dashboard
steampipe dashboard
# To run all the checks from rhe cli
steampipe check all
Check all Projects

In order to check all the projects you need to generate the gcp.spc file indicating all the projects to test. You can just follow the indications from the following script

FILEPATH="/tmp/gcp.spc"
rm -rf "$FILEPATH" 2>/dev/null

# Generate a json like object for each project
for pid in $(gcloud projects list --format="value(projectId)"); do
echo "connection \"gcp_$(echo -n $pid | tr "-" "_" )\" {
    plugin  = \"gcp\"
    project = \"$pid\"
}" >> "$FILEPATH"
done

# Generate the aggragator to call
echo 'connection "gcp_all" {
  plugin      = "gcp" 
  type        = "aggregator"
  connections = ["gcp_*"]
}' >> "$FILEPATH"

echo "Copy $FILEPATH in ~/.steampipe/config/gcp.spc if it was correctly generated"

To check other GCP insights (useful for enumerating services) use: https://github.com/turbot/steampipe-mod-gcp-insights

To check Terraform GCP code: https://github.com/turbot/steampipe-mod-terraform-gcp-compliance

More GCP plugins of Steampipe: https://github.com/turbot?q=gcp {% endtab %}

{% tab title="AWS" %}

# Install aws plugin
steampipe plugin install aws

# Modify the spec indicating in "profile" the profile name to use
nano ~/.steampipe/config/aws.spc

# Get some info on how the AWS account is being used
git clone https://github.com/turbot/steampipe-mod-aws-insights.git
cd steampipe-mod-aws-insights
steampipe dashboard

# Get the services exposed to the internet
git clone https://github.com/turbot/steampipe-mod-aws-perimeter.git
cd steampipe-mod-aws-perimeter
steampipe dashboard

# Run the benchmarks
git clone https://github.com/turbot/steampipe-mod-aws-compliance
cd steampipe-mod-aws-insights
steampipe dashboard # To see results in browser
steampipe check all --export=/tmp/output4.json

To check Terraform AWS code: https://github.com/turbot/steampipe-mod-terraform-aws-compliance

More AWS plugins of Steampipe: https://github.com/orgs/turbot/repositories?q=aws {% endtab %} {% endtabs %}

~~~~cs-suite~~~~

AWS, GCP, Azure, DigitalOcean.
It requires python2.7 and looks unmaintained.

Nessus

Nessus has an Audit Cloud Infrastructure scan supporting: AWS, Azure, Office 365, Rackspace, Salesforce. Some extra configurations in Azure are needed to obtain a Client Id.

Cloudlist is a multi-cloud tool for getting Assets (Hostnames, IP Addresses) from Cloud Providers.

{% tabs %} {% tab title="Cloudlist" %}

cd /tmp
wget https://github.com/projectdiscovery/cloudlist/releases/latest/download/cloudlist_1.0.1_macOS_arm64.zip
unzip cloudlist_1.0.1_macOS_arm64.zip
chmod +x cloudlist
sudo mv cloudlist /usr/local/bin

{% endtab %}

{% tab title="Second Tab" %}

## For GCP it requires service account JSON credentials
cloudlist -config </path/to/config>

{% endtab %} {% endtabs %}

Cartography is a Python tool that consolidates infrastructure assets and the relationships between them in an intuitive graph view powered by a Neo4j database.

{% tabs %} {% tab title="Install" %}

# Installation
docker image pull ghcr.io/lyft/cartography
docker run --platform linux/amd64 ghcr.io/lyft/cartography cartography --help
## Install a Neo4j DB version 3.5.*

{% endtab %}

{% tab title="GCP" %}

docker run --platform linux/amd64 \
     --volume "$HOME/.config/gcloud/application_default_credentials.json:/application_default_credentials.json" \
     -e GOOGLE_APPLICATION_CREDENTIALS="/application_default_credentials.json" \
     -e NEO4j_PASSWORD="s3cr3t" \
     ghcr.io/lyft/cartography  \
     --neo4j-uri bolt://host.docker.internal:7687 \
     --neo4j-password-env-var NEO4j_PASSWORD \
     --neo4j-user neo4j


# It only checks for a few services inside GCP (https://lyft.github.io/cartography/modules/gcp/index.html)
## Cloud Resource Manager
## Compute
## DNS
## Storage
## Google Kubernetes Engine
### If you can run starbase or purplepanda you will get more info

{% endtab %} {% endtabs %}

Starbase collects assets and relationships from services and systems including cloud infrastructure, SaaS applications, security controls, and more into an intuitive graph view backed by the Neo4j database.

{% tabs %} {% tab title="Install" %}

# You are going to need Node version 14, so install nvm following https://tecadmin.net/install-nvm-macos-with-homebrew/
npm install --global yarn
nvm install 14
git clone https://github.com/JupiterOne/starbase.git
cd starbase
nvm use 14
yarn install
yarn starbase --help
# Configure manually config.yaml depending on the env to analyze
yarn starbase setup
yarn starbase run

# Docker
git clone https://github.com/JupiterOne/starbase.git
cd starbase
cp config.yaml.example config.yaml
# Configure manually config.yaml depending on the env to analyze
docker build --no-cache -t starbase:latest .
docker-compose run starbase setup
docker-compose run starbase run

{% endtab %}

{% tab title="GCP" %}

## Config for GCP
### Check out: https://github.com/JupiterOne/graph-google-cloud/blob/main/docs/development.md
### It requires service account credentials
 
integrations:
  -
    name: graph-google-cloud
    instanceId: testInstanceId
    directory: ./.integrations/graph-google-cloud
    gitRemoteUrl: https://github.com/JupiterOne/graph-google-cloud.git
    config:
      SERVICE_ACCOUNT_KEY_FILE: '{Check https://github.com/JupiterOne/graph-google-cloud/blob/main/docs/development.md#service_account_key_file-string}'
      PROJECT_ID: ""
      FOLDER_ID: ""
      ORGANIZATION_ID: ""
      CONFIGURE_ORGANIZATION_PROJECTS: false

storage:
  engine: neo4j
  config: 
    username: neo4j
    password: s3cr3t
    uri: bolt://localhost:7687
    #Consider using host.docker.internal if from docker

{% endtab %} {% endtabs %}

Discover the most privileged users in the scanned AWS or Azure environment, including the AWS Shadow Admins. It uses powershell.

Import-Module .\SkyArk.ps1 -force
Start-AzureStealth

# in the Cloud Console
IEX (New-Object Net.WebClient).DownloadString('https://raw.githubusercontent.com/cyberark/SkyArk/master/AzureStealth/AzureStealth.ps1')  
Scan-AzureAdmins  

A tool to find a company (target) infrastructure, files, and apps on the top cloud providers (Amazon, Google, Microsoft, DigitalOcean, Alibaba, Vultr, Linode).

More lists of cloud security tools

Google

GCP

{% content-ref url="gcp-security/" %} gcp-security {% endcontent-ref %}

Workspace

{% content-ref url="workspace-security.md" %} workspace-security.md {% endcontent-ref %}

AWS

{% content-ref url="aws-security/" %} aws-security {% endcontent-ref %}

Azure

Access the portal here: http://portal.azure.com/
To start the tests you should have access with a user with Reader permissions over the subscription and Global Reader role in AzureAD. If even in that case you are not able to access the content of the Storage accounts you can fix it with the role Storage Account Contributor.

It is recommended to install azure-cli in a linux and windows virtual machines (to be able to run powershell and python scripts): https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest
Then, run az login to login. Note the account information and token will be saved inside <HOME>/.azure (in both Windows and Linux).

Remember that if the Security Centre Standard Pricing Tier is being used and not the free tier, you can generate a CIS compliance scan report from the azure portal. Go to Policy & Compliance-> Regulatory Compliance (or try to access https://portal.azure.com/#blade/Microsoft_Azure_Security/SecurityMenuBlade/22).
__If the company is not paying for a Standard account you may need to review the CIS Microsoft Azure Foundations Benchmark by "hand" (you can get some help using the following tools). Download it from here.

Run scanners

Run the scanners to look for vulnerabilities and compare the security measures implemented with CIS.

pip install scout
scout azure --cli --report-dir <output_dir>

#Fix azureaudit.py before launching cs.py
#Adding "j_res = {}" on line 1074
python cs.py -env azure

#Azucar is an Azure security scanner for PowerShell (https://github.com/nccgroup/azucar)
#Run it from its folder
.\Azucar.ps1 -AuthMode Interactive -ForceAuth -ExportTo EXCEL

#Azure-CIS-Scanner,CIS scanner for Azure (https://github.com/kbroughton/azure_cis_scanner)
pip3 install azure-cis-scanner #Install
azscan #Run, login before with `az login`

Attack Graph

Stormspotter creates an “attack graph” of the resources in an Azure subscription. It enables red teams and pentesters to visualize the attack surface and pivot opportunities within a tenant, and supercharges your defenders to quickly orient and prioritize incident response work.

More checks

  • Check for a high number of Global Admin (between 2-4 are recommended). Access it on: https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview

  • Global admins should have MFA activated. Go to Users and click on Multi-Factor Authentication button.

  • Dedicated admin account shouldn't have mailboxes (they can only have mailboxes if they have Office 365).

  • Local AD shouldn't be sync with Azure AD if not needed(https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/AzureADConnect). And if synced Password Hash Sync should be enabled for reliability. In this case it's disabled:

  • Global Administrators shouldn't be synced from a local AD. Check if Global Administrators emails uses the domain onmicrosoft.com. If not, check the source of the user, the source should be Azure Active Directory, if it comes from Windows Server AD, then report it.

  • Standard tier is recommended instead of free tier (see the tier being used in Pricing & Settings or in https://portal.azure.com/#blade/Microsoft_Azure_Security/SecurityMenuBlade/24)

  • Periodic SQL servers scans:

    Select the SQL server --> Make sure that 'Advanced data security' is set to 'On' --> Under 'Vulnerability assessment settings', set 'Periodic recurring scans' to 'On', and configure a storage account for storing vulnerability assessment scan results --> Click Save

  • Lack of App Services restrictions: Look for "App Services" in Azure (https://portal.azure.com/#blade/HubsExtension/BrowseResource/resourceType/Microsoft.Web%2Fsites) and check if anyone is being used. In that case check go through each App checking for "Access Restrictions" and there aren't rules, report it. The access to the app service should be restricted according to the needs.

Office365

You need Global Admin or at least Global Admin Reader (but note that Global Admin Reader is a little bit limited). However, those limitations appear in some PS modules and can be bypassed accessing the features via the web application.

Other Cloud Pentesting Guides

Support HackTricks and get benefits!