Automated Deep Security Health Check, Currently in Beta
Checkout QUICKSTART.md
-
Note, if you just want to generate a report for a client go to Releases and download the executables, there are no prerequisites.
-
PyInstaller to generate executables for the extractor tool
-
The complete set of prerequesites and dependencies can be found in the
Dockerfile
. We advise against running the tool without docker.- Docker
- python 3.7+ and the python requirements
- PyInstaller
- Run the
install_docker_image.sh
script. It will:- automatically __build a docker image, create configuration files and dirs at
/etc/DSHC/
- make a
dshc
alias to run a container and mount the correct directories
- automatically __build a docker image, create configuration files and dirs at
- Inside the directory
docker build -t dshc:latest .
(or any other tag) - You will need to create or choose FOUR directories for the container's volumes:
- one to store the assymetric keys.
- one to store configuration files.
- Create a
api_config.yml
file on your chosen configuration directory (see the file's contents below in CONFIGURATION) - Copy/move all others files included in the repository's own
config
directory
- Create a
- (optional) one to store data packs from the extractor tool
- one to recieve the reports from the container
Please follow the docker post-install guide to configure permissions.
The configuration files for the program are placed in the tool's config
directory.
api_config.yml
Required to configure access to the DSM RESTful API. The following fields are expectedhost
The DSM API URL/IP Address. Includehttps://
and the port (if non-standard)api-secret-key
The API Read-only access keyapi-version
Version in use by the API
EXAMPLE
host: https://app.deepsecurity.trendmicro.com/api
api-secret-key: myscretkey
api-version: 'v1' #Example 'v1'
report_details.yml
To configure details used in non-critical parts of the report (DETAILS PENDING)
usage: dshc [-h] [--licenses LICENSES [LICENSES ...]]
[--modules MODULES [MODULES ...]] [--standard STANDARD] [--debug]
[--stress STRESS] [--remote REMOTE] [--generate-keys] [--migrate]
[--language LANGUAGE] [--key KEY] [--password PASSWORD]
[--output OUTPUT] [--version]
-h, --help show this help message and exit
--licenses LICENSES [LICENSES ...], -l LICENSES [LICENSES ...]
Space separated list of modules (grouped by license)
to be checked
--modules MODULES [MODULES ...], -m MODULES [MODULES ...]
Space separated list of modules to be checked
--standard STANDARD, -c STANDARD
Conformity Standard to be used
--debug, -d Debug messages will be written in debug.log
--stress STRESS, -s STRESS
Stress test - Choose number (dummy) of computers to
run test
--remote REMOTE, -r REMOTE
Loads a previously extracted data package
--generate-keys, -g Generate a new cryptographic key pair and exits
--migrate Migrates a Data Package to a new key pair and exits
--language LANGUAGE Language to be used in the report
--key KEY, -k KEY Path to your private key
--password PASSWORD, -p PASSWORD
Your private key password, this will be written in
your history file!
--output OUTPUT, -o OUTPUT
Path that zip file will be stored
--version, -v Print version and exit
en English
jp Japanese
mp (Malware Protection - Anti-Malware, Web Reputation)
ss (System Security - Integrity Monitoring, Log Inspection, Application Control)
ns (Network Security - Firewall, Intrusion Prevention)
all (All modules)
Note: If no modules or licenses are explicitly declared, all modules will be checked by default
am (Anti-Malware)
wr (Web Reputation)
im (Integrity Monitoring)
li (Log Inspection)
ac (Application Control)
fw (Firewall)
ip (Intrusion Prevention)
If no modules or licenses are explicitly declared, all modules will be checked by default
dshc -r my_pack.dat -l mp ss -m ip
- Loads the my_pack.dat placed in the /etc/DSHC/data_packs directory and checks all Malware Protection and System Security modules and Intrusion Prevention module
The extractor Windows and Linux binaries are available to be downloaded in the releases tab, note that official PUBLIC keys are available with the binaries allowing for remote processing of data on our cloud.
This tool is used to parse information from the client DSM through the API, it can be used on-premises on air-gapped environments, and on the CloudOne Workload Security (SaaS).
The extractor.py
accesses the API and generates an encrypted data package containing the Deep Security's environment information. The data packages can then be unencrypted and used through the -r
or --remote
mode of the main dshc
program having the private keys.
The Extractor allows for automatic generation of the report using our cloud which is the standard if you are downloading the binaries, if you wish to only generate the encrypted package, you will have to use the --notsend
argument
usage: extractor [-h] [--get GET] [--send SEND] [--notsend] [--unencrypted]
[--version]
optional arguments:
-h, --help show this help message and exit
--get GET, -g GET Attempts to download a report of an already submitted
pack using the ID
--send SEND, -p SEND Submit a data pack file for report generation,
recieves ID
--notsend, -n Do not send for remote processing, just generate .dat
--unencrypted, -u Generate an UNENCRYPTED version of the Datapack as a
json file
--version, -v Print version and exit
VALID LANGUAGES INPUT
en English
jp Japanese
VALID MODULES INPUT (SPACE SEPARATED)
am (Anti-Malware)
wr (Web Reputation)
im (Integrity Monitoring)
li (Log Inspection)
ac (Application Control)
fw (Firewall)
ip (Intrusion Prevention)
all (All modules)
For the extractor to encrypt and for dshc to decrypt the data packages, it is required a 8192-bit RSA public-private key pair. A pair of public and private keys can be generated using the generate_keys.py
.
For further protection the private key will be protected with a password that as prompted by the key generation script.
DO NOT SHARE YOUR PRIVATE KEY
The build_extractor_zip
can be used to create a zip with an executable of the extractor tool and of it's required files
Detailed Instructions for the extractor
can be found in the EXTRACTOR_README.md
file.
Using loads a data package generated by the extractor, decrpyts it and run checks.
If running the docker version, place the data package in the data_packs
directory and pass the name of the package. The corresponding private key is expected to be in the keys
directory with the name PRIVATE_key.pem
. You will be prompted to enter the key's password.
WARNING: the extractor module is not secure against erroneous or maliciously constructed data. Never unpack data received from an untrusted or unauthenticated source.
The data used by the program (or extractor standalone) is fetched in the Deep Security Database via the RESTful API. There is an increase of resource usage (proportional to the amount of data being queried), mainly in the Database but also on the Manager. As such, caution should be taken for On-Premise deployments to avoid possible impacts on a customer's business.
Although the impact should be minimal for most deployments, consider the following before proceeding:
- Is the Database used exclusively for Deep Security?
- If not, what other services could be affected?
- How many Agents are in the environment?
- Is Deep Security undergoing updates?
A test extracting data for 995 Agents and a lesser number of configurations and policies during a "stress" (DS Rules and Patterns Update) period on a on a RDS-T2 (2 Intel CPUs up to 3.0 GHz, 8GB RAM, running PostegreSQL 9.6), resulted in Increases the following:
- DATABASE
- +1% of CPU
- +2e-4 seconds of Read Latency
- +4 Database Connections
- MANAGER
- +3% of RAM
- +1% of CPU
In the utils
directory are general utilities
gen_config
Script generates an empty api config YAML file in theconfig
directory.profiler
Shell script is used to perform a bulk of Stress tests on the dshc script. The results are stored asperformance_profile
in the same directory.
Detailed documentation on the code, project architecture can be found on the docs
directory. This is a To-Do.
Deep Security Health Check was originally Developed by
- Anderson Leite - @aandersonl
- Angelo Rodem - @angelorodem
- João Guimarães - @jvlsg