This project implements a Zero Trust Architecture (ZTA) for an Intrusion Detection System (IDS), focusing on simulating normal and attack requests through a web interface. It utilizes Docker to create a network of clients and servers, applies various IDS methods, and incorporates a modified multi-view approach for enhanced detection capabilities.
- Simulation of Requests
- Server Setup
- Client Setup
- Network and Server Management
- Dataset for MV and IDS
- Multi-View Implementation
- Intrusion Detection System
- Dataset and Preprocessing
- IDS Models
- Real World Evaluation
To simulate web requests, we implement a website on a server to sniff all incoming requests. We use Docker and docker-compose to create a network of clients and servers with a range of IP addresses.
- Pull the Docker image:
docker pull ubuntu:20.04
- Create and configure the Docker container:
docker run --rm -it --name ubuntu ubuntu:20.04 bash # Followed by installation commands
Update and install necessary modules:
apt update
apt install build-essential git openssh-server -y
# Additional installation commands
Copy SSH config to the container and commit changes:
docker cp sshd_config ubuntu:/
docker commit ubuntu gitea
We use Kali Linux Docker image for client setup.
Build the client Docker image:
cd Client
docker build -t kalilinux/kali-rolling:latest .
cd ..
Edit docker-compose.yml to set the desired IP range.
Start the server and network:
docker-compose up
SSH Service and Network Monitoring
Manually start the SSH service and run tcpdump for network monitoring.
https://research.unsw.edu.au/projects/unsw-nb15-dataset
Install dependencies: pip install pycrypto
Place the UNSW-NB15 CSV files in the MV directory.
Execute the script: python3 mv.py
The UNSW-NB15 dataset is used, downloadable via download.sh
.
Pre-process the dataset using preprocess.py.
Various IDS models implemented in Python:
- CNN with Autoencoder features (CNN_AE.py)
- CNN-Attention with Autoencoder features (CNNAtt_AE.py)
- CNN-Attention with balanced data sampling (CNNAtt-balanced.py)
- CNN-LSTM with Attention module (CNNAttLstm_AE.py)
Convert dumped pcap file to equivalent argus file using:
export filename=testmasscan; argus -r $filename.pcap -w $filename.argus
Then extract the attributes that are extractable by argus tools using:
export filename=testmasscan; ra -r $filename.argus -s saddr, daddr, sport, dur, proto, dport, state, spkts, dpkts, sbytes, dbytes, rate, sttl, dttl, sload, dload, sloss, dloss, sintpkt, dintpkt, sjit, djit, swin, stcpb, dtcpb, dwin, tcprtt, synack, ackdat, smeansz, dmeansz -c , > $filename.csv
The resulting $filename.csv contains most attributes of UNSW-NB15, however the others can be extracted by our implemented python code named "extract.py".
python3 extract.py testmasscan
- Note: In the above codes, we used testmasscan for both argus and python running files
The resulting $filename.csv includes all attributes of UNSW-NB15, although it may have some additional columns due to one-hot encoding of non-numerical attributes with values other than known values of UNSW-NB15 dataset. We should remove or replace these values with most similart values. For example, according to our tests these values can be replaced as written bellow:
test_real_data.loc[test_real_data['state'] == 'URFIL', 'state'] = 'URN'
test_real_data.loc[test_real_data['state'] == 'STP', 'state'] = 'CLO'
test_real_data.loc[test_real_data['state'] == 'STA', 'state'] = 'PAR'
test_real_data.loc[test_real_data['state'] == 'NNS', 'state'] = 'no'
test_real_data.loc[test_real_data['state'] == 'URP', 'state'] = 'PAR'
test_real_data.loc[test_real_data['state'] == 'NRS', 'state'] = 'no'
test_real_data.loc[test_real_data['proto'] == 'man', 'proto'] = 'any'
test_real_data.loc[test_real_data['proto'] == 'ipv6-icmp', 'proto'] = 'icmp'
real_data = pd.DataFrame(ct.transform(test_real_data))
real_data.columns = new_cols
Also, do not forget to set any Nan value in the generated data:
real_data = real_data.replace(np.nan, 0)
real_data seems ready to be used by next modules.
!apt-get install argus-client