Skip to content

Secbrain/Trident

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

🔱 WWW_Trident

avatar

Introduction

Trident is a novel framework to detect unknown attack traffic in a fine-grained and incremental manner. It can realize known class classification, fine-grained unknown class detection, and incremental model updates (including sample increments and class increments). At a high level, Trident consists of three tightly coupled components named tSieve, tScissors, and tMagnifier. They are responsible for profiling traffic, determining outlier thresholds, and clustering respectively.

avatar

Requirements

pip install scipy
pip install numpy
pip install pandas
pip install matplotlib
pip install sklearn
pip install torch

Model Architecture

The AutoEncoder model architecture is stored in ./code/autoencoder.py.

autoencoder.py

Outlier Detection

The Extreme Value Theory (EVT) calculations is stored in ./code/evt.py.

evt.py

Demo Dataset

A demo dataset based on KDDCup 99 is stored in ./code/data_loader.py.

data_loader.py

Running

To perform this test code, enter the ./code/ folder, run with

python main_process.py

How to Configure Other Models

For sequence-based models, user can configure recurrent neural networks, could refer to I2RNN and Seq2Seq architectures. For the graph-based model, Graph U-Nets could be used to construct the reconstruction loss of graph data. Particularly, RNN and GNN models require corresponding feature processing.

Note: the scenario for this sample code is that one emerging class appears in the data stream each time. When considering multiple new classes appearing simultaneously, further clustering is only required in the buffer pool (corresponding to res_ls in main_process.py). For the ensemble of multiple clusters, could refer to FARE.

References

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages