Spectrum-Sensing
Spectrum-Sensing copied to clipboard
Using signal processing based features to train and validate machine-learning algorithms to improve spectrum sensing and related problems in cognitive radios.
Spectrum-Sensing
Introduction :
This project aims to use signal processing based features to train and validate machine-learning algorithms to improve spectrum sensing and related problems in cognitive radios. We used differential entropy, geometric power and LP- norm based features to train supervised ML algorithms and various deep neural networks. The noise process is assumed to follow a generalized Gaussian distribution, which is of practical relevance. Through experimental results based on real-world captured datasets, we show that the proposed method outperforms the energy-based approach in terms of probability of detection. The proposed technique is particularly useful under low signal-to-noise ratio conditions, and when the noise distribution has heavier tails.
Datasets :
Dataset 1 The centre frequency of the PU was set at 2.48 GHz. The primary transmitter deploys a differential quadrature phase shift keying modulation with a continuous transmission rate of 500 kbps and has a tranmission bandwidth of 1 MHz. The data measurement was carried out in an anechoic chamber with a scan bandwidth of 4 MHz, which uses a discrete Fourier transform of 1024 frequency bins. Therefore, the bandwidth of each frequency bin is 3.9 kHz. To this clean signal, generalized Gaussian noise was added with a given parameter beta and unit variance, which serves as a real-world data that is received by the deployed CR nodes.
Dataset 2 This dataset was captured in a laboratory in Thailand. The dataset was recorded by an omnidirectional antenna connected to the RF Explorer spectrum analyzer. The operating frequency range is 510 to 790 MHz, with a center frequency of 650 MHz. The measurements were taken in three different locations, with both indoor and outdoor environments. We have used data with the highest signal-to-noise ratio (SNR) for our experimental study.
System Model :

Machine Learning Algorithms :
Classical ML algoirithms :
- Support Vector Machines
- K-Nearest Neighbor
- Logistic Regression
- Random Forest
Deep net architectures:



Results:
We show that the combination of features - Differential Entropy, Geometric Power, Lp-Norm and Energy statistic outperforms the performance of raw data.




Publications :
If you use our code and/or system model with proposed features, please cite the above publications.