Hans Georg Schaathun
Machine Learning in Image Steganalysis (eBook, ePUB)
91,99 €
91,99 €
inkl. MwSt.
Sofort per Download lieferbar
0 °P sammeln
91,99 €
Als Download kaufen
91,99 €
inkl. MwSt.
Sofort per Download lieferbar
0 °P sammeln
Jetzt verschenken
Alle Infos zum eBook verschenken
91,99 €
inkl. MwSt.
Sofort per Download lieferbar
Alle Infos zum eBook verschenken
0 °P sammeln
Hans Georg Schaathun
Machine Learning in Image Steganalysis (eBook, ePUB)
- Format: ePub
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
Bitte loggen Sie sich zunächst in Ihr Kundenkonto ein oder registrieren Sie sich bei
bücher.de, um das eBook-Abo tolino select nutzen zu können.
Hier können Sie sich einloggen
Hier können Sie sich einloggen
Sie sind bereits eingeloggt. Klicken Sie auf 2. tolino select Abo, um fortzufahren.
Bitte loggen Sie sich zunächst in Ihr Kundenkonto ein oder registrieren Sie sich bei bücher.de, um das eBook-Abo tolino select nutzen zu können.
Steganography is the art of communicating a secret message, hiding the very existence of a secret message. This book is an introduction to steganalysis as part of the wider trend of multimedia forensics, as well as a practical tutorial on machine learning in this context. It looks at a wide range of feature vectors proposed for steganalysis with performance tests and comparisons. Python programs and algorithms are provided to allow readers to modify and reproduce outcomes discussed in the book.
- Geräte: eReader
- mit Kopierschutz
- eBook Hilfe
- Größe: 8.49MB
Andere Kunden interessierten sich auch für
- Emilio MaggioVideo Tracking (eBook, ePUB)95,99 €
- Mathematical Morphology (eBook, ePUB)203,99 €
- Bangjun LeiClassification, Parameter Estimation and State Estimation (eBook, ePUB)101,99 €
- Ulisses M. Braga NetoError Estimation for Pattern Recognition (eBook, ePUB)118,99 €
- Amit KonarEmotion Recognition (eBook, ePUB)118,99 €
- Pradipta MajiRough-Fuzzy Pattern Recognition (eBook, ePUB)101,99 €
- Steven J. SimskeMeta-Algorithmics (eBook, ePUB)88,99 €
-
-
-
Steganography is the art of communicating a secret message, hiding the very existence of a secret message. This book is an introduction to steganalysis as part of the wider trend of multimedia forensics, as well as a practical tutorial on machine learning in this context. It looks at a wide range of feature vectors proposed for steganalysis with performance tests and comparisons. Python programs and algorithms are provided to allow readers to modify and reproduce outcomes discussed in the book.
Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.
Produktdetails
- Produktdetails
- Verlag: John Wiley & Sons
- Seitenzahl: 296
- Erscheinungstermin: 5. September 2012
- Englisch
- ISBN-13: 9781118437988
- Artikelnr.: 37360113
- Verlag: John Wiley & Sons
- Seitenzahl: 296
- Erscheinungstermin: 5. September 2012
- Englisch
- ISBN-13: 9781118437988
- Artikelnr.: 37360113
Hans Georg Schaathun, Department of Computing, University of Surrey, UK Dr Schaathun was previously a lecturer in coding and cryptography at the University of Bergen. Since February 2006, he has been a lecturer at the University of Surrey, UK, belonging to the research group in Digital Watermarking and Multimedia Security. His main research areas are applications of coding theory in information hiding, and machine learning techniques in steganalysis. He teaches Computer Security and Steganography at MSc level, and Functional Programming Techniques at u/g level. Dr Scaathun has published more than 35 international, peer-reviewed articles, and is an associate editor of EURASIP Journal of Information Security.
Part One Overview 3 1 Introduction 5 1.1 Real threat or hype? 5 1.2
Artificial Intelligence and Learning 6 1.3 How to read this book 7 2
Steganography and Steganalysis 9 2.1 Cryptography versus Steganography 9
2.2 Steganography 10 2.2.1 The Prisoners' Problem 10 2.2.2 Covers -
Synthesis and Modification 12 2.2.3 Keys and Kerckhoffs' Principle 13 2.2.4
LSB embedding 15 2.2.5 Steganography and Watermarking 17 2.2.6 Different
media types 18 2.3 Steganalysis 19 2.3.1 The Objective of Steganalysis 19
2.3.2 Blind and Targeted Steganalysis 20 2.3.3 Main approaches to
steganalysis 21 2.3.4 Example: pairs of values 24 2.4 Summary and Notes 26
3 Getting Started with a Classifier 27 3.1 Classification 27 3.1.1 Learning
Classifiers 28 3.1.2 Accuracy 29 3.2 Estimation and Confidence 29 3.3 Using
libSVM 32 3.3.1 Training and testing 32 3.3.2 Grid search and
Cross-Validation 33 3.4 Using Python 35 3.4.1 Why we use Python 35 3.4.2
Getting started with Python 36 3.4.3 Scientific Computing 37 3.4.4 Python
Imaging Library 38 3.4.5 An example: Image Histogram 38 3.5 Images for
Testing 39 3.6 Further Reading 41 Part Two Features 43 4 Histogram Analysis
45 4.1 Early Histogram Analysis 45 4.2 Notation 46 4.3 Additive Independent
Noise 46 4.3.1 The effect of noise 47 4.3.2 The Histogram Characteristic
Function 48 4.3.3 Moments of the Characteristic Function 50 4.3.4 Amplitude
of Local Extrema 54 4.4 Multi-dimensional Histograms 56 4.4.1 HCF Features
for Colour Images 57 4.4.2 The Co-occurrence Matrix 58 4.5 Experiment and
Comparison 64 5 Bit Plane Analysis 65 5.1 Visual Steganalysis 65 5.2
Auto-correlation Features 67 5.3 Binary Similarity Measures 69 5.4
Evaluation and Comparison 72 6 More Spatial Domain Features 75 6.1 The
Difference Matrix 75 6.1.1 The EM features of Chen et al. 76 6.1.2 Markov
Models and the SPAM features 78 6.1.3 Higher-order differences 80 6.1.4
Run-length analysis 81 6.2 Image Quality Measures 81 6.3 Colour Images 85
6.4 Experiment and Comparison 85 7 The Wavelets Domain 87 7.1 A Visual View
87 7.2 The Wavelet Domain 89 7.2.1 The Fast Wavelet Transform 89 7.2.2
Example: The Haar Wavelet 90 7.2.3 The Wavelet Transform in Python 91 7.2.4
Other Wavelet Transforms 92 7.3 Farid's Features 94 7.3.1 The image
statistics 94 7.3.2 The linear predictor 94 7.3.3 Notes 96 7.4 HCF in the
wavelet domain 96 7.4.1 Notes and further reading 99 7.5 Denoising and the
WAM features 99 7.5.1 The denoising algorithm 100 7.5.2 Locally Adaptive
LAW-ML 101 7.5.3 Wavelet Absolute Moments 103 7.6 Experiment and Comparison
104 8 Steganalysis in the JPEG domain 105 8.1 JPEG compression 106 8.1.1
The compression 106 8.1.2 Programming JPEG steganography 108 8.1.3
Embedding in JPEG 110 8.2 Histogram Analysis 111 8.2.1 The JPEG histogram
112 8.2.2 First-order Features 115 8.2.3 Second-order Features 117 8.2.4
Histogram Characteristic Function 118 8.3 Blockiness 120 8.4 Markov model
based features 122 8.5 Conditional Probabilities 124 8.6 Experiment and
Comparison 125 9 Calibration Techniques 127 9.1 Calibrated Features 127 9.2
JPEG Calibration 129 9.2.1 The FRI-23 feature set 129 9.2.2 The Pevn?
features and Cartesian Calibration 131 9.3 Calibration by Downsampling 132
9.3.1 Down-sampling as calibration 133 9.3.2 Calibrated HCF-COM 134 9.3.3
The sum and difference images 136 9.3.4 Features for colour images 138
9.3.5 Pixel Selection 139 9.3.6 Other Features based on Downsampling 141
9.3.7 Evaluation and Notes 142 9.4 Calibration in General 142 9.5
Progressive Randomisation 143 Part Three Classifiers 145 10 Simulation and
Evaluation 147 10.1 Estimation and Simulation 147 10.1.1 The binomial
distribution 147 10.1.2 Probabilities and Sampling 148 10.1.3 Monte Carlo
simulations 150 10.1.4 Confidence intervals 151 10.2 Scalar measures 152
10.2.1 Two error types 152 10.2.2 Common scalar measures 154 10.3 The
Receiver Operating Curve 155 10.3.1 The libSVM API for Python 156 10.3.2
The ROC curve 158 10.3.3 Choosing a Point on the ROC Curve 160 10.3.4
Confidence and variance 161 10.3.5 The area under the curve 163 10.4
Experimental Methodology 164 10.4.1 Feature Storage 165 10.4.2 Parallel
computation 166 10.4.3 The dangers of large-scale experiments 167 10.5
Comparison and hypothesis testing 167 10.5.1 The hypothesis test 168 10.5.2
Comparing two binomial proportions 168 10.6 Summary 170 11 Support Vector
Machines 171 11.1 Linear Classifiers 171 11.1.1 Linearly Separable Problems
172 11.1.2 Non-separable Problems 175 11.2 The kernel function 179 11.2.1
Example: the XOR function 179 11.2.2 The SVM algorithm 180 11.3 _-SVM 182
11.4 Multi-class methods 183 11.5 One-class methods 184 11.5.1 The
one-class SVM solution 185 11.5.2 Practical problems 186 11.5.3 Multiple
hyperspheres 187 11.6 Summary 187 12 Other Classification Algorithms 189
12.1 Bayesian Classifiers 190 12.1.1 Classification Regions and Errors 191
12.1.2 Misclassification risk 192 12.1.3 The naïve Bayes classifier 193
12.1.4 A security criterion 194 12.2 Estimating Probability Distributions
195 12.2.1 The histogram 195 12.2.2 The kernel density estimator 196 12.3
Multivariate Regression Analysis 201 12.3.1 Linear Regression 201 12.3.2
Support Vector Regression 202 12.4 Unsupervised Learning 204 12.4.1 K-means
clustering 204 12.5 Summary 206 13 Feature Selection and Evaluation 207
13.1 Overfitting and Underfitting 207 13.1.1 Feature Selection and Feature
Extraction 209 13.2 Scalar feature selection 209 13.2.1 Analysis of
Variance 210 13.3 Feature Subset Selection 212 13.3.1 Subset Evaluation 213
13.3.2 Search Algorithms 213 13.4 Selection using Information Theory 214
13.4.1 Entropy 215 13.4.2 Mutual Information 216 13.4.3 Multivariate
Information 219 13.4.4 Information Theory with Continuous Sets 221 13.4.5
Estimation of entropy and information 222 13.4.6 Ranking Features 223 13.5
Boosting feature selection 225 13.6 Applications in Steganalysis 228 13.6.1
Correlation coefficient 229 13.6.2 Optimised feature vectors for JPEG 229
14 The Steganalysis Problem 233 14.1 Different use cases 233 14.1.1 Who are
Alice and Bob? 233 14.1.2 Wendy's role 235 14.1.3 Pooled Steganalysis 236
14.1.4 Quantitative Steganalysis 237 14.2 Images and Training Sets 238
14.2.1 Choosing Cover Source 238 14.2.2 The Training Scenario 241 14.2.3
The Steganalytic Game 244 14.3 Composite Classifier Systems 246 14.3.1
Fusion 246 14.3.2 A multi-layer classifier for JPEG 248 14.3.3 Benefits of
composite classifiers 249 14.4 Summary 249 15 Future of the Field 251 15.1
Image Forensics 251 15.2 Conclusions and notes 253 Bibliography 255 Index
263
Artificial Intelligence and Learning 6 1.3 How to read this book 7 2
Steganography and Steganalysis 9 2.1 Cryptography versus Steganography 9
2.2 Steganography 10 2.2.1 The Prisoners' Problem 10 2.2.2 Covers -
Synthesis and Modification 12 2.2.3 Keys and Kerckhoffs' Principle 13 2.2.4
LSB embedding 15 2.2.5 Steganography and Watermarking 17 2.2.6 Different
media types 18 2.3 Steganalysis 19 2.3.1 The Objective of Steganalysis 19
2.3.2 Blind and Targeted Steganalysis 20 2.3.3 Main approaches to
steganalysis 21 2.3.4 Example: pairs of values 24 2.4 Summary and Notes 26
3 Getting Started with a Classifier 27 3.1 Classification 27 3.1.1 Learning
Classifiers 28 3.1.2 Accuracy 29 3.2 Estimation and Confidence 29 3.3 Using
libSVM 32 3.3.1 Training and testing 32 3.3.2 Grid search and
Cross-Validation 33 3.4 Using Python 35 3.4.1 Why we use Python 35 3.4.2
Getting started with Python 36 3.4.3 Scientific Computing 37 3.4.4 Python
Imaging Library 38 3.4.5 An example: Image Histogram 38 3.5 Images for
Testing 39 3.6 Further Reading 41 Part Two Features 43 4 Histogram Analysis
45 4.1 Early Histogram Analysis 45 4.2 Notation 46 4.3 Additive Independent
Noise 46 4.3.1 The effect of noise 47 4.3.2 The Histogram Characteristic
Function 48 4.3.3 Moments of the Characteristic Function 50 4.3.4 Amplitude
of Local Extrema 54 4.4 Multi-dimensional Histograms 56 4.4.1 HCF Features
for Colour Images 57 4.4.2 The Co-occurrence Matrix 58 4.5 Experiment and
Comparison 64 5 Bit Plane Analysis 65 5.1 Visual Steganalysis 65 5.2
Auto-correlation Features 67 5.3 Binary Similarity Measures 69 5.4
Evaluation and Comparison 72 6 More Spatial Domain Features 75 6.1 The
Difference Matrix 75 6.1.1 The EM features of Chen et al. 76 6.1.2 Markov
Models and the SPAM features 78 6.1.3 Higher-order differences 80 6.1.4
Run-length analysis 81 6.2 Image Quality Measures 81 6.3 Colour Images 85
6.4 Experiment and Comparison 85 7 The Wavelets Domain 87 7.1 A Visual View
87 7.2 The Wavelet Domain 89 7.2.1 The Fast Wavelet Transform 89 7.2.2
Example: The Haar Wavelet 90 7.2.3 The Wavelet Transform in Python 91 7.2.4
Other Wavelet Transforms 92 7.3 Farid's Features 94 7.3.1 The image
statistics 94 7.3.2 The linear predictor 94 7.3.3 Notes 96 7.4 HCF in the
wavelet domain 96 7.4.1 Notes and further reading 99 7.5 Denoising and the
WAM features 99 7.5.1 The denoising algorithm 100 7.5.2 Locally Adaptive
LAW-ML 101 7.5.3 Wavelet Absolute Moments 103 7.6 Experiment and Comparison
104 8 Steganalysis in the JPEG domain 105 8.1 JPEG compression 106 8.1.1
The compression 106 8.1.2 Programming JPEG steganography 108 8.1.3
Embedding in JPEG 110 8.2 Histogram Analysis 111 8.2.1 The JPEG histogram
112 8.2.2 First-order Features 115 8.2.3 Second-order Features 117 8.2.4
Histogram Characteristic Function 118 8.3 Blockiness 120 8.4 Markov model
based features 122 8.5 Conditional Probabilities 124 8.6 Experiment and
Comparison 125 9 Calibration Techniques 127 9.1 Calibrated Features 127 9.2
JPEG Calibration 129 9.2.1 The FRI-23 feature set 129 9.2.2 The Pevn?
features and Cartesian Calibration 131 9.3 Calibration by Downsampling 132
9.3.1 Down-sampling as calibration 133 9.3.2 Calibrated HCF-COM 134 9.3.3
The sum and difference images 136 9.3.4 Features for colour images 138
9.3.5 Pixel Selection 139 9.3.6 Other Features based on Downsampling 141
9.3.7 Evaluation and Notes 142 9.4 Calibration in General 142 9.5
Progressive Randomisation 143 Part Three Classifiers 145 10 Simulation and
Evaluation 147 10.1 Estimation and Simulation 147 10.1.1 The binomial
distribution 147 10.1.2 Probabilities and Sampling 148 10.1.3 Monte Carlo
simulations 150 10.1.4 Confidence intervals 151 10.2 Scalar measures 152
10.2.1 Two error types 152 10.2.2 Common scalar measures 154 10.3 The
Receiver Operating Curve 155 10.3.1 The libSVM API for Python 156 10.3.2
The ROC curve 158 10.3.3 Choosing a Point on the ROC Curve 160 10.3.4
Confidence and variance 161 10.3.5 The area under the curve 163 10.4
Experimental Methodology 164 10.4.1 Feature Storage 165 10.4.2 Parallel
computation 166 10.4.3 The dangers of large-scale experiments 167 10.5
Comparison and hypothesis testing 167 10.5.1 The hypothesis test 168 10.5.2
Comparing two binomial proportions 168 10.6 Summary 170 11 Support Vector
Machines 171 11.1 Linear Classifiers 171 11.1.1 Linearly Separable Problems
172 11.1.2 Non-separable Problems 175 11.2 The kernel function 179 11.2.1
Example: the XOR function 179 11.2.2 The SVM algorithm 180 11.3 _-SVM 182
11.4 Multi-class methods 183 11.5 One-class methods 184 11.5.1 The
one-class SVM solution 185 11.5.2 Practical problems 186 11.5.3 Multiple
hyperspheres 187 11.6 Summary 187 12 Other Classification Algorithms 189
12.1 Bayesian Classifiers 190 12.1.1 Classification Regions and Errors 191
12.1.2 Misclassification risk 192 12.1.3 The naïve Bayes classifier 193
12.1.4 A security criterion 194 12.2 Estimating Probability Distributions
195 12.2.1 The histogram 195 12.2.2 The kernel density estimator 196 12.3
Multivariate Regression Analysis 201 12.3.1 Linear Regression 201 12.3.2
Support Vector Regression 202 12.4 Unsupervised Learning 204 12.4.1 K-means
clustering 204 12.5 Summary 206 13 Feature Selection and Evaluation 207
13.1 Overfitting and Underfitting 207 13.1.1 Feature Selection and Feature
Extraction 209 13.2 Scalar feature selection 209 13.2.1 Analysis of
Variance 210 13.3 Feature Subset Selection 212 13.3.1 Subset Evaluation 213
13.3.2 Search Algorithms 213 13.4 Selection using Information Theory 214
13.4.1 Entropy 215 13.4.2 Mutual Information 216 13.4.3 Multivariate
Information 219 13.4.4 Information Theory with Continuous Sets 221 13.4.5
Estimation of entropy and information 222 13.4.6 Ranking Features 223 13.5
Boosting feature selection 225 13.6 Applications in Steganalysis 228 13.6.1
Correlation coefficient 229 13.6.2 Optimised feature vectors for JPEG 229
14 The Steganalysis Problem 233 14.1 Different use cases 233 14.1.1 Who are
Alice and Bob? 233 14.1.2 Wendy's role 235 14.1.3 Pooled Steganalysis 236
14.1.4 Quantitative Steganalysis 237 14.2 Images and Training Sets 238
14.2.1 Choosing Cover Source 238 14.2.2 The Training Scenario 241 14.2.3
The Steganalytic Game 244 14.3 Composite Classifier Systems 246 14.3.1
Fusion 246 14.3.2 A multi-layer classifier for JPEG 248 14.3.3 Benefits of
composite classifiers 249 14.4 Summary 249 15 Future of the Field 251 15.1
Image Forensics 251 15.2 Conclusions and notes 253 Bibliography 255 Index
263
Part One Overview 3 1 Introduction 5 1.1 Real threat or hype? 5 1.2
Artificial Intelligence and Learning 6 1.3 How to read this book 7 2
Steganography and Steganalysis 9 2.1 Cryptography versus Steganography 9
2.2 Steganography 10 2.2.1 The Prisoners' Problem 10 2.2.2 Covers -
Synthesis and Modification 12 2.2.3 Keys and Kerckhoffs' Principle 13 2.2.4
LSB embedding 15 2.2.5 Steganography and Watermarking 17 2.2.6 Different
media types 18 2.3 Steganalysis 19 2.3.1 The Objective of Steganalysis 19
2.3.2 Blind and Targeted Steganalysis 20 2.3.3 Main approaches to
steganalysis 21 2.3.4 Example: pairs of values 24 2.4 Summary and Notes 26
3 Getting Started with a Classifier 27 3.1 Classification 27 3.1.1 Learning
Classifiers 28 3.1.2 Accuracy 29 3.2 Estimation and Confidence 29 3.3 Using
libSVM 32 3.3.1 Training and testing 32 3.3.2 Grid search and
Cross-Validation 33 3.4 Using Python 35 3.4.1 Why we use Python 35 3.4.2
Getting started with Python 36 3.4.3 Scientific Computing 37 3.4.4 Python
Imaging Library 38 3.4.5 An example: Image Histogram 38 3.5 Images for
Testing 39 3.6 Further Reading 41 Part Two Features 43 4 Histogram Analysis
45 4.1 Early Histogram Analysis 45 4.2 Notation 46 4.3 Additive Independent
Noise 46 4.3.1 The effect of noise 47 4.3.2 The Histogram Characteristic
Function 48 4.3.3 Moments of the Characteristic Function 50 4.3.4 Amplitude
of Local Extrema 54 4.4 Multi-dimensional Histograms 56 4.4.1 HCF Features
for Colour Images 57 4.4.2 The Co-occurrence Matrix 58 4.5 Experiment and
Comparison 64 5 Bit Plane Analysis 65 5.1 Visual Steganalysis 65 5.2
Auto-correlation Features 67 5.3 Binary Similarity Measures 69 5.4
Evaluation and Comparison 72 6 More Spatial Domain Features 75 6.1 The
Difference Matrix 75 6.1.1 The EM features of Chen et al. 76 6.1.2 Markov
Models and the SPAM features 78 6.1.3 Higher-order differences 80 6.1.4
Run-length analysis 81 6.2 Image Quality Measures 81 6.3 Colour Images 85
6.4 Experiment and Comparison 85 7 The Wavelets Domain 87 7.1 A Visual View
87 7.2 The Wavelet Domain 89 7.2.1 The Fast Wavelet Transform 89 7.2.2
Example: The Haar Wavelet 90 7.2.3 The Wavelet Transform in Python 91 7.2.4
Other Wavelet Transforms 92 7.3 Farid's Features 94 7.3.1 The image
statistics 94 7.3.2 The linear predictor 94 7.3.3 Notes 96 7.4 HCF in the
wavelet domain 96 7.4.1 Notes and further reading 99 7.5 Denoising and the
WAM features 99 7.5.1 The denoising algorithm 100 7.5.2 Locally Adaptive
LAW-ML 101 7.5.3 Wavelet Absolute Moments 103 7.6 Experiment and Comparison
104 8 Steganalysis in the JPEG domain 105 8.1 JPEG compression 106 8.1.1
The compression 106 8.1.2 Programming JPEG steganography 108 8.1.3
Embedding in JPEG 110 8.2 Histogram Analysis 111 8.2.1 The JPEG histogram
112 8.2.2 First-order Features 115 8.2.3 Second-order Features 117 8.2.4
Histogram Characteristic Function 118 8.3 Blockiness 120 8.4 Markov model
based features 122 8.5 Conditional Probabilities 124 8.6 Experiment and
Comparison 125 9 Calibration Techniques 127 9.1 Calibrated Features 127 9.2
JPEG Calibration 129 9.2.1 The FRI-23 feature set 129 9.2.2 The Pevn?
features and Cartesian Calibration 131 9.3 Calibration by Downsampling 132
9.3.1 Down-sampling as calibration 133 9.3.2 Calibrated HCF-COM 134 9.3.3
The sum and difference images 136 9.3.4 Features for colour images 138
9.3.5 Pixel Selection 139 9.3.6 Other Features based on Downsampling 141
9.3.7 Evaluation and Notes 142 9.4 Calibration in General 142 9.5
Progressive Randomisation 143 Part Three Classifiers 145 10 Simulation and
Evaluation 147 10.1 Estimation and Simulation 147 10.1.1 The binomial
distribution 147 10.1.2 Probabilities and Sampling 148 10.1.3 Monte Carlo
simulations 150 10.1.4 Confidence intervals 151 10.2 Scalar measures 152
10.2.1 Two error types 152 10.2.2 Common scalar measures 154 10.3 The
Receiver Operating Curve 155 10.3.1 The libSVM API for Python 156 10.3.2
The ROC curve 158 10.3.3 Choosing a Point on the ROC Curve 160 10.3.4
Confidence and variance 161 10.3.5 The area under the curve 163 10.4
Experimental Methodology 164 10.4.1 Feature Storage 165 10.4.2 Parallel
computation 166 10.4.3 The dangers of large-scale experiments 167 10.5
Comparison and hypothesis testing 167 10.5.1 The hypothesis test 168 10.5.2
Comparing two binomial proportions 168 10.6 Summary 170 11 Support Vector
Machines 171 11.1 Linear Classifiers 171 11.1.1 Linearly Separable Problems
172 11.1.2 Non-separable Problems 175 11.2 The kernel function 179 11.2.1
Example: the XOR function 179 11.2.2 The SVM algorithm 180 11.3 _-SVM 182
11.4 Multi-class methods 183 11.5 One-class methods 184 11.5.1 The
one-class SVM solution 185 11.5.2 Practical problems 186 11.5.3 Multiple
hyperspheres 187 11.6 Summary 187 12 Other Classification Algorithms 189
12.1 Bayesian Classifiers 190 12.1.1 Classification Regions and Errors 191
12.1.2 Misclassification risk 192 12.1.3 The naïve Bayes classifier 193
12.1.4 A security criterion 194 12.2 Estimating Probability Distributions
195 12.2.1 The histogram 195 12.2.2 The kernel density estimator 196 12.3
Multivariate Regression Analysis 201 12.3.1 Linear Regression 201 12.3.2
Support Vector Regression 202 12.4 Unsupervised Learning 204 12.4.1 K-means
clustering 204 12.5 Summary 206 13 Feature Selection and Evaluation 207
13.1 Overfitting and Underfitting 207 13.1.1 Feature Selection and Feature
Extraction 209 13.2 Scalar feature selection 209 13.2.1 Analysis of
Variance 210 13.3 Feature Subset Selection 212 13.3.1 Subset Evaluation 213
13.3.2 Search Algorithms 213 13.4 Selection using Information Theory 214
13.4.1 Entropy 215 13.4.2 Mutual Information 216 13.4.3 Multivariate
Information 219 13.4.4 Information Theory with Continuous Sets 221 13.4.5
Estimation of entropy and information 222 13.4.6 Ranking Features 223 13.5
Boosting feature selection 225 13.6 Applications in Steganalysis 228 13.6.1
Correlation coefficient 229 13.6.2 Optimised feature vectors for JPEG 229
14 The Steganalysis Problem 233 14.1 Different use cases 233 14.1.1 Who are
Alice and Bob? 233 14.1.2 Wendy's role 235 14.1.3 Pooled Steganalysis 236
14.1.4 Quantitative Steganalysis 237 14.2 Images and Training Sets 238
14.2.1 Choosing Cover Source 238 14.2.2 The Training Scenario 241 14.2.3
The Steganalytic Game 244 14.3 Composite Classifier Systems 246 14.3.1
Fusion 246 14.3.2 A multi-layer classifier for JPEG 248 14.3.3 Benefits of
composite classifiers 249 14.4 Summary 249 15 Future of the Field 251 15.1
Image Forensics 251 15.2 Conclusions and notes 253 Bibliography 255 Index
263
Artificial Intelligence and Learning 6 1.3 How to read this book 7 2
Steganography and Steganalysis 9 2.1 Cryptography versus Steganography 9
2.2 Steganography 10 2.2.1 The Prisoners' Problem 10 2.2.2 Covers -
Synthesis and Modification 12 2.2.3 Keys and Kerckhoffs' Principle 13 2.2.4
LSB embedding 15 2.2.5 Steganography and Watermarking 17 2.2.6 Different
media types 18 2.3 Steganalysis 19 2.3.1 The Objective of Steganalysis 19
2.3.2 Blind and Targeted Steganalysis 20 2.3.3 Main approaches to
steganalysis 21 2.3.4 Example: pairs of values 24 2.4 Summary and Notes 26
3 Getting Started with a Classifier 27 3.1 Classification 27 3.1.1 Learning
Classifiers 28 3.1.2 Accuracy 29 3.2 Estimation and Confidence 29 3.3 Using
libSVM 32 3.3.1 Training and testing 32 3.3.2 Grid search and
Cross-Validation 33 3.4 Using Python 35 3.4.1 Why we use Python 35 3.4.2
Getting started with Python 36 3.4.3 Scientific Computing 37 3.4.4 Python
Imaging Library 38 3.4.5 An example: Image Histogram 38 3.5 Images for
Testing 39 3.6 Further Reading 41 Part Two Features 43 4 Histogram Analysis
45 4.1 Early Histogram Analysis 45 4.2 Notation 46 4.3 Additive Independent
Noise 46 4.3.1 The effect of noise 47 4.3.2 The Histogram Characteristic
Function 48 4.3.3 Moments of the Characteristic Function 50 4.3.4 Amplitude
of Local Extrema 54 4.4 Multi-dimensional Histograms 56 4.4.1 HCF Features
for Colour Images 57 4.4.2 The Co-occurrence Matrix 58 4.5 Experiment and
Comparison 64 5 Bit Plane Analysis 65 5.1 Visual Steganalysis 65 5.2
Auto-correlation Features 67 5.3 Binary Similarity Measures 69 5.4
Evaluation and Comparison 72 6 More Spatial Domain Features 75 6.1 The
Difference Matrix 75 6.1.1 The EM features of Chen et al. 76 6.1.2 Markov
Models and the SPAM features 78 6.1.3 Higher-order differences 80 6.1.4
Run-length analysis 81 6.2 Image Quality Measures 81 6.3 Colour Images 85
6.4 Experiment and Comparison 85 7 The Wavelets Domain 87 7.1 A Visual View
87 7.2 The Wavelet Domain 89 7.2.1 The Fast Wavelet Transform 89 7.2.2
Example: The Haar Wavelet 90 7.2.3 The Wavelet Transform in Python 91 7.2.4
Other Wavelet Transforms 92 7.3 Farid's Features 94 7.3.1 The image
statistics 94 7.3.2 The linear predictor 94 7.3.3 Notes 96 7.4 HCF in the
wavelet domain 96 7.4.1 Notes and further reading 99 7.5 Denoising and the
WAM features 99 7.5.1 The denoising algorithm 100 7.5.2 Locally Adaptive
LAW-ML 101 7.5.3 Wavelet Absolute Moments 103 7.6 Experiment and Comparison
104 8 Steganalysis in the JPEG domain 105 8.1 JPEG compression 106 8.1.1
The compression 106 8.1.2 Programming JPEG steganography 108 8.1.3
Embedding in JPEG 110 8.2 Histogram Analysis 111 8.2.1 The JPEG histogram
112 8.2.2 First-order Features 115 8.2.3 Second-order Features 117 8.2.4
Histogram Characteristic Function 118 8.3 Blockiness 120 8.4 Markov model
based features 122 8.5 Conditional Probabilities 124 8.6 Experiment and
Comparison 125 9 Calibration Techniques 127 9.1 Calibrated Features 127 9.2
JPEG Calibration 129 9.2.1 The FRI-23 feature set 129 9.2.2 The Pevn?
features and Cartesian Calibration 131 9.3 Calibration by Downsampling 132
9.3.1 Down-sampling as calibration 133 9.3.2 Calibrated HCF-COM 134 9.3.3
The sum and difference images 136 9.3.4 Features for colour images 138
9.3.5 Pixel Selection 139 9.3.6 Other Features based on Downsampling 141
9.3.7 Evaluation and Notes 142 9.4 Calibration in General 142 9.5
Progressive Randomisation 143 Part Three Classifiers 145 10 Simulation and
Evaluation 147 10.1 Estimation and Simulation 147 10.1.1 The binomial
distribution 147 10.1.2 Probabilities and Sampling 148 10.1.3 Monte Carlo
simulations 150 10.1.4 Confidence intervals 151 10.2 Scalar measures 152
10.2.1 Two error types 152 10.2.2 Common scalar measures 154 10.3 The
Receiver Operating Curve 155 10.3.1 The libSVM API for Python 156 10.3.2
The ROC curve 158 10.3.3 Choosing a Point on the ROC Curve 160 10.3.4
Confidence and variance 161 10.3.5 The area under the curve 163 10.4
Experimental Methodology 164 10.4.1 Feature Storage 165 10.4.2 Parallel
computation 166 10.4.3 The dangers of large-scale experiments 167 10.5
Comparison and hypothesis testing 167 10.5.1 The hypothesis test 168 10.5.2
Comparing two binomial proportions 168 10.6 Summary 170 11 Support Vector
Machines 171 11.1 Linear Classifiers 171 11.1.1 Linearly Separable Problems
172 11.1.2 Non-separable Problems 175 11.2 The kernel function 179 11.2.1
Example: the XOR function 179 11.2.2 The SVM algorithm 180 11.3 _-SVM 182
11.4 Multi-class methods 183 11.5 One-class methods 184 11.5.1 The
one-class SVM solution 185 11.5.2 Practical problems 186 11.5.3 Multiple
hyperspheres 187 11.6 Summary 187 12 Other Classification Algorithms 189
12.1 Bayesian Classifiers 190 12.1.1 Classification Regions and Errors 191
12.1.2 Misclassification risk 192 12.1.3 The naïve Bayes classifier 193
12.1.4 A security criterion 194 12.2 Estimating Probability Distributions
195 12.2.1 The histogram 195 12.2.2 The kernel density estimator 196 12.3
Multivariate Regression Analysis 201 12.3.1 Linear Regression 201 12.3.2
Support Vector Regression 202 12.4 Unsupervised Learning 204 12.4.1 K-means
clustering 204 12.5 Summary 206 13 Feature Selection and Evaluation 207
13.1 Overfitting and Underfitting 207 13.1.1 Feature Selection and Feature
Extraction 209 13.2 Scalar feature selection 209 13.2.1 Analysis of
Variance 210 13.3 Feature Subset Selection 212 13.3.1 Subset Evaluation 213
13.3.2 Search Algorithms 213 13.4 Selection using Information Theory 214
13.4.1 Entropy 215 13.4.2 Mutual Information 216 13.4.3 Multivariate
Information 219 13.4.4 Information Theory with Continuous Sets 221 13.4.5
Estimation of entropy and information 222 13.4.6 Ranking Features 223 13.5
Boosting feature selection 225 13.6 Applications in Steganalysis 228 13.6.1
Correlation coefficient 229 13.6.2 Optimised feature vectors for JPEG 229
14 The Steganalysis Problem 233 14.1 Different use cases 233 14.1.1 Who are
Alice and Bob? 233 14.1.2 Wendy's role 235 14.1.3 Pooled Steganalysis 236
14.1.4 Quantitative Steganalysis 237 14.2 Images and Training Sets 238
14.2.1 Choosing Cover Source 238 14.2.2 The Training Scenario 241 14.2.3
The Steganalytic Game 244 14.3 Composite Classifier Systems 246 14.3.1
Fusion 246 14.3.2 A multi-layer classifier for JPEG 248 14.3.3 Benefits of
composite classifiers 249 14.4 Summary 249 15 Future of the Field 251 15.1
Image Forensics 251 15.2 Conclusions and notes 253 Bibliography 255 Index
263