Verkauf durch Sack Fachmedien

Singh

Entropy Theory and Its Application in Environmental and Water Engineering

Medium: Buch
ISBN: 978-1-119-97656-1
Verlag: Wiley
Erscheinungstermin: 18.02.2013
Lieferfrist: bis zu 10 Tage
Entropy Theory and its Application in Environmental and Water Engineering responds to the need for a book that deals with basic concepts of entropy theory from a hydrologic and water engineering perspective and then for a book that deals with applications of these concepts to a range of water engineering problems. The range of applications of entropy is constantly expanding and new areas finding a use for the theory are continually emerging. The applications of concepts and techniques vary across different subject areas and this book aims to relate them directly to practical problems of environmental and water engineering.

The book presents and explains the Principle of Maximum Entropy (POME) and the Principle of Minimum Cross Entropy (POMCE) and their applications to different types of probability distributions. Spatial and inverse spatial entropy are important for urban planning and are presented with clarity. Maximum entropy spectral analysis and minimum cross entropy spectral analysis are powerful techniques for addressing a variety of problems faced by environmental and water scientists and engineers and are described here with illustrative examples.

Giving a thorough introduction to the use of entropy to measure the unpredictability in environmental and water systems this book will add an essential statistical method to the toolkit of postgraduates, researchers and academic hydrologists, water resource managers, environmental scientists and engineers. It will also offer a valuable resource for professionals in the same areas, governmental organizations, private companies as well as students in earth sciences, civil and agricultural engineering, and agricultural and rangeland sciences.

This book:

* Provides a thorough introduction to entropy for beginners and more experienced users
* Uses numerous examples to illustrate the applications of the theoretical principles
* Allows the reader to apply entropy theory to the solution of practical problems
* Assumes minimal existing mathematical knowledge
* Discusses the theory and its various aspects in both univariate and bivariate cases
* Covers newly expanding areas including neural networks from an entropy perspective and future developments.

Produkteigenschaften


  • Artikelnummer: 9781119976561
  • Medium: Buch
  • ISBN: 978-1-119-97656-1
  • Verlag: Wiley
  • Erscheinungstermin: 18.02.2013
  • Sprache(n): Englisch
  • Auflage: 1. Auflage 2013
  • Produktform: Gebunden
  • Gewicht: 1247 g
  • Seiten: 662
  • Format (B x H x T): 196 x 241 x 38 mm
  • Ausgabetyp: Kein, Unbekannt

Autoren/Hrsg.

Autoren

Singh, Vijay P

Preface, xv

Acknowledgments, xix

1 Introduction, 1

1.1 Systems and their characteristics, 1

1.2 Informational entropies, 7

1.3 Entropy, information, and uncertainty, 21

1.4 Types of uncertainty, 25

1.5 Entropy and related concepts, 27

Questions, 29

References, 31

Additional References, 32

2 Entropy Theory, 33

2.1 Formulation of entropy, 33

2.2 Shannon entropy, 39

2.3 Connotations of information and entropy, 42

2.4 Discrete entropy: univariate case and marginal entropy, 46

2.5 Discrete entropy: bivariate case, 52

2.6 Dimensionless entropies, 79

2.7 Bayes theorem, 80

2.8 Informational correlation coefficient, 88

2.9 Coefficient of nontransferred information, 90

2.10 Discrete entropy: multidimensional case, 92

2.11 Continuous entropy, 93

2.12 Stochastic processes and entropy, 105

2.13 Effect of proportional class interval, 107

2.14 Effect of the form of probability distribution, 110

2.15 Data with zero values, 111

2.16 Effect of measurement units, 113

2.17 Effect of averaging data, 115

2.18 Effect of measurement error, 116

2.19 Entropy in frequency domain, 118

2.20 Principle of maximum entropy, 118

2.21 Concentration theorem, 119

2.22 Principle of minimum cross entropy, 122

2.23 Relation between entropy and error probability, 123

2.24 Various interpretations of entropy, 125

2.25 Relation between entropy and variance, 133

2.26 Entropy power, 135

2.27 Relative frequency, 135

2.28 Application of entropy theory, 136

Questions, 136

References, 137

Additional Reading, 139

3 Principle of Maximum Entropy, 142

3.1 Formulation, 142

3.2 POME formalism for discrete variables, 145

3.3 POME formalism for continuous variables, 152

3.4 POME formalism for two variables, 158

3.5 Effect of constraints on entropy, 165

3.6 Invariance of total entropy, 167

Questions, 168

References, 170

Additional Reading, 170

4 Derivation of Pome-Based Distributions, 172

4.1 Discrete variable and discrete distributions, 172

4.2 Continuous variable and continuous distributions, 185

Questions, 203

References, 208

Additional Reading, 208

5 Multivariate Probability Distributions, 213

5.1 Multivariate normal distributions, 213

5.2 Multivariate exponential distributions, 245

5.3 Multivariate distributions using the entropy-copula method, 258

5.4 Copula entropy, 265

Questions, 266

References, 267

Additional Reading, 268

6 Principle of Minimum Cross-Entropy, 270

6.1 Concept and formulation of POMCE, 270

6.2 Properties of POMCE, 271

6.3 POMCE formalism for discrete variables, 275

6.4 POMCE formulation for continuous variables, 279

6.5 Relation to POME, 280

6.6 Relation to mutual information, 281

6.7 Relation to variational distance, 281

6.8 Lin's directed divergence measure, 282

6.9 Upper bounds for cross-entropy, 286

Questions, 287

References, 288

Additional Reading, 289

7 Derivation of POME-Based Distributions, 290

7.1 Discrete variable and mean E[x] as a constraint, 290

7.2 Discrete variable taking on an infinite set of values, 298

7.3 Continuous variable: general formulation, 305

Questions, 308

References, 309

8 Parameter Estimation, 310

8.1 Ordinary entropy-based parameter estimation method, 310

8.2 Parameter-space expansion method, 325

8.3 Contrast with method of maximum likelihood estimation (MLE), 329

8.4 Parameter estimation by numerical methods, 331

Questions, 332

References, 333

Additional Reading, 334

9 Spatial Entropy, 335

9.1 Organization of spatial data, 336

9.2 Spatial entropy statistics, 339

9.3 One dimensional aggregation, 353

9.4 Another approach to spatial representation, 360

9.5 Two-dimensional aggregation, 363

9.6 Entropy maximization for modeling spatial phenomena, 376

9.7 Cluster analysis by entropy maximization, 380

9.8 Spatial visualization and mapping, 384

9.9 Scale and entropy, 386

9.10 Spatial probability distributions, 388

9.11 Scaling: rank size rule and Zipf's law, 391

Questions, 393

References, 394

Further Reading, 395

10 Inverse Spatial Entropy, 398

10.1 Definition, 398

10.2 Principle of entropy decomposition, 402

10.3 Measures of information gain, 405

10.4 Aggregation properties, 417

10.5 Spatial interpretations, 420

10.6 Hierarchical decomposition, 426

10.7 Comparative measures of spatial decomposition, 428

Questions, 433

References, 435

11 Entropy Spectral Analyses, 436

11.1 Characteristics of time series, 436

11.2 Spectral analysis, 446

11.3 Spectral analysis using maximum entropy, 464

11.4 Spectral estimation using configurational entropy, 483

11.5 Spectral estimation by mutual information principle, 486

References, 490

Additional Reading, 490

12 Minimum Cross Entropy Spectral Analysis, 492

12.1 Cross-entropy, 492

12.2 Minimum cross-entropy spectral analysis (MCESA), 493

12.3 Minimum cross-entropy power spectrum given auto-correlation, 503

12.4 Cross-entropy between input and output of linear filter, 509

12.5 Comparison, 512

12.6 Towards efficient algorithms, 514

12.7 General method for minimum cross-entropy spectral estimation, 515

References, 515

Additional References, 516

13 Evaluation and Design of Sampling and Measurement Networks, 517

13.1 Design considerations, 517

13.2 Information-related approaches, 518

13.3 Entropy measures, 521

13.4 Directional information transfer index, 530

13.5 Total correlation, 537

13.6 Maximum information minimum redundancy (MIMR), 539

Questions, 553

References, 554

Additional Reading, 556

14 Selection of Variables and Models, 559

14.1 Methods for selection, 559

14.2 Kullback-Leibler (KL) distance, 560

14.3 Variable selection, 560

14.4 Transitivity, 561

14.5 Logit model, 561

14.6 Risk and vulnerability assessment, 574

Questions, 578

References, 579

Additional Reading, 580

15 Neural Networks, 581

15.1 Single neuron, 581

15.2 Neural network training, 585

15.3 Principle of maximum information preservation, 588

15.4 A single neuron corrupted by processing noise, 589

15.5 A single neuron corrupted by additive input noise, 592

15.6 Redundancy and diversity, 596

15.7 Decision trees and entropy nets, 598

Questions, 602

References, 603

16 System Complexity, 605

16.1 Ferdinand's measure of complexity, 605

16.2 Kapur's complexity analysis, 618

16.3 Cornacchio's generalized complexity measures, 620

16.4 Kapur's simplification, 627

16.5 Kapur's measure, 627

16.6 Hypothesis testing, 628

16.7 Other complexity measures, 628

Questions, 631

References, 631

Additional References, 632

Author Index, 633

Subject Index, 639