This book examines non-Gaussian distributions. It addresses the causes and consequences of non-normality and time dependency in both asset returns and option prices. The book is written for non-mathematicians who want to model financial market prices so the emphasis throughout is on practice. There are abundant empirical illustrations of the models and techniques described, many of which could be equally applied to other financial time series.
Published by: Springer | Publication date: 04/05/2007Kindle book details: Kindle Edition, 541 pages
Gaussian Markov Random Fields: Throey and Applications (Chapman & Hall/CRC Monographs on Statistics & Applied Probability)
No description available
Published by: CRC Press | Publication date: 04/16/2007Kindle book details: Kindle Edition, 280 pages
Unsupervised Machine Learning in Python: Master Data Science and Machine Learning with Cluster Analysis, Gaussian Mixture Models, and Principal Components Analysis
In a real-world environment, you can imagine that a robot or an artificial intelligence won’t always have access to the optimal answer, or maybe there isn’t an optimal correct answer. You’d want that robot to be able to explore the world on its own, and learn things just by looking for patterns.Think about the large amounts of data being collected today, by the likes of the NSA, Google, and other organizations. No human could possibly sift through all that data manually. It was reported recently in the Washington Post and Wall Street Journal that the National Security Agency collects so much surveillance data, it is no longer effective.Could automated pattern discovery solve this problem?Do you ever wonder how we get the data that we use in our supervised machine learning algorithms?Kaggle always seems to provide us with a nice CSV, complete with Xs and corresponding Ys.If you haven’t been involved in acquiring data yourself, you might not have thought about this, but someone has to make this data!A lot of the time this involves manual labor. Sometimes, you don’t have access to the correct information or it is infeasible or costly to acquire.You still want to have some idea of the structure of the data.This is where unsupervised machine learning comes into play.In this book we are first going to talk about clustering. This is where instead of training on labels, we try to create our own labels. We’ll do this by grouping together data that looks alike.The 2 methods of clustering we’ll talk about: k-means clustering and hierarchical clustering.Next, because in machine learning we like to talk about probability distributions, we’ll go into Gaussian mixture models and kernel density estimation, where we talk about how to learn the probability distribution of a set of data.One interesting fact is that under certain conditions, Gaussian mixture models and k-means clustering are exactly the same! We’ll prove how this is the case.Lastly, we’ll look at the theory behind principal components analysis or PCA. PCA has many useful applications: visualization, dimensionality reduction, denoising, and de-correlation. You will see how it allows us to take a different perspective on latent variables, which first appear when we talk about k-means clustering and GMMs.All the algorithms we’ll talk about in this course are staples in machine learning and data science, so if you want to know how to automatically find patterns in your data with data mining and pattern extraction, without needing someone to put in manual work to label that data, then this book is for you.All of the materials required to follow along in this book are free: You just need to able to download and install Python, Numpy, Scipy, Matplotlib, and Sci-kit Learn.
Publication date: 05/22/2016Kindle book details: Kindle Edition, 38 pages
Gaussian Processes on Trees: From Spin Glasses to Branching Brownian Motion (Cambridge Studies in Advanced Mathematics)
Branching Brownian motion (BBM) is a classical object in probability theory with deep connections to partial differential equations. This book highlights the connection to classical extreme value theory and to the theory of mean-field spin glasses in statistical mechanics. Starting with a concise review of classical extreme value statistics and a basic introduction to mean-field spin glasses, the author then focuses on branching Brownian motion. Here, the classical results of Bramson on the asymptotics of solutions of the F-KPP equation are reviewed in detail and applied to the recent construction of the extremal process of BBM. The extension of these results to branching Brownian motion with variable speed are then explained. As a self-contained exposition that is accessible to graduate students with some background in probability theory, this book makes a good introduction for anyone interested in accessing this exciting field of mathematics.
Published by: Cambridge University Press | Publication date: 10/20/2016Kindle book details: Kindle Edition, 211 pages
Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dimensional data and variable selection. The remainder of the text explores advanced topics of functional regression analysis, including novel nonparametric statistical methods for curve prediction, curve clustering, functional ANOVA, and functional regression analysis of batch data, repeated curves, and non-Gaussian data.Many flexible models based on Gaussian processes provide efficient ways of model learning, interpreting model structure, and carrying out inference, particularly when dealing with large dimensional functional data. This book shows how to use these Gaussian process regression models in the analysis of functional data. Some MATLAB® and C codes are available on the first author’s website.
Published by: Chapman and Hall/CRC | Publication date: 07/01/2011Kindle book details: Kindle Edition, 216 pages
Stochastic Analysis for Gaussian Random Processes and Fields: With Applications (Chapman & Hall/CRC Monographs on Statistics & Applied Probability)
Stochastic Analysis for Gaussian Random Processes and Fields: With Applications presents Hilbert space methods to study deep analytic properties connecting probabilistic notions. In particular, it studies Gaussian random fields using reproducing kernel Hilbert spaces (RKHSs).The book begins with preliminary results on covariance and associated RKHS before introducing the Gaussian process and Gaussian random fields. The authors use chaos expansion to define the Skorokhod integral, which generalizes the Itô integral. They show how the Skorokhod integral is a dual operator of Skorokhod differentiation and the divergence operator of Malliavin. The authors also present Gaussian processes indexed by real numbers and obtain a Kallianpur–Striebel Bayes' formula for the filtering problem. After discussing the problem of equivalence and singularity of Gaussian random fields (including a generalization of the Girsanov theorem), the book concludes with the Markov property of Gaussian random fields indexed by measures and generalized Gaussian random fields indexed by Schwartz space. The Markov property for generalized random fields is connected to the Markov process generated by a Dirichlet form.
Published by: Chapman and Hall/CRC | Publication date: 06/23/2015Kindle book details: Kindle Edition, 201 pages
The book is based on the observation that communication is the central operation of discovery in all the sciences. In its "active mode" we use it to "interrogate" the physical world, sending appropriate "signals" and receiving nature's "reply". In the "passive mode" we receive nature's signals directly. Since we never know a prioriwhat particular return signal will be forthcoming, we must necessarily adopt a probabilistic model of communication. This has developed over the approximately seventy years since it's beginning, into a Statistical Communication Theory (or SCT). Here it is the set or ensemble of possible results which is meaningful. From this ensemble we attempt to construct in the appropriate model format, based on our understanding of the observed physical data and on the associated statistical mechanism, analytically represented by suitable probability measures. Since its inception in the late '30's of the last century, and in particular subsequent to World War II, SCT has grown into a major field of study. As we have noted above, SCT is applicable to all branches of science. The latter itself is inherently and ultimately probabilistic at all levels. Moreover, in the natural world there is always a random background "noise" as well as an inherent a priori uncertainty in the presentation of deterministic observations, i.e. those which are specifically obtained, a posteriori. The purpose of the book is to introduce Non-Gaussian statistical communication theory and demonstrate how the theory improves probabilistic model. The book was originally planed to include 24 chapters as seen in the table of preface. Dr. Middleton completed first 10 chapters prior to his passing in 2008. Bibliography which represents remaining chapters are put together by the author's close colleagues; Drs. Vincent Poor, Leon Cohen and John Anderson. email email@example.com to request Ch.10
Published by: Wiley-IEEE Press | Publication date: 05/11/2012Kindle book details: Kindle Edition, 664 pages
This textbook is the result of the enhancement of several courses on non-equilibrium statistics, stochastic processes, stochastic differential equations, anomalous diffusion and disorder. The target audience includes students of physics, mathematics, biology, chemistry, and engineering at undergraduate and graduate level with a grasp of the basic elements of mathematics and physics of the fourth year of a typical undergraduate course. The little-known physical and mathematical concepts are described in sections and specific exercises throughout the text, as well as in appendices. Physical-mathematical motivation is the main driving force for the development of this text.It presents the academic topics of probability theory and stochastic processes as well as new educational aspects in the presentation of non-equilibrium statistical theory and stochastic differential equations.. In particular it discusses the problem of irreversibility in that context and the dynamics of Fokker-Planck. An introduction on fluctuations around metastable and unstable points are given. It also describes relaxation theory of non-stationary Markov periodic in time systems. The theory of finite and infinite transport in disordered networks, with a discussion of the issue of anomalous diffusion is introduced. Further, it provides the basis for establishing the relationship between quantum aspects of the theory of linear response and the calculation of diffusion coefficients in amorphous systems.
Published by: Springer | Publication date: 03/07/2017Kindle book details: Kindle Edition, 556 pages
Multilevel and Longitudinal Modeling Using Stata, Third Edition, by Sophia Rabe-Hesketh and Anders Skrondal, looks specifically at Stata’s treatment of generalized linear mixed models, also known as multilevel or hierarchical models. These models are “mixed” because they allow fixed and random effects, and they are “generalized” because they are appropriate for continuous Gaussian responses as well as binary, count, and other types of limited dependent variables.Volume I is devoted to continuous Gaussian linear mixed models and has nine chapters organized into four parts. The first part reviews the methods of linear regression. The second part provides in-depth coverage of two-level models, the simplest extensions of a linear regression model.Volume II is devoted to generalized linear mixed models for binary, categorical, count, and survival outcomes. The second volume has seven chapters also organized into four parts. The first three parts in volume II cover models for categorical responses, including binary, ordinal, and nominal (a new chapter); models for count data; and models for survival data, including discrete-time and continuous-time (a new chapter) survival responses. The fourth and final part in volume II describes models with nested and crossed-random effects with an emphasis on binary outcomes.
Published by: Stata Press | Publication date: 04/02/2012Kindle book details: Kindle Edition, 2 pages
Gaussian processes can be viewed as a far-reaching infinite-dimensional extension of classical normal random variables. Their theory presents a powerful range of tools for probabilistic modelling in various academic and technical domains such as Statistics, Forecasting, Finance, Information Transmission, Machine Learning - to mention just a few. The objective of these Briefs is to present a quick and condensed treatment of the core theory that a reader must understand in order to make his own independent contributions. The primary intended readership are PhD/Masters students and researchers working in pure or applied mathematics. The first chapters introduce essentials of the classical theory of Gaussian processes and measures with the core notions of reproducing kernel, integral representation, isoperimetric property, large deviation principle. The brevity being a priority for teaching and learning purposes, certain technical details and proofs are omitted. The later chapters touch important recent issues not sufficiently reflected in the literature, such as small deviations, expansions, and quantization of processes. In university teaching, one can build a one-semester advanced course upon these Briefs.
Published by: Springer | Publication date: 01/11/2012Kindle book details: Kindle Edition, 134 pages