定價: | ||||
售價: | 59元 | |||
庫存: | 已售完 | |||
LINE US! | ||||
此書為本公司代理,目前已售完,有需要可以向line客服詢問進口動向 | ||||
付款方式: | 超商取貨付款 |
![]() |
|
信用卡 |
![]() |
||
線上轉帳 |
![]() |
||
物流方式: | 超商取貨 | ||
宅配 | |||
門市自取 |
為您推薦
類似書籍推薦給您
類似書籍推薦給您
類似書籍推薦給您
【簡介】 Inferring latent structure and causality is crucial for understanding underlying patterns and relationships hidden in the data. This book covers selected models for latent structures and causal networks and inference methods for these models.After an introduction to the EM algorithm on incomplete data, the book provides a detailed coverage of a few widely used latent structure models, including mixture models, hidden Markov models, and stochastic block models. EM and variation EM algorithms are developed for parameter estimation under these models, with comparison to their Bayesian inference counterparts. We make further extensions of these models to related problems, such as clustering, motif discovery, Kalman filtering, and exchangeable random graphs. Conditional independence structures are utilized to infer the latent structures in the above models, which can be represented graphically. This notion generalizes naturally to the second part on graphical models that use graph separation to encode conditional independence. We cover a variety of graphical models, including undirected graphs, directed acyclic graphs (DAGs), chain graphs, and acyclic directed mixed graphs (ADMGs), and various Markov properties for these models. Recent methods that learn the structure of a graphical model from data are reviewed and discussed. In particular, DAGs and Bayesian networks are an important class of mathematical models for causality. After an introduction to causal inference with DAGs and structural equation models, we provide a detailed review of recent research on causal discovery via structure learning of graphs. Finally, we briefly introduce the causal bandit problem with sequential intervention.
類似書籍推薦給您
【簡介】 This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning. Includes over 210 student exercise