# Quelle:Rh/Neal 2000

## < Quelle:Rh

32.151Seiten in
diesem Wiki

Angaben zur Quelle [Bearbeiten]

 Autor Radford M. Neal Titel Markov Chain Sampling Methods for Dirichlet Process Mixture Models Zeitschrift Journal of Computational and graphical Statistics Herausgeber American Statistical Association Jahr 2000 Jahrgang 9 Nummer 2 Seiten 249-265 URL http://www.cs.princeton.edu/courses/archive/fall11/cos597C/reading/Neal2000a.pdf Literaturverz. ja Fußnoten ja Fragmente 3

Fragmente der Quelle:
 Zuletzt bearbeitet: 2012-07-31 19:43:04 Graf Isolan

 Typus KomplettPlagiat Bearbeiter Hindemith Gesichtet
Untersuchte Arbeit:
Seite: 144, Zeilen: 16-22
Quelle: Neal 2000
Seite(n): 249, Zeilen: 25-31
The use of DPM models has become computationally feasible with the development of Markov chain methods for sampling from the posterior distribution of the parameters of the component distributions and of the associations of mixture components with observations. Methods based on Gibbs sampling can easily be implemented for models based on conjugate prior distributions, but when nonconjugate priors are used, as is appropriate in many contexts, straightforward Gibbs sampling requires that an often difficult numerical integration be performed. Use of Dirichlet process mixture models has become computationally feasible with the development of Markov chain methods for sampling from the posterior distribution of the parameters of the component distributions and/or of the associations of mixture components with observations. Methods based on Gibbs sampling can easily be implemented for models based on conjugate prior distributions, but when non-conjugate priors are used, as is appropriate in many contexts, straightforward Gibbs sampling requires that an often difficult numerical integration be performed.
 Anmerkungen Die Quelle wird nach diesem Abschnitt genannt, aber nicht als Quelle eines Zitates oder Gedanken und auch nicht im Zusammenhang mit diesem Abschnitt. Sichter (Hindemith), KnallErbse

 Zuletzt bearbeitet: 2012-07-31 19:48:15 Graf Isolan

 Typus Verschleierung Bearbeiter Hindemith Gesichtet
Untersuchte Arbeit:
Seite: 149, Zeilen: 1-16
Quelle: Neal 2000
Seite(n): 261, Zeilen: 3ff
[In this case we will introduce m temporary auxiliary variables that represent] possible values for the parameters of components that are not associated with any other observations. We then update cl by Gibbs sampling with respect to the distribution that includes these auxiliary parameters.

Because of the fact that the observations yl are exchangeable, and the component labels cl are arbitrary, we can assume that we are updating ci for the last observation, and that the cj for other observations have values in the set {1,...,k-}, We can now visualize the conditional prior distribution for cl given the other cj in terms of these m auxiliary components and their associated parameters. The probability of cl being equal to a c in {1,... ,k-} will be n-l,c / (L — 1 + a), where n-l,c is the number of times c occurs among the cj for j ǂ l. The probability of cl having some other value will be a / (L — 1 + a) which we will split equally among the m auxiliary components we have introduced.

The first step in using this representation to update cl is to sample from the conditional distribution of these auxiliary parameters given the current value of cl and the rest of the state. In the case of the conditional probability (5.5.9) ( i.e., n-l,c > 0) the auxiliary parameters have no connection with the rest of the state or the observations, and are simply drawn independently from G0.

[...], we will introduce temporary auxiliary variables that represent possible values for the parameters of components that are not associated with any other observations. We then update ci by Gibbs sampling with respect to the distribution that includes these auxiliary parameters.

Since the observations yi are exchangeable, and the component labels ci are arbitrary, we can assume that we are updating ci for the last observation, and that the cj for other observations have values in the set {1,...,k-}, [...] We can now visualize the conditional prior distribution for ci given the other cj in terms of m auxiliary components and their associated parameters. The probability of ci being equal to a c in {1,...,k-} will be n-i,c / (n — 1 + a), where n-i,c is the number of times c occurs among the cj for j ǂ i. The probability of ci having some other value will be a / ( n - l + a ) , which we will split equally among the m auxiliary components we have introduced. [...]

[...] The first step in using this representation to update ci is to sample from the conditional distribution of these auxiliary parameters given the current value of ci and the rest of the state. If ci =cj for some j ǂ i, the auxiliary parameters have no connection with the rest of the state, or the observations, and are simply drawn independently from G0

 Anmerkungen Auf der Vorseite wird die Quelle wie folgt erwähnt: "To approximate this integral we use the algorithm 8 by Neal (2000)". Nicht erwähnt wird, dass der erklärende Text auch wörtlich aus der Quelle stammt Sichter (Hindemith), KnallErbse

 Zuletzt bearbeitet: 2014-01-07 15:57:02 Schumann

 Typus Verschleierung Bearbeiter Hindemith Gesichtet
Untersuchte Arbeit:
Seite: 149, Zeilen: 22-26
Quelle: Neal 2000
Seite(n): 261, 262, Zeilen: 28-29, 1-4
Following, a Gibbs sampling update is performed for cl in this representation of the posterior distribution. Since ci must be either one of the components associated with other observations or one of the auxiliary components that were introduced, we can easily do Gibbs sampling by evaluating the relative probabilities of these possibilities. Once a new value for cl has been chosen, we discard all values that are not associated with an observation. We now perform a Gibbs sampling update for ci in this representation of the posterior distribution. Since ci must be either one of the components associated with other

[Page 262]

observations or one of the auxiliary components that were introduced, we can easily do Gibbs sampling by evaluating the relative probabilities of these possibilities. Once a new value for ci has been chosen, we discard all ϕ values that are not now associated with an observation.

 Anmerkungen Dem Leser ist zwar klar, dass der Verfasser hier die Anwendung des Algorithmus 8 von Neal auf das untersuchte Problem beschreibt. Nicht klar ist allerdings, dass er dies mit den Worten Neals tut. Sichter (Hindemith), KnallErbse