## FANDOM

33.175 Seiten

 Typus Verschleierung Bearbeiter Hindemith Gesichtet
Untersuchte Arbeit:
Seite: 149, Zeilen: 1-16
Quelle: Neal 2000
Seite(n): 261, Zeilen: 3ff
[In this case we will introduce m temporary auxiliary variables that represent] possible values for the parameters of components that are not associated with any other observations. We then update cl by Gibbs sampling with respect to the distribution that includes these auxiliary parameters.

Because of the fact that the observations yl are exchangeable, and the component labels cl are arbitrary, we can assume that we are updating ci for the last observation, and that the cj for other observations have values in the set {1,...,k-}, We can now visualize the conditional prior distribution for cl given the other cj in terms of these m auxiliary components and their associated parameters. The probability of cl being equal to a c in {1,... ,k-} will be n-l,c / (L — 1 + a), where n-l,c is the number of times c occurs among the cj for j ǂ l. The probability of cl having some other value will be a / (L — 1 + a) which we will split equally among the m auxiliary components we have introduced.

The first step in using this representation to update cl is to sample from the conditional distribution of these auxiliary parameters given the current value of cl and the rest of the state. In the case of the conditional probability (5.5.9) ( i.e., n-l,c > 0) the auxiliary parameters have no connection with the rest of the state or the observations, and are simply drawn independently from G0.

[...], we will introduce temporary auxiliary variables that represent possible values for the parameters of components that are not associated with any other observations. We then update ci by Gibbs sampling with respect to the distribution that includes these auxiliary parameters.

Since the observations yi are exchangeable, and the component labels ci are arbitrary, we can assume that we are updating ci for the last observation, and that the cj for other observations have values in the set {1,...,k-}, [...] We can now visualize the conditional prior distribution for ci given the other cj in terms of m auxiliary components and their associated parameters. The probability of ci being equal to a c in {1,...,k-} will be n-i,c / (n — 1 + a), where n-i,c is the number of times c occurs among the cj for j ǂ i. The probability of ci having some other value will be a / ( n - l + a ) , which we will split equally among the m auxiliary components we have introduced. [...]

[...] The first step in using this representation to update ci is to sample from the conditional distribution of these auxiliary parameters given the current value of ci and the rest of the state. If ci =cj for some j ǂ i, the auxiliary parameters have no connection with the rest of the state, or the observations, and are simply drawn independently from G0

 Anmerkungen Auf der Vorseite wird die Quelle wie folgt erwähnt: "To approximate this integral we use the algorithm 8 by Neal (2000)". Nicht erwähnt wird, dass der erklärende Text auch wörtlich aus der Quelle stammt Sichter (Hindemith), KnallErbse