Inverted Dirichlet distribution

In statistics, the inverted Dirichlet distribution is a multivariate generalization of the beta prime distribution, and is related to the Dirichlet distribution. It was first described by Tiao and Cuttman in 1965.

The distribution has a density function given by



p\left(x_1,\ldots, x_k\right) = \frac{\Gamma\left(\nu_1+\cdots+\nu_{k+1}\right)}{\prod_{j=1}^{k+1}\Gamma\left(\nu_j\right)} x_1^{\nu_1-1}\cdots x_k^{\nu_k-1}\times\left(1+\sum_{i=1}^k x_i\right)^{-\sum_{j=1}^{k+1}\nu_j},\qquad x_i>0.$$

The distribution has applications in statistical regression and arises naturally when considering the multivariate Student distribution. It can be characterized by its mixed moments:



E\left[\prod_{i=1}^kx_i^{q_i}\right] = \frac{\Gamma\left(\nu_{k+1}-\sum_{j=1}^k q_j\right)}{\Gamma\left(\nu_{k+1}\right)}\prod_{j=1}^k\frac{\Gamma\left(\nu_j+q_j\right)}{\Gamma\left(\nu_j\right)} $$

provided that $$q_j>-\nu_j, 1\leqslant j\leqslant k$$ and $$\nu_{k+1}>q_1+\ldots+q_k$$.

The inverted Dirichlet distribution is conjugate to the negative multinomial distribution if a generalized form of odds ratio is used instead of the categories' probabilities- if the negative multinomial parameter vector is given by $$p$$, by changing parameters of the negative multinomial to $$x_i = \frac{p_i}{p_0}, i = 1\ldots k$$ where $$p_0 = 1 - \sum_{i=1}^{k} p_i$$.

T. Bdiri et al. have developed several models that use the inverted Dirichlet distribution to represent and model non-Gaussian data. They have introduced finite and infinite mixture models of inverted Dirichlet distributions using the Newton–Raphson technique to estimate the parameters and the Dirichlet process to model infinite mixtures. T. Bdiri et al. have also used the inverted Dirichlet distribution to propose an approach to generate Support Vector Machine kernels basing on Bayesian inference and another approach to establish hierarchical clustering.