Effective interrelation of Bayesian nonparametric document clustering and embedded-topic modeling
Academic Article
Publication Date:
2021
abstract:
Topic modeling can be synergically interrelated with document clustering. We present an innovative unsupervised approach to the interrelationship of topic modeling with document clustering. The devised approach exploits Bayesian generative modeling and posterior inference, to seamlessly unify and jointly carry out the two tasks, respectively. Specifically, a Bayesian nonparametric model of text collections, formulates an unprecedented interrelationship of word-embedding topics with a Dirichlet process mixture of cluster components. The latter enables countably infinite clusters and permits the automatic inference of their actual number in a statistically principled manner. All latent clusters and topics under the foresaid model are inferred through collapsed Gibbs sampling and parameter estimation. An extensive empirical study of the presented approach is effected on benchmark real-world corpora of text documents. The experimental results demonstrate its higher effectiveness in partitioning text collections and coherently discovering their semantics, compared to state-of-the-art competitors and tailored baselines. Computational efficiency is also looked into under different conditions, in order to provide an insightful analysis of scalability.
Iris type:
01.01 Articolo in rivista
Keywords:
Text analysis; Word embeddi; Topic modeling; Document clustering; Bayesian nonparametrics; Dirichlet process clustering
List of contributors:
Ortale, Riccardo; Costa, Giovanni
Published in: