Melissa Ailem
TitreCitée parAnnée
Co-clustering document-term matrices by direct maximization of graph modularity
M Ailem, F Role, M Nadif
Proceedings of the 24th ACM International on Conference on Information and …, 2015
Sparse poisson latent block model for document clustering
M Ailem, F Role, M Nadif
IEEE Transactions on Knowledge and Data Engineering 29 (7), 1563-1576, 2017
Graph modularity maximization as an effective method for co-clustering text data
M Ailem, F Role, M Nadif
Knowledge-Based Systems 109, 160-173, 2016
Unsupervised text mining for assessing and augmenting GWAS results
M Ailem, F Role, M Nadif, F Demenais
Journal of biomedical informatics 60, 252-259, 2016
Model-based co-clustering for the effective handling of sparse data
M Ailem, F Role, M Nadif
Pattern Recognition 72, 108-122, 2017
Non-negative matrix factorization meets word embedding
M Ailem, A Salah, M Nadif
Proceedings of the 40th International ACM SIGIR Conference on Research and …, 2017
Word Co-Occurrence Regularized Non-Negative Matrix Tri-Factorization for Text Data Co-Clustering
A Salah, M Ailem, M Nadif
Thirty-Second AAAI Conference on Artificial Intelligence, 2018
A Probabilistic Model for Joint Learning of Word Embeddings from Texts and Images
M Ailem, B Zhang, A Bellet, P Denis, F Sha
EMNLP, 2018
A way to boost semi-nmf for document clustering
A Salah, M Ailem, M Nadif
Proceedings of the 2017 ACM on Conference on Information and Knowledge …, 2017
Topic Augmented Generator for Abstractive Summarization
M Ailem, B Zhang, F Sha
arXiv preprint arXiv:1908.07026, 2019
Amortized Inference of Variational Bounds for Learning Noisy-OR
Y Yan, M Ailem, F Sha
arXiv preprint arXiv:1906.02428, 2019
Sparsity-sensitive diagonal co-clustering algorithms for the effective handling of text data
M Ailem
Sorbonne Paris Cité, 2016
Factorisation matricielle non-négative sémantique
M Febrissy, A Salah, M Ailem, M Nadif
Société Francophone de Classification (SFC) Actes des 26èmes Rencontres, 15, 0
Le système ne peut pas réaliser cette opération maintenant. Veuillez réessayer plus tard.
Articles 1–13