- Yunli Wang, Yu Wu, Lili Mou, Zhoujun Li and Wenhan Chao. Formality style transfer with shared latent space. To appear In COLING, 2020.
- Kashif Khan, Gaurav Sahu, Vikash Balasubramanian, Lili Mou, Olga Vechtomova.
Adversarial learning on the latent space for diverse dialog generation. To appear In COLING, 2020.
- Jingjing Li, Zichao Li, Lili Mou, Xin Jiang, Michael Lyu, Irwin King.
Unsupervised text generation by learning from search. In Advances in Neural Information Processing System (NeurIPS), 2020. [pdf]
- Yixing Luan, Bradley Hauer, Lili Mou, Grzegorz Kondrak. Improving word sense disambiguation with translations. In Proceedings of the 2020 Conference on Empiricla Methods in Natural Language Processing (EMNLP), pages 4055--4065, 2020. [pdf]
- Lili Mou, Olga Vechtomova.
Stylized text generation: Approaches and applications. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (ACL): Tutorial Abstracts, pages 19--22, 2020. [pdf, video]
- Dhruv Kumar, Lili Mou, Lukasz Golab, Olga Vechtomova.
Iterative edit-based unsupervised sentence simplification. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (ACL), pages 7918--7928, 2020. [pdf]
- Raphael Schumann, Lili Mou, Yao Lu, Olga Vechtomova, Katja Markert.
Discrete optimization for unsupervised sentence summarization with word-level extraction. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (ACL), pages 5032--5042, 2020. [pdf]
- Xianggen Liu, Lili Mou, Fandong Meng, Hao Zhou, Jie Zhou, Sen Song. Unsupervised paraphrasing by simulated annealing. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (ACL), pages 302--312, 2020. [pdf]
- Nabiha Asghar,1 Lili Mou,1 Kira A Selby, Kevin D Pantasdo, Pascal Poupart, Xin Jiang. Progressive memory banks for incremental domain adaptation.
In Proceedings of the International Conference on Learning Representations (ICLR), 2020. [pdf]
- Zeyu Sun, Qihao Zhu, Yingfei Xiong, Yican Sun, Lili Mou, Lu Zhang. TreeGen: A tree-based Transformer architecture for code generation. In Proceedings of the Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI), pages 8984--8991, 2020.
[pdf]
- Xianggen Liu, Lili Mou, Haotian Cui, Zhengdong Lu, Sen Song. Finding decision jumps in text classification. Neurocomputing, vol 371, pages 177--187, 2020. [url]
Selected Refereed Papers
[Selection is NOT based on the venue or length of a paper.]
- Bowen Li, Lili Mou, Frank Keller. An imitation learning approach to unsupervised parsing. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL-short), pages 3485--3492, 2019. [pdf, slides (courtesy of BL)] (Best paper nomination)
- Hareesh Bahuleyan, Lili Mou, Hao Zhou, Olga Vechtomova. Stochastic Wasserstein autoencoder for probabilistic sentence generation. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT short), pages 4068--4076, 2019. [pdf, slides]
- Ning Miao, Hao Zhou, Lili Mou, Rui Yan, Lei Li. CGMH: Constrained sentence generation by Metropolis-Hastings sampling. In Proceedings of the Thirty-Third AAAI Conference on Artificial Intelligence (AAAI), pages 6834--6942, 2019. [pdf]
- Hareesh Bahuleyan,1 Lili Mou,1 Olga Vechtomova, Pascal Poupart. Variational attention for sequence-to-sequence models. In Proceedings of the International Conference on Computational Linguistics (COLING), pages 1672--1682, 2018. Also presented at TADGM Workshop @ICML, 2018. [pdf, slides]
- Lili Mou, Zhengdong Lu, Hang Li, Zhi Jin. Coupling distributed and symbolic execution for natural language queries. In Proceedings of the 34th International Conference on Machine Learning (ICML), pages 2518--2526, 2017. Also presented in ICLR Workshop, 2017. [pdf, slides (courtesy of ZL), slides]
- Lili Mou, Zhao Meng, Rui Yan, Ge Li, Yan Xu, Lu Zhang, Zhi Jin. How transferable are neural networks in NLP applications? In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 478--489, 2016.
[pdf]
- Yunchuan Chen, Lili Mou, Yan Xu, Ge Li, Zhi Jin. Compressing neural language models by sparse word representations. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (ACL), pages 226--235, 2016. [pdf]
- Lili Mou, Ge Li, Lu Zhang, Tao Wang, Zhi Jin.
Convolutional neural networks over tree structures for programming language processing.
In Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence (AAAI), pages 1287--1293, 2016.
[pdf]
- Lili Mou,1 Hao Peng,1 Ge Li, Yan Xu, Lu Zhang, Zhi Jin. Discriminative neural sentence modeling by tree-based convolution. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 2315--2325, 2015. [pdf,
slides]
- Yan Xu, Lili Mou, Ge Li, Yunchuan Chen, Hao Peng, Zhi Jin. Classifying relations via long short term memory networks along shortest dependency paths. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 1785--1794, 2015. [pdf, slides (courtesy of YX)]
|