Multi-level Deep Models for Forum Sentiment Classification
Authors
Term
4. term
Education
Publication year
2017
Submitted on
2017-06-09
Pages
80
Abstract
Multiple attempts have been made to address the problem of target dependent sentiment classification, however it is mostly done for tweets. Another interesting domain is that of forum threads. Analysing forums with cancer patients' described effects of supplementation for their treatment, could be beneficial for other patients looking to improve their situation. To solve this problem we propose three LSTM-based methods, which models the natural hierarchy of forum threads, by considering both word and post-level contexts with regards to targets. Each model captures this hierarchy differently, exploring the relation of word and document embeddings, and how multiple labels and loss functions can improve performance. We manually annotate threads from various cancer-related forums for evaluating our models. Our results show potential in our models designs, and that given more data, they should be able to outperform previous models.
Multiple attempts have been made to address the problem of target dependent sentiment classification, however it is mostly done for tweets. Another interesting domain is that of forum threads. Analysing forums with cancer patients' described effects of supplementation for their treatment, could be beneficial for other patients looking to improve their situation. To solve this problem we propose three LSTM-based methods, which models the natural hierarchy of forum threads, by considering both word and post-level contexts with regards to targets. Each model captures this hierarchy differently, exploring the relation of word and document embeddings, and how multiple labels and loss functions can improve performance. We manually annotate threads from various cancer-related forums for evaluating our models. Our results show potential in our models designs, and that given more data, they should be able to outperform previous models.
Keywords
Documents
