To search, Click
below search items.
|
|
All
Published Papers Search Service
|
Title
|
Comprehending Text Meaning through Similarity
|
Author
|
Adeel Ahmed, Imran Amin, Muhammad Mubashir Khan
|
Citation |
Vol. 24 No. 12 pp. 39-46
|
Abstract
|
This paper addresses how natural language processing (NLP) works with deep learning models to understand meaning of words in text. In this work, vector space models representing words into continuous vector representations are employed for identification of semantic and syntactic similarity between words in text articles. The model is trained and evaluated on unlabeled news articles [29], [30]. The model is implemented with continuous bag-of-words (CBOW) and skip-gram (SG) architectures with negative sampling (NES) and hierarchical softmax (HS) techniques. The model is evaluated on word similarity task, analogy tasks and vector compositionality to identify linear structure of word vectors representations. Computationally, the cost of training time and required memory for two architectures trained with two techniques is compared. It is observed that architectures trained with HS are expensive to train and more memory intensive than NES. Moreover, the findings of the evaluations on different task is presented representing both semantic and syntactic regularities in word embeddings.
|
Keywords
|
word similarity, deep learning, unstructured text, natural language processing
|
URL
|
http://paper.ijcsns.org/07_book/202412/20241205.pdf
|
|