Why aren’t you using pretrained models?
To show how simply this can be done, let's build a semantic search function that could be of use for anyone tasked with writing text. To find better words and expressions, you need to think of a word, look it up, and then chase references to explore the possibilities. What if, in addition to this forward search, a computer could look in the other direction? Then you may want to look at the code, even if you are not a programmer: we need about 16 lines of Python to load the data, run it through a neural network, index it and start searching. With the Webster-Vectors in memory, we can now query this dataset by encoding a search phrase into a query vector. To search for words, we compare how close the query is to vectors in the dataset. All that's left to do for us is to return the words associated with these neighbor vectors.