Word Embedding

WordEmbedding is a tf.keras.layers.Embedding layer with pre-trained Word2Vec/GloVe Emedding weights.

kashgari.embeddings.WordEmbedding.__init__(self, w2v_path: str, *, w2v_kwargs: Dict[str, Any] = None, **kwargs)
Parameters:
  • w2v_path – Word2Vec file path.
  • w2v_kwargs – params pass to the load_word2vec_format() function of gensim.models.KeyedVectors
  • kwargs – additional params