questionet
RWKV, 본문
https://arxiv.org/pdf/2305.13048.pdf
RWKV Language Model
RWKV GUI with one-click install and API for v4 v5 v6 Official RWKV pip package for v4 v5 v6 Fast GPU inference server (nvidia/amd/intel) for v4 v5 v6 Fast CPU/cuBLAS/CLBlast inference, int4/int8/fp16/fp32 for v4 v5 Simple training, for any GPU / CPU Latest
www.rwkv.com
RWKV Language Model
RWKV Language Model RWKV (pronounced as RwaKuv) is an RNN with GPT-level LLM performance, which can also be directly trained like a GPT transformer (parallelizable). RWKV is an Open Source, non profit group, under the linux foundation. Supported by our spo
wiki.rwkv.com
https://huggingface.co/blog/rwkv
Introducing RWKV - An RNN with the advantages of a transformer
Introducing RWKV - An RNN with the advantages of a transformer ChatGPT and chatbot-powered applications have captured significant attention in the Natural Language Processing (NLP) domain. The community is constantly seeking strong, reliable and open-sourc
huggingface.co
'Deep learning > NLP 모델 설명' 카테고리의 다른 글
Attention, self-attention, transformer (1) | 2021.03.14 |
---|---|
Recurrent Neural Network (RNN) (0) | 2021.03.13 |
WordNet (0) | 2021.02.25 |
LSTM, GRU (0) | 2021.02.25 |