Contributing support for 120b
#15
by
WyattTheSkid
- opened
Hey I saw that you said that you would consider doing the 120b model if there’s enough demand. I am making this thread because I use the 120b model almost every day. It can be ran well on 2 3090s with 10 layers on system ram which is not super uncommon or unobtainable for a normal person. Anyways, the 120b model is incredible for it’s size but it suffers from almost lobotomizing censorship which your efforts would solve, thus making the model that much more useful.