You are viewing a single comment's thread from:

RE: Open-Source LLMs: How good is Vicuna-13b?

in STEMGeekslast year

I agree, those results are pretty good considering the model size. Definitely not as good as the commercial versions, but it also doesn't have the massive computing power available to it. It will be interesting to see how much it will improve over the next six to twelve months.

Great post, it was really interesting!
!DHEDGE

Sort:  

Thx, also Vucina is quite an old model, but was one of the best, released in March 23, will test newer models based on Llama2 soon and share the results here.

That will be interesting! Can't wait!