目前适合消费级硬件的最佳大型语言模型(LLM)是什么?是phi-4吗?

14作者: VladVladikoff6 天前原帖
我有一块5060ti显卡,配备16GB显存。我在寻找一个能够进行基本对话的模型,不需要涉及物理或高级数学。理想情况下,它应该能够以合理的速度运行,接近实时。
查看原文
I have a 5060ti with 16GB VRAM. I’m looking for a model that can hold basic conversations, no physics or advanced math required. Ideally something that can run reasonably fast, near real time.