请问HN:消费类人工智能设备是一个可行的想法吗?

1作者: spprashant13 天前原帖
我怀疑在某个时刻,当前形式的语言模型(LLM)会被认为足够好,可以用于一般的研究和编码任务。我不明白为什么我们还需要继续采用事实上的云计算方式。在我看来,云计算解决了操作复杂性,这值得支付额外的费用。但似乎只要有合适的硬件,运行一个开源模型并没有那么复杂。随着时间的推移,我怀疑这些模型会变得更好且更便宜。 未来会不会有这样一种情况,人们可以像购买电视一样,从百思买(BestBuy)购买“人工智能”?它可能会预装一些模型——如果是开源的,价格会更便宜,而前沿实验室的模型则会定价较高。硬件基本上是一堆足够用于本地推理的GPU。 把它带回家,插入你的家庭网络,你可以通过任何本地设备访问IP来打开一个聊天实例。如果你愿意,可以让它访问互联网。也许它还可以接收OTA(空中下载)更新。 我很好奇其他人对此的看法——本地优先的人工智能是否感觉像是一种可能性?这方面有哪些经济和社会挑战?
查看原文
I suspect at some point LLM in its current form will be deemed good enough for general research and coding tasks. I don&#x27;t get why we need to continue with a de-facto cloud-based approach. Cloud in my opinion solves operational complexity, which is worth paying a premium for. But it seems it isn&#x27;t quite all that complex to get an open source model running locally as long as you have the hardware. Over time I suspect the models get better and cheaper.<p>Is there a future where we can expect people to just buy &quot;AI&quot; from BestBuy, like a TV set? It ll probably come with some model preloaded - cheaper if open-source, premium pricing for frontier lab models. The hardware is basically a bunch of GPUs enough for local inference.<p>Take it home and plug it into your home network and you can open a chat instance by going to the IP on any local device. You can give it access to internet if you want. Maybe it can also receive OTA updates.<p>Curious how others think about this - does local-first AI feel like a possibility? What are the economic and social challenges with this?