美光这类做芯片的肯定不喜欢DeepSeek,油管有人讲了他的感受
回答: 硅谷AI霸权和神话的破灭开端:浅谈DeepSeek R1的意义 由 nohup 于 2025-01-22 15:19
DeepSeek不需要强大的power: I am running deepseek-r1:14b-qwen-distill-q8_0 variant locally (Ollama on Kubuntu 24.04) on my cheap ASRock DeskMeet X600 PC without dedicated GPU. My AMD Ryzen 5 8600G has 16 TOPS only and a 65 watts power limit. I have 64GB RAM which can be FULLY USED for inference. Inference is slow. Highly complex tasks (prompts) are running up to 5 minutes but even writing a well-structured prompt takes me more time. And the result saves me hours of work. The PC supports up to 128GB RAM, therefore running a 70B model should work perfectly when time is no issue. Due to the low power consumption there are no heat problems. So you trade speed against unlimited model size - for me that is the perfect solution, especially considering that this is a <$1,000 setup). Smaller models run naturally much faster!
|