—————-
prenwxc 发表评论于 2025-01-29 06:00:00 是distill 不是distall. "OpenAI found evidence of “distillation,” which it believes came from DeepSeek. Distillation is a process where AI firms use an already trained large AI model to train smaller models. The “student” models will match similar results to the “teacher” AI in specific tasks."’
—————-
prenwxc 发表评论于 2025-01-29 06:00:00 是distill 不是distall. "OpenAI found evidence of “distillation,” which it believes came from DeepSeek. Distillation is a process where AI firms use an already trained large AI model to train smaller models. The “student” models will match similar results to the “teacher” AI in specific tasks."
是distill 不是distall. "OpenAI found evidence of “distillation,” which it believes came from DeepSeek. Distillation is a process where AI firms use an already trained large AI model to train smaller models. The “student” models will match similar results to the “teacher” AI in specific tasks."_------------------------ 回复网友评论 令胡冲 -----------DeepSeek没有Distall ChatGPT。句号。它也distall不了,OpenAI这四年来任何模型和数据都没有来源。DeepSeek只能蒸馏自己的模型,去微调其它小模型。 -----------------
——————-
prenwxc 发表评论于 2025-01-29 05:09:28 deepseek 本身就是建立在 open source LLM 之上的. openAI, meta... 都是贡献者。deepseek used distillation to learn from other models, and used other models in reinforced learning. 已经有人试了问deepseek 自己是谁,who are you, 它的回答是我是百分百的Microsoft. 可见copilot/chatgpt 影响之深
deepseek 本身就是建立在 open source LLM 之上的. openAI, meta... 都是贡献者。deepseek used distillation to learn from other models, and used other models in reinforced learning. 已经有人试了问deepseek 自己是谁,who are you, 它的回答是我是百分百的Microsoft. 可见copilot/chatgpt 影响之深
回复网友评论 令胡冲 -----------
It collects all dataset from everywhere. GPT generated dataset is only a small part - no different f...
-----------------