This is the most likely reality according to my AI friends.
It's very cheap to clone any existing chatbot by simply using it to generate training data.
Evidently if you ask DeepSeek what AI it is... it typically says its one of the other top AIs' which @rekcahdam tells me is normally a sign of model cloning.
Anyways, this is all beside the point, DeepSeek isn't the most interesting AI project rn . . .
front-end.social/@mxbck/113910…
Max Böck (@mxbck@front-end.social)
Attached: 1 image OpenAI now claims that Deepseek used its model to train their competitor. Huge if trueFront-End Social
Dieser Beitrag wurde bearbeitet. (1 Tag her)
nullagent
Als Antwort auf nullagent • • •Real match up is DeepSeek vs Sky-T1. Forget #OpenAI.
DeepSeek spent millions either cloning existing AIs or training their own. The resulting AI is big and can't run on consumer hardware with decent quality (ie: It's a pretty bad coder). But it's otherwise similar to OpenAI's o1.
Meanwhile a lab at Berkeley just open sourced their model and all training data. It only costs $450 to reproduce the Berkely team's novel training approach 🤯
That's not all
novasky-ai.github.io/posts/sky…
#SkyT1 #DeepSeek
Sky-T1: Train your own O1 preview model within $450
novasky-ai.github.io