there’s much bullshit in ccp claims. due to lack of transparency in spend and cost from tiongs, it’s most likely that deepshit oops deepseek spent more than a mere “$5m” to train from r1 to v3. there are other cost centers not included in the total cost calculation, such as prior research and cost of innovation and development plus multiple iterations in testing and tweaking. all such costs are typically ignored by tiong management as labor and brain power and or intellectual contribution are conveniently dismissed as insignificant. most ccp models only take into account cost of hardware, material, and other accessory purchases. space, power, time, labor, ipr are typically ignored or rendered insignificant. thus, the total cost to train the next model should be about $69m, not less than $6.9m. about two thirds of $100m which is the american cost to train chat-gpt4 from gpt3. american cost is bloated by labor, ipr and patent prep and filing beside the usual data center cost of hardware, space, power. and there’s the cost of acquiring training data and cost of time required to iterate, regression, testing, verification, etc. don’t believe any bullshit from ccp cocksuckers and prc porlumpars.