Chinese AI startup DeepSeek is making waves with its claim of a theoretical cost-profit ratio of 545% per day, thanks to its AI models V3 and R1. This means that for every dollar spent on inference tasks, DeepSeek is generating $5.45 in revenue.
To put this into perspective, DeepSeek's daily inference cost for its V3 and R1 models is around $87,072, while its theoretical daily revenue is a whopping $562,027. This translates to a potential annual revenue of over $200 million.
What's interesting is that DeepSeek's AI models are powered by Nvidia's H800 chips, which are reportedly less powerful than those used by US-based AI firms like OpenAI. Despite this, DeepSeek's claims have raised eyebrows, with investors questioning the massive spending on cutting-edge chips by US AI firms.
However, it's worth noting that DeepSeek's actual revenue is likely to be significantly lower than its theoretical projections, due to factors like lower costs for using its V3 model and reduced developer fees during off-peak hours.