- Inside the Creator Economy
- Posts
- DEEPSEEK IS GREAT FOR CREATORS
DEEPSEEK IS GREAT FOR CREATORS
There's a golden age coming round.
Hi, welcome to Inside the Creator Economy newsletter! You are receiving this because you attended one of my meetups at Web Summit, VidCon, Creator World or 1 Billion Followers Summit - thanks for attending!
Or you’re one of our regular Beehiiv subscribers - if so THANKS for being part of our Beehiiv audience.
The newsletter comes out weekly every Monday - but this is a special edition about DeepSeek. You’ll start receiving the weekly newsletter tomorrow.
We also distribute on LinkedIn - if you subscribe there, thanks - but I’d prefer you subscribe here rather than there (or both - but they are exactly the same!)
You can unsubscribe here - but please give it a chance first!
And if you like it, tell all your friends!
Either way, hope to see you at a meetup soon, at NAB in Las Vegas, at VidCon, Open Sauce or Web Summit. And probably more to come too.
- Jim
Special Weekend Edition of Inside the Creator Economy focused 100% on the new DeepSeek AI model and why it’s good for creators. The regular Monday edition will be out – as always – on Monday.
Unless you hold stock in Nvidia or have invested in frontier AI companies lacking strong application adoption, you should welcome the generative AI advances from Deep Seek. Why? Because so many useful creator tools now use AI to power their breakthrough features - and that adds significant cost.
For example, the AI-heavy Spotter Studio costs $100 a month (currently half-off in year 1). Competitor VidIQ, with a more limited AI toolset costs about $17 a month. Video editor Descript includes an AI “Underlord”, but with only 20 uses per month at the base subscription rate, it’s mostly useless.
Other tools, like Opus Clip, Auggie, Rask and more, charge based on usage – whether tokens, minutes or other forms of pay-as-you-go pricing. Most offload their generative AI processing to OpenAI, Anthropic or other external GenAI models via token-based API calls, sending queries and receiving responses back via an exchange of “tokens”. Think of a token as a word – although longer works break down into multiple tokens.
Those API calls are expensive and add up fast, which is why the more you use those AI-enabled products the more you pay. Most small and mid-sized creators get priced out - exactly the ones that arguably need them the most.
But DeepSeek’s breakthrough has already started to upend pricing. Their API pricing is 5x-39x cheaper than the supposedly comparable Open-AI models and a whopping 180x-230x cheaper than OpenAI's latest model. Like to like, creator tool companies could save 5-7x or more by rerouting their external API calls from OpenAI to DeepSeek today. It’s not a simple plug-and-play swap, but it’s not rocket science either.
That’s a ridiculous price drop. Moore’s law had computer CPUs doubling in speed for the same price every two years. It carried us from the room-sized computers of the 60s to that candy-bar shaped super-computer in your pocket.
We’ve just seen the effective cost of using AI drop by a factor of 10x or more in one week. If it continues, Moore’s Law will quickly look like a geologic anachronism.
That sound you heard among AI-enabled companies serving creators was a collective sigh of relief. Expect prices to drop – or tiers to collapse --quickly, as the smarter AI-enabled tools offer advanced services for a hefty fraction of what slower and less nimble competitors can.
It’s already happening - as Microsoft and Perplexity have already added DeekSeek’s VS-R1 model into their services.
A few caveats. First, some believe that the Chinese government is subsidizing DeepSeek – and it definitely cost way more than $6M to train their models.
Second, sending queries to DeepSeek via their app or API means that a Chinese company now has owns your data – which raises the same security and privacy issues the US has with TikTok. I’m not willing to use their chat app, but YMMV.
It also appears that DeepSeek used a process called “distillation” to copy training data from OpenAI, Anthropic and others. Distillation is common in the AI Industry, and OpenAI, at least, has been accused of similarly sketchy model-training transgressions.
Even if you eschew DeepSeek, it’s mostly its mostly open source. That means other models will quickly adopt their training and/or pricing strategies. That will bring safer AI models much closer to DeepSeek’s outrageously low pricing with similar capability.
Meta, for example, is already hard at work replicating DeepSeek’s breakthroughs. On Thursday OpenAI rushed out its new 03-mini model, which adding reasoning at a much lower API cost – still 5-10x than DeepSeek-R1 (which also includes reasoning) – but way less than its standard 03 model.
(story continues below…)
SPONSOR: The NFL and YouTube have launched Access Pass for Legends, giving NFL greats access to official league footage for their YouTube content. Kicking off the initiative, Brandon Marshall breaks down his career highlights, with Cam Newton and J.T. O’Sullivan joining soon. This program opens new doors for athlete-driven media. Read more in Fast Company: https://www.fastcompany.com/91267698/theres-about-to-be-a-whole-new-generation-of-nfl-creators
Some AI researchers insist that DeepSeek’s training and pricing breakthroughs were predictable and expected. Either way, within a few months we’ll see dramatically more capability available to creator economy companies for significantly less money. And that’s good for everyone.
Of course, it also accelerates the already rampant AI slop invasion - and will accelerate the ability for custom AI content to be tailored individually to every internet user on the planet. But we’re not there yet.
Full disclosure, I asked ChatGPT to help with this analysis. It first feigned ignorance, saying, “I’m not aware of any publicly available, detailed pricing information for an API called “DeepSeek.” But after I shared their public API pricing, which can be found here (here's OpenAI’s API pricing too), it grudgingly concluded that “DeepSeek is by far the cheapest in terms of pure token pricing—and still offers a large 64K context window. If its output quality meets your needs, it’s extremely cost‐effective.” Note that a 64k context window isn’t enough for video and audio – but works great with text.
It then warned, “Always test actual model performance for your use case; cost savings can be undercut if the cheaper model requires more calls or yields lower‐quality results.” It’s right. But with a 5-10x cost savings, it would have to fail miserably to shift the narrative.
For a deeper look at DeepSeek, check out Trung Phan’s extensive analysis and meme farm.
Foundation VC Ashu Garg has a good piece on how investment and company formation will change.
Ali Baba releases its own updated Qwen model, claiming it beats OpenAi and DeepSeek, but it’s still much more expensive than DeepSeek – and almost 2x OpenAI’s 03-mini model.
Want to build a DeepSeek-R1 locally? Here’s a complete $6k PC Build and installation instructions. The most expensive part? A whopping 768GB of RAM.
Thanks to Paul Robert Cary for his suggestions as I refined this piece!
100% written by me – no human or AI ghostwriters were involved in the production (except for the cover art!).
Like this free newsletter? Buy me a coffee and say thanks! Or let’s do a meetup in your town.
Don’t miss another issue! Get this newsletter delivered in full to your inbox every Monday!
I've built and sold multiple creator economy startups to top media companies - including Discovery and Paramount. Subscribe here on LinkedIn to get this newsletter every Monday.
Let me know what you think – email me at [email protected]. Thanks for reading and see you around the internet.