Notion's AI Gamble Pays Off: From Note-Taking App to Digital Brain
In 2022, Notion's founders skipped their company offsite. Instead, they locked themselves in a room with OpenAI'
Chinese AI lab DeepSeek just dropped their most ambitious creation yet. The DeepSeek V3 model tips the scales at a staggering 641GB, pushing the boundaries of what consumer hardware can handle.
MLX developer Awni Hannun wasted no time. Within hours of release, he had the model purring along at over 20 tokens per second. The catch? You'll need Apple's M3 Ultra Mac Studio - a $9,499 piece of "consumer" hardware that costs more than some cars.
The model marks a significant shift in DeepSeek's approach. They've abandoned their previous custom license in favor of the MIT license, opening the door for broader experimentation and development. The empty README suggests they're letting the code speak for itself.
For those watching their storage space, there's hope. A 4-bit quantization reduces the model to a mere 352GB - still massive, but more manageable for serious enthusiasts. The files come split into 163 chunks, each a hefty piece of the AI puzzle.
OpenRouter integration brings accessibility to those without deep pockets. Free API keys work smoothly, though some users report mysterious "policy" errors when trying to use paid keys with certain privacy settings.
DeepSeek's decision to bake the release date into the model name (DeepSeek-V3-0324) suggests confidence in their rapid development cycle. They're not just building models; they're marking their territory in the AI timeline.
The model handles everything from basic chat functions to creative tasks. It can even generate SVG images of pelicans riding bicycles - though whether that justifies the 641GB footprint remains debatable.
For developers ready to dive in, the setup process is straightforward. A few command-line instructions get you rolling with the llm-mlx plugin, assuming your hardware can handle it. The OpenRouter plugin offers an alternative path, complete with API key management.
The pelican-drawing capabilities might seem frivolous, but they demonstrate the model's versatility. When prompted about pelican facts, it delivers detailed, structured responses complete with markdown formatting and emoji flair.
Why this matters:
Read on, my dear:
Fuel your morning with AI insights. Lands in your inbox 6 a.m. PST daily. Grab it free now! 🚀