Making almost exactly $500/mo on an Anki extension that embeds AI / text to speech / image gen deeply into the app, allowing you to generate example sentences, audio, explanations, etc, for whatever you’re studying, in bulk.
Still holding off on the show HN post for now; have a few more features and QoL things I’d like to add first.
It’s been an enormously gratifying project and I hear from users all around the world who have feature requests for their specific use cases. Easily the most fun I’ve had working on a project.
Almost all of my customers so far have been directly from the central Anki plugin directory. I made sure to use lots of SEO friendly terms / buzzwords in the title so that when people ctrl+f for AI or ChatGPT, they find mine.
My next steps I think are to better incentivize leaving reviews so that it ranks higher on the add-on list, and then launch it on various language learning subreddits. There’s a whole cottage industry of Anki influencers on YouTube as well (absurd, I know), so that’s another channel eventually.
Frankly I think it’s the opposite - Apple is one of the only BigCo without an advertising based biz model. Unlike say Meta, Apple didn’t profit directly from increased engagement with your iPhone (at least to a sizable extent), they profit when you purchase a new device. This alignment of incentives is what allows Apple to at least marginally prioritize user privacy in a way Meta/ Google just structurally cannot.
40% (and growing) of Apple’s profits are from services. Margins on services are 3x of hardware.
Apple doesn’t make money directly when you doom scroll but a lot of App Store revenue is a by product of people simply using their device in unlocked state.
ksimple is eight bit. 128 is the unsigned middle or one plus signed max. usually using it for null or error signal. on sixty for bit k implementations it would be two to the sixty three.
It looks like Mitchell is using an agentic framework called Amp (I’d never heard of it) - does anybody else here use it or tried it? Curious how it stacks up against Claude Code.
I haven't yet spent any time with it myself, but the impression I have been getting is that it is the most credible of the vendor-independent terminal coding agents right now.
Claude Code, Codex CLI and Gemini CLI are all (loosely) locked to their own models.
Reason to stick with Claude Code or Codex CLI is that they use the respective platforms’ subscriptions, like Claude Max or ChatGPT-Pro. Unless I’m mistaken, with the other cli apps like Amp you get charged by token usage which can add up to a lot more than the $200/mo that Claude Max 20x or ChatGPT Pro cost.
Shameless plug for anybody who has been through the hell that is Anki card creation for language learning - I built an LLM powered extension for Anki that allows you to wire up fields to arbitrary prompts, and then generate notes in batch (or selectively per field). I use it every day for generating example sentences, definitions, and TTS. Would have quit Anki ages ago without this.
FWIW I did get a lot more mileage from building my own deck vs a custom deck too, would recommend that approach regardless once you're past the initial vocab bootstrapping phrase.
Wow, so feature rich, congrats on the release. Love that the mnemonic generation takes into account your existing memory anchors (didn’t know that term).
I’ve actually been working on a similar-ish Anki Plugin for about 6 months - it can autogenerate any field via LLM in bulk, as well as images and TTS. I’m not explicitly targeting the med school use case as much yours (I use it for language learning), and it’s more GUI centric/geared towards non-technical Anki users who don’t want to fiddle with a bunch of different API keys etc. Was planning to launch HN soon but you beat me to the punch!
Thank you very much for the kind word but I have to return the complement. The app you are making is very feature-rich too, seems easy to use and affordable too!
By the way, the new LLM by deepseek called deepseek-chat is very good and actually on par with Claude Sonnet 3.5, so you might want to give it a try. Openrouter and litellm help a lot.
Also, my projects are not actually aimed at medical learners, I just happened to be one! I went to create lengths to make it adaptable to any learning scenario and types of learners, etc :).
Looking at your project, it seems like there is definitely an opportunity for merging efforts. If I get that right, you are not implementing few shot learning and a few other features like mnemonics and anchors and major system. Do you think there is a possibility that wa can talk reusing some of my code and features? I can think of ways to make few shot learning painless UI wise.
I semi-secretly made that release to -> make my scripts more well known to -> help me find people who could help me make it available for laymen (I am painfully aware that my features are currently only available for nerds although I made great efforts to document the whole thing and make good code!).
Honestly, it looks like your awesome project might be the opportunity I was looking for! Are you interested in trying to make it so good that we can scale and even make (on average) better doctors :D!? Please get in touch!
https://smart-notes.xyz
Still holding off on the show HN post for now; have a few more features and QoL things I’d like to add first.
It’s been an enormously gratifying project and I hear from users all around the world who have feature requests for their specific use cases. Easily the most fun I’ve had working on a project.
reply