1 Comment

Very cool! Do you think it would be possible to build an llm into a command so you wouldn't have to run an always-on server?

Expand full comment