That would legitimately help me out. I use LLMs a lot for simple data restructuring, or rewording of explanations when I’m reading through certain sources. I was worried they would just do a simple ChatGPT API integration and have that be the end of it, but maybe this will end up being something I’d actually use.
Wait, it’ll actually let you use local LLMs?
That would legitimately help me out. I use LLMs a lot for simple data restructuring, or rewording of explanations when I’m reading through certain sources. I was worried they would just do a simple ChatGPT API integration and have that be the end of it, but maybe this will end up being something I’d actually use.