@tante This hasn't been my experience. I run a local 12B Mistral model and it does a great job summarizing things. Of course things break down eventually, but the context length has to get pretty darn long before that happens.
https://mastodon.jordanwages.com/@wagesj45/113525906075886946
No replies.
────
────