Since CoT increases common sense rationality why does knowledge generation require an external LLM? Mostly since this technique reaps great benefits from common sense reasoning would it not be preferred to have an LLM that has an understanding of general knowledge itself rather than outsourcing it?
Great question. Yes, your intuition is correct (in my opinion). Notably, we don't need an external LLM for generated knowledge prompting. We could actually prompt the same LLM to "provide output about <this topic>". If we do that explicitly, the LLM actually benefits from outputting this information and using it as context for future output. This technique is comparable to how humans get better at answering questions when they take time to write out and organize their thoughts (obviously this is a super rough analogy, but hopefully you get the idea).
will you cover tree of thoughts?
Not on the roadmap right now, but I'll add it to cover eventually. Thanks for the recommendation.
The fact that you share this much knowledge with us is incredible. I'm deeply grateful Cameron for all your work.
Thanks so much for reading and for the kind words!
Since CoT increases common sense rationality why does knowledge generation require an external LLM? Mostly since this technique reaps great benefits from common sense reasoning would it not be preferred to have an LLM that has an understanding of general knowledge itself rather than outsourcing it?
Great question. Yes, your intuition is correct (in my opinion). Notably, we don't need an external LLM for generated knowledge prompting. We could actually prompt the same LLM to "provide output about <this topic>". If we do that explicitly, the LLM actually benefits from outputting this information and using it as context for future output. This technique is comparable to how humans get better at answering questions when they take time to write out and organize their thoughts (obviously this is a super rough analogy, but hopefully you get the idea).