6 Comments
User's avatar
kaveinthran's avatar

will you cover tree of thoughts?

Expand full comment
Cameron R. Wolfe, Ph.D.'s avatar

Not on the roadmap right now, but I'll add it to cover eventually. Thanks for the recommendation.

Expand full comment
Tyler Corderman's avatar

The fact that you share this much knowledge with us is incredible. I'm deeply grateful Cameron for all your work.

Expand full comment
Cameron R. Wolfe, Ph.D.'s avatar

Thanks so much for reading and for the kind words!

Expand full comment
Lazaro  Hurtado's avatar

Since CoT increases common sense rationality why does knowledge generation require an external LLM? Mostly since this technique reaps great benefits from common sense reasoning would it not be preferred to have an LLM that has an understanding of general knowledge itself rather than outsourcing it?

Expand full comment
Cameron R. Wolfe, Ph.D.'s avatar

Great question. Yes, your intuition is correct (in my opinion). Notably, we don't need an external LLM for generated knowledge prompting. We could actually prompt the same LLM to "provide output about <this topic>". If we do that explicitly, the LLM actually benefits from outputting this information and using it as context for future output. This technique is comparable to how humans get better at answering questions when they take time to write out and organize their thoughts (obviously this is a super rough analogy, but hopefully you get the idea).

Expand full comment