16 Comments
Apr 7Liked by Cameron R. Wolfe, Ph.D.

Thanks for this fantastically detailed write-up!

Since I am from a computer vision background I have seen MoEs used for a different modality than text. I have seen them being used "conditionally fuse" information based on the "quality and content" of various inputs.

Imagine a CNN that does semantic segmentation of a scene with multi-modal inputs such as RGB images, infrared image, etc. The model then learns to "weigh" the output of each modality branch. The weighting is conditioned on the inputs. So if the RGB image is washed out due to high exposure because your RGB camera is facing the sun, the model can give the RGB branch a lower weight and prefer information from other branches to produce the segmentation mask output.

Expand full comment
author

Yep totally! In general, language models don't have a clear analysis showing that certain experts specialize in certain skills, but my gut feeling is that you can use analysis similar to the paper below to find some type of specialization.

https://transformer-circuits.pub/2023/monosemantic-features

Expand full comment
Apr 9Liked by Cameron R. Wolfe, Ph.D.

That is interesting! Thanks for sharing the relevant paper. :)

Expand full comment
Mar 28Liked by Cameron R. Wolfe, Ph.D.

Cameron, thiscwas a really excellent overview- shows your impressive command of the materials. Would love to see a book by you on the topic.

Expand full comment
author

Thanks for the kind words. I might try to write a book in the future once I build up enough content on the newsletter to serve as a starting point :)

Expand full comment
Mar 21Liked by Cameron R. Wolfe, Ph.D.

Absolutely fantastic article, thank you!

Expand full comment
author

Glad you liked it! Thanks for reading

Expand full comment
Mar 20Liked by Cameron R. Wolfe, Ph.D.

Thank you for sharing this

Expand full comment
author

Of course! Thank you for reading 🙂

Expand full comment
Mar 22Liked by Cameron R. Wolfe, Ph.D.

Great work

Expand full comment
author

Thanks! Thank you for reading

Expand full comment
Mar 19Liked by Cameron R. Wolfe, Ph.D.

Great!

Expand full comment
author

Thanks!

Expand full comment
Mar 18Liked by Cameron R. Wolfe, Ph.D.

This was great for me, thanks Cameron, you went ALL out! I’ve invested (year ago) in an MOE network called BitTensor and thought I knew this. Did not but do now. I’m not exactly sure if they’re MOE still, are you familiar with this network, and if so any thoughts on the mechanism underlying it? There are a number of very highly qualified AI groups building on it. I would like to as well but haven’t learnt enough yet.

Expand full comment
author

I'm not familiar with BitTensor, but it looks interesting!

Expand full comment

Yes, very much so, if you find it compelling enough and are available - I’m looking for an AI consultant to hire regarding building my position on the network (fine-tuning, host models).

Expand full comment