Discussion about this post

User's avatar
David Huang's avatar

"much of human reasoning involves tacit knowledge gained through experience and intuition, which we reuse in all kinds of novel ways"

One analogy that's recurred to me: our raw experiences are like an audio stream, from which we discern common patterns (eg pauses, syllables, inflections), distill atomic representations (vocabulary expansion), which we then use to more compactly encode past/new experiences, imagine hypotheticals, etc (recount the experience). Intuition & insight could arise from a clearer perception of pattern on these distilled representations above underlying raw phenomena.

Crucially, such representations may comprise not only concepts (~"nouns") but also actions (~"verbs"), qualifiers (~"adjectives"/"numerics"), desires (~"imperatives"), etc. Francois Chollet has similarly analogized this as LLMs learning from its training data a library of vector programs, which could be recombined during test-time compute to represent & "reason" about new tasks.

This may suggest a few ways to stay ahead of DBOTs of the future:

1) experience & encode novel phenomena outside training data

2) derive novel representations of known experiences

3) derive novel recombinations of known representations

Expand full comment
BillW's avatar

It's time to call time - - - please report out on the DBOT vs students analyzing BYD. Please also describe the bot and any opportunity for outsiders (myself) to access it.

Thanks.

Expand full comment
2 more comments...

No posts