2 Comments

"much of human reasoning involves tacit knowledge gained through experience and intuition, which we reuse in all kinds of novel ways"

One analogy that's recurred to me: our raw experiences are like an audio stream, from which we discern common patterns (eg pauses, syllables, inflections), distill atomic representations (vocabulary expansion), which we then use to more compactly encode past/new experiences, imagine hypotheticals, etc (recount the experience). Intuition & insight could arise from a clearer perception of pattern on these distilled representations above underlying raw phenomena.

Crucially, such representations may comprise not only concepts (~"nouns") but also actions (~"verbs"), qualifiers (~"adjectives"/"numerics"), desires (~"imperatives"), etc. Francois Chollet has similarly analogized this as LLMs learning from its training data a library of vector programs, which could be recombined during test-time compute to represent & "reason" about new tasks.

This may suggest a few ways to stay ahead of DBOTs of the future:

1) experience & encode novel phenomena outside training data

2) derive novel representations of known experiences

3) derive novel recombinations of known representations

Expand full comment

Chollet's discussion post OpenAI o3 release: https://arcprize.org/blog/oai-o3-pub-breakthrough

Expand full comment