Originally I wouldn’t have thought position was so important in the way AI works, it didn’t seem like a super-important parameter to me, but it turns out that it is
Also, for a very long time in life I’ve never considered the order that things occur in to be overly important, surely it’s the same no matter what, but as life went on I’ve come to regard the order in which things happen as perhaps important after all
It’s now starting to occur to me that ‘position’ in the AI context and ‘order’ in the context of sequences of stuff happening in life are pretty much the same thing
Now I’m wondering if the order in which a person is presented with discrete packages of information or concepts makes a big difference, and I think it does, and further, I think it can possibly be detected – in the future it might be that a person’s current understanding can be dismantled into the contributory concepts that that person has been exposed to and comprehended, but more importantly, in which order did they trip over those concepts in their life
(A further parameter might be ‘how much did they understand those concept packages, and does their understanding comprise any misunderstandings?)
Could it be possible in the future to analyse how a person sequentially came to understand their present position in their life, broken down into the sequence of contributory chunks, and further could it then be possible to re-order the presentation of those chunks in a powerful enough way to make the understanding so much more useful or better or productively happy (or do we have to erase the brain and start again)