Natural abstractions form relatively to their agent

Natural abstractions form relatively to their agent

This is an idea that came to me after writing
Natural abstractions don’t overlap
. How heavily do natural abstractions actually overlap? How should be expect a totally alien cognitive architecture to form natural abstractions?
It seems to me like abstraction based around a perception of time (e.g. an abstraction based on the velocity of a rotating gear) might feel different for an AI which might lack a perception of time or have a different perception of time.

Source: