Where do you draw the line between high-level and low-level objects when defining values?

Tags
EA
AI Alignment
Human values
Reference Box
Date created
Sep 21, 2022 07:17 PM
Related Main Box
Now, I already made the point that we value things that we know of, don’t know and will never know.
Nevertheless, how do we define what we value? Where do we draw the line between “too low-level” and “too abstract”. We look for a sweet spot in between.
For example: I don’t care about the exact positions of the molecules of the seat I am sitting on right now. Instead I care about the general concept of “a chair”. But I also don’t care about things that appear to be chairs.
E.g. I care about smiling people, but I don’t care about a muscle spasm that appears to be a smile.
This leads to me to an idea that we value internal states differently from objects.