Maybe, but imo there's an infinitely-recursive (up until hardware or wetware limits) nature of introspection to consciousness. I can reflect on reflecting on reflecting on reflecting etc etc, until I stack overflow. But on each level of reflection, I can do something useful wrt adjusting the processes that I am reflecting on.
I'm not sure how you could program this recursive nature unless you generalized what it means to be introspective. But by definition, introspection defines itself. The process of introspection has to also be aware of itself in order to "break out" to the higher level.
Maybe, but imo there's an infinitely-recursive (up until hardware or wetware limits) nature of introspection to consciousness. I can reflect on reflecting on reflecting on reflecting etc etc, until I stack overflow. But on each level of reflection, I can do something useful wrt adjusting the processes that I am reflecting on.
I don't see why one couldn't write program to do something - a subroutine that called-itself and analyzed some of the contents on up the stack in some vaguely useful way. One can pretty easily write a program that approximating satisfies most vague descriptions of consciousness. It doesn't "feel" like "real consciousness" but that brings up other questions.
Exactly. But to your last point, what are feelings if not data? For example, if the system was programmed to interpret feedback threatening its higher level goals as negative data (anxiety, fear) and feedback supporting its goals as positive data (happiness, joy), then feelings can very much be represented in the model.
I'm not sure how you could program this recursive nature unless you generalized what it means to be introspective. But by definition, introspection defines itself. The process of introspection has to also be aware of itself in order to "break out" to the higher level.