I specified "simple algorithm" because I think it's hard to argue that a piece of traditional computer programming possesses intentional states.
Agreed.
I'm considerably more open-minded about modern learning systems, though I would be interested in hearing why you think some already do have limited intentional states.
If intentional states are mental states that are
about other things (in the world or other mental states), they are states with referrents; in this sense, it seems to me that any system that has a representation of some other thing, and uses this representation in lieu of or with respect to that other thing, has intentional states in respect of that thing.
This means, for example, that a system that can represent aspects of its environment, then act in the environment (e.g. navigate) according to its representation, can be said to have intentional states in respect of those aspects of its environment.
I am not sure what you mean by "poorly defined in functional terms," though. If you mean that these issues are hard to approach empirically, I agree, but that's the nature of the beast when dealing with the subjective side of theory of mind.
The problem I see is that if intentional states are to do with representation and/or modeling, then, by a kind of reductio ad absurdum, all computational systems, and even simple mechanical systems, could be said to have intentional states if they manipulate and/or act on internal representations of things. For example, even a thermostat can be said to represent environmental temperature in the bending of a bimetallic strip, and to act on certain states of that representation... The sole purpose of computing systems is to manipulate and act on representations of other things - but where is the line between simple computation and an intentional state?
Clearly, a thermostat doesn't have mental states, but then how do we define a 'mental state' in this context?
The definition of 'intentional state' I used above is obviously inadequate, so what are the functional characteristics of an intentional state? i.e. how do we recognise one? what behaviours characterise an intentional state?
And when attempting to define precisely what constitutes an intentional state, we should be wary of begging the question by requiring irrelevant properties, e.g. "intentional states are representations of things in a human brain, therefore,
by definition, other animals can't have intentional states..."
I would be surprised if there isn't a pragmatic definition somewhere, but I don't recall seeing one.