higher-order-theories-of-consciousness

Higher-Order Theories of Consciousness

Higher-Order Theories of Consciousness

The most fundamental and commonly used notion of the term ‘conscious’ in philosophical circles is captured by Thomas Nagel’s famous “what it is like” sense (Nagel 1974). When I am in a conscious mental state, there is “something it is like” for me to be in that state from the subjective or first-person point of view. When I smell a rose or have a conscious visual experience, there is something it “seems” or “feels like” from my perspective. This is primarily the sense of “conscious state” that will be used throughout this entry. There is also something it is like to be a conscious creature whereas there is nothing it is like to be a table or tree.

Representational theories of consciousness attempt to reduce consciousness to “mental representations” rather than directly to neural or other physical states. This approach has been fairly popular over the past few decades. Examples include first-order representationalism (FOR) which attempts to explain conscious experience primarily in terms of world-directed (or first-order) intentional states (Tye 2005) as well as several versions of higher-order representationalism (HOR) which holds that what makes a mental state M conscious is that it is the object of some kind of higher-order mental state directed at M (Rosenthal 2005, Gennaro 2012). The primary focus of this entry is on HOR and especially higher-order thought (HOT) theory. The key question that should be answered by any theory of consciousness is: What makes a mental state a conscious mental state?

Section 1 introduces the overall representationalist approach to consciousness and briefly discuss Tye’s FOR. Section 2 presents three major versions of HOR: higher-order thought theory, dispositional higher-order thought theory, and higher-order perception theory. In section 3, a number of common and important objections and replies are presented. Section 4 briefly outlines a close connection between HOT theory and conceptualism, questo è, the claim that the representational content of a perceptual experience is entirely determined by the conceptual capacities the perceiver brings to bear in her experience. Section 5 examines several hybrid higher-order and “self-representational” theories of consciousness which all hold that conscious states are self-directed in some way. Section 6 addresses the potentially damaging claim that HOT theory requires neural activity in the prefrontal cortex (PFC) in order for one to have conscious states.

Sommario
Representationalism
Higher-Order Representationalism
Higher-Order Thought (HOT) Teoria
Dispositional HOT Theory
Higher-Order Perception (HOP) Teoria
Objections and Replies
HOT Theory and Conceptualism
Hybrid Higher-Order and Self-Representational Theories
HOT Theory and the Prefrontal Cortex
Riferimenti e approfondimenti
1. Representationalism

Representational theories of consciousness reduce consciousness to “mental representations” rather than directly to neural states. Examples include first-order representationalism (FOR) which attempts to explain conscious experience primarily in terms of world-directed (or first-order) intentional states (Tye 2005) as well as several versions of higher-order representationalism (HOR) which holds that what makes a mental state M conscious is that it is the object of some kind of higher-order mental state directed at M (Rosenthal 2005, Gennaro 2012). The primary focus of this entry is on HOR and especially higher-order thought (HOT) theory. The key question that should be answered by any theory of consciousness is: What makes a mental state a conscious mental state?

Some current theories attempt to reduce consciousness in mentalistic terms, such as ‘thoughts’ and ‘awareness,’ rather than directly in neurophysiological terms. One popular approach is to reduce consciousness to mental representations of some kind. The notion of a “representation” is of course very general and can even be applied to pictures and signs. Much of what goes on in the brain might also be understood in a representational way. Per esempio, mental events represent outer objects partly because they are caused by such objects in, Dire, cases of veridical visual perception. Philosophers often call such mental states “intentional states” which have representational content, questo è, mental states that are “about” or “directed at” at something as when one has a thought about a house or a perception of a tree. Although intentional states, such as beliefs and thoughts, are sometimes contrasted with “phenomenal states,” such as pains and color experiences, it is clear that many conscious states have both phenomenal and intentional properties, such as in visual perceptions.

The general view that we can explain conscious mental states in terms of representational or intentional states is called “representationalism.” Although not automatically reductionistic, most versions of it do attempt such a reduction. Most representationalists believe that there is room for a second-step reduction to be filled in later by neuroscience. A related motivation for representational theories of consciousness is the belief that an account of intentionality can more easily be given in naturalistic terms, such as a causal theory whereby mental states are understood as representing outer objects via some reliable causal connection. The idea, Poi, is that if consciousness can be explained in representational terms and representation can be understood in purely physical terms, then there is the promise of a naturalistic theory of consciousness. Most generally, Tuttavia, representationalism can be defined as the view that the phenomenal properties of conscious experience (questo è, the “qualia”) can be explained in terms of the experiences’ representational properties.

It is worth noting here that the relationship between intentionality and consciousness is itself a major ongoing area of research with some arguing that genuine intentionality actually presupposes consciousness in some way (Searle 1992, Horgan and Tienson 2002). Se questo è corretto, then it would be impossible to reduce consciousness to intentionality, but representationalists argue that consciousness requires intentionality, not vice versa. Naturalmente,, few if any today hold the very strong view Cartesian view that all intentional states are conscious. Descartes thought that mental states are essentially conscious and there are no unconscious mental states at all. For much more on the relationship between intentionality and consciousness, see Gennaro (2012, chapter two), Chudnoff (2015), and the essays in Bayne and Montague (2011) and Kriegel (2013).

A first-order representational (FOR) theory of consciousness is one that attempts to explain and reduce conscious experience primarily in terms of world-directed (or first-order) intentional states. The two most cited FOR theories are those of Fred Dretske (1995) and Michael Tye (1995, 2000), but the emphasis here will be on Tye’s more developed theory.

Of course not all mental representations are conscious, so the key question remains: What exactly distinguishes conscious from unconscious mental states (or representations)? What makes an unconscious mental state a conscious mental state? Tye defends what he calls “PANIC theory.” The acronym “PANIC” stands for poised, abstract, non-conceptual, intentional content. Tye holds that at least some of the representational content in question is non-conceptual (N), which is to say that the subject can lack the concept for the properties represented by the experience in question, such as an experience of a certain shade of red that one has never seen before. But conscious states clearly must also have “intentional content” (IC) for any representationalist. Tye also asserts that such content is “abstract” (UN) and so not necessarily about particular concrete objects. This is needed to handle hallucination cases where there are no concrete objects at all or cases where different objects look phenomenally alike. Perhaps most important for mental states to be conscious, Tuttavia, is that such content must be “poised” (P), which is an importantly functional notion about what conscious states do. The “key idea is that experiences and feelings…stand ready and available to make a direct impact on beliefs and/or desires. For example…feeling hungry… has an immediate cognitive effect, vale a dire, the desire to eat….States with nonconceptual content that are not so poised lack phenomenal character [Perché]…they arise too early, com'era, in the information processing” (Tye 2000, 62).

One common objection to FOR is that it does not apply to all conscious states. Some conscious states do not seem to be “about” or “directed at” anything, such as pains or anxiety, and so they would be non-representational conscious states. Se è così, then conscious states cannot generally be explained in terms of representational properties (Block 1996). Tye responds that pains and itches do represent in the sense that they represent parts of the body. Even hallucinations either misrepresent (which is still a kind of representation) or the conscious subject still takes them to have representational properties from the first-person point of view. Tye (2000) goes to great lengths in response to a host of alleged counter-examples to FOR. Per esempio, with regard to conscious emotions, he says that they “are frequently localized in particular parts of the body. . . . Per esempio, if one feels sudden jealousy, one is likely to feel one’s stomach sink . . . [o] one’s blood pressure increase” (Tye 2000, 51). He believes that something similar is true for fear or anger. Moods, Tuttavia, are quite different and do not seem so easily localizable in the same way. Perhaps the most serious objection to Tye’s theory, Tuttavia, is that what seems to be doing most of the work on Tye’s account is the extremely functional-sounding “poised” notion, and so he is arguably not really explaining phenomenal consciousness in entirely representational terms (Kriegel 2002). For other versions of FOR, see Harman (1990), Byrne (2001), and Droege (2003). Chalmers (2004) does an excellent job of presenting and categorizing the plethora of representationalist positions.

2. Higher-Order Representationalism
UN. Higher-Order Thought (HOT) Teoria

Di nuovo, the key question is: What makes a mental state a conscious mental state? There is also a long tradition that has attempted to understand consciousness in terms of some kind of higher-order awareness (Locke 1689/1975). This view has been revived by several contemporary philosophers (Armstrong 1981, Rosenthal 1986, 1997, 2005, Lycan 1996, 2001, Gennaro 1996, 2012). The basic idea is that what makes a mental state conscious is that it is the object of some kind of higher-order representation (HOR). A mental state M becomes conscious when there is a HOR of M. A HOR is a “meta-psychological” or “meta-cognitive” state, questo è, a mental state directed at another mental state (“I am in mental state M”). Così, Per esempio, my desire to write a good entry becomes conscious when I am (non-inferentially) “aware” of the desire. Intuitivamente, conscious states, as opposed to unconscious ones, are mental states that I am “aware of” being in some sense. Conscious mental states arise when two unconscious mental states are related in a certain way, vale a dire, that one of them (the HOR) is directed at the other (M).

This overall idea is sometimes referred to as the Transitivity Principle (TP):

(TP) A conscious state is a state whose subject is, in some way, aware of being in it.

The corresponding idea that I could be having a conscious state while totally unaware of being in that state seems like a contradiction. A mental state of which the subject is completely unaware is clearly an unconscious state. Per esempio, I would not be aware of having a subliminal perception and thus it is an unconscious perception. There are various kinds of HOR theory with the most common division between higher-order thought (HOT) theories and higher-order perception (HOP) theories. HOT theorists, such as David Rosenthal (2005), think it is better to understand the HOR (or higher-order “awareness”) as a thought containing concepts. HOTs are treated as cognitive states involving some kind of conceptual component. HOP theorists (Lycan 1996) urge that the HOR is a perceptual state of some kind which does not require the kind of conceptual content invoked by HOT theorists. Although HOT and HOP theorists agree on the need for a HOR theory of consciousness, they do sometimes argue for the superiority of their respective positions (Rosenthal 2004, Lycan 2004, Gennaro 2012, chapter three).

One can also find something like TP in premise 1 of Lycan’s (2001) more general argument for HOR. The entire argument runs as follows:

(1) A conscious state is a mental state whose subject is aware of being in it.

(2) The “of” in (1) is the “of” of intentionality; what one is aware of is an intentional object of the awareness.

(3) Intentionality is representational; a state has a thing as its intentional object only if it represents that thing.

Perciò,

(4) Awareness of a mental state is a representation of that state. (From 2, 3)

Perciò,

(5) A conscious state is a state that is itself represented by another of the subject’s mental states. (1, 4)

The intuitive appeal of premise 1 leads naturally to the final conclusion— (5)—which is just another way of stating HOR.

A related rationale for HOR, and HOT theory in particular, can be put as follows (based on Rosenthal 2004): A non-HOT theorist might still agree with HOT theory as an account of introspection or reflection , vale a dire, that it involves a conscious thought about a mental state. This seems to be a fairly common sense definition of introspection that includes the notion that introspection involves conceptual activity. It also seems reasonable for anyone to hold that when a mental state is unconscious, there is no HOT at all. But then it stands to reason that there should be something “in between” those two cases, questo è, when one has a first-order conscious state. So what is in between no HOT at all and a conscious HOT? The answer is an unconscious HOT, which is precisely what HOT theory says, questo è, a first-order conscious state is accompanied by an unconscious HOT. Inoltre, this explains what happens when there is a transition from a first-order conscious state to an introspective state: an unconscious HOT becomes conscious.

Ancora, it might still seem that HOT theory results in circularity by defining consciousness in terms of HOTs. It also might seem that an infinite regress results because a conscious mental state must be accompanied by a HOT, Quale, in turn, must be accompanied by another HOT ad infinitum. Tuttavia, as we have just seen, the standard and widely accepted reply is that when a conscious mental state is a first-order world-directed state the higher-order thought (HOT) is not itself conscious. But when the HOT is itself conscious, there is a yet higher-order (or third-order) thought directed at the second-order state. In questo caso, we have introspection which involves a conscious HOT directed at an inner mental state. When one introspects, one’s attention is directed back into one’s mind. Per esempio, what makes my desire to write a good chapter a conscious first-order desire is that there is a (non-conscious) HOT directed at the desire. In questo caso, my conscious focus is directed outwardly at the paper or computer screen, so I am not consciously aware of having the HOT from the first-person point of view. When I introspect that desire, Tuttavia, I then have a conscious HOT (accompanied by a yet higher, third-order, HOT) directed at the desire itself (Rosenthal 1986, 1997). Infatti, it is crucial to distinguish first-order conscious states (with unconscious HOTs) from introspective states (with conscious HOTs).

HOT theorists do insist that the HOT must become aware of the lower-order (LO) state noninferentially in order to make it conscious. The point of this condition is mainly to rule out alleged counterexamples to HO theory, such as cases where I become aware of my unconscious desire to kill my boss because I have consciously inferred it from a session with a psychiatrist, or where my anger becomes conscious after making inferences based on my own behavior. The characteristic feel of such a conscious desire or anger may be absent in these cases, but since awareness of them arose via conscious inference, the HO theorist accounts for them by adding this noninferential condition.

b. Dispositional HOT Theory

Peter Carruthers (2000, 2005) has proposed a different form of HOT theory such that the HOTs are dispositional states instead of actual HOTs, though he also understands his “dispositional HOT theory” to be a form of HOP theory (Carruthers 2004). The basic idea is that the conscious status of an experience is due to its availability to higher-order thought. So “conscious experience occurs when perceptual contents are fed into a special short-term buffer memory store, whose function is to make those contents available to cause HOTs about themselves” (Carruthers 2000, 228). Some first-order perceptual contents are available to a higher-order “theory of mind mechanism,” which transforms those representational contents into conscious contents. Così, no actual HOT occurs. Invece, according to Carruthers, some perceptual states acquire a dual intentional content, Per esempio, a conscious experience of red not only has a first-order content of “red,” but also has the higher-order content “seems red” or “experience of red.” Thus, he also calls his theory “dual-content theory.” Carruthers makes interesting use of so-called “consumer semantics” in order to fill out his theory of phenomenal consciousness. Cioè, the content of a mental state depends, in part, on the powers of the organisms which “consume” that state, Per esempio, the kinds of inferences which the organism can make when it is in that state.

Dispositional HOT theory is often criticized by those who do not see how the mere disposition toward a mental state can render it conscious (Rosenthal 2004). Recall that a key motivation for HOT theory is the Transitivity Principle (TP) but the TP clearly lends itself to an actualist HOT theory interpretation, vale a dire, that we are aware of our conscious states and not aware of our unconscious states. E, as Rosenthal puts it, “Being disposed to have a thought about something doesn’t make one conscious of that thing, but only potentially conscious of it” (2004, 28). Thus it is natural to wonder just how dual-content theory explains phenomenal consciousness. It is difficult to understand how a dispositional HOT can render, Dire, a perceptual state actually conscious.

Carruthers is well aware of this objection and attempts to address it (Carruthers 2005, 55-60). He again relies heavily on consumer semantics in an attempt to show that changes in consumer systems can transform perceptual contents. Cioè, what a state represents will depend, in part, on the kinds of inferences that the cognitive system is prepared to make in the presence of that state, or on the kinds of behavioral control that it can exert. In quel caso, the presence of first-order perceptual representations to a consumer-system that can deploy a “theory of mind” and concepts of experience may be sufficient to render those representations at the same time as higher-order ones. This would confer phenomenal consciousness to such states. But the central and most serious problem remains: questo è, dual-content theory is vulnerable to the same objection raised against FOR. This point is made most forcefully by Jehle and Kriegel (2006). They point out that dual-content theory “falls prey to the same problem that bedevils FOR: It attempts to account for the difference between conscious and [Uno]conscious . . . mental states purely in terms of the functional roles of those states” (Jehle and Kriegel 2006, 468). Carruthers, Tuttavia, is more concerned to avoid what he takes to be a problem for “actualist” HOT theory, vale a dire, that an unbelievably large amount of cognitive (and neural) space would have to be taken up if every conscious experience is accompanied by an actual HOT.

c. Higher-Order Perception (HOP) Teoria

David Armstrong (1981) and William Lycan (1996, 2004) have been the leading proponents of HOP theory in recent decades. Unlike HOTs, HOPs are not thoughts and do not have conceptual content. Piuttosto, they are to be understood as analogous to outer perception. One major objection to HOP theory is that, unlike outer perception, there is no obvious distinct sense organ or scanning mechanism responsible for HOPs. Allo stesso modo, no distinctive sensory quality or phenomenology is involved in having HOPs whereas outer perception always involves some sensory quality. Lycan concedes the disanalogy but argues that it does not outweigh other considerations favoring HOP theory. His reply is understandable, but the objection remains a serious one and the disanalogy cannot be overstated.

Gennaro argues against Lycan’s claim that HOP theory is superior to HOT theory because, by analogy to outer perception, there is an importantly passive aspect to perception not found in thought (Gennaro 2012, chapter three). The perceptions in HOPs are too passive to account for the interrelation between HORs and first-order states. Così, HOTs are preferable. Gennaro sometimes frames it in Kantian terms: we can distinguish between the faculties of sensibility and understanding, which must work together to make experience possible. What is most relevant here is that the passive nature of the “sensibility” (through which outer objects are given to us) is contrasted with the active and cognitive nature of the “understanding,” which thinks about and applies concepts to that which enters via the sensibility. HOTs fit this latter description better than HOPs. Comunque, what ultimately justifies treating HORs as thoughts is the exercise and application of concepts to first-order states (Rosenthal 2005, Gennaro 2012, chapter four).

Più recentemente, Tuttavia, Lycan has changed his mind and no longer holds HOP theory mainly because he now thinks that attention to first-order states is sufficient for an account of conscious states and there is little reason to view the relevant attentional mechanism as intentional or as representing first-order states (Sauret and Lycan 2014). Armstrong and Lycan had indeed previously spoken of HOP “monitors” or “scanners” as a kind of attentional mechanism but now it seems that “…leading contemporary cognitive and neurological theories of attention are unanimous in suggesting that attention is not intentional” (Sauret and Lycan 2014, 365). They cite Prinz (2012), Per esempio, who holds that attention is a psychological process that connects first-order states with working memory. Sauret and Lycan explain that “attention is the mechanism that enables subjects to become aware of their mental states” (2014, 367) and yet this “awareness of” is supposed to be a non-intentional selection of mental states. Così, Sauret and Lycan (2014) find that Lycan’s (2001) earlier argument, discusso sopra, goes wrong at premise 2 and that the “of” in question need not be the “of” of intentionality. Invece, the ‘of’ is perhaps more of an “acquaintance relation” although Sauret and Lycan do not really present a theory of acquaintance, let alone one with the level of detail offered by HOT theory.

Gennaro (2015a) offers reasons to doubt that the acquaintance strategy is a better alternative. Such acquaintance relations would presumably be somehow “closer” than the representational relation. But this strategy is arguably at best trading one difficult problem for an even deeper puzzle, vale a dire, just how to understand the allegedly intimate and nonrepresentational “awareness of” relation between HORs and first-order states. It is also more difficult to understand such “acquaintance relations” within the context of any HOR reductionist approach. Infatti, acquaintance is often taken to be unanalyzable and simple in which case it is difficult to see how it could usefully explain anything, let alone the nature of conscious states. Zahavi (2007), who is not a HOT or HOP theorist, also recognizes how unsatisfying invoking ‘acquaintance’ can be. It remains unclear as to what this acquaintance relation is supposed to be. For other variations on HOT theory, see Rolls (2004), Picciuto (2011), and Coleman (2015).

3. Objections and Replies

Several prominent objections to HOR (and counter-replies) can be found in the literature. Although some also apply to HOP theory, others are aimed more specifically at HOT theory.

Primo, some argue that various animals (and even infants) are not likely to have to the conceptual sophistication required for HOTs, and so that would render animal (and infant) consciousness very unlikely (Dretske 1995, Seager 2004). Are cats and dogs capable of having complex higher-order thoughts such as “I am in mental state M”? Although most who bring forth this objection are not HO theorists, Carruthers (1989, 2000) is one HO theorist who actually embraces the conclusion that (most) animals do not have phenomenal consciousness.

Tuttavia, perhaps HOTs need not be as sophisticated as it might initially appear, not to mention some comparative neurophysiological and experimental evidence supporting the conclusion that animals have conscious mental states (Gennaro 1993, 1996). Most HO theorists do not wish to accept the absence of animal or infant consciousness as a consequence of holding the theory. The debate has continued over the past two decades (see for example, Carruthers 2000, 2005, 2008, 2009, and Gennaro 2004b, 2009, 2012, chapters eight). To give an example which seems to favor animal HOTs, Clayton and Dickinson and their colleagues (in Clayton, Bussey, and Dickinson 2003) have reported convincing demonstrations of memory for time in scrub jays. Scrub jays are food-caching birds, and when they have food they cannot eat, they hide it and recover it later. Because some of the food is preferred but perishable (such as crickets), it must be eaten within a few days, while other food (such as nuts) is less preferred but does not perish as quickly. In cleverly designed experiments using these facts, scrub jays are shown, even days after caching, to know not only what kind of food was where but also when they had cached it (see also Clayton, Emery, and Dickinson 2006). Such experimental results seem to show that they have episodic memory which involves a sense of self over time. This strongly suggests that the birds have some degree of meta-cognition with a self-concept (or “I-concept”) which can figure into HOTs. Ulteriore, many crows and scrub jays return alone to caches they had hidden in the presence of others and recache them in new places (Emery and Clayton 2001). This suggests that they know that others know where the food is cached, and thus, to avoid having their food stolen, they recache the food. This strongly suggests that these birds can have some mental concepts, not only about their own minds but even of other minds, which is sometimes referred to as “mindreading” ability. Naturalmente,, there are many different experiments aimed at determining the conceptual and meta-cognitive abilities of various animals so it is difficult to generalize across species.

There does seem to be growing evidence that at least some animals can mind-read under familiar conditions. Per esempio, Laurie Santos and colleagues show that rhesus monkeys attribute visual and auditory perceptions to others in more competitive paradigms (Flombaum and Santos 2005, Santos, Nissen, and Ferrugia 2006). Rhesus monkeys preferentially attempted to obtain food silently only in conditions in which silence was relevant to obtaining the food undetected. While a human competitor was looking away, monkeys would take grapes from a silent container, thus apparently understanding that hearing leads to knowing on the part of human competitors. Subjects reliably picked the container that did not alert the experimenter that a grape was being removed. This suggests that monkeys take into account how auditory information can change the knowledge state of the experimenter (see also for example the essays in Terrace and Metcalfe 2005). Some of these same issues arise with respect to infant concept possession and consciousness (see Gennaro 2012, chapter seven, Goldman 2006, Nichols and Stich 2003, but also Carruthers 2009).

A second objection to has been referred to as the “problem of the rock” and is originally due to Alvin Goldman (Goldman 1993). When I have a thought about a rock, it is certainly not true that the rock becomes conscious. So why should I suppose that a mental state becomes conscious when I think about it? This is puzzling to many and the objection forces HOT theorists to explain just how adding the HOT state changes an unconscious state into a conscious. There have been, Tuttavia, a number of responses to this kind of objection (Rosenthal 1997, Van Gulick 2000, 2004, Gennaro 2005, 2012, chapter four). Perhaps the most common theme is that there is a principled difference in the objects of the thoughts in question. For one thing, rocks and similar objects are not mental states in the first place, and HOT theorists are first and foremost trying to explain how a mental state becomes conscious. The objects of the HOTs must be “in the head.”

Terzo, one might object to any reductionist theory of consciousness with something like Chalmers’ hard problem, questo è, how or why brain activity produces conscious experience (Chalmers 1995). Tuttavia, it is first important to keep in mind that HOT theory is unlike reductionist accounts in non-mentalistic terms and so is arguably immune to Chalmers’s criticism about the plausibility of theories which attempt a direct reduction to neurophysiology (Gennaro 2005). On HOT theory, there is no problem about how a specific brain activity “produces” conscious experience, nor is there an issue about any a priori or a posteriori relation between brains and consciousness. The issue instead is how HOT theory might be realized in our brains for which there seems to be some evidence thus far (Gennaro 2012, chapters four and nine).

Ancora, it might be asked just how exactly any HOR theory really explains the subjective or phenomenal aspect of conscious experience. How or why does a mental state come to have a first-person qualitative “what it is like” aspect by virtue of the presence of a HOR directed at it? HOR theorists have been slow to address this problem though a number of overlapping responses have emerged. Some argue that this objection misconstrues the main and more modest purpose of their HOT theories. The claim is that HOT theories are theories of consciousness only in the sense that they are attempting to explain what differentiates conscious from unconscious states, questo è, in terms of a higher-order awareness of some kind. A full account of “qualitative properties” or “sensory qualities” (which can themselves be unconscious) can be found elsewhere in their work, but is independent of their theory of consciousness (Rosenthal 1991, 2005, Lycan 1996). Così, a full explanation of phenomenal consciousness does require more than a HOR theory but that is no objection to HOR theories as such. There is also a concern that proponents of the hard problem unjustly raise the bar as to what would count as a viable reductionist explanation of consciousness so that any such reductionist attempt would inevitably fall short (Carruthers 2000). Part of the problem may even be a lack of clarity about what would count as an explanation of consciousness (Van Gulick 1995).

Gennaro responds that HOTs explain how conscious states occur because the concepts that figure into the HOTs are necessarily presupposed in conscious experience (Gennaro 2012, chapter four, 2005). The idea is that first we receive information via our senses (or the “faculty of sensibility”). Some of this information will then rise to the level of unconscious mental states but they do not become conscious until the more cognitive “faculty of understanding” operates on them via the application of concepts. We can arguably understand such concept application in terms of HOTs directed at first-order states. Così, I consciously experience (and recognize) the blue house as a blue house partly because I apply the concepts “blue” and “house” (in my HOTs) to my basic perceptual states. Gennaro urges that if there is a real hard problem, it has more to do with explaining concept acquisition (Gennaro 2012, chapters six and seven).

A fourth, and very important, objection to higher-order approaches is the question of how such theories can explain cases where the HO state might misrepresent the lower-order (LO) mental state (Byrne 1997, Neander 1998, Levine 2001, Block 2011). Dopotutto, if we have a representational relation between two states, it seems possible for misrepresentation or malfunction to occur. If it does, then what explanation can be offered by the HO theorist? If my LO state registers a red percept and my HO state registers a thought about something green, then what happens? It seems that problems loom for any answer given by a HOT theorist and the cause of the problem has to do with the very nature of the HO theorist’s belief that there is a representational relation between the LO and HO states. Per esempio, if a HOT theorist takes the option that the resulting conscious experience is reddish, then it seems that the HOT plays no role in determining the qualitative character of the experience. D'altra parte, if the resulting experience is greenish, then the LO state seems irrelevant. Ciò nonostante, Rosenthal and Weisberg hold that the HOT determines the qualitative properties, even in so-called “targetless” or “empty” HOT cases where there is no LO state at all (Rosenthal 2005, 2011, Weisberg 2008, 2011).

Gennaro argues instead that no conscious color experience would result in such cases, questo è, neither reddish nor greenish experience especially since, Per esempio, it is difficult to see how a sole (unconscious) HOT can result in a conscious state at all (Gennaro 2012, chapter four, 2013). He argues that there must be a conceptual match, complete or partial, between the LO and HO state in order for the conscious experience to exist in the first place. Weisberg and Rosenthal argue that what really matters is how things seem to the subject and, if we can explain that, we have explained all that we need to. But the problem here is that somehow the HOT alone is what matters. Doesn’t this defeat the purpose of HOT theory which is supposed to explain state consciousness in terms of the relation between two states? Inoltre, according to the theory, the lower-order state is supposed to be conscious when one has an unconscious HOT.

Alla fine, Gennaro argues for the more nuanced claim that:

Whenever a subject S has a HOT directed at experience e, the content c of S’s HOT determines the way that S experiences e (provided that there is a full or partial conceptual match with the lower-order state, or when the HO state contains more specific or fine-grained concepts than the LO state has, or when the LO state contains more specific or fine-grained concepts than the HO state has, or when the HO concepts can combine to match the LO concept) (Gennaro 2012, 180).

The reasons for the above qualifications are discussed in Gennaro (2012, chapter six) but they basically try to explain what happens in some abnormal cases (such as visual agnosia) and in some other atypical contexts (such as perceiving ambiguous figures such as the vase-two faces) where mismatches might occur between the HOT and LO state. Per esempio, visual agnosia, or more specifically associative agnosia, seems to be a case where a subject has a conscious experience of an object without any conceptualization of the incoming visual information (Farah 2004). There appears to be a first-order perception of an object without the accompanying concept of that object (either first- or second-order, per questo motivo). Thus its “meaning” is gone and the object is not recognized. It seems that there can be conscious perceptions of objects without the application of concepts, questo è, without recognition or identification of those objects. But one might instead hold that associative agnosia is simply an unusual case where the typical HOT does not fully match up with the first-order visual input. Cioè, we might view associative agnosia as a case where the “normal,” or most general, object concept in the HOT does not accompany the input received through the visual modality. There is a partial match instead. A HOT might partially recognize the LO state. So associative agnosia would be a case where the LO state could still register a percept of an object O (because the subject still does have the concept), but the HO state is limited to some features of O. Bare visual perception remains intact in the LO state but is confused and ambiguous, and thus the agnosic’s conscious experience of O “loses meaning,” resulting in a different phenomenological experience. Quando, Per esempio, the agnosic does not (visually) recognize a whistle as a whistle, perhaps only the concepts ‘silver,’ ‘roundish,’ and ‘object’ are applied. But as long as that is how the agnosic experiences the object, then HOT theory is left unthreatened.

Comunque, on Gennaro’s view, misrepresentations cannot occur between M and HOT and still result in a conscious state (Gennaro 2012, 2013). Misrepresentations cannot occur between M and HOT and result in a conscious experience reflecting mismatched and incompatible concepts.

A final kind of objection worth mentioning has to do with various pathologies of self-awareness, such as somatoparaphrenia which is a pathology of self characterized by the sense of alienation from parts of one’s body. It is a bizarre type of body delusion where one denies ownership of a limb or an entire side of one’s body. It is sometimes called a “depersonalization disorder.” Relatedly, anosognosia is a condition in which a person who suffers from a disability seems unaware of the existence of the disability. A person whose limbs are paralyzed will insist that his limbs are moving and will become furious when family and caregivers say that they are not. Somatoparaphrenia is usually caused by extensive right-hemisphere lesions, most commonly in the temporoparietal junction (Valler and Ronchi 2009). Patients with somatoparaphrenia say some very strange things, such as “parts of my body feel as if they didn’t belong to me” (Sierra and Berrios 2000, 160) and “when a part of my body hurts, I feel so detached from the pain that it feels as if it were somebody else’s pain” (Sierra and Berrios 2000, 163). It is difficult to grasp what having these conscious thoughts and experiences are like.

There is some question as to whether or not the higher-order thought (HOT) theory of consciousness can plausibly account for the depersonalization psychopathology of somatoparaphrenia (Liang and Lane 2009, Rosenthal 2010, Lane and Liang 2010). Liang and Lane (2009) argue that it cannot. HOT theory has been critically examined in light of some psychopathologies because, according to HOT theory, what makes a mental state conscious is a HOT of the form that “I am in mental state M.” The requirement of an I-reference leads some to think that HOT theory cannot explain since there would seem to be cases where I can have a conscious state and not attribute it to myself (and instead to someone else). Liang and Lane (2009) initially argued that somatoparaphrenia threatens HOT theory because it contradicts the notion that the accompanying HOT that “I am in mental state M.” The “I” is not only importantly self-referential but essential in tying the conscious state to oneself and, così, to one’s ownership of M.

Rosenthal (2010) basically responds that one can be aware of bodily sensations in two ways that, normally at least, go together: (1) aware of a bodily sensation as one’s own, e (2) aware of a bodily sensation as having some bodily location, like a hand or foot. Patients with somatoparaphrenia still experience the sensation as their own but also as having a mistaken bodily location (perhaps somewhat analogous to phantom limb pain where patients experience pain in missing limbs). Such patients still do have the awareness in (1), which is the main issue at hand, but they have the strange awareness in sense (2). So somatoparaphrenia leads some people to misidentify the bodily location of a sensation as some­one else’s, but the awareness of the sensation itself remains one’s own. Lane and Liang (2010) are not satisfied and, tra l'altro, counter that Rosenthal’s analogy to phantom limbs is faulty, and that he has still not explained why the identification of the bearer of the pain can­not also go astray.

Tra l'altro, Gennaro (Gennaro 2015b replies first that we must remember that many of these patients often deny feel­ing anything in the limb in question (Bottini et al. 2002). As Liang and Lane point out, patient FB (Bottini et al. 2002), while blindfolded, feels “no tactile sensation” (2009, 664) when the examiner would in fact touch the dorsal surface of FB’s hand. In these cases, it is particularly difficult to see what the problem is for HOT theory at all. But when there really is a bodily sensation of some kind, a HOT theorist might also argue that there are really two conscious states that seem to be at odds. There is a conscious feeling in a limb but also the (conscious) attribution of the limb to someone else. It is crucial to emphasize that somatoparaphrenia is often characterized as a delusion of belief often under the broader category of anosognosia. A delusion is often defined as a false belief that is held based on an incorrect (and probably unconscious) inference about external reality or one­self that is firmly sustained despite what almost everyone else believes and despite what constitutes incontrovertible and obvious proof or evidence to the contrary (Bortolotti 2009, Radden 2010). In alcuni casi, delusions seriously inhibit normal day-to-day functioning. Beliefs are often taken to be intentional states integrated with other beliefs. They are typically understood as caused by perceptions or experiences that then lead to action or behavior. Così, somatoparaphrenia is, in some ways, closer to self-deception and involves frequent confabulation. For more on this disagreement as well as the phenomenon of thought insertion in schizophrenia, see Lane (2015) anche.

4. HOT Theory and Conceptualism

Consider again the related claim that HOT theory can explain how one’s conceptual repertoire can transform our phenomenological experience. Concetti, at minimum, involve recognizing and understanding objects and properties. Having a concept C should also give the concept possessor the ability to discriminate instances of C and non-C’s. Per esempio, if I have the concept ‘tiger’ I should be able to identify tigers and distinguish them from other even fairly similar land animals. Rosenthal invokes the idea that acquiring concepts can change one’s conscious experience with the help of several well-known examples (2005, 187-188). Acquiring various concepts from a wine-tasting course will lead to different experiences from those taste experiences enjoyed prior to the course. I acquire more fine-grained wine-related concepts, such as “dry” and “heavy,” which in turn can figure into my HOTs and thus alter my conscious experiences. I literally have different qualia due to the change in my conceptual repertoire. As we learn more concepts, we have more fine-grained experiences and thus experience more qualitative complexities. A botanist will likely have somewhat different perceptual experiences than I do while walking through a forest. Al contrario, those with a more limited conceptual repertoire, such as infants and animals, will often have a more coarse-grained set of experiences. Much the same goes for other sensory modalities, such as the way that I experience a painting after learning more about artwork and color. The notion of “seeing-as” (“hearing-as” and so on) is often used in this context, questo è, when I possess different concepts I literally experience the world differently.

Così, Gennaro argues that there is a very close and natural connection between HOT theory and what is known as “conceptualism” (Gennaro 2012, chapter six, 2013). Chuard (2007) defines conceptualism as the claim that “the representational content of a perceptual experience is fully conceptual in the sense that what the experience represents (and how it represents it) is entirely determined by the conceptual capacities the perceiver brings to bear in her experience” (Chuard 2007, 25). Comunque, the basic idea is that, just like beliefs and thoughts, perceptual experiences also have conceptual content. In a somewhat Kantian spirit, one might say that all conscious experience presupposes the application of concepts, o, even stronger, the way that one experiences the world is entirely determined by the concepts one possesses. Infatti, Gunther (2003, 1) initially uses Kant’s famous slogan that “thoughts without content are empty, intuitions [= sensory experiences] without concepts are blind” to sum up conceptualism (Kant 1781/1965, A51/B75).

5. Hybrid Higher-Order and Self-Representational Theories

Some related representationalist views hold that the HOR in question should be understood as intrinsic to (or part of) an overall complex conscious state. This stands in contrast, Per esempio, to the standard view that the HOT is extrinsic to (questo è, entirely distinct from) its target mental state. One motivation for this shift is renewed interest in a view somewhat closer to the one held by Franz Brentano (1874/1973) e altri, normally associated with the phenomenological tradition (Sartre 1956, Smith 2004). To varying degrees, these theories have in common the idea that conscious mental states, in some sense, represent themselves, which still involves having a thought about a mental state but just not a distinct or separate state. Così, when one has a conscious desire for a beer, one is also aware that one is in that very state. The conscious desire represents both the beer and itself. It is this “self-representing” which makes the state conscious.

Gennaro has argued that, when one has a first-order conscious state, IL (unconscious) HOT is better viewed as intrinsic to the target state, so that we have a complex conscious state with parts (Gennaro 1996, 2006, 2012). This is what he calls the “wide intrinsicality view” (WIV) which he takes to be a version of HOT theory and argues elsewhere that Sartre’s theory of consciousness could be understood in this way (Gennaro 2002, 2015). On the WIV, first-order conscious states are complex states with a world-directed part and a meta-psychological component. Robert Van Gulick (2000, 2004, 2006) has also explored the alternative that the HO state is part of an overall global conscious state. He calls such states “HOGS” (Higher-Order Global States) whereby a lower-order unconscious state is “recruited” into a larger state, which becomes conscious partly due to the implicit self-awareness that one is in the lower-order state.

This general approach is also forcefully advocated by Uriah Kriegel in a series of papers, beginning with Kriegel (2003) and culminating in Kriegel (2009). He refers to it as the “self-representational theory of consciousness” (see also Kriegel and Williford 2006). To be sure, the notion of a mental state representing itself or a mental state with one part representing another part is in need of further development. Ciò nonostante, there is agreement among all of these authors that conscious mental states are, in some important sense, reflexive or self-directed.

Più specificamente, Kriegel (2003, 2006, 2009) has tried to cash out TP in terms of a ubiquitous (conscious) “peripheral” self-awareness which accompanies all of our first-order focal conscious states. Not all conscious “directedness” is attentive and so perhaps we should not restrict conscious directedness to that which we are consciously focused on. Se questo è giusto, then a first-order conscious state can be both attentively outer-directed and inattentively inner-directed. Gennaro has argued against this view at length (Gennaro 2008, Gennaro 2012, chapter five). Per esempio, although it is surely true that there are degrees of conscious attention, the clearest example of genuine “inattentive” consciousness is outer-directed awareness in one’s peripheral visual field. But this obviously does not show that any inattentional consciousness is self-directed during outer-directed consciousness, let alone at the very same time. Anche, what is the evidence for such self-directed inattentional consciousness? It is presumably based on phenomenological considerations but he claims not to find such ubiquitous inattentive self-directed “consciousness” in his outer-directed conscious experience. Except when he is introspecting, Gennaro thinks that conscious experience is so completely outer directed that there really is no such peripheral self-directed consciousness when in first-order conscious states. He says that it does not seem to him that he is consciously aware of his own experience when, Dire, consciously attending to a band in concert or to the task of building a bookcase. Even some who are otherwise very sympathetic to Kriegel’s phenomenological approach find it difficult to believe that “pre-reflective” (inattentional) self-awareness accompanies conscious states (Siewart 1998, Zahavi 2004) or at least that all conscious states involve such self-awareness (Smith 2004). Self-representationalism is also a target of the objection discussed in section 3 regarding somatoparaphrenia and related deficits of self-awareness (for more on this dispute, see Lane 2015 and Billon and Kriegel 2015).

Alla fine, Kriegel actually holds that there is an indirect self-representation applicable to conscious states with the self-representational peripheral component directed at the world-directed part of the state (2009, 215-226). This seems closer to Gennaro’s WIV but Kriegel thinks that “pre-reflective self-awareness” or the “self-representation” is itself (peripherally) conscious. For others who hold some form of the self-representational view, see Williford (2006) and Janzen (2008). Carruthers’ (2000, 2005) theory can also be viewed in this light since, as we have seen, he contends that conscious states have two representational contents.

6. HOT Theory and the Prefrontal Cortex

An interesting topic in recent years has focused on attempts to identify just how HOT theory and self-representationalism might be realized in the brain. We have seen that most representationalists tend to think that the structure of conscious states is realized in the brain (though it may take some time to identify all the main neural structures). The issue is sometimes framed in terms of the question: “how global is HOT theory?” That is, do conscious mental states require widespread brain activation or can at least some be fairly localized in narrower areas of the brain? Perhaps most interesting is whether or not the prefrontal cortex (PFC) is required for having conscious states (Gennaro 2012, chapter nine). Gennaro disagrees with Kriegel (2007, 2009 chapter seven) and Block (2007) Quello, according to the higher-order and self-representational view, the PFC is required for most conscious states (see also Del Cul et al. 2007, Lau and Rosenthal 2011). It may very well be that the PFC is required for the more sophisticated introspective states but this isn’t a problem for HOT theory as such because it does not require introspection for having first-order conscious states.

Are there conscious states without PFC activity? It seems so. Per esempio, Rafael Malach and colleagues show that when subjects are engaged in a perceptual task or absorbed in watching a movie, there is widespread neural activation but little PFC activity (Grill-Spector and Malach 2004, Goldberg, Harel, and Malach 2006). Although some other studies do show PFC activation, this is mainly because of the need for subjects to report their experiences. Anche, basic conscious experience is certainly not entirely eliminated even when there is extensive bilateral PFC damage or lobotomies (Pollen 2008). Zeki (2007) also cites evidence that the frontal cortex is engaged only when reportability is part of the conscious experience and that all human color imaging experiments have been unanimous in not showing any particular activation of the frontal lobes. Similar results are found for other sensory modalities, Per esempio, in auditory perception (Baars and Gage 2010, chapter seven). Although areas outside the auditory cortex are sometimes cited, there is virtually no mention of the PFC.

Gennaro thinks that the above line of argument actually works to the advantage of HOT theory with regard to the problem of animal and infant consciousness. If HOT theory does not require PFC activity for all conscious states, then HOT theory is in even a better position to account for animal and infant consciousness since it is doubtful that they have the requisite PFC activity.

But why think that unconscious HOTs can occur outside the PFC? If we grant that unconscious HOTs can be regarded as a kind of “pre-reflective” self-consciousness, then one might for example look to Newen and Vogeley (2003) for answers. They distinguish five levels of self- consciousness ranging from “phenomenal self-acquaintance” and “conceptual self-consciousness” up to “iterative meta- representational self-consciousness.” They are explicitly concerned with the neural correlates of what they call the “first-person perspective” (1PP) and the “egocentric reference frame.” Citing numerous experiments, they point to various neural signatures of self-consciousness. The PFC is rarely mentioned and then usually only with regard to more sophisticated forms of self-consciousness. Other brain areas are much more prominently identified, such as the medial and inferior parietal cortices, the temporoparietal cortex, the posterior cingulate cortex, and the anterior cingulate cortex (ACC). Kriegel (2007) also mentions the ACC as a possible location for HOTs but it should be noted that the ACC is, at least sometimes, considered to be part of the PFC.

Damasio (1999) explicitly mentions the ACC as a site for some higher-order mental activity or “maps.” There are various cortical association areas that might be good candidates for HOTs depending on the modality. Per esempio, key regions for spatial navigation comprise the medial parietal and right inferior parietal cortex, posterior cingulate cortex, and the hippocampus. Even when considering the neural signatures of theory of mind and mind-reading, Newen and Vogeley have replicated experiments indicating that such meta-representation is best located in the ACC. Inoltre, “the capacity for taking 1PP in such [theory of mind] contexts showed differential activation in the right temporo-parietal junction and the medial aspects of the superior parietal lobe” (Newen and Vogeley 2003, 538). Di nuovo, even if the PFC is essential for having certain HOTs and conscious states, this poses no threat to HOT theory provided that the HOTs in question are of the more sophisticated introspective variety.

This matter is certainly not yet settled but Gennaro urges that it is a mistake, both philosophically and neurophysiologically, to claim that HOT theory should treat first-order conscious states as essentially including PFC activity. Ulteriore, and to tie this together with the animals issue, Gennaro concedes the following: “If all HOTs occur in the PFC, and if PFC activity is necessary for all conscious experience, and if there is little or no PFC activity in infants and most animals, then either (UN) infants and most animals do not have conscious experience or (b) HOT theory is false” (Gennaro 2012, 281). Carruthers (2000, 2005) and perhaps Rosenthal opt for (b). Ancora, Gennaro argues that a good case can be made for the falsity of one or more of the conjuncts in the antecedent of the above conditional.

Kuzuch (2014) presents a very nice discussion of the PFC in relation to higher-order theories, arguing that the lack of dramatic deficits in visual consciousness even with PFC lesions presents a compelling case against higher-order theories. Per esempio, in addition to the studies cited above, Kozuch references Alvarez and Emory (2006) as evidence for the view that

Lesions to the orbital, lateral, or medial PFC produce so-called executive dysfunction. Depending on the precise lesion location, subjects with damage to one of these areas have problems inhibiting inappropriate actions, switching efficiently from task to task, or retaining items in short-term memory. Tuttavia, lesions to these areas appear not to produce notable deficits in visual consciousness: Tests of the perceptual abilities of subjects with lesions to the PFC proper reveal no such deficits; anche, PFC patients never report their visual experience to have changed in some remarkable way (Kozuch 2014, 729).

Kozuch notes that Gennaro’s WIV may be left undamaged, at least to some extent, since he does not require that the PFC is where HOTs are realized. It is also important to keep in mind the distinction between unconscious HOTs and conscious HOTs (= introspection). Perhaps the latter require PFC activity given the more sophisticated executive functions associated with introspection but having first-order conscious states does not require introspection. Yet another interesting argument along these lines is put forth by Sebastian (2014) with respect to some dream states. If some dreams are conscious states and there is little, se presente, PFC activity during the dream period, then HOT theory would again be in trouble if we suppose that HOTs are realized in the PFC.

In conclusion, higher-order theory has remained a viable theory of consciousness, especially for those attracted to a reductionist account but not presently to a reduction in purely neurophysiological terms. Although there are significant objections to different versions of HOR, at least some plausible replies have emerged through the years. HOR also maintains a degree of intuitive plausibility due to the Transitivity Principle (TP). Inoltre, HOT theory might help to shed light on conceptualism and can contribute to the question of the PFC’s role in producing conscious states.

7. Riferimenti e approfondimenti
Alvarez, J. and Emory, E. 2006. Executive Function and the Frontal Lobes: A Meta-Analytic Review. Neuropsychology Review 16: 17-42.
Armstrong, D. 1981. What is Consciousness? In The Nature of Mind. Itaca, New York: Cornell University Press.
Baars, B. and Gage, N. 2010. Cognition, Brain, and Consciousness: Introduction to Cognitive Neuroscience. Seconda edizione. Oxford: Elsevier.
Bayne, T. and Montague, M. eds. 2011. Fenomenologia cognitiva. New York: la stampa dell'università di Oxford.
Billon, UN. and Kriegel, U. 2015. Jaspers’ Dilemma: The Psychopathological Challenge to Subjectivity Theories of Consciousness. In R. Gennaro ed. Disturbed Consciousness. Cambridge, MA: CON Premere.
Block, N. 1996. Mental Paint and Mental Latex. In E. Villanueva ed. Perception. Atascadero, circa: Ridgeview.
Block, N. 2007. Consciousness, Accessibility, and the Mesh between Psychology and Neuroscience. Behavioral and Brain Sciences 30: 481-499.
Block, N. 2011. The Higher-Order Approach to Consciousness is Defunct. Analysis 71: 419-431.
Bottini, G., Bisiach, E., Sterzi, R., and Vallar, G. 2002. Feeling Touches in Someone Else’s Hand. NeuroReport 13: 249-252.
Bortolotti, l. 2009. Delusions and Other Irrational Beliefs. New York: la stampa dell'università di Oxford.
Brentano, F. 1874/1973. Psychology From an Empirical Standpoint. New York: Humanities.
Byrne, UN. 1997. Some like it HOT: Consciousness and Higher-Order Thoughts. Philosophical Studies 86: 103-129.
Byrne, UN. 2001. Intentionalism Defended. Philosophical Review 110: 199-240.
Carruthers, P. 1989. Brute Experience. Journal of Philosophy 86: 258-269.
Carruthers, P. 2000. Phenomenal Consciousness. Cambridge: Pressa dell'Università di Cambridge.
Carruthers, P. 2004. HOP over FOR, HOT Theory. In R. Gennaro ed. Higher-Order Theories of Consciousness: An Anthology. Amsterdam: John Benjamins.
Carruthers, P. 2005. Consciousness: Essays from a Higher-Order Perspective. New York: la stampa dell'università di Oxford.
Carruthers, P. 2008. Meta-Cognition in Animals: A Skeptical Look. Mind and Language 23: 58-89.
Carruthers, P. 2009. How we Know our Own Minds: The Relationship Between Mindreading and Metacognition. Behavioral and Brain Sciences 32: 121-138.
Chalmers, D. 1995. Facing Up to the Problem of Consciousness. Journal of Consciousness Studies 2: 200-219.
Chalmers, D. 1996. The Conscious Mind. New York: la stampa dell'università di Oxford.
Chalmers, D. 2004. The Representational Character of Experience. In B. Leiter ed. The Future for Philosophy. Oxford: la stampa dell'università di Oxford.
Chuard, P. 2007. The Riches of Experience. In R. Gennaro ed. The Interplay between Consciousness and Concepts. Exeter: Imprint Academic.
Chundoff, E. 2015. Fenomenologia cognitiva. New York: Routledge.
Clayton, N., Bussey, T., and Dickinson, UN. 2003. Can Animals Recall the Past and Plan for the Future? Nature Reviews Neuroscience 4: 685-691.
Clayton, N., Emery, N., and Dickinson, UN. 2006. The Rationality of Animal Memory: Complex Caching Strategies of Western Scrub Jays. In Hurley and Nudds 2006.
Coleman, S. 2015. Quotational Higher-Order Thought Theory. Philosophical Studies 172: 2705-2733.
Damasio, UN. 1999. The Feeling of What Happens. New York: Harcourt Brace and Co.
Del Cul, UN., Baillet, S., and Dehaene, S. 2007. Brain Dynamics Underlying the Nonlinear Threshold for Access to Consciousness. PLoS Biology 5: 2408-2423.
Dretske, F. 1995. Naturalizing the Mind. Cambridge, MA: CON Premere.
Droege, P. 2003. Caging the Beast. Philadelphia and Amsterdam: John Benjamins Publishers.
Emery, N. and Clayton, N. 2001. Effects of Experience and Social Context on Prospective Caching Strategies in Scrub Jays. Nature 414: 443-446.
Farah, M. 2004. Visual Agnosia, 2a ed. Cambridge, MA: CON Premere.
Flombaum, J. and Santos, l. 2005. Rhesus Monkeys Attribute Perceptions to Others. Current Biology 15: 447-452.
Gennaro, R. 1993. Brute Experience and the Higher-Order Thought Theory of Consciousness. Philosophical Papers 22: 51-69.
Gennaro, R. 1996. Consciousness and Self-consciousness: A Defense of the Higher-Order Thought Theory of Consciousness. Amsterdam and Philadelphia: John Benjamins.
Gennaro, R. 2002. Jean-Paul Sartre and the HOT Theory of Consciousness. Canadian Journal of Philosophy 32: 293-330.
Gennaro, R. ed. 2004a. Higher-Order Theories of Consciousness: An Anthology. Amsterdam and Philadelphia: John Benjamins.
Gennaro, R. 2004b. Higher-Order Thoughts, Animal Consciousness, and Misrepresentation: A Reply to Carruthers and Levine. In R. Gennaro ed. Higher-Order Theories of Consciousness: An Anthology. Amsterdam: John Benjamins.
Gennaro, R. 2005. The HOT Theory of Consciousness: Between a Rock and a Hard Place? Journal of Consciousness Studies 12 (2): 3-21.
Gennaro, R. 2006. Between Pure Self-Referentialism and the (extrinsic) HOT Theory of Consciousness. In U. Kriegel and K. Williford eds. Self-Representational Approaches to Consciousness. Cambridge, MA: CON Premere.
Gennaro, R. 2008. Representationalism, Peripheral Awareness, and the Transparency of Experience. Philosophical Studies 139: 39-56.
Gennaro, R. 2009. Animals, consciousness, and I-thoughts. In R. Lurz ed. Philosophy of Animal Minds. New York: Pressa dell'Università di Cambridge.
Gennaro, R. 2012. The Consciousness Paradox: Consciousness, Concetti, and Higher-Order Thoughts. Cambridge, MA: La stampa del MIT.
Gennaro, R. 2013. Defending HOT Theory and the Wide Intrinsicality View: A reply to Weisberg, Van Gulick, and Seager. Journal of Consciousness Studies 20 (11-12): 82-100.
Gennaro, R. 2015a. The ‘of’ of Intentionality and the ‘of’ of Acquaintance. In S. Miguens, G. Preyer, and C. Morando eds. Pre-Reflective Consciousness: Sartre and Contemporary Philosophy of Mind. New York: Routledge Publishers.
Gennaro, R. 2015b. Somatoparaphrenia, Anosognosia, and Higher-Order Thoughts. In R. Gennaro ed. Disturbed Consciousness. Cambridge, MA: CON Premere.
Gennaro, R. ed. 2015c. Disturbed Consciousness: New Essays on Psychopathology and Theories of Consciousness. Cambridge, MA: La stampa del MIT.
Goldberg, I., Harel, M., and Malach, R. 2006. When the Brain Loses its Self: Prefrontal
Inactivation during Sensorimotor Processing. Neuron 50: 329-339.
Goldman, UN. 1993. Consciousness, Folk Psychology and Cognitive Science. Consciousness and Cognition 2: 264-82.
Goldman, UN. 2006. Simulating Minds. New York: la stampa dell'università di Oxford.
Grill-Spector, K. and Malach, R. 2004. The Human Visual Cortex. Annual Review of Neuroscience 7: 649-677.
Gunther, E. ed. 2003. Essays on Nonconceptual Content. Cambridge, MA: CON Premere.
Harman, G. 1990. The Intrinsic Quality of Experience. In J. Tomberlin ed. Prospettive filosofiche, 4. Atascadero, circa: Ridgeview Publishing.
Horgan, T. and Tienson, J. 2002. The Intentionality of Phenomenology and the Phenomenology of Intentionality. In D. Chalmers ed. Filosofia della mente: Classical and Contemporary Readings. New York: la stampa dell'università di Oxford.
Hurley, S. and Nudds, M. eds. 2006. Rational Animals? New York: la stampa dell'università di Oxford.
Janzen, G. 2008. The Reflexive Nature of Consciousness. Amsterdam and Philadelphia: John Benjamins.
Jehle, D. and Kriegel, U. 2006. An Argument against Dispositional HOT Theory. Philosophical Psychology 19: 462-476.
Kant, IO. 1781/1965. Critique of Pure Reason. Translated by N. Kemp Smith. New York: MacMillan.
Kozuch, B. 2014. Prefrontal Lesion Evidence against Higher-Order Theories of Consciousness. Philosophical Studies 167: 721-746.
Kriegel, U. 2002. PANIC Theory and the Prospects for a Representational Theory of Phenomenal Consciousness. Philosophical Psychology 15: 55-64.
Kriegel, U. 2003. Consciousness as Intransitive Self-Consciousness: Two Views and an Argument. Canadian Journal of Philosophy 33: 103-132.
Kriegel, U. 2005. Naturalizing Subjective Character. Philosophy and Phenomenological Research 71: 23-56.
Kriegel, U. 2006. The Same Order Monitoring Theory of Consciousness. In U. Kriegel and K. Williford eds. Self-Representational Approaches to Consciousness. Cambridge, MA: CON Premere.
Kriegel, U. 2007. A Cross-Order Integration Hypothesis for the Neural Correlate of Consciousness. Consciousness and Cognition 16: 897-912.
Kriegel, U. 2009. Subjective Consciousness. New York: la stampa dell'università di Oxford.
Kriegel, U. ed. 2013. Phenomenal Intentionality. New York: la stampa dell'università di Oxford.
Kriegel, U. and Williford, K. eds. 2006. Self-Representational Approaches to Consciousness. Cambridge, MA: CON Premere.
Lane, T. 2015. Self, Belonging, and Conscious Experience: A Critique of Subjectivity Theories of Consciousness. In R. Gennaro ed. Disturbed Consciousness. Cambridge, MA: CON Premere.
Lane, T. and Liang, C. 2010. Mental Ownership and Higher-Order Thought. Analysis 70: 496-501.
Lau, H. and Rosenthal, D. 2011. Empirical Support for Higher-Order Theories of Conscious Awareness. Trends in Cognitive Sciences 15: 365–373.
Levine, J. 2001. Purple Haze: The Puzzle of Conscious Experience. Cambridge, MA: CON Premere.
Liang, l. and Lane, T. 2009. Higher-Order Thought and Pathological Self: The Case of Somatoparaphrenia. Analysis 69: 661-668.
Lurz, R. ed. 2009. The Philosophy of Animal Minds. Cambridge, MA: Pressa dell'Università di Cambridge.
Lurz, R. 2011. Mindreading Animals. Cambridge, MA: CON Premere.
Lycan, W. 1996. Consciousness and Experience. Cambridge, MA: CON Premere.
Lycan, W. 2001. A Simple Argument for a Higher-Order Representation Theory of Consciousness. Analysis 61: 3-4.
Lycan, W. 2004. The Superiority of HOP to HOT. In R. Gennaro ed. Higher-Order Theories of Consciousness: An Anthology. Amsterdam: John Benjamins.
Nagel, T. 1974. What is it Like to be a Bat? Philosophical Review 83: 435-456.
Neander, K. 1998. The Division of Phenomenal Labor: A Problem for Representational Theories of Consciousness. Philosophical Perspectives 12: 411-434.
Newen, UN. and Vogeley, K. 2003. Self-Representation: Searching for a Neural Signature of Self-Consciousness. Consciousness and Cognition 12: 529-543.
Nichols, S. and Stich, S. 2003. Mindreading. New York: la stampa dell'università di Oxford.
Picciuto, V. 2011. Addressing Higher-Order Misrepresentation with Quotational Thought. Journal of Consciousness Studies 18 (3-4): 109-136.
Pollen, D. 2008. Fundamental Requirements for Primary Visual Perception. Cerebral Cortex 18: 1991-1998.
Prinz, J. 2012. The Conscious Brain. New York: la stampa dell'università di Oxford.
Radden, J. 2010. On Delusion. Abingdon and New York: Routledge.
Rolls, E. 2004. A Higher Order Syntactic Thought (HOST) Theory of Consciousness. In R. Gennaro ed. Higher-Order Theories of Consciousness: An Anthology. Amsterdam: John Benjamins.
Rosenthal, D.M. 1986. Two Concepts of Consciousness. Philosophical Studies 49: 329-359.
Rosenthal, D.M. 1991. The Independence of Consciousness and Sensory Quality. Philosophical Issues 1: 15-36.
Rosenthal, D.M. 1997. A Theory of Consciousness. In N. Block, O. Flanagan, e G. Güzeldere eds. The Nature of Consciousness. Cambridge, MA: CON Premere.
Rosenthal, D.M. 2002. Explaining Consciousness. In D. Chalmers ed. Filosofia della mente: Classical and Contemporary Readings. New York: la stampa dell'università di Oxford.
Rosenthal, D.M. 2004. Varieties of Higher-Order Theory. In R. Gennaro ed. Higher-Order Theories of Consciousness: An Anthology. Philadelphia and Amsterdam: John Benjamins.
Rosenthal, D.M. 2005. Consciousness and Mind. New York: la stampa dell'università di Oxford.
Rosenthal, D.M. 2010. Consciousness, the Self and Bodily Location. Analysis 70: 270-276.
Rosenthal, D.M. 2011. Exaggerated Reports: Reply to Block. Analysis 71: 431-437.
Santos, l., Nissen, UN., and Ferrugia, J. 2006. Rhesus monkeys, Macaca mulatta, Know
What Others Can and Cannot Hear. Animal Behaviour 71: 1175-1181.
Sartre, J. 1956. Being and Nothingness. New York: Philosophical Library.
Sauret, W. and Lycan, W. 2014. Attention and Internal Monitoring: A Farewell to HOP. Analysis 74: 363-370.
Seager, W. 2004. A Cold Look at HOT Theory. In R. Gennaro ed. Higher-Order Theories of Consciousness: An Anthology. Amsterdam: John Benjamins.
Searle, J. 1992. The Rediscovery of the Mind. Cambridge. MA: CON Premere.
Sebastián, M. 2013. Not a HOT Dream. In R. Brown ed. Consciousness Inside and Out: Phenomenology, Neuroscience, and the Nature of Experience. Dordrecht: Springer.
Sierra, M. and Berrios, G. 2000. The Cambridge Depersonalisation Scale: a New Instrument for the Measurement of Depersonalisation. Psychiatry Research 93: 153-164.
Siewart, C. 1998. The Significance of Consciousness. Princeton: Stampa dell'Università di Princeton.
Fabbro, D.W. 2004. Mind World: Essays in Phenomenology and Ontology. Cambridge, MA: Pressa dell'Università di Cambridge.
Terrace, H. and Metcalfe, J. eds. 2005. The Missing Link in Cognition: Origins of Self-Reflective Consciousness. New York: la stampa dell'università di Oxford.
Tye, M. 1995. Ten Problems of Consciousness. Cambridge, MA: CON Premere.
Tye, M. 2000. Consciousness, Colore, and Content. Cambridge, MA: CON Premere.
Vallar, G. and Ronchi, R. 2009. Somatoparaphrenia: A Body Delusion. A Review of the Neuropsychological Literature. Experimental Brain Research 192: 533-551.
Van Gulick, R. 1995. What Would Count as Explaining Consciousness? In T. Metzinger ed. Conscious Experience. Paderborn: Ferdinand Schöningh.
Van Gulick, R. 2000. Inward and Upward: Reflection, Introspection and Self-awareness. Philosophical Topics 28: 275-305.
Van Gulick, R. 2004. Higher-Order Global States (HOGS): An Alternative Higher-Order Model of Consciousness. In R. Gennaro ed. Higher-Order Theories of Consciousness: An Anthology. Amsterdam: John Benjamins.
Van Gulick, R. 2006. Mirror Mirror—Is That All? In U. Kriegel and K. Williford eds. Self- Representational Approaches to Consciousness. Cambridge, MA: CON Premere.
Weisberg, J. 2008. Same Old, Same Old: The Same-Order Representation Theory of Consciousness and the Division of Phenomenal Labor. Synthese 160: 161-181.
Weisberg, J. 2011. Misrepresenting Consciousness. Philosophical Studies 154: 409-433.
Williford, K. 2006. The Self-Representational Structure of Consciousness. In Kriegel and Williford 2006.
Zahavi, D. 2004. Back to Brentano? Journal of Consciousness Studies 11 (10-11): 66-87.
Zahavi, D. 2007. The Heidelberg School and the Limits of Reflection. In S. Heinämaa, V. Lähteenmäki, and P. Remes eds. Consciousness: From perception to reflection in the history of philosophy. Dordrecht: Springer.
Zeki, S. 2007. A Theory of Micro-Consciousness. In M. Velmans and S. Schneider eds. The Blackwell Companion to Consciousness. Malden, MA: Blackwell.

Informazioni sull'autore

Rocco J. Gennaro
E-mail: [email protected]
University of Southern Indiana
U. S. UN.

(Visitato 1 volte, 1 visite oggi)