Talk:Consciousness Studies/Nineteenth To Twenty First Century Philosophy

Phenomenal consciousness
When I started adding information about "phenomenal" consciousness to Consciousness studies (24 August 2005) the section on The neuroscience of consciousness uncritically (without discussion on that page and without reference to discussion elsewhere) introduced the dualistic distinction between "phenomenal" and "access" consciousness. It was suggested that neuroscience currently investigates "access" consciousness but "phenomenal" consciousness eludes scientific explanation.


 * Thank you for adding a link, there is a lack of links in the book. RobinH 20:14, 5 September 2005 (UTC).

I searched the Consciousness studies wikibook for discussion of the distinction between "phenomenal" and "access" consciousness. I found discussion of these categories of consciousness at the page on "The conflict", so I made a link from the word "phenomenal" in The neuroscience of consciousness to The conflict. In The conflict, I started adding information about Block's division of consciousness into "phenomenal" and "access" consciousness.

The information about Block was then moved to the page about Nineteenth and twentieth century philosophy. I continued to edit that page, providing both a description of Block's ideas and a description of why neuroscientists resist Block's division of consciousness into "phenomenal" and "access" consciousness.


 * Your discussion was about Block rather than the whole issue raised on the page: Phenomenal consciousness and access consciousness and is an interesting addition to twentieth century philosophers.

On 5 September 2005 User:RobinH removed from Nineteenth and twentieth century philosophy the account of why neuroscientists resist Block's division of consciousness into "phenomenal" and "access" consciousness.

This comment was inserted: "why is this relevant to Block? It is discussed in great depth later" in place of the account of why neuroscientists resist Block's division of consciousness into "phenomenal" and "access" consciousness. The edit (03:07, 5 September 2005 RobinH) was described as:

(NPOV remove subjective text (what computer scientists feel) ! Remove rant against skepticism)

The entire account of Block's division of consciousness into "phenomenal" and "access" consciousness was added to Consciousness studies in an attempt to counter an uncritical account of this dualistic division and its relevance to neuroscience. Neuroscientists do not accept Block's dualism and it is important to say so.


 * The division into phenomenal and access consciousness is not dualism. Phenomenal consciousness could be a non-algorithmic physical phenomenon such as proposed by Pribram and Bohm or Penrose and Hammeroff or Green or Stapp etc. However, given that Turing machines are algorithmic it would suggest that phenomenal consciousness is not something that is done by digital computers. RobinH

To remove the account of why neuroscientists do not accept Block's dualism while asking, "why is this relevant to Block?" misses the whole point of why the account was added.


 * Many neuroscientists accept the concept of phenomenal consciousness, certainly cognitivists and most indirect realist neurophysiologists would be in this category. I know many neurophysiologists and would say that indirect realism is the conventional wisdom in that field. RobinH

Block's dualistic account of consciousness is depicted in Consciousness studies as being of fundamental importance to the neuroscience of consciousness, a view that is rejected by most neuroscientists. It is entirely relevant to add discussion of this issue. If the objection by User:RobinH is simply that the reaction of neuroscientists to Block should not be a part of the Block section of Nineteenth and twentieth century philosophy, then it should be moved to another page. The material about Block was originally being added to another page that deals with the dualistic distinction between "phenomenal" and "access" consciousness. User:RobinH is the editor who moved the account of Block's dualistic distinction between "phenomenal" and "access" consciousness (and reaction to it) to "Nineteenth and twentieth century philosophy".

User:RobinH seems to be justifying removal of the account of the views of neuroscientists because such accounts do not present a neutral point of view. However, the original depiction by Consciousness studies of the fundamental importance of the dualistic distinction between "phenomenal" and "access" consciousness to the neuroscience of consciousness was not itself a neutral point of view. The material about how neuroscientists react to Block's dualistic distinction between "phenomenal" and "access" consciousness was added to Consciousness studies as an attempt to counter the original bias of the book.

According to the introduction:

"Everyone has their own view of the nature of consciousness based on their education and background, the intention of this book is to expand this view by providing an insight into the various ideas and beliefs on the subject as well as a review of current work in neuroscience. The neuroscientist should find the philosophical discussion interesting because this provides first person insights into the nature of consciousness and also provides some subtle arguments about why consciousness is not a simple problem."

If a goal of this book is "providing an insight into the various ideas and beliefs on the subject" then it seems strange to delete descriptions of why some neuroscientists, computer scientists and some philosophers question the "first person insights into the nature of consciousness" that have been proclaimed by philosophers such as Block. --JWSurf 16:42, 5 September 2005 (UTC)

Response by RobinH 19:43, 5 September 2005 (UTC)
Firstly please forgive me for editing so aggressively. My problem with the pieces that were deleted is described below.

The first piece that was taken out is the following:


 * Many neurobiologists and computer scientists feel that philosophers such as Block and Searle are overly-pessimistic about the power of "computation", "program" or "algorithm" to produce human-like consciousness. The study of "computation", "program", "algorithm" and consciousness is too primitive for us to be able to trust our intuitions about exactly what is possible for computational algorithms to accomplish. Further, it may not matter what we call physical processes that can generate consciousness as long as we can figure out what they are and how to work with them. Thus, neurobiologists and computer scientists feel justified in continuing to search for the physical basis of consciousness and for ways to endow man-made devices with human-like consciousness. Further, despite warnings from philosophers, neurobiologists and computer scientists often suspect that conventional physical accounts of brain processes and some form of computational algorithm can be found to explain consciousness and allow us to instantiate it in robots.

My gripe with this is it is about what the author believes to be the prevailing opinion amongst neuroscientists. If Searle is overly pessimistic then the author should quote who has analysed Searle and found him to be wrong and why. Searle's argument is basically the same as Leibniz's concern, a computer can be made out of mechanical parts but would a set of rolling steel balls have your experience at any instant? Can the author find a paper that argues convincingly that a set of steel balls could be conscious? The gripe is about a personal opinion that is unsupported by references to how Searle is wrong.

There is always a tendency in this debate to confuse Searle's attack on information processing in digital computers with the wider physicalist versus dualist debate. Digital computers are not the only route to artificial consciousness and Searle is attacking digital computers (Turing type information processors). I believe that artificial consciousness must be possible in some way but not using Turing machines (after all, we are conscious).

The second bit that I cut was:


 * Some philosophers such as Thomas Nagel have claimed a fundamental distinction between the first person experience of consciousness and any third person account of the mechanisms by which consciousness is generated. If philosophers can be overly-pessimistic about what neuroscientists and computer scientists can accomplish from the third person perspective, they might also be overly-enthusiastic about the reliability of first person introspection. Some philosophers have been fundamentally skeptical about our ability to be certain about anything we observe from the first person perspective. Despite any sense we me have about our inability to be be wrong about our subjective evaluations of our own consciousness, it may be wise to keep an open mind and remain open to the possibility that phenomenal consciousness is not a distinct category from access consciousness. For example, they may be at the two ends of a continuous spectrum of consciousness for which some forms of consciousness are easier to imagine as being algorithmically generated than others.

I had a problem with this because Nagel is pointing out that there is a thing, personal phenomenal experience. You are saying that you have an explanation for that thing - digital computing (algorithmic generation). However, there is no reference to how Nagel's thing is explained by digital computing/algorithms or even how we might set about explaining it in terms of digital computing/algorithms. Wittgenstein is really attacking the cogito. You might have invoked Dennett who does the same in the context of computers. Dennett gives up trying to explain personal experience in terms of digital computing and also attacks incorrigibility and suggests that personal experience does not exist as a thing at all. In other words, according to Dennett, the only way that we can say that people are digital computers/algorithms is to deny the very thing that we want to explain - phenomenal experience. But is Dennett or Wittgenstein relevant to the excised paragraph? The paragraph says that digital computers/algorithms will explain phenomenal consciousness, it does not say that, like Dennett (an extremist Eliminativist philosopher), we can just ignore the whole problem by redefining experience in functionalist terms. But doubting incorrigibility does make the phenomenon go away, there is still something to be explained and we are still lacking a digital/algorithmic explanation. The gripe is that you are using veiled references to attacks on incorrigibility in support of the idea that digital computers/algorithms could be conscious. The thesis needs references to papers that show how digital computers/algorithms could, in principle, be phenomenally conscious or needs to develop the idea that phenomenal consciousness does not exist (see below).

Dennett was vague in his "Consciousness Explained" about all this. He invokes eliminativism then, later in the book, claims that phenomenal consciousness will emerge from complexity. He cannot have it both ways! If consciousness emerges then how does a digital computer do that? On the other hand, if Dennett really believes that phenomenal consciousness does not exist he should stick to that (but then he would be a zombie).

These were the principle changes that I made. The other changes were to add references to work by physicists who believe that there may be non-digital computing forms of artificial consciousness and to point out that the idea that human beings are digital computers or simple information processors is highly contentious. Most of the neuroscientists I have known and know do not believe that we are simply information processors - most would either look to some sort of emergentism or unexplained physics (which is almost the same thing) or just say that they don't know how consciousness is done!

Lastly I pointed out that Block is just the last in a long line of people who have drawn to our attention the cognitivist vs behaviourist divide. Even Aristotle mentions it! Is it a biased basis for this book to simply accept the divide and run with it?

See: The problem of machine and digital consciousness where the problem of artificial consciousness is discussed and where it is noted that you can be a physicalist (non-dualist) without accepting that we are Turing Machines.

One last word, welcome aboard JWSchmidt, this is going to be a rocky ride of a book.

Thanks for the welcome
"If Searle is overly pessimistic then the author should quote who has analysed Searle and found him to be wrong and why."

I'm not sure that many neurobiologists would label as "wrong" Searle's position as given here. However, Searle has contributed to the perception of neurobiologists that many philosophers are too centered on their inability to imagine how physical brain processes can account for consciousness. Gerald Edelman expressed this sentiment in "Naturalizing consciousness: A theoretical framework" (Proc Natl Acad Sci U S A. 2003 April 29; 100(9): 5520–5524; see the last sentence and Edelman's reference 25).


 * Edelman is almost at the same place as Block philosophically he says: "Primary consciousness is the state of being mentally aware of things in the world--of having mental images in the present" and "In contrast, higher-order consciousness involves the recognition by a thinking subject of his or her own acts or affections". Primary consciousness, mental images at the present instant, is phenomenal consciousness and higher-order consciousness is access consciousness. It is complex, but definitely feasible to provide an algorithmic explanation of higher order consciousness but primary consciousness is far more difficult - how can we be mentally aware of images at the present instant? As neuroscientists such as Penfield and Newman & Baars and philosophers from time immemorial have pointed out primary/phenomenal consciousness is a 'state'. This was why I chose Green's article as the definition of the problem of consciousness, Green defines phenomenal consciousness in physical terms see: http://en.wikibooks.org/wiki/Consciousness_studies:_The_description_of_consciousness

RobinH 09:05, 6 September 2005 (UTC)

Philosophers seem very concerned with the mystery of consciousness and neuroscientists like Edelman and Koch just want to get on with their research. Edelman started his article by warning against dualism, and it is Block's dualistic distinction between the natures of A and P consciousness that provokes distrust and leads to the inability of scientists to adopt Block's distinction between A and P consciousness.

What we are dealing with here is differences in the intuitions of philosophers and neuroscientists (and AI researchers, too) about the nature of consciousness and our ability to use the tools of science to make sense of consciousness and possibly attain the technical capacity to produce consciousness in man-made devices. I find it strange that there could be honest doubt about the existence of differences in these intuitions between people working in different fields such as neuroscience, philosophy of mind, and computer science. In my view, we have a case of the blind men and the elephant. Each specialized discipline has its own perspective on consciousness. Different people take "program" to mean different things. We are bewitched by the language we use. I hope a way can be found to clearly make this point in Consciousness studies. I hope that the wiki environment can help lead to better communication between specialized disciplines and a more coordinated attack on consciousness and eventual understanding. --JWSurf 04:45, 6 September 2005 (UTC)

Husserl and bracketing
Concerning: "the intention to move, the movement and the sensation of movement are bound or 'bracketed' together into a single meaning."

Bracketing has more to do with the Husserlian epoche than with semantics or psychological intention. As I understand bracketing, it is what we do when we postpone our judgement in the epoche. For instance, when considering an experience, in order to reach its pureness we must bracket the ego.Ex.

hello there. i'm sure this is a bad way to do research but i wondered if 'other minds' is a problem with naturalism, if that's what the qualia arguments boil down to? is there anything i could be reading i'm taking a module in sciientific realism in Januray? thanks :o