The three primary philosophical arguments for consciousness being separate from the material brain:
- Perfect Scientific Understanding Argument - If (or when) we reach an absolute perfect scientific understanding of a biological system, and we know absolutely every single detail pertaining to what makes up the material structure of any living thing, including every minute detail of that living thing's physiological brain structure, there would still be one thing that we wouldn't know. Namely, what it is like to be that living thing. Therefore consciousness must be something other than the material structure of something, since knowing all there is to know of the material structure leaves out the conscious experience of it.
- The Zombie Argument - Imagine a human being who on the outside acts no different from that of any other human being, however internally has no consciousness at all whatsoever. And if you ask them: "are you conscious?", they'd respond: "of course I'm conscious!", completely 'unaware' that they lack any sort of consciousness whatsoever, a true 'zombie' so to speak. This isn't to say that such non-conscious entities positively do exist, but simply the mere possibility that such a zombie could exist, shows that consciousness must be something somewhat separate from the material brain, since there's no logical reason why 'zombies' (at least in this sense) couldn't exist.
- The Chinese Room Argument - This argument is in response to a specific argument for materialism, namely that if you design the right artificial intelligence with the right computer program, that this alone is sufficient in creating consciousness. It goes as follows: Take any cognitive function you don't currently have. For this example, I don't speak Chinese. Someone sits me in a room and asks me to answer questions in Chinese. I don't know Chinese, but they hand me a rulebook that tells me the proper steps I need to implement in order to answer the questions in Chinese. This rulebook represents this supposed computer program. Of course I don't know what these symbols in Chinese mean, but the rule book was written so good that I'm able to follow it's proper steps in shuffling these symbols around in proper order so as to accurately answer each of these questions in Chinese. From an outside observer, my answers would be indistinguishable from a native speaker of Chinese, but in reality I don't understand a word of Chinese. I'm just following the rulebook and shuffling symbols without consciously comprehending what any of the symbols mean, and there's no way for me to actually learn Chinese by simply shuffling around symbols in a room. So here's the main crux of the argument: If I don't understand Chinese on the basis of implementing the rules in the rulebook (aka the computer program), then neither does any other computer solely on that basis. Therefore, consciousness must be something more then just a 'program' in the hardwire that is the brain.
Ultimately materialism excludes all qualitative experiences by attempting to reduce them all down to the quantitative level. Though this just simply does not work. The qualitative conscious experience is a uniquely separate thing altogether that must be measured independent of material structures.