It does exist in the real world, in the various neurological patterns that form consciousness. Rather than re-conjecturing dualism over and over again without foundation, simply stating different scenarios and asking if it could in fact exist in this or that situation, why not specify the scenario that you first studied deeply and found dualism to be the only viable explanation.
Doesn't it exist in states of the brain? If our neuroscience were advanced enough, we might be able to "measure" it by mapping the brain. I think of it like computer software. Everything saved on the computer is "inside" it in a physical way (written on the hard drive), but it only makes sense in the context of the computer. When we study the brain and try to explain consciousness, we're essentially trying to work out the software by looking at the hard drive under a microscope—needless to say, a monumental task.
Does a computer program exist physically, or is there program/hard-drive dualism? See why the question of mind/body dualism might be the similar? Nobody has a problem accepting that something as complex as the internet is nothing more than wires and electrons.
Making things more complicated, although I've used a hard drive as a brain metaphor, I strongly suspect that the brain is far more similar to RAM. In my very rudimentary grasp of computing, I understand that RAM depends on the supply of electricity and is erased when the computer is powered down. I think the brain is always "on" from birth to death, and that while common processes do change the brain structure in some ways, 90% to 100% of ourselves are immediately erased when we die because the power is turned off. That's why I don't hold out too much hope that being cryogenically frozen will preserve someone's personality and memories even if their body is revived one day.
Yeah absolutely it could exist in the states of the brain, I'm not sure. And the analogies between brains and computers are apt, with the CPU being the cognitive processes and the hard drive being memories. Perhaps dreams are a construct of our own internal graphics card. I'm just playing with dualism as an idea, haven't really thought it through, that's why I came here.
Monism hasn't answered everything though, for instance our sense of morality. Also the idea of free will, or even a self at all?
Why do we feel more indisposed towards empathy with certain things but not others? I don't feel bad for the brick when it breaks, or the bacteria when it dies, but we do feel a sense of loss when a higher order animal dies, or a human being. Seeing another consciousness in pain or distress evokes those feelings in us? Why is this? What do we share, aside from a consciousness? If they are only arrangements of atoms and chemical processes like me, why do I feel for them?
If the brain is the source of consciousness, where is free will expressed? Some random dice roll in the brain does not constitute 'will'. Is the notion of free will therefore an illusion?
The same can be applied to the self. I know I am me and not you, here and not there. If two brains had identical patterns and processes at the same time, they would still have a notion of self. Is this an illusion?
If they are illusions, is this implying that all conscious beings are all suffering from some kind of dementia?
I'm pretty convinced that morality has very plausible naturalistic explanations based in evolutionary biology, sociology, psychology, and philosophy. I don't see that as any challenge to "monism" whatsoever.
Free will is the thorniest problem in philosophy, if only because it seems to have such importance to how we see ourselves. Unfortunately, all of the evidence seems to point in one direction: free will is an illusion. That's a tough pill to swallow, but I'm not qualified to argue against the evidence. Sam Harris has a lot to say on this topic.
Perhaps, but then why do we like seeing tigers in a zoo, or are interested in preserving them? From an evolutionary standpoint that runs false surely? In an evolutionary sense we ought to be trying to make them extinct! They're fucking dangerous!
We intuit that tigers have some value, but aren't wholly able to explain why we think that (from an evolutionary defense-mechanism standpoint. They are fucking dangerous)
Okay, I'll bite. Why does preventing the extinction of tigers run contrary to evolution? Be specific.
We intuit that tigers have some value, but aren't wholly able to explain why we think that (from an evolutionary defense-mechanism standpoint.
What in the world is "an evolutionary defense-mechanism"?
We can - it's called an MRI. We do not currently have high enough resolution to get a fully detailed, quantifiable measurement, but we get a little closer with each upgrade to the technology.
Yeah. I have heard of an MRI. If we had high enough resolution then, we could measure someone thinking about a number, and would be able to say what? The number 8 = this amount of units of electricity in this direction, between these synapses in this pattern. Presumably then, different numbers would have different values in these fields.
Does this mean then, that the number 8 is reducible to these values? That does not seem to explain sufficiently what 8 or 2 or pi represents. Would we all have identical values in these fields?
Individual brains store information with subtle differences - this is what accounts for our different personalities even when our experiences have been remarkably similar. Current neurology suggests that we can really only conceive of rather small numbers. Numbers like 2 and 3 or "two-ness and threeness" seems to be rather hard wired from birth. In reality no one can really think of a million - but we have other strategies for working with such numbers, such as thinking of 'six zeroes'.
Abstract ideas evolve, bit by bit, in the neurological patterns that constitute individual consciousnesses. Sometimes an abstract idea can span more than one consciousness.
In what sense do you mean 'span'?
It is possible that the pattern that forms a particular idea is formed in more than one brain. We have some concepts that are so large that no single person can be said to fully grasp them in their entirety. Individuals work with pieces of such ideas.