The nature of consciousness, and how to enjoy it while you can | Page 3 | Ars OpenForum

The nature of consciousness, and how to enjoy it while you can

iamai

Ars Scholae Palatinae
788
Subscriptor++
I'm a huge consciousness skeptic.

To me it looks like we try to name something which doesn't even exist in the first place and the quest to find it is akin to the search of God. It doesn't help that no one can even clearly define the thing of why its existence or uniqueness is important or makes any difference at all. Just read the intro of the corresponding article on Wikipedia https://en.wikipedia.org/wiki/Consciousness and you'll instantly realize we are dealing with a huge mess. I'd like to hear one reason why consciousness needs to exist and I'm ready to claim I don't have one and I'm feeling fine.
I've come to think of consciousness as an elaborate type of reflex.
 
Upvote
1 (2 / -1)
Positing hypotheticals that are not actually feasible in the real world doesn’t really explain or define what consciousness is. We don’t try to define the electromagnetic field by asking the reader to imagine what it would feel like to travel between two electrons.
Is defining consciousness really useful? I like to think that I think. Some of this thinking I am aware of and some is unconscious. Many times I have woken up with the solution to a problem which had baffled me the day before. Is conscious thinking better than unconscious thinking? Not a good question. Is the border between the two sharp or fuzzy? Are there other differences besides self awareness? Hypnotists, who purport to communicate with the unconscious mind, stress that their message must be very literal. Metaphors, similes, and other figures of speech may be misinterpreted.
What does this imply?
 
Upvote
-2 (2 / -4)

pfstevenson32

Wise, Aged Ars Veteran
118
If consciousness didn't exist, you wouldn't be able to feel anything, because there would be no experience, and no 'you'.
There seem to be more than one level of consciousness. Tibetan Buddhism also uses the multiple levels of consciousness analogy for ease of use, I think. I currently think both the scientific materialism (of thinking only the brain/body exist) and Buddhism mind "sciences" have validity and, more importantly to me, usefulness.
 
Upvote
4 (5 / -1)

pfstevenson32

Wise, Aged Ars Veteran
118
Totally agree! I think the ability people have to actually change their how their mind works is something most people unfortunately are too unaware of. A lot of suffering could be avoided if more people was given the tools to change stuff that hinders them in life. I too have have had great use of cognitive therapy (MCT to be specific) and it was and is an incredibly powerful tool for me. I’m mid life myself and in a way I also wish I had these tools earlier in life but on the other hand, I’m not sure I was entirely ready before now.
I was in a meeting that was also attended by a young lady. I assumed she was a college student. Someone later introduced her as a veterinarian doctor. In that moment, I felt my mind change from a neutral opinion of her to one of big respect. That was the first time I had watched or felt it change. (Although it had to have changed over my lifetime.) Meditation is basically, to me, doing this the whole time.
 
Upvote
1 (1 / 0)

Quisquis

Ars Tribunus Angusticlavius
6,802
Several others have commented similar things to what I'm going to say, particularly OldNoobGuy, but my consciousness compels me...

I have undergone sedation several times for a procedure. This is, from what they told me, not quite the same as full anesthesia; rather, I am at least partly awake during the sedation, and able to converse with the doctor and nurses (not sure at what level). And this procedure is uncomfortable, although not quite fully painful. (I know this because I had it once or twice before they instituted the sedation method.)

I remember nothing of the experience. And somehow any discomfort I experienced doesn't matter to me--it's as if it happened to someone else, or it didn't happen at all.

Was I conscious during this sedation, even though I don't remember anything about it afterwards? I don't know.

Is this the experience animals--my cat or dog, for example--have every day? Does my dog remember the walk I took her on (as opposed to just liking "walks")? Does the cat remember that I petted him (as opposed to just liking "being petted")? I have no idea.
I mean, those questions are easily answerable in the affirmative...

You're just asking if cats and dogs have memories
 
Upvote
7 (7 / 0)

mscf

Seniorius Lurkius
13
The first time I read Koch (The Feeling of Life Itself), I was quite taken with his theory. I had been reading about emergent behavior at the time and thought that IIT was a good fit for these ideas. Over time I have come to like it somewhat less. I was privileged to see him debate Stanislas Dehaene, of Global Workspace Theory (GWT) fame, on this topic and wasn't really convinced by either.

Of late I have been most interested in the theory of Predictive Coding (PC) and what it implies about the nature of consciousness. Predictive Coding in the Neural Cortex (Rao and Ballard, 1999) is the seminal paper here, but Anil Seth has a Ted talk suitable for wider audiences that I think is interesting.

I think consciousness is an evolved adaptation that allows animals to minimize delays between stimulus and response by, essentually, hallucinating the world in real-time. In this way, you don't consciously see with your eyes, but their incoming signals serve as a kind of guide rail for the hallucination. By adapting the timing of the hallucination to the learned characteristic of sensory and motor delays the behavior of the organism can adapt to rapidly changing circumstances - a significant competitive advantage over organisms incapable of this trick.
 
Upvote
19 (19 / 0)
Koch, a physicist, neuroscientist, and former president of the Allen Institute for Brain Science, has spent his career hunting for the seat of consciousness, scouring the brain for physical footprints of subjective experience.

I read that and thought

I have seen all the works which have been done under the sun, and behold, all is futility and striving after the wind
 
Upvote
-2 (2 / -4)
Is defining consciousness really useful? I like to think that I think. Some of this thinking I am aware of and some is unconscious. Many times I have woken up with the solution to a problem which had baffled me the day before. Is conscious thinking better than unconscious thinking? Not a good question. Is the border between the two sharp or fuzzy? Are there other differences besides self awareness? Hypnotists, who purport to communicate with the unconscious mind, stress that their message must be very literal. Metaphors, similes, and other figures of speech may be misinterpreted.
What does this imply?
“I like to think that I think.”

You started your statement with "I," and the problem is the same with "I think, therefore I am."

It's a fallacy. There is no independent evidence for it. It's a concept/thought and nothing more.

Both "I" and "consciousness" are undefinable except by other words, which themselves are just tags we use for convenience.

If you ask, "How does the mind decide to have a thought? Is there something else that decides?" and you start following that logic train, you always end up OUTSIDE of your body somewhere.

Everything is required for "consciousness," whatever that is. Saying that the brain is responsible for it ignores the rest of the universe.

Sorry…I’m waxing philosophical today ;)
 
Upvote
-2 (4 / -6)

oluseyi

Ars Scholae Palatinae
1,168
…with AI systems behaving in strikingly conscious-looking ways…
Are "conscious-looking" ways actually conscious? How much of this "conscious-looking"-ness is just the human tendency to anthropomorphize everything? We see faces in clouds and toast, and ascribe complex motivations to puppies.

This feels like a forced conclusion to tenuously relate some interesting speculation and hypothesis to the "AI" craze du jour.
 
Upvote
8 (8 / 0)
I'm gonna get a little bit out there for a second, but:

My personal theory is that consciousness is, like everything else in this universe, a field that permeates spacetime and which is warped and influenced by matter.

This is pretty comprehensively ruled out by Noether's theorem. If such a field existed, and interacted with matter at the scales and energy levels of brains, we would necessarily have observed the particle that acts as the force carrier in existing accelerators.
 
Upvote
11 (11 / 0)
There's never been any Moore's "law", it was simply an observation, and it wasn't about "complexity", it was only about doubling a number of transistors in approximately 2-3 years, which tells us nothing about the power efficiency or consumption of the system built with them. 70 doublings are not physically or economically feasible with the current state of technology or ever.

To give you an idea 2^70 = 1.18059 x 10^21
You're right. 10^21 is only 70 cycles of Moore's law. Maybe it's not achievable, but I remember the doom and gloom, was it the 90s?, when we hit the "limit" of wavelength of light used making the chips, and and everybody knows, that was a mere speed bump.

Predicting impossible is brave.
 
Upvote
-8 (1 / -9)

Veritas super omens

Ars Legatus Legionis
22,634
Subscriptor++
But does your cat have a daily routine, goals, objectives? My cats like me because I feed him, give him attention, empty his cat litter....and in return he treats my hand like a pin cushion.
My cat decidedly has goals. He nearly always achieves them. He scans his territory from his throne (cat tower with view down across the street in front of house). He hunts the territory for interloping rats. Dispatching the occasional perp but mostly "catch and release" to teach them a lesson. When I ask if he wants to "go for a walk" he (usually) jumps from his throne he goes out the back with me and "supervises" as I scan my bonsai garden for issue needing resolution, then up to vegetable garden to water greenhouse, often a catnip treat goes his way. Then he usually wanders of and checks his route coming back between 9 and 11 AM for afternoon nappage time. Napping is his main goal.
 
Upvote
6 (7 / -1)

pe1

Wise, Aged Ars Veteran
132
Subscriptor
Current estimate is the human brain contains a minimum of 80 billion neurons, roughly the same number of astrocytes, and a quadrillion tetrapartite synapses. Further there are a minimum of 10 quadrillion g-protein coupled receptors - the 'control panel' - of a cell dynamically modulating and modifying neural and glial cell activity.
Individual proteins are not the relevant unit at which brains do computation. We understand how neurons work. We understand how their outputs are computed from their inputs. We understand how they communicate with each other. There's a lot we don't know about the brain, but it's mainly at larger scales, how complex behaviors emerge in huge networks of neurons.

Individual neurons can be accurately described by small neural networks (of the computer variety) involving less than a hundred parameters. That's the relevant scale for doing computation. Everything else is implementation details and unrelated functions (metabolism, protein production, DNA repair, and so on). As far as its job of doing computation is concerned, a neuron is a small and easily modeled object.
 
Upvote
10 (10 / 0)

Somebody Important

Smack-Fu Master, in training
94
I have a few questions, which will highlight my near-complete ignorance but my great interest in the topic.

1. What about animals with "simpler" brains? What are the animals that lack the posterior hot zone (PHZ), behind the neocortex? It was my understanding (maybe wrong) that the neocortex is only found in apes, elephants and... I forget what others. So if many animals lack a neocortex, do they also lack the PHZ? So many animals aren't conscious? That can't be right. Spiders seem - to me at least - absolutely conscious. Snails, fish, all birds etc just as well. The PHZ seems implausible to me as the seat of consciousness, because that specific part of the brain is likely missing in too many animals that absolutely seem conscious to me (and hopefully I'm conscious). But maybe the PHZ is present in many animals, instead? I'd love to know!

2. Why would the PHZ be less active in monks that have long practiced meditation, if it's the seat of consciousness? If anything, it tells us rather that the PHZ is tightly involved in the Default Mode Netwok (DMN), which, as a whole, is toned down during meditation, states of flow and psilocybin intake.

3. Does the book discuss the Default Mode Network and its relationship to consciousness? Based on my very limited experience with meditation, and according to many more knowledgeable than me, your conscious experience is usually drowned in the self-referential noise of the DMN, but you can meditate to train a less insistent DMN. I definitely felt something like that after meditating a few weeks in a row, although that was just the way I was making sense of what I was feeling, not knowing any better.

4. Why couldn't we simulate consciousness with a computer? Of course, the usual von Neumann architecture of our computers is highly unlikely to become conscious, but that's not the point. Instead, we will simulate consciousness in software, where we have full freedom to model anything. We won't be constrained by the physical substrate of the machine; we will create a virtual substrate instead, under very different paradigms.
Under what paradigms can I buy this substrate for smoking?
 
Upvote
-1 (0 / -1)

Sonio

Ars Scholae Palatinae
814
You're already just an exact copy of you from 10 minutes ago, albeit teleported in time rather than space. The other guy is completely gone, all you have are some of his memories.
You're not an exact copy of yourself from 10 minutes ago -- chemical/biological processes have been occurring within you over that span of time. Your neurons have formed/deleted/modified connections over that span of time. You're maybe not very different than you were 10 minutes ago, but you're certainly not unchanged.

I would also go so far as to say you're much more than a guy who just has "some of his ["the other guy's"] memories" -- your present experience/awareness, sight, sound, etc, is the result of an unbroken biochemical process that began when your brain formed. Your current inner experience is the direct result of your preceding inner experience. A Continuity of experience, if you will.

Actually, as I'm thinking about it, I don't know that I would agree with "transported in time", either. Where you are in time is a direct result of where you were in time, much like where you are in space is a direct result of where you were in space. Whether time or space, it's all part of an unbroken process. Again, a continuity.

Maybe this is without merit, and I certainly can't quantify anything, but to me "unbroken process/continuity" seems... important? Significant? I'm not a neurologist or psychologist, so I'm speaking well out of my ass.

Anyway, let's go back to GrimPloughman's hypothetical for a moment:

If somebody created a teleportation device that works the way that it scan your body atom by atom and makes your exact copy somewhere else, the copy would be you or just your copy? If a copy, how to make the copy to be you?

Start with a basic teleportation. You start in your house, Point A, energize the transporter, and end up in the house across the street, Point B.

Question -- Do you have any conscious experience of the teleportation itself?

My assumption is that you don't. During the teleportation process, you don't exist in the same physical sense that you do under normal circumstances. If your brain doesn't physically exist, can the biochemical/electrical processes it uses to propagate experience happen? I expect not. Unless the physics which govern the in-teleportation state of the brain allows for an analogous process to continue. I guess another way of asking this is, is your consciousness (and all other biological/chemical/etc processes) "suspended" or "frozen" for the duration of the teleportation cycle or not? My guess is that yes, they are. I could be wrong, for sure, but that's the assumption I will choose for now. It strikes me as the simpler assumption to make.

Question -- Do you die, and a copy with all your memories is generated in the house across the street? Or do you remain alive from beginning to end? In other words, does the "unbroken process/continuity" of cognition remain unbroken? Or does teleportation break that continuity? I think there's a distinction to be made here -- you said:

You're already just an exact copy of you from 10 minutes ago, albeit teleported in time rather than space. The other guy is completely gone, all you have are some of his memories.

Question -- Consider two concepts:

1) Having memories of the past, and

2) Carrying your present conscious awareness and experience with you into the future.

Are these two concepts equivalent? Are they two slightly different ways of describing the same exact phenomenon, or is there a distinction to be made?

I personally make the assumption that there's a distinction here. The classic question about teleportation is whether or not pre-teleport you survives the process, or instead you die, and an exact replica with all your memories is assembled at the destination. Either way, the post-teleport replica should have the experience and memories of surviving the experience, which satisfies condition (1). But does it follow, logically and/or scientifically, that satisfying condition (1) by definition satisfies condition (2)? My opinion is that it does not follow, but again, I could absolutely be wrong.

But let me ask one more question to try to pin down the issue I'm having.

The 'basic' teleportation described above is a person moving from Point A (your house) to Point B (house across the street). Let's now take it a step further. During the teleportation cycle, your information is duplicated. A copy of you with all your memories intact is reconstituted in the house across the street. At the same time, another copy of you is reconstituted in Paris. Point C.

IF simply having the memories of the "other guy [who is] completely gone" means you are that person, then there are now two people who can equally claim to be 'you', although they are clearly not the same person -- they're each seeing, hearing, and otherwise experiencing different stimuli post-teleport, which sets their cognitive/conscious processes in distinctly different directions. Inasmuch as we are partially a product of our experiences, these two people who are both 'you', are different people. This seems to indicate that having memories of the past and carrying your present consciousness with you into the future are not exactly the same thing, right? Because...

IF teleportation doesn't kill pre-teleportation you, IF you carry your present consciousness with you into the future, surviving the teleportation process -- if having memories of the past implies that you've carried your present consciousness into the future -- then doesn't this mean that 'you' are now simultaneously experiencing two different sets of stimuli at the same time? You've carried your consciousness into two bodies in to different locations. Are you seeing the inside of the house across the street, and the Eiffel Tower at the exact same time? To get admittedly quantum 'woo' about it, do you exist as two 'entangled' minds, now, by some very tortured, as-yet-pseudoscientific definition of the word?

Why would you experience two sets of stimuli, you ask? The post-teleport bodies and brains are physically separate, individual, and unconnected, you say. Well, because that means that somewhere, there's a fundamental lack of Continuity. A fundamental disconnect between pre-teleport you and post-teleport 'you' which renders the post-teleport 'you' outwardly (and maybe even inwardly) indistinguishable from pre-teleport you, and yet somehow... not quite actually you.

----------------------------------------------------------

In trying to define a person's consciousness/experience/etc as an "unbroken process" or a "continuity", I'm somewhat trying (and maybe failing) to invoke the idea of "showing your work" on a math problem, or a logical chain of reasoning, or something along those lines. You could get the right answer on a math problem, and yet do something wrong in the actual process of getting there. In that case, the answer is the correct value when considering the initial value and where you're expected to end up, but the answer is still wrong in some fundamental sense, if it's not supported by the work you did to get there. There's a fundamental lack of continuity, if the end result is attributable to luck or intuition, rather than demonstrable, replicable mathematics.

I don't know it the analogy works the way I want it to, but that's just sorta where my mind goes when the topic of consciousness, especially with regards to teleportation, comes up.

But enough of my sophistry. I'm going to bed.
 
Upvote
5 (5 / 0)

star-strewn

Ars Scholae Palatinae
643
Subscriptor++
The first time I read Koch (The Feeling of Life Itself), I was quite taken with his theory. I had been reading about emergent behavior at the time and thought that IIT was a good fit for these ideas. Over time I have come to like it somewhat less. I was privileged to see him debate Stanislas Dehaene, of Global Workspace Theory (GWT) fame, on this topic and wasn't really convinced by either.

Of late I have been most interested in the theory of Predictive Coding (PC) and what it implies about the nature of consciousness. Predictive Coding in the Neural Cortex (Rao and Ballard, 1999) is the seminal paper here, but Anil Seth has a Ted talk suitable for wider audiences that I think is interesting.

I think consciousness is an evolved adaptation that allows animals to minimize delays between stimulus and response by, essentually, hallucinating the world in real-time. In this way, you don't consciously see with your eyes, but their incoming signals serve as a kind of guide rail for the hallucination. By adapting the timing of the hallucination to the learned characteristic of sensory and motor delays the behavior of the organism can adapt to rapidly changing circumstances - a significant competitive advantage over organisms incapable of this trick.
Are IIT and Predictive Coding truly incompatible theories? To me, your explanation of consciousness refers to a single feature of consciousness, the ability to anticipate events, even if it does newly posit that such anticipation is built directly into our perceptions. Which, honestly, is pretty neat and makes a lot of sense because I've long heard that people are prone to see what they expect.

But why would sensory anticipation disqualify consciousness as an emergent property of a complex analytical system, one that observes and attempts to improve its own analyses through multiple layers of introspection and feedback? Such a system definitely needs a way to reference itself, an "I." And tying anticipatory feedback into perception could be one of many feedback loops.
 
Upvote
2 (2 / 0)

nivedita

Ars Tribunus Militum
2,054
Subscriptor
It was more a thought experiment rather than trying to define it.

How do you know that things like those aren't feasible in the real world. There is, for example, the idea of transferring people's consciousness into a computer simulation. Did somebody prove that it's impossible? Or did somebody prove that it's impossible to transfer consciousness between two brains?



There unfortunately isn't any other way to talk about consciousness right now rather than talking about the people's subjective impression of being conscious. There is the impression of pain, pleasure or color red, this is what everyone subjectively know and what the discussion is about. Nobody has come yet with some strict definition or formalism. At the current point there is only philosophy like, e.g. there is the question if animals are also conscious or are just empty vessels that behave like if they were conscious but there is nobody inside to feel anything. In the past people thought that animal aren't conscious. In the past doctors also claimed that newborn children aren't conscious and did child surgeries without anesthesia.

Now scientist think that e.g. insects are conscious beings too:



Some quacks also come with weird ideas like e.g. conscious agent theory by Donald Hoffman.


View: https://youtu.be/reYdQYZ9Rj4?feature=shared


Nobody has any idea what ”transferring consciousness” even means. How can it possibly make any sense to define consciousness as “that which is transferred when consciousness is transferred”?

The rest of your post is just you coming around to Artem’s point that we don’t even know what we’re talking about when we talk about consciousness.
 
Upvote
3 (3 / 0)

nivedita

Ars Tribunus Militum
2,054
Subscriptor
That's like something you would encounter in a 200 level philosophy of mind course in a unit on "the problem of other minds", where the exercise is aimed at introducing students to elementary concepts like theory of mind, the intentional stance, etc. The original article is about neuroscience of consciousness which has a pre existing evidence base where that is not in question.
What is the neuroscience definition of consciousness?
 
Upvote
0 (1 / -1)

nivedita

Ars Tribunus Militum
2,054
Subscriptor
Is defining consciousness really useful? I like to think that I think. Some of this thinking I am aware of and some is unconscious. Many times I have woken up with the solution to a problem which had baffled me the day before. Is conscious thinking better than unconscious thinking? Not a good question. Is the border between the two sharp or fuzzy? Are there other differences besides self awareness? Hypnotists, who purport to communicate with the unconscious mind, stress that their message must be very literal. Metaphors, similes, and other figures of speech may be misinterpreted.
What does this imply?
I mean, yes, defining what we’re talking about would be a useful first step to understanding it.
 
Upvote
7 (7 / 0)

nivedita

Ars Tribunus Militum
2,054
Subscriptor
You're not an exact copy of yourself from 10 minutes ago -- chemical/biological processes have been occurring within you over that span of time. Your neurons have formed/deleted/modified connections over that span of time. You're maybe not very different than you were 10 minutes ago, but you're certainly not unchanged.

I would also go so far as to say you're much more than a guy who just has "some of his ["the other guy's"] memories" -- your present experience/awareness, sight, sound, etc, is the result of an unbroken biochemical process that began when your brain formed. Your current inner experience is the direct result of your preceding inner experience. A Continuity of experience, if you will.

Actually, as I'm thinking about it, I don't know that I would agree with "transported in time", either. Where you are in time is a direct result of where you were in time, much like where you are in space is a direct result of where you were in space. Whether time or space, it's all part of an unbroken process. Again, a continuity.

Maybe this is without merit, and I certainly can't quantify anything, but to me "unbroken process/continuity" seems... important? Significant? I'm not a neurologist or psychologist, so I'm speaking well out of my ass.

Anyway, let's go back to GrimPloughman's hypothetical for a moment:



Start with a basic teleportation. You start in your house, Point A, energize the transporter, and end up in the house across the street, Point B.

Question -- Do you have any conscious experience of the teleportation itself?

My assumption is that you don't. During the teleportation process, you don't exist in the same physical sense that you do under normal circumstances. If your brain doesn't physically exist, can the biochemical/electrical processes it uses to propagate experience happen? I expect not. Unless the physics which govern the in-teleportation state of the brain allows for an analogous process to continue. I guess another way of asking this is, is your consciousness (and all other biological/chemical/etc processes) "suspended" or "frozen" for the duration of the teleportation cycle or not? My guess is that yes, they are. I could be wrong, for sure, but that's the assumption I will choose for now. It strikes me as the simpler assumption to make.

Question -- Do you die, and a copy with all your memories is generated in the house across the street? Or do you remain alive from beginning to end? In other words, does the "unbroken process/continuity" of cognition remain unbroken? Or does teleportation break that continuity? I think there's a distinction to be made here -- you said:



Question -- Consider two concepts:

1) Having memories of the past, and

2) Carrying your present conscious awareness and experience with you into the future.

Are these two concepts equivalent? Are they two slightly different ways of describing the same exact phenomenon, or is there a distinction to be made?

I personally make the assumption that there's a distinction here. The classic question about teleportation is whether or not pre-teleport you survives the process, or instead you die, and an exact replica with all your memories is assembled at the destination. Either way, the post-teleport replica should have the experience and memories of surviving the experience, which satisfies condition (1). But does it follow, logically and/or scientifically, that satisfying condition (1) by definition satisfies condition (2)? My opinion is that it does not follow, but again, I could absolutely be wrong.

But let me ask one more question to try to pin down the issue I'm having.

The 'basic' teleportation described above is a person moving from Point A (your house) to Point B (house across the street). Let's now take it a step further. During the teleportation cycle, your information is duplicated. A copy of you with all your memories intact is reconstituted in the house across the street. At the same time, another copy of you is reconstituted in Paris. Point C.

IF simply having the memories of the "other guy [who is] completely gone" means you are that person, then there are now two people who can equally claim to be 'you', although they are clearly not the same person -- they're each seeing, hearing, and otherwise experiencing different stimuli post-teleport, which sets their cognitive/conscious processes in distinctly different directions. Inasmuch as we are partially a product of our experiences, these two people who are both 'you', are different people. This seems to indicate that having memories of the past and carrying your present consciousness with you into the future are not exactly the same thing, right? Because...

IF teleportation doesn't kill pre-teleportation you, IF you carry your present consciousness with you into the future, surviving the teleportation process -- if having memories of the past implies that you've carried your present consciousness into the future -- then doesn't this mean that 'you' are now simultaneously experiencing two different sets of stimuli at the same time? You've carried your consciousness into two bodies in to different locations. Are you seeing the inside of the house across the street, and the Eiffel Tower at the exact same time? To get admittedly quantum 'woo' about it, do you exist as two 'entangled' minds, now, by some very tortured, as-yet-pseudoscientific definition of the word?

Why would you experience two sets of stimuli, you ask? The post-teleport bodies and brains are physically separate, individual, and unconnected, you say. Well, because that means that somewhere, there's a fundamental lack of Continuity. A fundamental disconnect between pre-teleport you and post-teleport 'you' which renders the post-teleport 'you' outwardly (and maybe even inwardly) indistinguishable from pre-teleport you, and yet somehow... not quite actually you.

----------------------------------------------------------

In trying to define a person's consciousness/experience/etc as an "unbroken process" or a "continuity", I'm somewhat trying (and maybe failing) to invoke the idea of "showing your work" on a math problem, or a logical chain of reasoning, or something along those lines. You could get the right answer on a math problem, and yet do something wrong in the actual process of getting there. In that case, the answer is the correct value when considering the initial value and where you're expected to end up, but the answer is still wrong in some fundamental sense, if it's not supported by the work you did to get there. There's a fundamental lack of continuity, if the end result is attributable to luck or intuition, rather than demonstrable, replicable mathematics.

I don't know it the analogy works the way I want it to, but that's just sorta where my mind goes when the topic of consciousness, especially with regards to teleportation, comes up.

But enough of my sophistry. I'm going to bed.

If we at least assume that “consciousness” is something that arises out of physical processes like everything else, then an exact teleportation must teleport consciousness as well. It doesn’t matter if there’s a “gap” in space or time. There is no general agreement on the idea that consciousness is purely physical, though the arguments against it seem specious to me.

Our present understanding of physics also rules out the possibility of creating multiple exact copies, so duplicating your consciousness can’t be done in general.

This doesn’t exclude the possibility that consciousness is actually just classical, or at least doesn’t depend so much on detailed quantum states that it can’t be duplicated: however, discussing such possibilities would require a much more detailed understanding of what consciousness is to begin with. But for instance, if it turns out that a computer program can be conscious, there’s clearly no issues either physical or philosophical with duplicating the state of a computer program.
 
Upvote
2 (2 / 0)

GrimPloughman

Smack-Fu Master, in training
55
How can it possibly make any sense to define consciousness as “that which is transferred when consciousness is transferred”?
I would put that different.

Consciousness is the thing that needs to be transferred from your brain to another one if you want to stop perceive the world as the your's brain and start to perceive the world as the second brain. Or maybe not "as the your's/second brain" but "from the perspective of the your's/second brain"?

Consciousness isn't the sum of all your memories, thoughts and impressions generated by your brain. It's the you that currently perceive all the things generated by your brain. You can ask the question: "what would it be like to be a house fly?" and try to imagine that you are a house fly right now. What is the alternative reality that you imagine in that case? It's a reality where the you isn't "connected" with your brain but with a house fly's brain instead. The you is consciousness.
 
Upvote
-5 (1 / -6)

nivedita

Ars Tribunus Militum
2,054
Subscriptor
I would put that different.

Consciousness is the thing that needs to be transferred from your brain to another one if you want to stop perceive the world as the your's brain and start to perceive the world as the second brain. Or maybe not "as the your's/second brain" but "from the perspective of the your's/second brain"?

Consciousness isn't the sum of all your memories, thoughts and impressions generated by your brain. It's the you that currently perceive all the things generated by your brain. You can ask the question: "what would it be like to be a house fly?" and try to imagine that you are a house fly right now. What is the alternative reality that you imagine in that case? It's a reality where the you isn't "connected" with your brain but with a house fly's brain instead. The you is consciousness.

You said exactly the same thing, just in more words. “The thing that needs to be transferred to send you into a different brain”… where “you” means “consciousness”.

Consciousness being some mystical thing separate from what your brain is doing isn’t a generally accepted idea.
 
Upvote
2 (2 / 0)
If we at least assume that “consciousness” is something that arises out of physical processes like everything else, then an exact teleportation must teleport consciousness as well. It doesn’t matter if there’s a “gap” in space or time.
Though does this teleport the consciousness, or does it "only" teleport organic hardware that will then immediately produce a new consciousness that is indistinguishable from the pre-teleport consciousness (to others and itself)?

I am not convinced that the gap does not matter. What if instead of startreky teleportation we use a much cruder method. The person is frozen, and a huge staff of really good surgeons tear down body and brain, atom by atom, while writing down an exact construction blueprint on a lot of paper. The blueprint is sent by snail mail across the globe. Four weeks later it arrives at the receiver lab, where a different huge staff of really good surgeons gets to work: They order suitable atoms from storage, and atom by atom they rebuild the person. The person awakens and says, "neat, it's me alright, but a plane would have been faster."

I have a hard time seeing this as the same person and consciousness, even though I think the person would feel like being the same as before the process. If the sender lab immediately rebuilds the person from the original atoms, it is more obvious that the person at the receiver lab is a fresh but separate copy - discarding the original body atoms just makes it easier to maintain the illusion. One might even send along the atoms (fastidiously labeled individually) and use those as the building material at the receiver lab, but again I have trouble seeing how that really preserves the original consciousness, as this would imply that somehow each individual original atom carries along a bit of that original consciousness.

Personally I suspect that I basically die everytime I sleep, my brain hardware generates a new consciousness when I wake up, and my current consciousness has inherited the memories of its predecessors. I don't like to dwell on this much.
 
Upvote
3 (4 / -1)

nivedita

Ars Tribunus Militum
2,054
Subscriptor
Though does this teleport the consciousness, or does it "only" teleport organic hardware that will then immediately produce a new consciousness that is indistinguishable from the pre-teleport consciousness (to others and itself)?

I am not convinced that the gap does not matter. What if instead of startreky teleportation we use a much cruder method. The person is frozen, and a huge staff of really good surgeons tear down body and brain, atom by atom, while writing down an exact construction blueprint on a lot of paper. The blueprint is sent by snail mail across the globe. Four weeks later it arrives at the receiver lab, where a different huge staff of really good surgeons gets to work: They order suitable atoms from storage, and atom by atom they rebuild the person. The person awakens and says, "neat, it's me alright, but a plane would have been faster."

I have a hard time seeing this as the same person and consciousness, even though I think the person would feel like being the same as before the process. If the sender lab immediately rebuilds the person from the original atoms, it is more obvious that the person at the receiver lab is a fresh but separate copy - discarding the original body atoms just makes it easier to maintain the illusion. One might even send along the atoms (fastidiously labeled individually) and use those as the building material at the receiver lab, but again I have trouble seeing how that really preserves the original consciousness, as this would imply that somehow each individual original atom carries along a bit of that original consciousness.

Personally I suspect that I basically die everytime I sleep, my brain hardware generates a new consciousness when I wake up, and my current consciousness has inherited the memories of its predecessors. I don't like to dwell on this much.

You missed the “if we assume consciousness arises out of physical processes”. If it does, there’s no difference between reproducing “organic hardware” and reproducing consciousness. Not to mention that something that is indistinguishable, is well, not distinguishable: if you can’t tell the new version apart from the old one, then you have no basis to argue that there’s a difference. Also, atoms and fundamental particles in general do not work the way you’re thinking. There is no “labeling”. Two photons with the same properties are not distinguishable even in principle.

Hell, for all you know, the whole process was an elaborate hoax on you, and the guy actually just took a steamship while being comatose.

There isn’t any gap unless you subscribe to the view that consciousness isn’t something that’s just physical process. In which case we don’t know anything about whether teleporting it is a concept that even makes any sense at all.
 
Upvote
0 (0 / 0)

BrangdonJ

Ars Praefectus
3,582
Subscriptor
I write thriller novels for a living, and I'm quite familiar with the concept of flow, which I go in and out of when I'm writing. It really is an interesting experience, but you're aware of it when you're in it. You don't exactly enjoy it, at least I don't, but that's when you do your best and most creative work. Everything seems smooth and accessible. I think most people who do intellectual styles of work that require focus and concentration (creative writing, painting, music, coding) break into something like flow, if they do it long enough. Bottom line, for me anyway, is that flow isn't the same as being without consciousness.

However. Years ago, in 1980, I solo paddled a canoe from the headwaters of the Mississippi to the Gulf of Mexico. I was traveling alone the whole time. I had no radio, and once below Cairo, Illinois, the towns were often days apart. There were no other boaters, other than commercial tows. I experienced a couple of odd mental states along the way. As I traveled, I was sitting still and upright, my legs folded under the seat, in near silence, almost as in Zen meditation, as I paddled. At some point below St. Louis, after a month or so of travel, I began to experience the mental state that I call emptiness, for lack of a better word. My mind would simply go away. I never knew when it was going to happen, I only knew when I came out of it, because I would sigh, and look around to see what caused me to "wake up."

But I wasn't asleep. Typically, I would find myself anything from a mile to several miles down the river from my last conscious thought. (I had a large scale map book on the deck of the canoe in front of me, and when "conscious" I always knew exactly where I was within a few yards.) I had paddled that distance without conscious awareness of it. The Mississippi is usually quite placid at the time I was paddling it (August through October) but it's also dangerous, because of the tow boats pushing barges. And it was usually the arrival of a tow boat coming toward me, or catching up to me, that caused me to break out of the emptiness. I'd sigh, and be back in the world.

I can provide no characterization of the emptiness. It wasn't anything. I don't think of it as being pleasant or unpleasant, restful or stressful, desirable or undesirable. It was simply empty. I think people who seek that state, as some religious folks do, impose some pre-conceived ideas about it. They may get there through meditation, but any significance they put on it is conceived beforehand. I personally wouldn't seek it, because I imagine it's something like death.
One of the challenges is distinguishing between not being conscious during those periods, and just not having a memory of them. Arguably you weren't "empty", you were fully conscious and aware, but you weren't laying down any long-term memories because there wasn't a need to. And that was because it was routine; that portion of the journey lacked enough novelty for it to be worth remembering. And the arrival of the tow boat changed that because it did have enough novelty. Which is to say, how you dealt with it was worth remembering.

Is that wrong? Do you in fact have memories of the "empty" part of the journey?
 
Upvote
2 (2 / 0)

GrimPloughman

Smack-Fu Master, in training
55
You said exactly the same thing, just in more words. “The thing that needs to be transferred to send you into a different brain”… where “you” means “consciousness”.
No. I purposely avoided a wording like "send you" to avoid self-reference. In definitions a proper wording is everything. I don't reference consciousness to define consciousness but establish a general idea that then I name as consciousness.
Consciousness being some mystical thing separate from what your brain is doing isn’t a generally accepted idea
There are many ideas trying to explain consciousness that I'm aware about. They are all being discussed.

This is a list of ideas that I came across over the internet:
  1. Consciousness is generated by the neural network created by neurons within our brains.
  2. Our brains use quantum phenomenons to generate consciousness
  3. Our brains use physics going beyond quantum physics, more fundamental than quantum physics, to generate consciousness.
  4. Consciousness is something fundamental just like space and time.
  5. Consciousness is the only fundamental thing and it generates space and time.
  6. We don't understand what consciousness is because it goes beyond our brains capabilities. Just like a dog can't comprehend what prime numbers are we can't comprehend what consciousness is.
 
Upvote
1 (2 / -1)

GrimPloughman

Smack-Fu Master, in training
55
You missed the “if we assume consciousness arises out of physical processes”. If it does, there’s no difference between reproducing “organic hardware” and reproducing consciousness. Not to mention that something that is indistinguishable, is well, not distinguishable: if you can’t tell the new version apart from the old one, then you have no basis to argue that there’s a difference.
Let's imagine that we built a teleport and there are two chambers: "original" where you sit and "copy" where your exact copy is created.

If the device doesn't kill you, after teleportation would you be the first one the second one or both of those people at once? I suspect that you would just sit inside the "original" chamber and see through its window as an identical person emerges in the second chamber. And you wouldn't care if the process failed and the copy that emerged wasn't alive but you wouldn't like the machine to kill you during that process.
 
Upvote
3 (3 / 0)

panton41

Ars Legatus Legionis
10,402
Subscriptor
I had a tooth extracted by an oral surgeon. Afterwards he showed me an X-ray, and due to the contents of the X-ray I realized it must have been taken after I was put under anesthesia. The surgeon told me that not only was I awake, but that I was cooperative when it was taken. Which surprised me because i had absolutely no memory of it.
Hell, when I was a teenager, my father was a TV newsman who arranged for his station to do a live feed of my high school play for their morning show. He was supposed to pick me up on the way up, and he was going to shoot the video. The bus comes and goes, and my father doesn't pick me up. We're watching the show of my high school, and my mother is getting more and more angry that he didn't come by.

Turned, out the night before, around 3-4am on the way to work he woke me up to tell me he wouldn't be able to pick me up because he wasn't going to be the photographer. By morning, I couldn't remember any of it because I wasn't fully awake when he told me, despite having a two-way conversation.

And that's just normal sleep.

I had surgery a few years ago to remove a cyst from my scalp and remember having dreams while under anesthesia. (Something about a friendly Orc being discriminated against in a fantasy village.)
 
Upvote
1 (1 / 0)
Let's imagine that we built a teleport and there are two chambers: "original" where you sit and "copy" where your exact copy is created.

If the device doesn't kill you, after teleportation would you be the first one the second one or both of those people at once? I suspect that you would just sit inside the "original" chamber and see through its window as an identical person emerges in the second chamber. And you wouldn't care if the process failed and the copy that emerged wasn't alive but you wouldn't like the machine to kill you during that process.

You're not thinking like a dinosaur!
 
Upvote
1 (1 / 0)
  1. Our brains use quantum phenomenons to generate consciousness
  2. Our brains use physics going beyond quantum physics, more fundamental than quantum physics, to generate consciousness.
  3. Consciousness is something fundamental just like space and time.
  4. Consciousness is the only fundamental thing and it generates space and time.
These four are considered quack theories as there's ... no physics to support them.
 
Upvote
7 (7 / 0)

nivedita

Ars Tribunus Militum
2,054
Subscriptor
Let's imagine that we built a teleport and there are two chambers: "original" where you sit and "copy" where your exact copy is created.

If the device doesn't kill you, after teleportation would you be the first one the second one or both of those people at once? I suspect that you would just sit inside the "original" chamber and see through its window as an identical person emerges in the second chamber. And you wouldn't care if the process failed and the copy that emerged wasn't alive but you wouldn't like the machine to kill you during that process.
If a copy is actually possible, then the answer is obvious. There are now two “you”s which are identical. That’s the definition of a copy. The person in the second chamber looking through its window at the original is just as much a “you” and they may not care very much if the “original you” is subsequently killed either. That is what a copy means.

What’s the point of this whole setup anyway? How does it tell us anything at all about what consciousness is or isn’t?
 
Upvote
5 (5 / 0)

nivedita

Ars Tribunus Militum
2,054
Subscriptor
No. I purposely avoided a wording like "send you" to avoid self-reference. In definitions a proper wording is everything. I don't reference consciousness to define consciousness but establish a general idea that then I name as consciousness.
ok if you want to be pedantic, then let‘s go over what you actually said. You said consciousness is the thing that needs to be transferred in order for “you” to perceive the world as the second brain. You also then said “you” is consciousness. If “you” is not the same as “thing being transferred”, then there are two things that are both consciousness. Is that really what you meant to say?
 
Upvote
-1 (0 / -1)
“I like to think that I think.”

You started your statement with "I," and the problem is the same with "I think, therefore I am."

It's a fallacy. There is no independent evidence for it. It's a concept/thought and nothing more.

Both "I" and "consciousness" are undefinable except by other words, which themselves are just tags we use for convenience.

If you ask, "How does the mind decide to have a thought? Is there something else that decides?" and you start following that logic train, you always end up OUTSIDE of your body somewhere.

Everything is required for "consciousness," whatever that is. Saying that the brain is responsible for it ignores the rest of the universe.

Sorry…I’m waxing philosophical today ;)
Even if the "I" is a simulation, even if the "I" is a hallucination, even if the "I" is an illusion, it exists.
Descartes wins, you lose.
 
Upvote
2 (2 / 0)

bebu

Wise, Aged Ars Veteran
184
I have a nasty suspicion that consciousness (whatever it might be) is intimately related to the nature of time itself (whatever that might be.)
Drawing a long bow I would then imagine anything that 'experiences' time has consciousness in some sense.
Organisms or systems with only tangental interactions with time should then have a correspondingly minimal consciousness. Even the most basic unicellular life forms have an extremely complex presence in space and time.
A corrollary from this would be that even the most minimal life form, necessarily partake of consciousness in some sense.
Posing one inponderable in terms of another inponderable doesn't really progress the question at all other than noting that fundamental questions about the nature of space, time (and gravity) are more likely to have some progress towards partial answers than similar foundational questions in the nature of consciousness.
If my hunches are anywhere near the mark, I think they would also imply that contemporary AI/LLM is inconceivably far from ever achieving a level of consciousness even approaching that of an amoeba.
Que sera, sera
Whatever will be, will be
The future's not ours to see

Thank goodness - the past has horrors enough!
 
Upvote
-3 (1 / -4)