Your resource for web content, online publishing
and the distribution of digital products.
S M T W T F S
 
 
 
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 

Conscious Awareness and the Feeling of Consciousness in Conscious Turing Machine Robots

DATE POSTED:September 2, 2024

:::info Authors:

(1) Lenore Blum ([email protected]);

(2) Manuel Blum ([email protected]).

:::

Table of Links

Abstract and 1 Introduction

2 Brief Overview of CtmR, a Robot with a CTM Brain

2.1 Formal Definition of CtmR

2.2 Conscious Attention in CtmR

2.3 Conscious Awareness and the Feeling of Consciousness in CtmR

2.4 CtmR as a Framework for Artificial General Intelligence (AGI)

3 Alignment of CtmR with Other Theories of Consciousness

4 Addressing Kevin Mitchell’s questions from the perspective of CtmR

5 Summary and Conclusions

6 Acknowledgements

7 Appendix

7.1 A Brief History of the Theoretical Computer Science Approach to Computation

7.2 The Probabilistic Competition for Conscious Attention and the Influence of Disposition on it

References

2.3 Conscious Awareness and the Feeling of Consciousness in CtmR

Importantly, CtmR’s Model-of-the-World processor (MotWp) constructs models of CtmR’s inner and outer worlds that are collectively called the Model-of-the-World (MotW). The MotW has CtmR’s current and continuing view of CtmR’s world. [KM1] [KM3]

\ The MotWp, collaborating with other processors, plays an important role in planning, predicting, exploring, testing and correcting/learning. The models it constructs contain sketches of referents in CtmR’s inner and outer worlds. Sketches are labeled with succinct Brainish gists. These labels indicate what CtmR learns or “thinks” about those referents over time. In particular, the sketch “CtmR” whose referent is CtmR itself will develop from scratch and eventually be labeled with SELF, CONSCIOUS, and FEELS. [17] [KM1] [KM3] [KM9]

\ Both the Model-of-the-World and Brainish evolve over time and play an essential role in the feeling of “what it is like” to be a CtmR. Thomas Nagel’s “what it is like” (Nagel, 1974) is often taken to be the canonical definition of phenomenological consciousness. [KM1] [KM3] [KM9]

\ The infant CtmR has only a very foggy MotW that does not even include a sketch of itself. Sketches in the MotW develop gradually, become refined, and gather labels. For example, when the infant CtmR discovers it can move its left leg actuator by the “power of thought”, the MotWp labels the sketch of that leg actuator in the MotW as SELF. [18] (Built into each processor is an algorithm that tracks what happens in the outer world when the processor tries to do something. This is built in because tracking is necessary for making predictions and corrections in the MotW. When the infant’s planning processor gets its motor processor to move its left leg and the infant detects the movement via its sensors, the MotWp will label the left leg as SELF.)[KM3] [KM16]

\ Even earlier, “feelings” of pain and pleasure start to develop. When the infant CtmR’s fuel gauge gets low, some sketch (which becomes the sketch of the fuel gauge) in the MotW gets labeled with the Brainish word LOW FUEL/PAIN (or HUNGER) and this information with a large negatively valenced weight wins the competition and gets globally broadcast. This information triggers a processor to activate the fuel pump processor. The infant CtmR learns that the fuel pump relieves the pain when the fuel gauge indicates “low fuel” (hunger). The “fuel pump” in the MotW is labeled PAIN RELIEVER, and may also get labeled PLEASURE PROVIDER.[19] [KM3][KM12]

\ Formal definition 2. Conscious awareness and feelings of consciousness in CtmR are consequences of Brainish-labeled MotW sketches being globally broadcast.[20]

\ In other words, CtmR becomes consciously aware of a chunk with a Brainish labeled gist, when CtmR pays conscious attention to that chunk.[21] [KM2]

\ As CtmR becomes increasingly consciously aware, the MotWp will label the sketch “CtmR” in the MotW as CONSCIOUSLY AWARE or simply CONSCIOUS.

\ Looking at CtmR from the viewpoint of the outside world, we see that something about CtmR is conscious; specifically, the CtmR considers itself conscious. What is conscious cannot be the MotWp or any other processor, as processors have no feelings; they are just machines running algorithms. Our proposal that CtmR as a whole feels it is conscious is a consequence in part of the fact that the MotWp views the “CtmR” in its MotW as conscious, and that this view is broadcast to all processors. and thus, to all processors associated with feelings of consciousness. This is CtmR’s phenomenal consciousness.

\

:::info This paper is available on arxiv under CC BY 4.0 DEED license.

:::

[17] We are often asked, isn’t this process recursive? Doesn’t the sketch of CtmR have a sketch of CtmR have a sketch of CtmR, etc. ? Yes, up to a point. But, at each iteration the current sketch is degraded, so the process soon becomes null.

\ [18] Certain pathologies will occur if a breakdown in CtmR causes its MotWp to mislabel. For example, if the sketch of that leg actuator gets labeled NOT-SELF at some point, CtmR might beg to get its leg amputated, even if it is a still functioning properly. This would be an example of CtmR body integrity dysphoria. Other pathologies due to faulty labeling in the MotW: CtmR phantom limb syndrome (a sketch of an amputated arm actuator is mislabeled SELF), Cotard’s syndrome (the sketch of CtmR is labeled DEAD), paranoia (sketch of CtmR’s best friend is labeled SPY), … . [K10]

\ [19] When a human mother gives a breast to her infant, the infant learns that the breast relieves the pain of hunger. The breast gets incorporated into the infant’s MotW labeled with PAIN RELIEVER and PLEASURE.

\ [20] We call a sequence of such broadcasted chunks, a stream of consciousness. CtmR dreams are such streams during which the input sensors and output actuators are essentially inactive, and CtmR’s Dream processor gets to work. Although dreams are “felt” as real, they can also be fantastical since their predictions are not being tested in the world. We propose that (testing for) dreaming is a (partial) test for consciousness.

\ [21] Is the infant consciously aware? An infant CTM’s memory of an instance doesn’t even have the label SELF associated with its MotW sketch, but soon enough that recollection will get labeled SELF, after which future recollections will have itself in that memory. [KM2], [KM3]