> Oh yup. The way I've always described myself is I have extremely "muted" emotions, bordering on none the overwhelming vast majority of the time. I only very rarely feel extreme emotions of any kind.
I remember infuriating my mother almost every day after school when she'd ask "How was it?" and I would just shrug and say, "I don't know."
She thought I was being evasive or something, but I was being completely honest. I genuinely didn't have an answer because my internal state was, as you describe perfectly, muted. Most of the time, I just felt... like a neutral, warm grey. Well - still do. There was no data to report.
> I don't remember hardly anything about my own past outside of factual information, and that tends to fade rather extremely with time. Even times when I was quite literally sobbing I don't remember the emotional impact of, just the fact that it happened, sometimes not even the cause.
I think remembering the fact of sobbing but not the feeling is the perfect distinction between semantic memory ("a thing that happened") and autobiographical memory ("an experience I had"). The factual data point was recorded, but the emotional qualia wasn't encoded for retrieval.
> On the other hand, I have extremely good factual memory about random shit and can usually build up a solid approximation for how something works from first principles on demand for an extremely broad array of things. Trade-offs, I guess.
I wouldn't even necessarily call it a trade-off so much as a logical consequence. If the brain's system for storing rich, first-person experiential data is impaired, it makes sense that it would rely on and strengthen its system for storing third-person factual data. The "what" gets stored efficiently because the "how it felt to be me when it happened" isn't taking up much space on the hard drive.
> It's what I imagine being an AI would feel like from the perspective of the AI.
Sounds about right to me. I feel the same. I have access to the facts, like my I'd argue objectively fairly impressive achievement I described above, but I don't seem to have the emotional data. So, I can reason myself into knowing that I achieved something - but I'm not feeling it.
> Even times when I was quite literally sobbing I don't remember the emotional impact of
This is possibly touching on the problem I'm trying to navigate lately. Someone will display observable emotionality while denying the subjective experience. I'm not sure what to do with it. I'm not trying to cure them or anything, all I want is to get them to understand that their lack of internal experience of emotion doesn't improve things out here, where I'm still dealing with an agitated person yelling at me about how fine they are. Does this seem to align or does it sound unrelated, you think?
I remember infuriating my mother almost every day after school when she'd ask "How was it?" and I would just shrug and say, "I don't know."
She thought I was being evasive or something, but I was being completely honest. I genuinely didn't have an answer because my internal state was, as you describe perfectly, muted. Most of the time, I just felt... like a neutral, warm grey. Well - still do. There was no data to report.
> I don't remember hardly anything about my own past outside of factual information, and that tends to fade rather extremely with time. Even times when I was quite literally sobbing I don't remember the emotional impact of, just the fact that it happened, sometimes not even the cause.
I think remembering the fact of sobbing but not the feeling is the perfect distinction between semantic memory ("a thing that happened") and autobiographical memory ("an experience I had"). The factual data point was recorded, but the emotional qualia wasn't encoded for retrieval.
> On the other hand, I have extremely good factual memory about random shit and can usually build up a solid approximation for how something works from first principles on demand for an extremely broad array of things. Trade-offs, I guess.
I wouldn't even necessarily call it a trade-off so much as a logical consequence. If the brain's system for storing rich, first-person experiential data is impaired, it makes sense that it would rely on and strengthen its system for storing third-person factual data. The "what" gets stored efficiently because the "how it felt to be me when it happened" isn't taking up much space on the hard drive.
> It's what I imagine being an AI would feel like from the perspective of the AI.
Sounds about right to me. I feel the same. I have access to the facts, like my I'd argue objectively fairly impressive achievement I described above, but I don't seem to have the emotional data. So, I can reason myself into knowing that I achieved something - but I'm not feeling it.