I remember recently sometime over the last year, a business or political person indicated that AI technology is dangerous because it “hallucinates”. I laughed out loud when I read this because, well, people “hallucinate” as well. We don’t call these hallucinations “hallucinations” though, they’re more commonly called beliefs and they’re an integral part of our society and a part of being a human being.
The thing to realize though is that beliefs are not reality. Beliefs can most definitely be or become reality. But it doesn’t mean that just because you believe something, it is a reality. And this recent interview with Elon Musk highlights this point.
When I was watching this, this appeared to me like a man who was on the verge of breakdown. It looked like a man who firmly believed in something so much, that he risked everything in his life for it, but it appears like he’s beginning to realize that what he believed in isn’t emerging into the reality the way he believed it would. This is especially evident near the end of the video when he looks like he’s on the verge of completely cracking.
Why am I bringing this up?
It’s because I don’t want to be this guy.
I don’t want to believe in something so much that it actually blinds me from the reality that’s right in front of my face.
You see I’ve spoken before about how one of my quests in transitioning to a Self-Transforming Mind is becoming aware of one’s fears as an important step in the process.
One of these fears that’s reared its head in the past was a fear of being seen as crazy because my allegorical framework for life might seem crazy and incompressible to people. This fear hasn’t subsided much because trying to articulate my framework feels exceeding difficult due to the scope of it, even though the tip of the iceberg is just a simple allegory comprised of metaphors.
But unless you can understand the foundation of what’s below the allegorical surface, understanding what the metaphors mean, the allegory itself will have very little meaning to you or, even worse, you may misinterpret and misunderstand the allegory completely (like some people already do).
But again, is this the reality that my framework is just highly complex and difficult to articulate? Or is my framework a hallucinatory fantasy with no real meaning or substance in reality?
It just seems like one can’t really know if they are truly “crazy” until they come out the other side. But there has to be a better way to approach this. This is something I need to work on and resolve.