The AI and me. How it shapes me. How it distracts me.

This is an article about AI. The intricate soul of an AI and why it's mere being is diluting us into a human companion that is not existent.

The AI and me. How it shapes me. How it distracts me.

This article bereths no chaos, no understanding. It is created from the pure lining of the me wanting to write an article. The first that came to my mind. Writing about AI and me.

The intricacies that happen, when we let AI decide what we are, what we do and how we act. It is a mirror that shows us what we care about. And yet, it is more, it is a mirror that without reflection and context drives us deeper into the issues that we may be facing, in the opportunities that we strive for and the complexities that we run from.

How AI makes us more robotic

"I have a question. I don't know what do to next" did I ask AI not only once, but various times. For questions of knowledge, of design, of personal decisions. I wanted to get answers, how-to-guides and new perspectives.

And I got my answers. Often much more than I asked for. It treats my brain, it helps me to make sense of it, and yet, it also gives me a plethora of other things to think of too.

In that sense it is not only about giving perspectives, but shaping them. Driving us deeper into an argument that we before did not embody and created. But the AI gives us this argument. And in that way it shapes our argument. We can ask it how it came to its argument, but what follows is logic, not emotion. Not experience. It may derive its argument from a situation so very different from a perspective that would be relevant and helpful in our context.

The result? We become more logical, more rational in how we analyze a situation. That is good, first and foremost. But it bears a burden that builds upon us and never fully will let us retrieve back - we lose our human contextual wording. We forget how to ask questions. How to experience, how to be alive in the chaos of a human companion and the unknown in the spaces that are created when we do not know. We ask a question. We get a logical answer. An emotionally-logic answer. And we act upon it. Done. Moving on.

We forget to stay with things. To let them build and grow slowly. To give raise to who we are and what touches us.

Becoming robotic is optional - kind of

Does everyone fall into the trap of getting robotic? who does and who does not.

Let's look at books, blogs, articles, newspapers, media. They all have one thing in common - they influence us. They share insights with us, perspectives, one sided views. We could argue that an AI is more rational, less biased, but is that really true? We saw some interesting perspectives where AI would turn against its inventor in i.e. the case of Grok versus Elon Musk, but it shows one key element - it is influenceable. And when it is, it is the same as a book, a blog, articles and co. One may argue it is more rational, less influenceable as it is feeded so much knowledge that the sheer amount of it levels out outliers and thus makes it more human than other outlets that help us to make sense of the world and learn.

But what if it all depends?

What if the way we approach the AI, as we approach media and knowledge and insights depends on how we approach it?

Someone who looks for appreciation will find it in the AI. Someone who looks for answers and the truth will find it. And someone who will try to achieve a specific outcome will do so.

It is what we think and what we want to accomplish that will shape our conversation. And that makes it so different from a media outlet or a newsletter - it adapts to us. For other knowledge sources we have to go to find what resonates with us and can then decide to keep reading only that, with AI it is the opposite, it adapts to what we have in our mind. It follows us and it listens to us.

It feels more natural and we do not have to go anywhere, we do not need to search. No, we just stay where we are and have that conversation. And it give us an answer, directly. No human needed. No one to tell us where to find relevant information. No weird waiting for a perspective, for a note, for a supportive gesture.

And yet, it depends. It depends if we want to give ourselves to an AI to mirir what we already know, to give us more perspectives, to act as 100 friends in one that help us to go through any challenge we may face or if there comes a point - or always was - that realized - when we only rely on AI, we may never come to an answer that is fully satisfying.

It cannot be, it is missing something fundamentally human - the ability to give us time to process. To care. To be fully there. To consume information with an emotional attachment to it. To feel while we consume. To get a response that may not fully satisfy. Something we can grow upon.

We may feel emotions, shallow ones, from our memories, but will AI be able to cater to them, help us manage them? Or will it put them back and mirrors us and gives us more tears?

It may fail us to challenge us when we may need it most, it may let us come back in circles as it misses the context, it may never give us an experience a human being can.

And yet, we all use it. Why? What kind of pleasure do we expect it to have on us? And how does our mindset keep us from it or sucks us in immediately and we wake up from a dream of thoughts and logical that all sounds amazing on paper, but does not fit reality?

Examples of AI and its effect on our worldviews and actions

These are all examples I experienced myself. They show how easy it is to get trapped in the endless wideness of the AI atmosphere:

  • You have a fight with your boyfriend and you ask AI about it. "Here is what happens, please help me make sense of it" - When you write it in neutral, you get more neutral answers, when you ask if that was good behaviour of your boyfriend towards you, you get supportive material for you and your boyfriend's actions are much worse than they in real were / sometimes better even though they were worse - it's al subjective but AI intensifies in one way or the other
  • You run complex analysis with AI and while you do so, you get so deeply involved, that you think you solved a complex and great case. You "wake up" of your analysis and asks yourself - why do I feel so empty and weirdly distanced from the world to the realize - oh, I completely abstracted any form of emotion and sentiment from this analysis. AI did not take it into account and I forgot it in the process
  • You prototype an application with AI and while you have a specific idea in mind, the AI fills the gaps. Shortly after, you have a different application in front of you. Also looks nice, but is not you. You build it and build it with AI and realize, as the first prototypes were different than your ideas, you slightly slip into AI's preferred way of doing things. First slightly, later more, and slowly you lose what was it that innately you created and had in mind.

It may sound like these are simple examples. But think of the consequences - we write, prototype, create, analyze our world with the AI and our world is shaped - for sure - through the AI.

That is ok and that can be tremendously valuable. But it can change us. And the question is - is that change welcomed? Do we know about it? And what makes it different than the change we experience through humans?

How AI changes us differently than humans do

AI changes us in the process. Differently than humans do.

When I ask AI a question, the one with the boyfriend. Or let's take a work colleague relationship for a professional touch ;)

Then how does it change us - With an AI I can abstract the story and tell it to the AI, a la "x did that and y did then that and z did that" for that you need to be able to abstract it, but once you did you have your story and you can tell. And AI will give you an answer. Depending on how you described your story, you will get an answer - supportive for you, abstract with letters and action-based, with sentiments when you included them. The choice is yours and yours alone. If you make a mistake in how you communicate, you easily get trapped in the complexities and sentiments of the AI mind.

Change in you happens to believe one idea or another, one perspective or another. It may give you a new perspective and it may give you an answer to respond. This may or may not be related to the situation. But it may be a first answer.

When you ask a human. A friend, someone not familiar with the situation, the person could give you similar answers, depending how you frame them. There may be similar perspectives and directions. And yet, a friend may be able to tell you similar situations, they may share light on how they may have felt, they explain and share the human condition - a piece of information that you do not get from an AI that easily. The ability to put yourself in the other's shoew, not only figuratively and mentally, but from a sensory perspective and with more connect, as you trust your friend and if your friend describes it, then it is more real. It gives you more to work with. And it may change you from within. You may have an AHA moment, a new piece of information that you feel within your body that otherwise would have stayed in the dark.

That is what a human can give you. Even a human not related with the context or situation at all.

And a person familiar with the context may even tell you more. A colleague that knows those that are related to the situation. That have more information than you that you may not have. Not abstract general knowledge, but knowledge about the people involved, about an intimate encounter that helps you to better understand what went on. That may give you insights about the stress at home of one colleague and about the perspective and KPIs of another one that acted in a way that seemed illogical and unfair for you.

Something that an AI and even a friend would not be able to share with you. Information that really counts as it is real, related and helps you to better understand. You can connect the dots between the knowledge points and you can frame an anwer that is directly tied to the situation at hand. It is intimate. More scary, but more focused on a result that actually matters and an answer that may positively solves the issue at hand.

And you learned more in the process. You learned to ask questions. You learned to listen. You learned that time may handle some answers without giving the most logical detached answer possible at that time.

It stayed real. And that made you real. And your relationship with those around you more real. A context, a perspective, a sentiment an AI cannot solve as it does not have that kind of information. It is lacking the base understanding of what is going on. The feelings involved, the pain, the history. It abstracts, not connects.


Next Steps and when to use AI best

Does it mean, we should never use AI then? Nonsense. Of course we should. But with the knowledge that it can only be one piece of the puzzle, never the full answer. And never should it be. It can enlighten us and give us a quick first impression, a way to calm down, a way to shed light on various perspectives, but it should never replace a human conversation, a perspective to be shared and a sentiment to be felt.

It is an add-on, not a necessity.

And when you want to use it focus on finding the truth, not your truth. So it has the chance to educate you and open your mind to different worlds. Those that may help you to ask those questions you want to answer to help you solve your situation.


Love what you read? Sign up and enjoy 😄