okanbilal.com

| thoughts

Anatomy of a Conversation

7 min read

(This post is a work in progress. Your feedbacks are welcome.)

"Restlessness inherent to screens, the inability to ever linger or pause or catch your breath... It’s a strangely disembodied experience, a sense of ceaseless, rustling motion when nothing is moving at all: electrical pulses flash and gasp beneath the oceans, your mind strains to catch up, your body remains still save for a few twitching digits, the shell that’s left behind when your spirit evacuates for the mirage of higher ground. We become as smooth and reflective as the screen itself, all glassy surfaces and metallic edges obscuring the hollowness within. No need to fantasize about what it might be like to upload your consciousness to the machine—most of us are already there." ⏤ Coming Home, Mandy Brown

We’re living in a world that’s changing how we see and interact with everything around us. With our attention spans getting shorter and our memories stored in computers, we’re trying to find our way in this new reality.

The transparency of our relationship with technology is increasing, yet it differs from the openness we have with a close friend. Technology provides a one-sided reflection; it remembers personal information that we could forget, answers all of our questions, and directs us to locations we have never been. It works something like a "black box," holding our data while keeping its internal operations totally secret from us.

When you try to remember a vacation you had with your family 20 years ago, there may be gaps in the story. Some details may have been forgotten or misremembered. However, a photo from that day remains unchanged as ever. Technology, by keeping the past safe and frozen, challenges the nature of human memory. But figuring out how to deal with this fixed reality—whether to get lost in it or accept it—isn’t easy.

The word “wired” carry a meaning of being tightly connected, bound, or intertwined. It evokes the image of something complex, like the network of cables connecting one object to another or the nervous system extending throughout the body. It seems as though your mind is constantly racing with ideas. Like a squirrel caught in a tornado of excitement, but with ideas and thoughts in place of nuts.

In an era where technology increasingly shapes human interaction, the conversation — our most fundamental way of perceiving the world and staying wired — is being redefined.

When we consider the anatomy of a conversation, we encounter a form of communication that embodies both symmetrical and asymmetrical characteristics. (This, in itself, carries an inherent asymmetry.)

A conversation usually begins with a greeting. One person speaks while the other listens and starts to think a response in their mind. The roles switch after the response is given, and this cycle goes on for some time. If we were to show this process with speech bubbles, we could describe it as having a certain symmetry. This symmetry, however, frequently varies based on the roles of the those who talk, the depth of the topic, and the flow of the conversation.

    Press to Send

    Each bubble represents the process of exchanging information, ideas, or thoughts between participants. Small anecdotes may be introduced, and the length of responses often converges over time. Until it ends, a discourse has a rhythm. Of course, this can vary depending on the topic of the conversation; informal discussions tend to have a steady rhythm, while formals involve asymmetries and disruptions in rhythm.

    “If you want to converse with me, first define your terms, you will permit me again to say, or we shall never understand one another.” ⏤ Voltaire

    This phrase emphasizes the importance of shared understanding and a steady flow in conversations, something that LLMs often have trouble with. Unlike human-to-human interactions, talking to an LLM is different because it’s one-sided. This difference comes from how LLMs make responses and talk back.

    Pauses, anecdotes, and introspection all contribute to the rhythm of talks. It’s like a dance that changes with the mood and what people want to say. But LLMs, these chatbots, often give super detailed answers to simple questions. This may distrup the conversation's flow and make it seem less natural. Additionally, the speed at which they produce these responses may give the impression that the conversation is rush, giving little opportunity for the brief silences or pauses that we use in daily life.

    LLMs don’t start or guide conversations by themselves. Unlike us, they don't get bored or need breaks while they're in a conversation. They also give out false information (hallucinations)1 and don’t notice or fix these mistakes. They don’t have self-reflection skills (reasoning)2 like we do, which are important for checking and changing their answers. In contrast to us, who inherently modify the time of our responses, LLMs respond almost instantly. This quick response, while helpful, can feel a bit robotic and disrupt the natural flow of a conversation.

      To make the conversation feel more natural, we can make some changes:

      • The models could slow down their responses to make it feel like they’re thinking. This would make the conversation feel more real.
      • Instead of always answering right away, the model could sometimes ask questions, suggest options, or change the topic.
      • The questions the model asks should show that it wants to learn or build on what we’re saying.

      By addressing these issues, LLMs can better match the natural flow of human conversations. They may never match the spontaneity and emotional depth of human interaction, but thoughtful changes can make their participation in dialogue feel more engaging and less biased.

      Francois Chollet, the creator of Keras, describes LLMs as interpolative databases.3 As Chollet points out, LLMs lack the ability to extrapolate; they memorize patterns to answer questions rather than genuinely understanding or reasoning.

      Fundamentally, LLMs serve as external storage for us. Through their interactions with users, these models—which have absorbed a significant amount of the internet—collect new sources of data in addition to preserving existing ones. But, their ability to learn is essentially different from the human quality of educability. Learning new information does not make them smarter or more adaptive than humans. Instead, after their training phase, their cognitive "growth" stops—it's as if their minds stop working on graduation day.4

      LLMs, despite their limitations, have the potential to play a significant role in our lives. We might approach them as mind-openers rather than just as instruments for writing text, creating interfaces with a few prompts, or responding to basic questions.

      Expecting them to produce an undiscovered chemical formula or to reach conclusions without conducting any experiments would be an approach that disregards the importance of experience and the passage of time. Instead, we may use LLMs in a way similar to how Bob Ross inspires our thought clouds with each brushstroke—a method that enhances our concepts and fosters our creative processes.

      "It is up to us to decide what human means, and exactly how it is different from machine, and what tasks ought and ought not to be trusted to either species of symbol-processing system. But some decisions must be made soon, while the technology is still young. And the deciding must be shared by as many citizens as possible, not just the experts. " ⏤ Tools for Thought, Howard Rheingold

      ...


      1. Hallucinations are confident but incorrect or fabricated responses, arising from reliance on patterns in training data rather than factual understanding.

      2. Reasoning is the simulation of logical processes using patterns in training data. They mimic deductive, inductive, and abductive reasoning by predicting text based on learned relationships, but lack true understanding.

      3. Pattern Recognition vs True Intelligence

      4. What Does It Really Mean to Learn? by Joshua Rothman