Anatomy of a Conversation

7 min read

(This post is a work in progress. Your feedbacks are welcome.)

"Restlessness inherent to screens, the inability to ever linger or pause or catch your breath... It’s a strangely disembodied experience, a sense of ceaseless, rustling motion when nothing is moving at all: electrical pulses flash and gasp beneath the oceans, your mind strains to catch up, your body remains still save for a few twitching digits, the shell that’s left behind when your spirit evacuates for the mirage of higher ground. We become as smooth and reflective as the screen itself, all glassy surfaces and metallic edges obscuring the hollowness within. No need to fantasize about what it might be like to upload your consciousness to the machine—most of us are already there." ⏤ Coming Home, Mandy Brown

We’re living in a world that’s changing how we see and interact with everything around us. With our attention spans getting shorter and our memories stored in computers, we’re trying to find our way in this new reality.

Our relationship with technology is becoming more transparent, but not in the same way as with a close friend. Technology reflects us back to ourselves, but only from one side. It remembers things we forget, answers every question, and guides us to places we’ve never been. It works like a black box, keeps our data, but hides how it really works.

When you try to remember a vacation you had with your family twenty years ago, there may be gaps in the story. Some details may have been forgotten or misremembered. However, a photo from that day remains unchanged. Technology freezes our past, unlike human memory, which changes over time. Figuring out how to live with this fixed version isn’t simple.

As technology becomes a bigger part of our lives, our conversations are changing.

The Conversation Itself

When we consider the anatomy of a conversation, we encounter a form of communication that embodies both symmetrical and asymmetrical characteristics. (This, in itself, carries an inherent asymmetry.)

A conversation usually begins with a greeting. One person speaks while the other listens and starts to think a response in their mind. The roles switch after the response is given, and this cycle goes on for some time. If we were to show this process with speech bubbles, we could describe it as having a certain symmetry. This symmetry, however, frequently varies based on the roles of the those who talk, the depth of the topic, and the flow of the conversation.

    Press to Send

    Each bubble shows people sharing ideas or thoughts. Short stories may come up, and responses often start to match in length as the talk continues. Every conversation has a rhythm. Of course, this can vary depending on the topic of the conversation; informal discussions tend to have a steady, while formals involve asymmetries and disruptions in rhythm.

    LLMs’ ability to maintain conversational rhythm and flow

    “If you want to converse with me, first define your terms, you will permit me again to say, or we shall never understand one another.” ⏤ Voltaire

    This phrase emphasizes the importance of shared understanding and a steady flow in conversations, something that LLMs often have trouble with. Unlike human-to-human interactions, talking to an LLM is different because it’s one-sided. This difference comes from how LLMs make responses and talk back.

    Pauses, anecdotes, and introspection all contribute to the rhythm of talks. It’s like a dance that changes with the mood and what people want to say. But LLMs, these chatbots, often give super detailed answers to simple questions. This may distrup the conversation's flow and make it seem less natural. Additionally, the speed at which they produce these responses may give the impression that the conversation is rush, giving little opportunity for the brief silences or pauses that we use in daily life.

    LLMs don’t start or guide conversations by themselves. Unlike us, they don't get bored or need breaks while they're in a conversation. They also give out false information (hallucinations)1 and don’t notice or fix these mistakes. They don’t have self-reflection skills (reasoning)2 like we do, which are important for checking and changing their answers. In contrast to us, who inherently modify the time of our responses, LLMs respond almost instantly. This quick response, while helpful, can feel a bit robotic and disrupt the natural flow of a conversation.

      To make the conversation feel more natural, we can make some changes:

      • The models could slow down their responses to make it feel like they’re thinking. This would make the conversation feel more real.
      • Instead of always answering right away, the model could sometimes ask questions, suggest options, or change the topic.
      • The questions the model asks should show that it wants to learn or build on what we’re saying.

      By addressing these issues, LLMs can better match the natural flow of human conversations. They may never match the spontaneity and emotional depth of human interaction, but thoughtful changes can make their participation in dialogue feel more engaging and less biased.

      LLMs are good at memorizing

      Francois Chollet, the creator of Keras, describes LLMs as interpolative databases.3 As Chollet points out, LLMs lack the ability to extrapolate; they memorize patterns to answer questions rather than genuinely understanding or reasoning.

      Fundamentally, LLMs serve as external storage for us. Through their interactions with users, these models collect new sources of data in addition to preserving existing ones. But, their ability to learn is essentially different from the human quality of educability. Learning new information does not make them smarter or more adaptive than humans. Instead, after their training phase, their cognitive "growth" stops it's as if their minds stop working on graduation day.4

      LLMs, despite their limitations, have the potential to play a significant role in our lives. We might approach them as mind-openers rather than just as instruments for writing text, creating interfaces with a few prompts, or responding to basic questions.

      Expecting them to invent a new chemical formula or reach conclusions without experiments overlooks the value of experience and time. A better way is to use LLMs like Bob Ross uses his brushstrokes, to spark ideas and help our creativity grow.

      Ideas Machine

      "It is up to us to decide what human means, and exactly how it is different from machine, and what tasks ought and ought not to be trusted to either species of symbol-processing system. But some decisions must be made soon, while the technology is still young. And the deciding must be shared by as many citizens as possible, not just the experts. " ⏤ Tools for Thought, Howard Rheingold

      ...


      1. Hallucinations are confident but incorrect or fabricated responses, arising from reliance on patterns in training data rather than factual understanding.

      2. Reasoning is the simulation of logical processes using patterns in training data. They mimic deductive, inductive, and abductive reasoning by predicting text based on learned relationships, but lack true understanding.

      3. Pattern Recognition vs True Intelligence

      4. What Does It Really Mean to Learn? by Joshua Rothman

      Anatomy of a Conversation