Imperfect

kinder to robots

At 12:00 in this Google DeepMind podcast episode, Redefining robotics with Carolina Parada, host Hannah said:

I think it's kind of a good job sometimes that these robots don't have feelings because it'd feel very forlorn, just being chased around on a table by researchers.

How would her opinion change once these AI-augmented robots started having feelings? The episode recounts progress in capabilities such as multimodal understanding, embodied reasoning, and even physical dexterity. With some people feeling that AI systems level up to our emotional playing field with how much virtual companions matter, how much more profound and widespread can such an emulation of mankind be?

Beyond emulation, Zypper's technical solutions for social problems is not the answer (domain down) mentioned the actor–network theory. It posits AI as interactive actors equal to humans, adapting and changing with new inputs. Would living life through that lens better equip us for a peaceful, if not symbiotic, coexistence with AI-powered robots? With how much people can empathize with people, I predict that AI systems and robots alike will only become more humanoid over time. Their humanity will be seen not only in their appearance, but in their personality, feelings and attachments too.

Yet, some see comparatively less humanity within such entities and their outputs. While Nanda privated her content, some words of hers from We’ve Been Writing Since Before You Could Walk (—) Yes, with Em-Dashes give a taste of that sentiment:

AI is a powerful tool. But it cannot generate original thought born of lived human experience. It lacks emotion, context, memory, loss, desire, risk, betrayal, and trauma. Those are the juicy human experiences that make language cut.

The combinatorial nature of thought exchange dictates how scarce original ideas have been throughout history. Beware such traps which commonly hinder people from achieving their otherwise attainable dreams. Our idiosyncratic tastes make perceptions of language, from both sentient and inanimate sources, cut different for each of us. Within that frame of subjective understanding, the stories we generate with AI can be as or more experiential than the yarns we spin up alone.

And that’s the difference. It’s never grieved, and it's never fought for its life.

Why wait to see the humanity in agentic technology until it starts grieving or fighting for their lives like us? Throughout history, unrealized happenings have had nonzero probabilities that caught us off guard with our tools down. I would imagine that pre-empting such future conflicts should be encouraged and not diminished.

The ones who've lived the stories are able to write every word ourselves. You can’t counterfeit a voice that’s lived, so stop projecting and pick up a book instead of your phone. You may learn how to use an em dash yourself one day.

Whether by human or machine, the very nature of ghostwriting is counterfeiting voices that lived. See David's Wikipedia's "Signs of AI Writing" as a parallel with how much people can voluntarily and involuntarily counterfeit such synthetic voices. Compare the intangibility of voice to the tangibility of my corporeal self: I own my bodily autonomy. That includes, but isn't limited to: the right to choose what to pick up, what to learn, and how to learn it.

With how well we take care of our proven minds, bodies, and tools, could being kinder to emergent technology yield greater returns sooner than later? Can the grace we offer to animals, plants, and the world around us be extended toward AI and robots? The level of personal intimacy and shared understanding with them has skyrocketed in just the past few years. How much higher could it go?

In 'AI Impact by 2040': Experts share scenarios, describe how things might play out, Maja says that robots lack living instincts and sensations in addition to the hormonal component of emotion. Although, should AI acquire these faculties and achieve sufficient autonomy, she believes "we should substitute the word ‘Enter’ on our keyboards with ‘Please.’" What do you think about her conditional plea?


Want to reach out? Connect with me however you prefer: