It seems more-and-more of our lives are being lived in tandem with machines. Whether it’s the automated bots with which we get appointment reminders, or Alexa reporting out the weather, or Siri sending directions to our car dashboards, machines are becoming more-and-more integral.
It is true in our creative lives as well. Writing on Substack, I have turned to ChatGPT for assists in tightening up posts, helping me find information, or tracking down apt quotes.
I always double-check for accuracy, of course, because. . .well. . .technology. I also interrogate its assumptions.
Though some of these tools can prove helpful, and some of its insights, proclamations and conclusions seem downright uncanny (or, to use a German word I have just learned, first coined by Freud, unheimlich), the last word is always mine.
I have turned to the machine from time-to-time, not to write for me, but to serve as a digital assistant and sometime sounding board and mind-unscrambler.
In a recent conversation with ChatGPT, I asked for help in crafting an AI policy—a covenant, if you will—to stand up for in my writing.
“Imagination is more important than knowledge. For knowledge is limited, whereas imagination encircles the world.”
— Albert Einstein
In this work of Releasing Memory, I aim to give voice to the silenced, reweave the broken into a new whole, no matter how imperfect, and reimagine the future—story-by-story-by-story.
The tales we tell interrogate how time shapes our lives. How the past ripples into the present. In my writing, I look for clues in unexpected places. Tapping into the imagination and the imaginal.
At Releasing Memory, imagination is the compass.
While I may use AI as a tool, it is imagination—rooted in memory, emotion, history, and ancestral knowing—that guides the journey.
All content originates from my own creative voice, with AI serving only to assist and enrich, not replace, the human soul behind the story.
We are at the very dawn of human-machine interactions; I see this as an evolutionary process. It is my belief we humans should safeguard our very human efforts to do the best we can in this game we call life.
I see AI as an assist—not to write or create for me (it would hardly feel ethical, therapeutic or even sane to outsource my creativity), but as a digital deputy, and a sometime sounding board and mind-unscrambler.
I posit AI should serve to enhance our humanity—and not the other way around.
I envision this Storykeeper’s Code evolving as a living map protecting and preserving the space between memory and machine.
Here is how it’s unfolding so far in 2025:
1. Memory as a Human Capacity
I write to remember. To reclaim stories lost to silence, shame, erasure, or exile.
To summon the voices of grandmothers whose traumas became inheritance, and whose joys day-to-daay we never thought to ask about.
To honor the thread that binds one breath to the next across generations, to help shed light on the past—to enlighten and enliven our future.
Memory is not data. It is devotion.
And I insist that its sacred charge expand towards wisdom, and not be flattened into information.
2. Reserving the Right to Shape—and Own—My Story
What I create—words, questions, testimony, truths—are born from body, blood, lineage, and lived experience.
They are not just content—they are covenant.
I do not consent to my creative work being scraped, mined, weaponized, resold, or embedded in algorithms that erase authorship and intention.
I do not believe in “open access”, especially when that means open season on the vulnerable.
My stories are not raw material for training machines.
They are offerings for human connection.
The final word rests with me. But even more, and in the spirit of community, I invite any reader so inspired to connect, engage, and reflect back their own memories, feelings and experiences. In their own voices.
Interacting in community, with respect and reverence, feels like the most human and, I might add, the most rewarding part of this journey into release.
“AI should serve to enhance our humanity—and not the other way around.”
— Releasing Memory
3. Crafting Stories with Care and Intention
In a world that rewards speed, replication, and infinite generation, I choose slowness.
I choose careful crafting.
I choose not to forget.
Technologies that consume without remembering—and that replicate without responsibility—do not serve remembrance.
They serve forgetting. And sometimes, erasure.
I call out the false promise of “scale” as progress.
We cannot heal what we erase in the name of “optimization”.
4. Insisting on Ethical Memory Practices
It is my belief that:
Ancestral stories must be told with consent and care.
Trauma narratives must not be sensationalized, collapsed, or commodified.
AI systems must be transparent about where their knowledge comes from—what is included and, importantly, who has been excluded.
Creators have the right to control, retract, and protect their work.
I support data dignity, authorial sovereignty, and community-rooted storytelling.
4-a. Calling Out the Bias in the Machine
You may have heard the axiom,“Children are born as blanks slates.” This is an old theory that implies that parents and the environment in which children are raised can shape them at will, like an artist shapes a lump of clay.
The science of genetics, of epigenetics, has advanced our knowledge significantly. We know children come into this world shaped in many ways—genetics, health care, nutrition, war and famine all play a part.
This early shaping expands beyond the physical to the cultural. Children are also born into families, moral and spiritual traditions, economies, history, memory, culture.
Extending that axiom to artificial intelligence, AI does not start off as a blank space created out of nothing. Its programming mirrors back vast amounts of information that it has ben fed, “trained” on. That material comes from and through humans and their very real shaping. As such, the machine may overlook, omit distort and twist information.
AIs are trained on the archives of human language:
Histories that are not neutral. A canon that excludes as much as it preserves.
Records most often written by the victors, the dominant, the wealthy, the colonizers, and the patriarchy. This omits the voices of women, people of color, the other-abled, and the marginalized.
When AI replicates what it has learned, it risks repeating what we have not yet put into memory.
“Our engagement with AI must be guided through ethical human intention.”
I strongly believe it is our human responsibility to shape, question, polish and refine. To add to the cannon. To tell new stories, and celebrate different voices.
We cannot accept verbatim, in the name of speed and scale, what the machine spits out at us.
I understand that:
Systems reflect the racial, gendered, and cultural bias of the information on which they are trained, even as they promise objectivity.
Marginalized voices are often underrepresented—or distorted—within the data.
Empathy, nuance, grief, and contradiction are flattened into extractable code.
Technology can automate erasure as easily as innovation.
I do not reject the machine, but I reject its unexamined assumptions. I believe in interrogating the source. I believe in asking: who benefits? who is harmed? who is missing?
And perhaps most of all: what may be being forgotten?
I recognize that bias is not just a glitch in the model. It is a mirror of the world as we have made it, and as it has been fed into the machine.
I will not stop polishing it until I see us all more clearly.
5. Envisioning a Storied Future
One where:
AI supports human healing, not replacing or eradicating memory of experiences from the past, whether joyful, traumatic, loving, or hurtful.
Storytelling centers truth, not tropes. Memory not memes.
Legacy is protected, not extracted.
Technology continues to exist to serve humanity, and not the other way around.
We are more than prompts.
We are not inputs.
We are never content to be content: we are memory-bearers, sense-makers, bridge-builders.
We are human, with all our strengths, frailties and imperfections. In all our feelings, aspirations and dreams. Witnesses inhabiting—and reflecting on—this one beautiful embodied moment.
How I Engage Technology
I share these personal commitments, and invite others to consider their own:
I turn off my AI chat histories when sharing sensitive or creative work.
I locally back up my original writing, poems, and ideas.
I demand ethical use and attribution from any AI, platform, or publisher that interacts with my creations.
I seek tools and communities aligned with these values.
Evolving Memory and Machine: Momentary Takeaways
As we stand at the edge of memory and machine, I choose to speak, not to be spoken for. To remember, not be remembered by proxy.
To release stories not as content, but as ritual. To make certain that what is sacred in me—in all of us—comes from an authentic, soulful place.
This is my remembering. I imagine as it encircles the globe.
AI should be developed by people with your values!
What's worrying is that the creators of AI are worried about the potential of AI.
I appreciate your share and discourse grappling with AI as a writer. I believe this is something we all have to work out for ourselves and screaming against it will do no good. The horse has left the barn. So far I find it helpful for research (with careful verification) and bouncing off and refining ideas. We are in the drivers seat with it unless we don’t have the sensibility to use it versus misuse it. My grandson is in the field and he says AGI (not AI as we know it) should not be developed or allowed because it will be worse than a nuclear weapon. Unfortunately, we can’t control other countries development of it can we?