Emotional Engineering
Over the past five years, I’ve been consistently journaling using a software called Notion. Initially, the convenience of digital journaling—being able to write on my phone anywhere—surpassed the tactile appeal of traditional notebooks. What began as a tool for organization evolved into something more. Instead of committing exclusively to deep, candlelit, spiritual journaling sessions, I adopted a simpler, daily approach: mood check-ins, to-do lists, reflections on what was hurting, and recognition of what was going well. Over time, this practice amassed a significant amount of personal data.
Recently, I began using ChatGPT as a key resource in developing my startup, Emerson Gray Consulting. In this process, I’ve posed thousands of questions to ChatGPT Pro, creating a growing database of my language patterns, aspirations, and reflections—essentially a mirror of my inner world, distilled through my writing. This led me to an intriguing concept: leveraging personal data to inquire deeper into facets of our lives that are often obscured by our own proximity to them.
One experiment stood out. I used my GPT database as a kind of impromptu therapist, reflecting on complex emotions I was navigating in my relationship. I uploaded essays exploring my feelings, and then designed a series of prompts that encouraged the model to dive deeper. Each question built upon the last, allowing the GPT to analyze my language, identify patterns, and offer reflections that felt like an external perspective on my internal experience.
In many ways, this mirrors the essence of good therapy: the act of unburdening your thoughts and emotions onto a neutral listener who interprets and gently reflects them back. When we bypass the uncanny feeling of speaking to software, fascinating insights emerge—ones rooted in the quantitative analysis of emotions expressed through language.
Some might argue that awareness transcends language, which I can accept. However, I propose that intelligence—our ability to solve problems and navigate life—is fundamentally intertwined with language. Every word we speak is finite; there’s a limited number of times each of us will articulate our thoughts in a lifetime. If that’s true, then analyzing these linguistic outputs offers profound potential to uncover patterns and insights that our conscious minds might overlook.
Imagine tracking our emotional trends and value systems through the patterns embedded in our language. These trends, often imperceptible because we live within them, could be illuminated by designing a system that authentically captures and organizes our expressions. The goal is not to humanize AI but to create a reflective mirror—a language-based representation of ourselves that can persist and provide rational, data-driven insights in ways we’ve traditionally thought impossible to quantify.
I call this concept Emotional Engineering. By building a structured database of our thoughts, preferences, and patterns, we could unlock a deeper understanding of our experiences, fostering self-awareness and offering meaningful support in navigating the complexities of life. This is not just an abstract idea but a tangible method for bridging the gap between the emotional and the analytical, revealing a clearer picture of who we are and where we are going.