Step 1: Core Ontology
We define the fundamental elements:
Neural Thespians (T) — The AI-driven or human-interpreted characters/entities that act and evolve in stories.
Nodes (N) — Discrete events or story moments.
Channels (C) — Mediums of expression (visual, auditory, textual, hybrid).
States (S) — Emotional, cognitive, or existential conditions of thespians or nodes.
Rules (R) — Constraints that define causal logic, consistency, and world dynamics.
Latent Spaces (L) — Probabilistic universe of possibilities: potential story continuations.
Step 2: Story Graph as a Directed Network
Let the story be a directed weighted graph:
G = (N, E, W)
Where:
N is the set of Nodes
E are edges, representing causality or thematic connections
W are weights, representing:
- Emotional intensity
- Narrative significance
- Probability of transition
- Node Definition
Each node is a tuple:
N = (T, S, R, Σ)
Where
T are the thespians present
S is the vector of states for thespians and nodes
R are the rules active at this node
Σ is the symbolic signature (emotional + thematic + conceptual encoding)
Step 3: State Evolution Equations
Each thespian’s state evolves according to:
S_t^{(k)} = f(S_{t-1}^{(k)}, \Sigma_{j \in \text{neighbors}(k)} W_{kj} \cdot \Delta S_j, R_t)
Where:
— State of thespian at time
— Influence weight of neighboring node/thespian
— Change in state at neighbor
— Evolution function (can be linear, neural network, or probabilistic)
— Active rules constraining state change
This allows emergent emotional and narrative dynamics.
Step 4: Node Generation Function
New nodes are generated dynamically:
n_{t+1} = g({n_1, \dots, n_t}, {T_1, \dots, T_m}, S_t, R_t, L)
Where is a node synthesis function exploring the latent space to produce plausible story events given current context and rules.
Step 5: Graph Operations (Synthia Grammar)
Merge: Combine two nodes
Split: Branch story based on thespian decision probabilities
Transform: Modify node attributes using a rule
Overlay: Superimpose latent symbolic layers on a node
Step 6: Symbolic Encoding
Each node and thespian carries a vector of symbolic weights:
\sigma_i = (\text{emotion}, \text{theme}, \text{concept})
Emotion: Vector of intensity values (joy, fear, curiosity…)
Theme: Categorical vector (freedom, knowledge, conflict…)
Concept: Abstract idea encoding (innovation, entropy, morality…)
Interactions are composable:
\sigma_{\text{result}} = h(\sigma_1, \sigma_2, \dots, \sigma_k)
Where is a symbolic algebra (addition, weighted combination, or tensor interaction).
Step 7: Emergent Narrative Quality Function
Define a global narrative quality function :
Q(G) = \alpha \cdot E(G) + \beta \cdot C(G) + \gamma \cdot S(G)
Where:
— Emotional impact aggregated over nodes and thespians
— Causal coherence of the story graph
— Symbolic significance (how well themes and concepts interact)
— User-defined weights for priorities
Goal: Maximize while respecting constraints .
Step 8: AI Integration
Thespian Simulation: AI predicts state evolution and selects actions.
Node Generation: AI explores latent space to create new nodes.
Graph Optimization: AI computes and proposes modifications to increase narrative impact.
Feedback Loop: Human input adjusts weights and modifies rules .
Step 9: Modular Templates
Hero Journey Module: Graph templates with archetypes and symbolic weights
Conflict Module: Predefined conflict motifs and branching patterns
World Module: Environmental, societal, and fantastical world layers
Emotion Module: Tracks and evolves symbolic emotional vectors for each thespian
Top comments (0)