Thinking through 7,917 Hours Ago as a digital half-life
Thinking through 7,917 Hours Ago as a digital half-life
Conversations with Claude, changing this piece into something different as a digital half-life.
Ann Morgan
Using the instructions and things we’ve already done as data for this project (loaded), I’d like to continue on to a new Elapsed Turbulence based on the beginning to the end final state of the painting “7,917 Hours Ago” in an algorithmic way. I want it to flow like version 3, but not dissipate into black so easily and totally and take in account all of the uploaded states and layers of the painting as it progressed in real life, processing real emotion, and incorporating time.
This AI piece is meant to live on as a digital half-life of the painting, as how it exists in the virtual and AI space, and what a painting filled with emotion and time back to an undisclosed life-altering rupture — which is where the title “7,917 Hours Ago” comes from — looks like as math, as a function, and as something purely in code that creates the visual piece. This version will live in my online gallery and as a prototype, but it will continue to iterate into something displayed perhaps on a screen, or projected onto a wall or into space. I want it to never stop changing, it can go back and forth in code between end state and first sketch.
Claude
(Images: Layer 01 — first sketch on raw canvas, pale teal and grey on near-white linen ground, the moon as a simple arc; Layer 05 — dark masses beginning to arrive; Layer 10 — red at full intensity, yellow erupting at the horizon; Layer 14 — the final painting in its complete state, all color fields resolved.)
14 painting states mapped as campaigns — each a named moment in the painting’s real history, with palette weights pulled from what those layers actually looked like. The system oscillates between final painting and early ghost state, never stopping. Persistent luminosity: the canvas maintains color accumulation and never collapses to black. Early campaigns pull the pale teal, grey, and rose underpainting palette; later ones build toward cadmium red, navy, and cerulean. The seed drives a slow pendulum — forward through all 14 states, then backward, then forward, continuously.
The reverse pass uses a softer brush class that surfaces earlier palette colors at lower opacity, so going backward feels like memory rather than erasure. Three mechanisms prevent the canvas from dying to black: a luminosity floor, a lift pass every 40 frames, and a warm white glaze at each full cycle reversal.
Ann Morgan
Can I have a version that has no controls, is not split into three, and maintains the color fields and begins looking like the first layer and sketches more faithfully to the painting? Left to right clarity with the actual images, but not divided into a triptych for the AI digital half-life.
Claude
(Images: All 14 layer photographs studied in sequence — from the raw linen ghost of Layer 01 through the lavender wash, the arrival of dark masses, the red eruption, the copper rain, the violet skeleton, to the resolved final state. The left-to-right spatial logic mapped: vortex moon upper-left, cloud mass sweeping diagonally, red field lower sweep, cerulean and teal in the far right.)
The triptych structure is gone. The painting’s left-to-right logic reads as one continuous surface — the panel seams were an installation convention, and this is the digital half-life, which has no stretcher bars. Six spatial zones mapped directly from the photographs: upper-left is always the vortex moon zone, lower sweep always the red field, far right always teal and cerulean, upper right always the black cloud mass and copper rain.
Layer 01 starts as it actually was — raw linen ground, near-white, minimal pale teal and grey-ghost marks at very low opacity. The first brush went lightly on that surface. Flow reads the painting’s actual composition vector: the diagonal spine coded as a force, the red zone with lateral sweep and turbulent vertical streaks, the copper rain zone pulling marks downward, the teal corner curling inward. The reverse pass is remembering, not unpainting.
Ann Morgan
More organic, layered line like appears in the first layers, and I want this to fill the entire window and not be framed to the proportions of the painting. While a painting is confined to the canvas, AI/algorithmic digital half-life of that painting does not have to be. Only faithful to the left to right imagery. This is 3D, in space, living a life based on the emotion and visuals in the painting itself, but as an algorithm and math that thinks for itself digitally, based on the time elapsed visually in the oil painting.
Claude
(Images: Layer 01 and Layer 02 studied closely for their line vocabulary — long sinuous drips falling vertically in the lower left, the arc of the moon circle, ghost teal trailing marks. Layer 12 for the late-state density, violet diagonal lines cutting through everything. Layer 14 for the final full pressure of the marks.)
Full window, no aspect ratio. 100vw × 100vh, fullscreen from first frame. The algorithm lives in screen-space and doesn’t know it came from a canvas. It fills whatever surface it’s on.
Each agent now carries its own velocity vector and steers toward the flow field rather than snapping to it. Early states have a very low steer rate — agents turn slowly, producing the long sinuous arcs and trailing lines from Layer 01 and 02. Drip agents fall nearly straight down with a slow wobble, that’s the teal vertical staining in the lower left of the early layers. Arc agents have a constant gentle turn bias that produces the sweeping curves. Each stroke has a ramp-in, sustained middle, and ramp-out — so the mark fades as the brush drags out, exactly like a loaded brush running dry. The canvas starts luminous: Layer 01’s background is near raw linen, and marks in early states reach only a fraction of full opacity. Going backward feels like memory surfacing rather than active painting.
Ann Morgan
Can you make a version with sound, based on this last version? The sound is algorithmic based on what was produced from the paintings, like the example I posted.
Claude
(Images: Layer 01 — near silence, ghost ground, the painting before it has weight; Layer 09 — red at full intensity, the state driving the most low-pressure noise in the audio system.)
A linen-colored gate screen appears first — title and a begin button — because browsers require a user gesture before audio starts. Click it and both systems open together.
Everything generated. No samples, no loops, no music. Twelve drone oscillators tuned to E minor — E2, F#3, A3, B3, D4, E4, F#4, A4, B4, E5, A5 — with a sub-bass E2 at 82 Hz that’s felt more than heard. Each oscillator has its own slow LFO at a slightly different rate so they breathe against each other, never in sync. The dissonance is intentional — F# against E, D against the root — the same unresolved tension that’s in the painting. There is no resolution because 7,917 hours ago is not resolved.
Three noise sources: teal and ocean (bandpass ~380 Hz), red and pressure (bandpass ~140 Hz), cloud and breath (highpass ~1800 Hz). Their levels respond to how many agents are active in each zone on that frame. The filter opens across the 14 states — Layer 01 is heard through a dark 400 Hz lowpass, muffled and distant, like a canvas before paint. By Layer 14 it’s fully open. On the reverse pass the filter closes down and reverb increases. Going backward sounds like the painting remembering itself from inside the paint.


