commonplace log II: sense and actuate
self-correcting date selectors, timeful backgrounds, ambient media curation, and a story about anthropologists discovering a hard drive
journals are singular in their ability to offer unmediated access to a person’s thought process, without abstraction or technical complexity or interruption by interface-speech
an anthropologist living in 2204 writes. She is digging up the remains of material culture of the 2020s when she finds traces of a hard drive. Thanks to futuristic scanning equipment, her lab is able to reconstruct this hard drive down to the last atom, and therefore reconstruct its files to the last bit, but only reconstruct the history of the lives told through these files with uncertainty.
Hers is a story documented in the present by the earliest deployed version of commonplace, a digital journal that has high hopes for the journal medium in terms of how it can shape how we use and relate to digital systems. Hers is also a story written as a birthday gift to Finn, discussions with whom inspired this project in the first place. This is Week 2 of working on Commonplace.
Changelog
∓ Added support for inline media within Entry.
Media, such as images or videos or gifs or voice notes, can be manipulated within an Entry with the precision of a character. That is, inline media behave like exactly like text in a text editor: you can select them, copy (Cmd+C) and paste (Cmd+V) them, cut them (Cmd+X), or drag them in from another space or even rotate them precisely in line.
Bringing an image from the “maybe its not so bad” podcast channel into your Entry.
Media behaving like text feels trivial enough to seem like a “nice to have”. But when missing, you don’t just notice it, you have to suffer it. There was a bug, for example, where you couldn’t delete inline media unless the selection wrapped the character before and after them. This was so finicky it took me out of the Zen this space had barely managed to create just with large text and a circling fish.
It was when inline media started working that I started to see this Journal as an elaboration on the Journal as a textual medium, that the expressive bounds had not just changed but possibly expanded. And by started to see, I mean envisioning some future possibilities for how I could use it bubbled up:
Scenario 1. I’m entering into it via keyboard input. I get tired of typing - maybe my fingers tire, or maybe voice is what I want to give it. I <tilt phone up to my mouth (inspiration) > and then <<speak>>, and the snippet of audio will embed itself inline.
Scenario 2. When reflecting on my hike, the System embeds pictures of the lake I meditated next to, next to the words describing the feeling of being next to them.
Allowing you to set rotation is a small step in the direction of creating an expressive textual medium.
These ideas came up only when and as I started to use it in its current form: a simple way to embed media from are.na into your journals. Relating this to Eder’s diagram, the Design Realization produced new information and material for the Design Approach.
Designer’s Knowledge from Eder (1966)
As to the question of “how is this different from putting media above or below lines, doesn’t it do the same thing”?
Images between lines and images within lines read differently. When you place media inline, it “reads” immediately after the next word, and it feels like its a flows from the stream of consciousness. This is, as the anthropologist notes, the goal of a journal - to support and provide an unmediated access to your thoughts, just as they are. In this designer’s (maybe pedantic) view, images above and below read very differently: you break out of the saccade of tracing line after line, and have to adjust to the monuments of media in front of you.
📡 ☭ Added are.na browsing (and discovery) support
Thanks to their open API (as it should be), you can open a Cell, type in any are.na channel name, and have it load in as physical cards that you can swipe or scrub through, and insert into your Entry.
Loading channels side by side
Early on, what I enjoyed doing was opening multiple Cells next to an Entry, and have it sort of as an “ambient” inspiration. In a moment of pause, I’d go to it, type in a few characters, pick a fun sounding channel, and browse; I discovered new ways to depict communism and sci-fi typefaces this way.
Hammers and Sickles next to Typefaces
While none of these channels made their way into the Journal Entry, it did give me an idea about “ambient discovery”, where the system would very gently suggest media as you type.
Ambient Discovery / Poetic Search
The prototype for “ambient discovery” or “poetic search” was quite simple: give a list of 5,000 channel names to the System (in their fresh-faced persona of a memoryless GPT-4o) with a selection of Entry text, and ask it to return the channel that most likely “connects with” the text, specifically asking for the connections to be fuzzy, metaphorical and/or poetic.
While there were a few misses, there was an incredible moment where I was writing about keeping in touch with Pablo by sending pictures of our sunsets every day, and a stranger’s channel of sunsets was surfaced. It was as if all three of us, separated by time and space, were resonating with one another through the experience of sunsets together through these images and text.
Left: System surfaces a channel of images of the sunset, next to my writing about documenting sunsets as forming a shared social experience. Right: I receive a channel of chairs when writing about chairs.
Now, a Deck that streams in media every other keystroke, as Pablo notes, is distracting to the flow of writing. But I now have evidence from the delight and humor I experienced here that this is a worthwhile interaction design problem to solve.
🌅🌄🌌 Added Timeful Ambience
As the time of the selected entry changes, the background shifts to reflect the sky.
It currently uses one of 6 Japanese gradients designed by Nuevo.Tokyo, one of my favorite design studios, to suggest this, but I imagine its iteration being generated programmatically using sensor data, one day feeling like an “extension” of your surroundings. Maybe it will sample sound, temperature, wind, humidity, sunlight as parameters ...
Sky timelapses with changing Entries.
🧾 Improved the Date Picker to function as a set of rotating time dials.
The Year, Month and Day portions of an Entry’s Date are each given a Dial that can be scrolled, dragged or flicked. If they come to rest at a value without a corresponding Entry, the dials auto-rotate to the nearest corresponding Entry.
While otherwise similar in visual and interaction to iOS’ Selector, the Date Picker’s auto-rotation had a very particular effect on me - it suggested “self-correction”, that “it” “knew” to correct “itself” to the “right position”. It’s the first of what I hope to be many signals of an intelligent System co-inhabiting a digital space with you. By Sentient System, here, we mean both the fish as beings and the place itself which (will eventually) speak as a kind of Narrator.
A draft where circles were extra jiggly.
As for indicating Entries within a day, I’ve settled with a jiggly set of circles that recall the Mac dock, except with a little more personality. Later, I want to indicate this through small, rotationally accurate simulations of the sun and moon for the time and place the entry was written. It would help build this sense of timeful ambience.
☵Explored Ripple Shaders
Finn and I theorized in January about how making experiential software must look very different from how we do today, that we would invent or borrow tools and processes from fields like game design, filmmaking and more to form ways of making quite unalike PM processes used to make software today,
where there were once user journeys, there would now be scripts and storyboards,
where there were once interface keyframes, there would now be playtestable prototypes for each aspect of interface.
This week, I borrowed a process from film - Screen Tests - to see how well a new Ripple shader, simulating the surface of the water, would interact with our star of the show, the Fish.
By parameterizing the shaders on the Fish model and the Ripple texture, I tried to fine-tune the experience of writing on the surface of the water through 6 Screen Tests (ST1-6), of which I’ll show ST1, ST2 and ST6.
Raw Notes. ST1: too fast, chaotic, unprovoked. ST2: zen but murky, mystery in fish diving below. Sense of depth. ST6: just dirty, no other effect.
After 6 Screen Tests, Ripple Shader, in its current form, was not given the part. We thanked it for its participation and asked it to work on its range of motion and interactivity before trying again.
Theory
Published “Extending Cognetics”, which is more of a declaration of research intent than it is a presentation of findings. Nonetheless, reasoning about “attention” not in terms of tech criticism but cognition is actively shaping both commonplace and the interface sketches (Shared Data Object, Sites of Interaction, and Routing Input to App) that I’ll publish this week.
More
I. Joined Roote Fellowship (RF8)
After 3 years of deferred admissions, I joined RF8 (link). Excited to meet and mingle with the people Rhys brought together; in many ways Roote’s philosophy, 2 years ago now, gave this project permission to exist, and I’m hopeful about finding co-creators within and through the network of care that extends through him.
Intentions for RF8. Writing this fulfills one of them (externalizing knowledge)
For a hopeful view on technology, I recommend checking out his work and applying to be a part of the RF9, which starts next year.
Net Project
Ali Kapadia is behind the Net Project, a tech collective (?) thinking about how to get people to think about meaning, purpose and identity in their lives. He is also a new friend.
Meaning and Identity are themes in both of our work, but I’m still approaching it obliquely, unsure if the Journal should be for self-knowledge discovery. In the meantime, we will be sharing more thinking, and I recommend his blog posts (link) if these ideas interest you.
Milestones
- S’ father subscribing to this newsletter. If you are reading this, привіт!
- Yoof’s email.
- Pablo’s many, many stirring thoughts. It takes a while for the poetry to unravel to its truth, but it does, I tend to it, and it is beautiful.
- S’ ear and mind, that I have been giving rare access to through our group chat (of two) called “baby commonplace”. I send every passing thought to her on this project, we’re comfortable enough for her to take walks around my mind.
This is a beautiful moment of progress. Cannot wait to be in the commonplace.