Introspective feedback system
(terrible name; will try to figure something better out later)
I currently have four software systems I use that all share a similar property:
- a goals spreadsheet
- Anki
- GnuCash
- alarm clocks on my phone that I use in a particular way (see how to wake up at a reasonable hour)
All four have the property that they are essentially “bookkeeping” in some sense. They help to track things for me, and there’s a feedback process where the system tells me to do things but also I tell the system things, and the goal is to be in an equilibrium where the system and I agree. For instance, the alarm clock tells me it’s time to turn off my computer. I can either decide to turn off my computer (I’m agreeing with the system), or I can decide that it’s not time to turn off the computer, and adjust the time for that so that tomorrow the alarm will be “correct” (I’m disagreeing with the system, or in other words forcing the system to agree with my behavior).
All four systems are, in some sense, a truth-seeking mechanism, where I must introspect to sort out a disagreement.
Qiaochu Yuan once tweeted something like: if you don’t want to do any of your goals, then introspecting on e.g. why you don’t want to do your taxes can itself be a goal, that that introspection is part of what constitutes progress on your taxes. Well, these introspective feedback systems are a sort of generalization of that. The system tells you one thing, but if you believe something different, then part of what it means to use the system – maybe the whole of it – is to sort out your disagreement with the system, to figure out your feelings or your memory or your finances or when you should do what.
To walk through each of the examples above in a bit more detail:
- goals spreadsheet: the system tells me what I should do given information about my current mental and physical state and the priority of the tasks. If I don’t want to do a task, or can’t do a task, or think the task at the top is not the thing I should be doing, then I have a disagreement with the system. I should either change some priorities in the spreadsheet, or tell the system my new mental/physical state, or tell the system that actually a task requires a better mental/physical state than the one I am in now, etc. Or I can be like “huh, I can’t think of any real disagreements, I guess I really should do this task”.
- Anki: the system tells me a thing that I should remember (that I am almost about to forget). If I have forgotten the thing, then I have a “disagreement” with the system, and must tell it that I have forgotten. The daily reviews are a way to sort out disagreements with the system.
- GnuCash: every time a transaction happens, there’s a “disagreement” between e.g. my PayPal balance and what’s in GnuCash, so the transaction entry sorts out that disagreement.
- alarm clocks: this was explained above.
Now, you might wonder, is this description I have given so abstract and general that it can apply to just about any system out there? Maybe… but I wouldn’t e.g. call a text editor an “introspective feedback system” – there’s just no feedback involved, nothing to disagree about (unless you want to get all lawyery and say that I have some idealized words in my head that are different from the words on the page, and the process of editing it hashing out disagreements).
I’m not sure if thinking of systems in this way has any large benefit, but it has made me more … patient or something. Instead of just saying “I guess this software didn’t work”, I am more inclined to say “I have a disagreement here that I can sort out”.