I was reading today about liquid democracy, which is a kind of improvement on direct democracy. Most of what makes direct democracy difficult, aside from the logistical difficulties of holding so many votes, is just that most people don't need to have an opinion on most issues. In liquid democracy you can just delegate your vote to someone else when you want. That can be a friend or a community leader all the way up to a traditional-style professional politician.
But of course you still have the problem of knowing who to delegate to. In some cases you can rely on there being a public figure whose opinion you trust. However, for many issues that process could be just as onerous as figuring out which way to vote. Some of the liquid democracy systems also seem to suggest that you could delegate certain topic areas, ie you have an environment delegate and an economy delegate. Though it's not clear exactly who would decide how things fall into those categories, or what would happen if an issue falls into both.
I think there's a more elegant solution to this: use Bayesian inference. Every vote is option, and by default you're assumed to vote with the majority on any given issue. However, you can at any time change that default vote and that updates the prediction of your future votes. If you vote in a pattern that is similar to a bloc of other voters, your default votes will become more similar to theirs. Essentially, it's the same algorithm you might use for any recommendation engine, it just recommends a vote.
I call this idea Bayesian democracy. I think it could be a pretty interesting generalisation of liquid democracy, allowing you to delegate your vote, not to another person, but rather to a statistical model of your own preferences.
I know a lot of people who would be quite keen on some kind of brain preservation. There's been some interesting progress in phyiscally preserving brains in a way that, presumably, preserves their function until we have the technology to reanimate them. Though, of course, it's fairly unlikely that your recently-thawed brain would be transplanted into a new body directly. Instead, we'd likely have some way to store and transfer the information represented in that brain.
And for many others that digital transfer, mind uploading and so on, is the real point. If you could upload your brain into a computer, it would never be possible to really die. Of course, your body could die, but your mind could live on, either in a new body or, perhaps, simulated directly on a computer itself. There's some fairly significant ethical mazes to navigate, but there's no arguing that there's something fairly compelling about immortality.
But what if only part of your memories could be reconstructed? Or if the version of you that lived on was imperfectly duplicated – had some slight personality changes or quirks not present in the original. We seem fairly comfortable with assigning a continuous identity to people who experience personality changes or lose memory from strokes. Obviously it would be ideal if the replication was completely accurate, but something is, perhaps, better than nothing.
The craziest thing I've heard along this vein is the idea that once you accept imperfect replication, you might not need mind uploading at all. Maybe I die with no brain scan, no cryogenic process, and a direct copy of my mind is impossible. However, you have thousands of hours of video of me talking, thinking, interacting with people. Could you reconstitute a mind by working backwards from that material? What if it's not video but third parties' memories of me? Or my writing? Perhaps this post itself could someday be used to remake its creator – or, at least, someone pretty similar.
The uncertain question, of course, is how much that reconstituted person would be you. Or if it's not you, would that still be worthwhile? Our notion of identity is very limited, and seems unlikely to hold up in the face of the serious complexity the future will bring. For me, I think I'd be happy to know that someone who thinks like I think is out there, who remembers some of the things I remember or shares similar ideas. Whether that person is me or not may be beside the point.
Once you get that far, it starts to seem like the whole mind uploading thing might not even be necessary. If what I want is for my mind to live on beyond the lifetime of my body, and I'm willing to accept that it may happen imperfectly or piecemeal, I can start doing that today. Each time I share an idea, I'm imperfectly transferring that part of my mind. If the person receiving that idea likes it enough to share it with others, the process will repeat.
And if that idea lives on, jumping mind to mind through the generations, maybe that is a kind of immortality.
I didn't write anything last night because I went straight from non-stop brain stuff to a party. I can't think of a way that I would have made those decisions differently under the circumstances, so I think not writing that evening was perhaps inevitable, or at least inevitable given priorities I'm otherwise happy with.
One thing worth considering is that if I'd had a pre-prepared emergency post, I could perhaps have just dropped it in. I'm not sure if that defeats the point, but it's something I might try for next time.
The two measurements are fractal dimension and sample entropy. I'm told they're both kinds of nonlinear analysis, though I confess the definition of nonlinear somewhat escaped me. Speaking of non-linear, the algorithm I'm using for calculating sample entropy is ridiculously slow. I think it's at least n2, maybe worse. There's apparently a faster version using magical K-D trees but it's not very well described anywhere and the maths is a bit over my head.
However, maybe I can just ditch sample entropy entirely for another entropy measurement. I've recently learned that you can get a very robust entropy measurement by just zipping your data and taking the ratio of the compressed to uncompressed size.