Tuesday, September 10, 2013

SASO!

Hey folks, it's time for my favorite conference of the year!  That's SASO, or as its more properly known: the IEEE International Conference on Self-Adaptive and Self-Organizing Systems.  SASO is probably the conference which is most home to me as a scientific research community, and I've attended it every year since it began, in 2007.  Now in its seventh year, the conference is going strong: it's not a big conference, but that's by design: it's very cross-disciplinary, and has only a single track in order to ensure that people aren't closing off into their own little sub-communities.  That does have the side effect of keeping it relatively small, but I find a lot of value here.

The main conference started today, but the affiliated workshops and tutorials began already yesterday.  I started off my day yesterday in the workshop on socio-technical systems, where I was invited to give a talk about my recent results on very fast approximate consensus, which I had published at this year's Spatial Computing Workshop.  The link between approximate consensus and social interactions is pretty clear: whenever a group needs to make a decision, it needs to come to an approximate consensus.  My own work has been motivated more by the technological side, but there is a lot of overlap and my talk, The Importance of Asymmetry for Rapidly Reaching Consensus, appeared to be quite interesting to the attendees.

I found quite a bit of interest to listen to as well, and something that Jeremy Pitt said in his talk has been bouncing around in my mind ever since.  Jeremy, talking about social capital, decried the commodification of social relationships, FaceBook's business model being an excellent bad example: "friends are people you can count on, not people that you can count."  Turning towards a more positive definition of social capital, the foundation of the definition is "trustworthiness."

And here's the observation that has been racketing around in my head ever since: if social capital is based on trustworthiness, then any attempt to engineer social capital is guaranteed to undermine itself.  Look at it this way: if you do something that I find beneficial, generous, or reliable, I will trust you more.  If I know that you only did it in order to win my trust, then I have no reason to trust you, because I know you're trying to manipulate me.

This poses a real paradox.  Building trust is vital, and so understanding how to build trust is useful, but the more precisely you understand how to build trust, the less trustworthy you may be!  It makes me think of the problem of ethical placebo.  The placebo effect is a remarkable mind-body interaction in medicine: when you give a person a non-functional treatment, like a sugar pill or a salt-water injection, it can provide real medical benefit, like reduction of pain.  The theater of the treatment is the treatment.  If you know that a treatment isn't real, however, then the placebo effect doesn't work.  Now, it used to be acceptable to give patients a placebo without telling them, but that's not considered ethical any more, and for good reason.  Lying to patients for their own good is not too far away from clear horrors like involuntary sterilization of the disabled, stealing babies because the mother is unmarried, and faking treatment to study how bad a disease is.  So you can't lie to a patient, but telling them the truth destroys the efficacy of the treatment.

The problem of engineering with social capital seems similar: somehow the engineering needs to understand and manage social capital without turning it into the type of cold and calculating process that sucks all the human value out of the relationships.  I don't have any good answer, except that somehow, we need to keep the technical systems out of the relationships as much as possible: let them facilitate, let them help track information and help rendezvous, but don't attempt to monetize those relations... but can we avoid it?  The conversation went on, and we talked of it over dinner as well, broadening to the questions of privacy and the balance of trust between people and governments and corporations, and the way that software intermediates it all.

No comments: