Saturday, March 8, 2014

Systems surprises


This month’s blog is a collection of circumstances where you might be surprised at how something reacts -- even though there are well documented reasons why it did what it did.  In nearly every case I'll let Wikipedia or other web pages provide the details.

A warning and disclaimer -- much of the theories mentioned here can make you depressed and unwilling to act.  Please take these theories and dilute with comforting memes from your social network of choice.

Lets start with Le Chatelier's principle [ https://en.m.wikipedia.org/wiki/Le_Chatelier%27s_principle]. Gall, in his book, "Systemantics" expressed this succinctly as "The system kicks back".  The original form is about chemical systems where a collection of chemicals is in equilibrium.  Any attempt to move the equilibrium by changing one factor is cancelled out by changes in the other factors.  The action is balanced by the reaction --  Newton's Third Law [https://en.m.wikipedia.org/wiki/Newton%27s_laws_of_motion#Newton.27s_third_law]. So if you poke a complicated system don't be surprised if it pokes back.  

Another variation is the Flare-back fallacy noted by Gerry and Daniella Weinberg's example of killing a pest that is either predator or prey that is following Volterra equation [pages 301-302&317  "The design of stable systems", Wiley 1979].  Here when you (for example) kill most of the predators, then, shortly you get a plague of their prey, followed by an explosion of the predators, etc.  You get a boom-bust pattern of oscillation.  This example is only the simplest of biological surprises I have read about or experienced where A encourages B that leads to C,,, Working these out is essential if you want to solve some problems. An example from social systems is Forrester's urban dynamics [ https://en.wikipedia.org/wiki/Urban_Dynamics#Urban_dynamics ] which models a typical city or town and predicts that attempts to solve social problems can make it worse. A version of this argument was used as a campaign argument in a recent mayoral election -- providing housing for the homeless attracts more homeless into the town. 

The theme of the obvious solution making a bigger problem turns up in software development. Brookes's law: adding people to a late project puts it further behind. This is well known and has been recently confirmed in all but the smallest projects by Capers Jones [ http://www.drdobbs.com/architecture-and-design/programming-laws-and-reality-do-we-know/240166164 ]. Brookes points out that when people are added to a project the existing time has to bring them up to speed.  They may need to learn the team's tool-set and organization.  But the also have to learn the domain knowledge that the team has discovered so far. All this explanation and training will slow down the rate of development and put the project further behind.  

The following system surprises were all in an article in the New Scientist magazine [ "Less is More", Justin Mullins, pages 30-33, 18 jan 2014 ]. In sports loosing the best player in a team can increase the chance of winning if it makes the plays less predictable.  This is from Brian skinner in 2009.  Then we have Braess's paradox [ https://en.wikipedia.org/wiki/Braess%27s_paradox ] which appears in traffic networks in both theory and practice. Dietrich Braess discovered that sometimes adding a new faster road can lead to delays if every body chooses it. This is an example of "Tragedy of the commons" https://en.m.wikipedia.org/wiki/Tragedy_of_the_commons] where large numbers of optimal choices by individuals leads to the whole system performing worse. This effect has been observed in reverse: removing a popular road has lead to a smoother and better flow of traffic in large cities.  To be technical: the Nash equilibrium is not optimal. The article has some other examples. Dirk Witthart researching power grids discovered that adding a new power line could reduce the stability of a power grid. However he also discovered an exception: adding a parallel link to a fully loaded link is safe. Similarly in 2013 Israel people observed that adding nodes to a wireless network lead to reduced service. 

Further reading and research. 
 Mencken's Law: To every problem there is a simple obvious solution that just doesn't work. 
 Parkinson's Law: Work expands to fill the available time.



No comments:

Post a Comment