If you've never read it, you should track down the CACM article on the history of steam boilers that appeared some time in the 80's. The brief summary is that after steam power was invented, there were lots of nasty boiler explosions. In the UK, the problem was dealt with by regulation. In the U.S., free-market advocates succeeded in arguing that liability law was sufficient: boiler makers would lose a few lawsuits and would then have an incentive to develop safer boilers.From a systems engineering point of view, feedback may be regarded as simply a different mode of regulation. One of the first feedback control systems, invented by James Watt, has become known as the Governor.
The result was that boiler-related deaths dropped to near zero in the UK and continued at high rates for 20 more years in the U.S., until finally we broke down and regulated the industry.
The problem with liability as a feedback mechanism is that the negative feedback is strongly disconnected from the original action. Liability can help, but it's not at all clear to me that simply making MS liable for all the worms in the world would cause them to start making secure software.
Feedback corresponds to a kind of Network Regulation, producing Network Trust. As Kuenning points out, this doesn't work effectively when there are strong disconnects between cause and effect. Kuenning contrasts this with a kind of Authority Regulation, and argues that because the latter was more effective at dealing with boiler safety, it would therefore be more effective at dealing with software quality and trustworthiness.
However, Authority Regulation can also suffer strong disconnects between cause and effect. Is there really an Authority anywhere in the world that can keep Microsoft in check?