As I was saying…

I was wondering when I wrote that post yesterday whether I went too far in my gloomy statements about the likely effects of rising interest rates on municipalities. I guess I wasn’t so far off the mark – from today’s NY Times, by Mary Williams Walsh:

States and cities across the nation are starting to learn what Wall Street already knows: the days of easy money are coming to an end.

Interest rates have been inching up everywhere, sending America’s vast market for municipal bonds, a crucial source of financing for roads, bridges, schools and more, into its steepest decline since the dark days of the financial crisis in 2008.

For one state, Illinois, the higher interest rates will add up to $130 million over the next 25 years — and that is for just one new borrowing. All told, the interest burden of states and localities is likely to grow by many billions, sapping tax dollars that otherwise might have been spent on public services…

The whole piece is online at Dealbook, and worth a read.

Advertisements
Posted in Public finance | Leave a comment

Municipal debt: from time machine to time bomb?

Kevyn Orr’s proposal to Detroit’s creditors last week included a wealth of information about the extent of the destruction of Detroit as a livable city, much of which has already been addressed elsewhere. The report also includes a good deal of information about just how badly the city was mismanaged financially. As I said before, I think that Detroit offers a case study (however cartoonishly bad) of many of the practices that have been used in other cities and states, so it’s worth looking more closely at what may be coming down the pike for the rest of us non-Detroiters.

The debt time machine

Stiff taxpayer resistance to rate increases over the past few decades has left state and local governments with a very limited set of options to fund themselves. Their way out of the impasse has been to turn to debt markets, where municipal issuers have one enormous advantage – the coupons they pay bondholders are exempt from income tax, making them a core holding for individuals and taxable institutions.

In one sense, municipal borrowers in debt markets face a similar set of decisions as individuals. Like the rest of us, municipalities can borrow for either long-term capital projects (analogous to a home mortgage) or to cover short-term gaps in revenues (like using a credit card to cover monthly expenses).

But there’s a crucial difference between individual and municipal borrowing in the timing of repayment. Most borrowing by individuals involves loan payments that chip away at both principal and interest over time. By contrast, borrowing in the municipal bond market means that the borrower only makes interest payments until the principal is due at term. Depending on the interest rate and other factors, this can give the borrower a chunk of money upfront while pushing the need to make an enormous principal repayment years or even decades into the future.

For cash-strapped municipalities like Detroit, this time-machine effect can be a budgetary godsend that transforms immediate funding gaps into distant future liabilities. To give a sense of just one of the ways Detroit has used debt, the chart below comes from emergency manager Orr’s report, and shows the annual funding gaps in the Detroit’s General Fund (the account used to pay out ongoing expenses). The height of each bar is the total deficit for each fiscal year, while the darkest section at the bottom of each bar is the reported deficit. The difference – the lighter segments – has been filled by various bonds the city issued over time.

Source: City of Detroit Proposal for Creditors,  June 14, 2013

Source: City of Detroit Proposal for Creditors, June 14, 2013

By issuing these three series of bonds, Detroit’s city government has been able to fund its current operations without having to pay anything more than immediate issuance costs and some interest payments – the ultimate cost lies far in the future.

Of course, these are far from the whole picture when it comes to Detroit’s liabilities – in practice, the city has taken a similar time machine approach to most areas of its finances, whether through direct bond issuance or just generally making promises about future benefits. And Detroit has been far from alone in doing this. According to data from SIFMA, municipalities have racked up nearly $4 trillion in municipal bond debt.

Total muni debt outstandingThe debt time bomb

Municipal borrowers also make use of a tactic that allows them to keep principal repayments far in the distant future. By repeatedly issuing new bonds to pay off existing debts (in addition to issuing entirely new bonds), municipalities are able to roll their liabilities into the future, while also lowering the interest payments due in the interim.

This sort of refinancing has until very recently made all the sense in the world for local politicians. Thirty years of generally declining interest rates made for almost constant opportunities to lower interest costs (albeit with some interim volatility). The bond bull market also boosted demand for the bonds so that issuers had a market. Best of all, the ability to lower interest costs – while delaying the inevitable repayment – could produce “savings” that bolstered budget projections.

The picture today has changed, however, and at the worst possible time. If interest rates continue to rise as they have over the past month, then at worst the revolving door of refinancings at lower interest rates could eventually become difficult if not impossible. Even if rates remain historically low, the interest costs will still likely be higher than they have in recent years. Either way, municipalities face a potential combination of higher borrowing costs and continued slow growth in tax receipts.

Perhaps worst of all, this would take place just as historical pension and retiree healthcare commitments (themselves a form of time machine) are coming due, many without adequate funding.

How big of a problem is this? There are admittedly too many variables to know, and reputations have been made and tarnished on doomsday predictions for the municipal bond market. It’s also important to note that the awful combination of a shrinking population, falling property values, industrial decay and extraordinary government corruption and ineptitude that has plagued Detroit is (thankfully) unique in the US. Nonetheless, the structural problems remain, and it’s difficult to see a brighter future for local spending on infrastructure, education and other essential services without a significant rethinking of commitments made to citizens, retirees and bondholders in sunnier times.

With thanks to Katya Grishakova for helpful comments that kept me from going full (Meredith) Whitney.

Posted in Pensions, Public finance | 3 Comments

Detroit is not Dhalgren

Samuel Delaney’s po-mo science fiction masterpiece Dhalgren has been linked to Detroit since it was published shortly after the urban convulsions of the late 1960s. It’s not hard to see why – the book describes the fictional city of Bellona as a sort of apocalyptic hole in an otherwise normal U.S., and traces the lives of the few people who remain there after social, political and legal institutions have collapsed.

The creative response to Detroit’s misery today is more in tune with the way we take in information – rather than a novel, we have tumblrs, blogs and other media focused on endless photographs of the city’s abandoned architecture and general decline. Nonetheless, the experience of viewing the photos has similarities to reading about Bellona in that both allow us to indulge in the fiction that there is a spectator’s distance between our own lives and those of the people in the frame.

I think that’s far from true for reasons that may not be obvious. First, the financial problems facing Detroit are unique only in their severity and timing – in kind, the problems are similar to what many (if not most) U.S. municipalities will be facing soon, if they’re not already. This means that the solutions used to work out Detroit’s financial problems are likely to set a precedent for the way governments manage the inevitable tradeoffs that will be imposed elsewhere.

Second, Detroit’s fiscal crisis puts pensions and retirement issues at the center of the frame in a way that past municipal crises haven’t. With the country facing a wave of impending retirees with little to no savings, what happens in Detroit (along with the final verdict in San Bernardino) is going to be part of an opening salvo in the looming war over retirement and health care for the elderly after decades of underfunding and fiscal mismanagement. The failure of the retiree benefit systems is also a case study of the profound governance failures that lie behind so many of the problems facing public sector pensions across the country.

However things sort out, there’s no question that the lives of the people in Detroit are likely to get worse in the near future, just as many other retirees’ are.  It’s worth a closer look at the failures that led to this point if the rest of the country is going to have any hope of avoiding a similar fate.

Posted in Labor, Pensions, Public finance | Leave a comment

Regulating complexity I: Ashby, Perrow, Haldane

The word “complexity” is being used a lot lately in discussions of the financial system, particularly when it comes to regulation. The severity of the financial crisis seems to have opened a space for new ideas about the relationship between the complexity of the financial system and the complexity of regulation, with some of the most interesting proposals calling for greater simplicity in one or both of these areas.

The counterintuitive idea of simplicity as a tonic for complex problems is very attractive. It strikes me as common sense that attempting to control complex institutions with increasingly complex regulation often only makes things worse – it creates incentives for cheating, drives destructive “innovation” designed to exploit gaps, and often arms regulators to fight the last battle rather than preventing problems.

But I also think it’s worth taking a step back to think more clearly about the problem of regulating complexity from a number of perspectives before zeroing in on specific proposals in detail. Working from the purely conceptual to the purely applied, the remainder of this post looks at the problem from the perspectives of biology, social systems and banking.

Ashby and the law of requisite variety

Ross Ashby was one of the godfathers of systems theory and cybernetics, and developed a famous concept called the law of requisite variety. I haven’t read his book yet (free pdf here) but he followed it with a fairly accessible paper in 1958 that laid out the central idea.

In the paper, Ashby approaches the question of how to regulate a system from the perspective of biology and engineering, where a “regulator” can be an enzyme, an overflow valve, or anything that acts to counteract disturbances to the system. Using set theory, he defines successful regulation as a set of regulatory responses (the plural is important) that reduce the space of possible outcomes from a set of disturbances to an acceptable subset.

The key takeaway is that the success of regulation in a complex system is a function of the relationship between the set of regulators and the set of disturbances. More precisely, because no single regulatory response will be effective across all possible disturbances, the success of a set of regulations is in part a function of the match between the variety of disturbances and of responses (hence the law’s name).

Working with Roger Conant, Ashby followed this paper with another in 1970 that reached an even more dramatic conclusion, namely that (to quote the title) “every good regulator of a system must be a model of that system.” The conclusion of the paper, which must please economists to no end, is that the only way to regulate a system effectively is to build a comprehensive model of its internal dynamics.

The abstract appeal of these concepts runs aground when they confront the messy reality of real-world social systems, with two important implications for regulating finance. First, if no single regulation can be uniformly effective across a dynamic range of problems, then attempting to regulate using law becomes much more problematic. I’d imagine that this (along with lobbying for loopholes) is one of the main reasons regulation has become complicated in tandem with the growth of finance – if one rule can’t work, the natural implication is that you write more of them, and add to the set as more problems manifest. But by definition, a single set of rules will never be robust to every possible event.

Second, the need for a model implies that the designer of the set of regulators has to have deep knowledge of all aspects of the system, and to understand the non-linear dynamics of interconnected systems. But our regulatory framework for finance is set up in such a way that this is impossible. Banks alone are regulated by multiple agencies, each focusing on a different part of the business, while various types of markets have their own overseers, and none of these effectively reach beyond national borders. New meta-regulators like the Financial Stability Oversight Council in the US and the Financial Stability Board globally are attempting to get there, but it is much too early to know how well they are doing.  Worse still, even the most sophisticated model will be prone to similar problems as law – by dealing with known risks, models by definition don’t take into account risks that lie beyond the designer’s awareness or ability to simulate.

So if effective regulation requires a perfect model of the financial system that is so complicated that seemingly no one has handle on the whole thing, let alone an ex-ante understanding of all possible bad events, how should we deal with the real-world complexity of the financial system?

Perrow: eliminate complexity in the system itself

The most rational answer is to make the system less complex. A simpler financial system is will presumably be less likely to fail catastrophically, while also reducing the scale of necessary regulation.

Charles Perrow wrote nearly three decades ago that complex systems, which he identifies as those marked by tight coupling and complex interactions, are inherently vulnerable to failure by accidents.  In fact, these accidents are so predictable that he named them “normal accidents” (the title of his landmark study).

Since the financial system is a textbook case of tight coupling and complex interactions, the natural question is how best to reduce the risk of accidental system failure. Perrow’s more recent book addresses this question, though not before Perrow asserts that the financial crisis was a result of malfeasance that was abetted by the complexity of the system itself, rather than an inevitable system accident. Moreover, he points to the concentration of economic power as a key faultline to be addressed since it creates dangerous dependencies (a pernicious form of tight coupling) and creates power dynamics that work against regulation.

Perrow’s recommendations for reducing risk in complex systems are consistent with his normal accident framework. He points to the need for effective regulation as the most important factor, but he also acknowledges the difficult politics that are likely to make reform a long and difficult process. Given that, he calls instead for two related measures to reduce the vulnerability of complex systems. First, a reduction in the concentration of power is crucial in order to lessen systemic risk (an idea echoed by Hansen, as I discussed here). Replacing one-way relationships of dependency, which tie individuals and companies to more powerful organizations, with more equal and reciprocal arrangements is also necessary.

So what would a Perrow-esque (Perrovian?) financial system look like? He doesn’t say, but I’d have to think it would be consistent with the rising chorus of demands to break up the biggest banks roughly along the lines of Glass-Steagall. Doing so would reduce the complexity of individual institutions, and would also significantly reduce the concentration of economic power these organizations currently enjoy.

That’s the easy part. What is less clear is how to address the problem of interconnectedness. One of Perrow’s insights is that tight couplings eliminate slack (the extra capacity that absorbs energy) in densely connected systems, and are the primary reason that bad outcomes can propagate so quickly across them. So while an industry of smaller banks and trading firms would be good, those firms would remain embedded in trading networks. As former Chancellor of the Exchequer Alasdair Darling wrote in this week’s FT, Lehman Brothers was a standalone investment bank that nonetheless had a disastrous global impact when it failed.

While it is difficult to imagine a financial system with smaller banks in the current political environment, it is impossible to imagine it with fewer connections. By implication, structural fixes to the system itself are unlikely to go far enough to eliminate the scope for normal accidents so long as the financial system remains so tightly interconnected.

Haldane: eliminate complexity in the regulatory framework

The Bank of England’s Andrew Haldane approaches the problem from the other side, by focusing on reducing complexity in regulation rather than in the financial system it governs.

This is not to say that Haldane argues against simplifying the banks – quite the opposite. In an inversion of Perrow’s approach, Haldane notes toward the end of this typically erudite 2012 speech (summary) that restructuring of banking institutions along the clear lines of Glass-Steagall would be ideal, but that the political reality is that debate over mandating  “has reached a stalemate.”

Instead, Haldane focuses more closely on the ways in which regulation could be restructured to be simpler in order to be more robust to the unexpected and less subject to model error. He provides a number of proposals for how this might work, including using simpler metrics to reduce regulators’ reliance on banks’ own risk models (a terrible legacy of the roaring 1990s) and recalibrating the role of the regulator to include more judgment from experience rather than box-ticking. He also proposes a sort of complexity tax to be levied on institutions large enough to create “complexity externalities.”

There is much more to say about Haldane’s ideas, and they will be central to the next complexity post. Until then, I’ll close by noting that none of these views is well-represented in our current regulatory environment. While we do have an increasingly complicated (some would say Byzantine) regulatory framework for banks, the complexity of the institutions themselves has far oustripped the ability of regulators to keep up, let alone create a perfect Ashby model. And while there is mounting political pressure to break up the banks, it has yet to overcome the political power of the banks themselves to resist this change, let alone to address the problem of tight coupling. Meanwhile, Haldane remains a bit of a lone voice, though that is changing.

Up next: the failure of expertise, and a potential solution

Posted in Complexity, Finance and capital markets, Information and systems | Leave a comment

Soon

I had a clever post on complexity and regulation all worked out in my head but in researching it I kept coming across interesting new things, which made the post much too (of course) complex. I’m still working out some ideas and will have something more in the next day or so. 

Posted in Uncategorized | Leave a comment

It must be nice on Planet Davos (but for the rest of us, not so much)

In researching another project yesterday, I came across two articles about the economy that seemed to describe completely different realities. The first was by Joe Weisenthal of Business Insider, and described a general consensus among the great and good of Davos that “the economic crisis is over.” Wiesenthal refers to Mohamed El-Erian of PIMCO who coined the term “the new normal” to describe the post-crash economic malaise. Apparently, even El-Erian now sees a possible end to our woes.

But you wouldn’t know that from reading sociologist Erin Hatton’s Opinionator post this weekend at the NY Times, which described the economic “new normal” on the planet where the rest of us live:

… According to the Census Bureau, one-third of adults who live in poverty are working but do not earn enough to support themselves and their families.

A quarter of jobs in America pay below the federal poverty line for a family of four ($23,050). Not only are many jobs low-wage, they are also temporary and insecure. Over the last three years, the temp industry added more jobs in the United States than any other, according to the American Staffing Association, the trade group representing temp recruitment agencies, outsourcing specialists and the like.

Low-wage, temporary jobs have become so widespread that they threaten to become the norm.

This would be easy to dismiss as another case of the 1% vs. the 99%, but I think it’s worth looking at exactly how this split has developed, and what reinforces it.

The first thing to bear in mind is that the people who go to Davos have one thing in common – all of them have benefited enormously from the financialization of the global economy, whether through their stock price of the company they run, the portfolio they built after selling their company, or (more often) by virtue of working directly in finance. So for these people, the economy matters to the extent that it affects their portfolio.

So what have these people seen since the crisis? This is a pretty good indicator:

New Picture (6)

The stock market has more or less regained everything it lost. So if your sense of well-being is tied up in financial markets, you have to be feeling pretty good.

The infuriating part, of course, is the reason why stock prices are so high, which is where Hatton’s work enters the picture. The media has been full of reports about corporate profits hitting record highs, which is true, but here again they don’t get at the components of that. To grossly oversimplify, we can disaggregate corporate profits into revenue minus costs. And since there are any number of ways to specify those, I’m going to keep it simple and use proxies.

If we assume that revenue is a function of overall economic activity, then GDP seems a decent place to look. And here the picture is surprisingly strong – not great in terms of percentage growth, but in terms of the dollar amount of output (shown here), there is consistent growth since the bottom in 2009.

GDP

The other side – costs – is where things get ugly. I took as proxies here the yield spread of the BAA corporate over Treasuries, which measures how much extra a company with this credit rating has to pay to issue bonds (there are other ratings, but the downward pattern is the same), as well as the change in the average wage. If we treat these as proxies for the costs of capital and labor then the pattern is quite clear:*

Costs

What is especially frustrating is just who is paying these costs. In terms of capital, the Fed has made it clear that its various policies involving purchasing mortgage and Treasury bonds are intended to drive down yields across the market. Those policies are connected to their decision to maintain short-term rates – and thus the yields for savers – at extraordinarily low levels. So while the exact amount is hard to quantify, it seems fair to assume that savers are subsidizing both corporate profits (through lower borrowing costs) and financial portfolio returns (through falling yields, which are synonymous with rising prices).

The picture with labor is more direct. With companies needing fewer and fewer employees, and hiring more and more of those as temps or contractors, the pressure to raise wages just isn’t there in the aggregate. So here again, the rest of us (just by virtue of needing to work) are bearing the cost of lower wages that in turn support record profits.**

So if I had to capture the real “new normal” in a single graph, I think it would be the one below. The red line is GDP growth, and the blue line is wage growth, both relative to the year prior (this is why I can get away with putting them on the same graph). What we have is an economy that is growing twice as fast as wages are growing, and that’s only for those who have a job – you don’t even show up if you’re unemployed.

Ugly

Doesn’t look like grounds for optimism to me, but then I don’t live on Planet Davos.

* Yes, this is a fudge to use rate of change in wage rather than the underlying wage, but using the base amount wasn’t informative without doing a lot of fiddling.

 ** There wasn’t room in this post, but it’s always worth looking at U6 unemployment rather than the number they give in the news. U6 takes into account those who are working part-time but don’t want to be, as well as those who have given up looking for work (though I don’t think it takes into account temps). The graph for that is here and tells the same story of stagnation at a very uncomfortable level.

Posted in Finance and capital markets, Inequality, Labor | Leave a comment

Sentiment should not be the new horizon in journalism

Cross-posted at mathbabe.org.

Nate Silver’s high-profile success in predicting the 2012 election has triggered a wave of articles on the victory of data analysts over pundits. Cathy has already taken on the troubling aspects of Silver’s celebrity, so I’d like to focus instead on the larger movement toward big data as a replacement for traditional punditry. It’s an intriguing idea, especially given the sad state of political punditry. But rather than making things better, it’s entirely possible that the methods these articles propose could make things even worse.

There’s no question that we need better media, especially when it comes to politics. If we take the media’s role to be making sure that voters are informed, then they’re clearly doing a poor job of it. And one of the biggest problems is that political coverage has largely abandoned any pretense of getting to the truth in favor of “he said/she said” and endless discussion of the horse race, with the pundits being the worst offenders. Instead of “Will this be good for citizens?” we get “Will this be good for the Democrats/Republicans in the next poll?”

This is where the big data proposals enter the picture, and where I think they go wrong. Rather than addressing the accuracy or usefulness of the information being provided to us as voters, or working to shift the dialogue away from projections of how a given policy will play in Iowa, the proposals for big data revolve around replacing pundits’ subjective claims about shifting perceptions with more objective analysis of shifting perceptions.

For example, this piece from the Awl convincingly describes the potential for the rapid analysis of thousands or even millions of articles as a basis for more effective media criticism, and as a replacement for punditry by “anecdata.” A more recent post from the Nieman Journalism Lab at least acknowledges some methodological weaknesses even as it makes a very strong case for large-scale sentiment analysis as a way of “getting beyond pundits claiming to speak for others.” By aggregating and analyzing the flow of opinion across social media, the piece argues, journalism can deliver a more finely tuned representation of public opinion.

It’s true that perceptions in a democracy matter a lot. But it’s also true that getting a more accurate read on perceptions is not going to move us toward more informative coverage, let alone toward better politics. Worse still, these proposals ignore the fact that public perception is heavily affected by media coverage, which implies that pulling public perception more explicitly into the coverage itself will just introduce reflexivity rather than clarification.

In other words, we could end up with a conversation about the conversation about the conversation about politics. Is that really what we need?

As I see it, there are two precedents here, neither of which is encouraging. Financial markets have been treated as a source of perfect information for a very long time. The most famous justification for this was Hayek’s claim that the price system inherent in markets acts as “a system of telecommunications” that condenses the most relevant information from millions of agents into a single indicator. Even if we accept this as being true when Hayek wrote his essay in 1945 (which we shouldn’t), it’s certainly not true now. That’s in part because financial markets have attracted more and more speculators who base their decisions on their expectations of what others will do rather than introducing new information. So rather than informational efficiency, we get informational cascades, herding and periodic crashes.

The other example is consumer markets, which have the most experience with sentiment analysis for obvious reasons. In fact, this analysis is only the latest service offered by an enormous industry of advertising, PR and the like that exists solely to engineer and harness these waves of sentiment and perception. Their success proves that perception doesn’t exist in some objective void, but is closely shaped by the process of thinking about and consuming the very products it’s attached to. Or to be wonky about it, preferences can be more endogenous than exogenous in a consumer society.

Which is ultimately my point. If we want to treat the information provided by the media – the primary source of information for our democracy – as a more and more finely tuned consumer good whose value is determined by how popular it is, then this sort of analysis is emphatically the way to go. But we should not be surprised by the consequences if we do.

Posted in Information and systems | 2 Comments