Thursday, February 26, 2015

Polical Collapse in Wisconsin

I normally don't talk state politics, but lately you can't go anywhere without hearing that Scott Walker is the frontrunner for the Rebublican nomination for president by running Wisconsin into the ground. Seriously, despite all the corruption probes, Walker's term has been, from a purely objective, nonpolitical standpoint, an unmitigated disaster.

We lost an opportunity to invest in new infrastructure - the light rail money instead went to California and the jobs left the state. Job growth has been anemic. The very specific promises Walker made for job growth were conveniently forgotten (150,000 new jobs!!!). The emphasis on busting unions and holding down wages has - surprise, surprise - not caused the state economy to blossom, and the state is even back in the red, despite cutting services and raising fees, because Walker has doled out so many tax cuts to the very rich.

So here is a list of the most recent hits, starting with a 220 million dollar subsidy to the billionaires of the professional sports cartel while cutting 300 million from my alma mater, the University of Wisconsin:
Meanwhile in Wisconsin, Gov. Scott Walker continues his efforts to carve up the state piecemeal. Walker is one of those governors and Republican presidential maybe-contenders whose entire reason for governance can be summed up as We Can't Have Nice Things Anymore.

    Gov. Scott Walker's plan to cut the University of Wisconsin System by $300 million over two years would likely lead to layoffs, but closing campuses is not on the table at this time, top school officials said Tuesday.

"At this time" is an interesting hedge, but the central premise is that the state university ought to be able to absorb budget cuts of $300 million over two years by eliminating "waste," which is a talking point that predates Scott Walker's birth by roughly an eon or so. Eliminating this waste, by which we mean people and services, will be accomplished in the usual Scott Walker way. The universities will be detached from state government and therefore the labor rules that regulate them, which will free up Walker supporters on the Board of Regents as they implement those rules in a manner more fitting of a kleptocracy.... 
If you are wondering why the Wisconsin university system needs to have its budget slashed by $300 million over the next two years, the proximal cause is that the previous Scott Walker tax cuts, which were going to pay for themselves what with the resulting boom in state growth caused by all the wealthy people needing new yachts to sail down the wider streets of Madison, instead continue to hemorrhage money. And we need that money to build rich people a new sporting arena.

    Calling his plan a "common-sense, fiscally conservative approach," Gov. Scott Walker on Tuesday said new growth in income tax revenue from Milwaukee Bucks players, employees and visiting teams will generate enough money to cover debt payments on $220 million in state-issued bonds for a new arena. [...]

    A new arena had been expected to cost $400 million to $500 million. Based on the new calculations, it appears the cost will be at least $500 million.

    "There's absolute security for the taxpayers," Walker said. "No new taxes, no drawing on existing revenues, no exposure to the future..."


Because that $220 million of new state bonds are expected to pay for themselves, you see, what with the great gobs of money Milwaukee Bucks players will personally be paying in state income taxes, plus those hotel rooms for the other teams and whatnot. It's like printing money, and forking over hundreds of millions of dollars to build extravagant sporting venues for wealthy sporting team owners is a sure thing that has never, ever gone wrong, and Scott Walker is a Republican financial brain-genius whose past predictions on these things have been so spot-on that people can hardly contain their admiration for their man or wait for him to work his budgetary money-making voodoo on all of the rest of America.

Educating Wisconsin's residents, however, is just a gigantic money drain. How does getting an education help anyone? How does having a college-educated workforce help Wisconsin, especially when compared to the revenues that could be gotten by wooing hotel visits by visiting sporting teams? No sir, you young people and educated people need to learn that we in America can no longer afford to have nice things these days. If you wanted a high-paying career or want to work in a well-appointed, state-funded venue that the politicians of the state can devote themselves to building, you should have done something decent with yourself and become a member of a prominent sporting team and/or the owner of a sporting team. You know, a real career. Grab a ball, junior, and hope you make the cut.
Gov. Scott Walker seeks $300 million in university cuts, but $220 million to build Bucks a new arena (Daily Kos)  So 500 million for a sporting arena is a "sound investment", but connecting the state's capital and largest city by rail in a state whose climate is cold and snowy is a bad investment. Welcome to Tea Party America.

Scott Walker proposes big cut to University of Wisconsin System (MSNBC) And of course, not content with that, he exchanged the "Wisconsin Idea" of betterment of the human race to "meeting the state's workforce needs." None too subtle, eh?
It was not enough for Gov. Scott Walker of Wisconsin suddenly to propose a destructive 13 percent cut in state support for the University of Wisconsin’s widely respected system. His biennial budget plan, released Tuesday, reached gratuitously into the university’s hallowed 111-year-old mission statement to delete a bedrock principle: “Basic to every purpose of the system is the search for truth.”

The budget — patently tailored for the governor’s conservative campaign for the Republican presidential nomination — inserted language that the university should be more narrowly concerned with meeting “the state’s work force needs.”

Brazenly deleted as well from the mission statement, which is nationally appreciated in education circles as the Wisconsin Idea, were the far from controversial goals “to educate people and improve the human condition” and “serve and stimulate society.” It was as if a trade school agenda were substituted for the idea of a university.
Gov. Walker’s ‘Drafting Error’ (NYtimes) Because we certainly don't want people to think, now do we? Walker and his ilk would never get elected.

About those tax cuts and red ink:
Wisconsin Gov. Scott Walker touts the generous tax cuts he's pushed through since 2010 to bolster his image as one of the 2016 GOP presidential field’s most high-profile fiscal conservatives. (One economically conservative activist told Slate's Betsy Woodruff that Walker's 2014 gubernatorial election was more important to him than every other election in the country combined.) But those tax cuts have not created the hoped-for economic growth, and even after big reductions in public spending, Wisconsin is in the midst of a budget crisis: Bloomberg reported Wednesday that the Walker administration will skip a debt payment of $108 million that is due in May.

Spokesman Cullen Werwie told Bloomberg that the state will restructure its debt obligations to avoid default, but the delay will result in a substantial increase in the cost of the loan for Wisconsin taxpayers.  
Scott Walker, Fiscal Responsibility Candidate, Orders His State to Skip Debt Payment (Slate)

But I'm sure the new agenda will bring economic growth - "right to work" legislation, voter ID laws, and drug testing welfare recipeints (it will actually cost the state more money to demonize welfare recipeints, putting us further in the red):
Scott Walker, the governor of Wisconsin who is considering a Republican presidential run, has promised to sign into law an anti-union bill targeted at the state’s private sector workers that is an almost verbatim copy of model legislation devised by an ultra-rightwing network of corporate lobbyists.

On Friday, Walker dropped his earlier opposition to a so-called “right to work” bill, which he had described as a “distraction”, signalling that he would sign it into law should it succeed in passing the Wisconsin legislature. Republican members are rushing through the provision, which would strip private sector unions of much of their fee-collecting and bargaining powers.
Wisconsin anti-union bill is 'word for word' from rightwing lobbyist group (The Guardian)
Are Food Stamp recipients in Wisconsin failing to get jobs because they are on drugs? That’s the story Assembly Speaker Robin Vos (R-Rochester) is peddling. He favors drug testing all working age adult welfare and food stamp recipients to send a message to recipients to “get yourself productive and stop asking the taxpayers to help subsidize your lifestyle,” as he put it.

Gov. Scott Walker has also embraced this proposal and wants to expand it to include all adults receiving Medicaid and unemployment benefits. “This is not a punitive measure. This is about getting people ready for work,” he noted. “I’m not making it harder to get government assistance, I’m making it easier to get a job.”

To the casual observer this might sound sensible. But the closer you look at the proposal the more nonsensical it appears. For starters, many recipients of Medicaid and even Food Stamps (the Supplemental Nutrition Assistance Program or SNAP) already have a job, but it pays so little they need assistance. Statistics show that as high as 58 percent of those getting food stamps are employed. As for Medicaid, the data shows that 83 percent of this funding to goes to the elderly, disabled, or working poor. If the goal is to help the working poor and lower government subsidies, raising the minimum wage would accomplish far more than giving a drug test.

The idea that public assistance recipients have are prone to drugs or have a “lifestyle” problem, as Vos puts it, has been disproven in other states that tried testing. In Florida, just two percent of welfare recipients failed drug tests of their urine, compared to 8 percent of the general population that uses illegal drugs. In Tennessee, just one of 800 tested had a positive result, a rate of 0.12 percent.

Then there is the cost of doing drug screens. In Florida, it cost $30, and the cost of the tests has pretty much equalled the small savings realized by throwing a tiny percentage of welfare recipients off the rolls.
Hey, Let’s Drug Test Food Stamp Recipients (Urban Milwaukee) I've been on unemployment in Wisconsin. Nice to know that next time I'll have to take the bus downtown and regularly pee in a cup in between those job searches!
In many respects, the point of Walker’s anti-union crusade was to destroy the electoral muscle of the main opposition to his conservative agenda. But the most important impact of the creeping death of public unions in Wisconsin may be on take-home pay.

The Washington Post didn’t take note of this, but according to the Census Bureau’s American Community Survey, median household income in Wisconsin is $51,467 a year, nearly $800 below the national average. And it has fallen consistently since the passage of the anti-union law in 2011, despite a small bounce-back nationally in 2013. The Bureau of Economic Analysis puts Wisconsin in the middle of the pack on earnings growth, despite a fairly tight labor market with a headline unemployment rate of 5.2 percent.

This actually undercounts the problem a bit, because it doesn’t cover total compensation. For example, in the wake of the anti-union law, public employees lost the equivalent of 8-10 percent in take-home pay because of increased contributions to healthcare and pension benefits.

Moreover, the meager earnings growth that has come to Wisconsin has mostly gone to the top 1 percent of earners. Another Wisconsin Budget Project report shows that the state hit a record share of income going to the very top in 2012, a year after passage of the anti-union law. That doesn’t include the $2 billion in tax cuts Walker initiated in his first term, which went disproportionately to the highest wage earners. (This is precisely the agenda Walker is likely to run on in his presidential campaign.)
Scott Walker’s economic mess: How worker wages were gutted in Wisconsin (Salon)

Expect more of that, here's what he's doing while we're paying his salary: Scott Walker to attend private dinner with supply-siders in New York (Washington Post)
Wisconsin Gov. Scott Walker is scheduled to attend a private dinner Wednesday with longtime advocates of supply-side economics. Economists Larry Kudlow, Arthur Laffer, and Stephen Moore will host Walker, according to several people with knowledge of the event. For decades, that trio of friends — all associated with President Ronald Reagan’s economic policies — have been high-profile proponents of using tax cuts to boost economic growth.
Yes, kids, there's no limit on how far you can go by being a shill for the rich! You can even be a college dropout, unlike working-clsss people.
Set aside, for a moment, his repeated refusal, in the past few days, to say whether he believes that President Obama loves America, or whether he believes that the President is a Christian, and look instead at Walker’s record running what used to be one of America’s more progressive states. Having cut taxes for the wealthy and stripped many of Wisconsin’s public-sector unions of their collective-bargaining rights, he is now preparing to sign a legislative bill that would cripple unions in the private sector. Many wealthy conservatives, such as the Koch brothers, who have funnelled a lot of money to groups supporting Walker, regard him as someone who’s turning his state into a showcase for what they want the rest of America to look like.
The Dangerous Candidacy of Scott Walker (The New Yorker) Wisconsin is now the Koch bother's "laboratory."

This article delves a little deeper into the reason for Walker's success in Wisconsin, which I can tell you from being here, is totally accurate. It's basically divide-and-conquer, along with mining the bottomless pit of white racial grievance and directing anger against urban dwellers and minorities:
Walker’s home turf of metropolitan Milwaukee has developed into the most bitterly divided political ground in the country—“the most polarized part of a polarized state in a polarized nation,” as a recent series by Craig Gilbert in the Milwaukee Journal Sentinel put it. Thanks to a quirk of twentieth-century history, the region encompasses a heavily Democratic and African American urban center, and suburbs that are far more uniformly white and Republican than those in any other Northern city, with a moat of resentment running between the two zones. As a result, the area has given rise to some of the most worrisome trends in American political life in supercharged form: profound racial inequality, extreme political segregation, a parallel-universe news media. These trends predate Walker, but they have enabled his ascent, and his tenure in government has only served to intensify them. Anyone who believes that he is the Republican to save his party—let alone win a presidential election—needs to understand the toxic and ruptured landscape he will leave behind.
Scott Walker's Toxic Racial Politics (New Republic) A journey through the poisonous, racially divided world that produced a Republican star. Long, but a must-read. I can't escape from Wisconsin soon enough!

Incidentally, to see an alternative style of governance in action, see neighboring Minnesota: This Billionaire Governor Taxed the Rich and Increased the Minimum Wage -- Now, His State's Economy Is One of the Best in the Country (Huffington Post) The difference is, of course, that because the governor of Minnesota is wealthy and not a college dropout, he does not have to climb the greasy pole by being an empty shill for plutocrats.

Tuesday, February 24, 2015

Competition Is Wasteful

Related to last time, this article on expenditure cascades, Why have weddings and houses gotten so ridiculously expensive? Blame inequality (Vox), talks about the ways that positional goods and extreme inequality cause waste in various economies:
The median new house in the US is now 50 percent larger than it was in 1980, even though the median income has grown only slightly in real terms. Houses are growing faster than incomes because of a process I call "expenditure cascades."

Here's how it works. People at the top begin building bigger houses simply because they have more money. Perhaps it's now the custom for them to have their daughters' wedding receptions at home, so a ballroom is now part of what defines adequate living space. Those houses shift the frame of reference for the near-wealthy — who travel in the same social circles — so they, too, build bigger.

But as the near-wealthy begin adding granite countertops and vaulted ceilings, they shift the frames of reference that define adequate for upper-middle class families. And so they begin going into debt to keep pace. And so it goes, all the way down the income ladder. More spending by the people who can afford it at the top ultimately creates pressure for more spending by people who can't afford it at the bottom.

The best response might seem to be simply to exhort people to summon more discipline, except for the fact that...
3) The costs of failure to keep pace with community spending norms are not just hurt feelings

The process I describe isn't a law. Congress isn't mandating that people buy bigger houses. So if it's optional, why don't people simply opt out? Because opting out entails real costs that are extremely hard to avoid.

Failure to keep pace with what peers spend on housing means not just living in a house that seems uncomfortably small. It also means having to send your children to inferior schools. A "good" school is a relative concept, and the better schools are almost always those in more expensive neighborhoods.

Here's the toil index, a simple measure I constructed to track one important cost of inequality for middle-income families. To send their children to a school of at least average quality, median earners must buy the median-priced home in their area. The toil index plots the monthly number of hours the median earner must work to achieve that goal. When incomes were growing at the same rate for everyone during the post-World War II decades, the toil index was almost completely stable. But income inequality began rising sharply after 1970, and since then the toil index has been rising in tandem. It's now approximately 100 hours a month, up from only 42 hours in 1970.

The median real hourly wage for men in the US is actually lower now than in the 1980s. If middle-income families must now spend more than before to achieve basic goals, how do they manage? Census data reveal clear symptoms of increasing financial distress among these families. Of the 100 largest US counties, those where income inequality grew most rapidly were also those that experienced the largest increases in three important symptoms of financial distress: divorce rates, long commutes, and bankruptcy filings.

In OECD countries, higher inequality is associated with longer work hours, both across countries and over time. Standard economic models predict none of these relationships.

4) Positional concerns spawn wasteful spending patterns, even when everyone is well-informed and rational

Charles Darwin, the great British naturalist, was heavily influenced by Adam Smith and other economists. He saw that competition in nature, like competition in the marketplace, often produced benefits for both individuals and larger groups, just as in Smith's fabled Invisible Hand theory. Keen eyesight in hawks, for example, made both individual hawks and hawks as a species more successful. Yet Darwin also saw clearly that many traits and behaviors helped individuals at the expense of larger groups. When success depends on relative position, as it almost always does in competitive struggles, wasteful positional arms races often result.

Consider the antlers in modern bull elk, which span four feet and weigh as much as 40 pounds. Because they impair mobility in wooded areas, bulls are more easily surrounded and killed by wolves. So why doesn't natural selection favor smaller antlers? Darwin's answer was that large antlers evolved because elk are a polygynous species, meaning that males take more than one mate if they can. But if some take multiple mates, others are left with none. That's why males fight so bitterly with one another for access to females. Mutations that coded for larger antlers spread quickly because they made any bull that had them more likely to win.

But bulls as a group would be better off if each animal's antlers were smaller by half, since they'd be less vulnerable to predators, and each fight would be decided as before. The inefficiency in such positional arms races is exactly analogous to the inefficiency of military arms races. It's also like when everyone stands to get a better view: no one sees any better than if all had remained comfortably seated.

Beyond some point, additional spending on mansions, coming-of-age parties, and many other goods becomes purely positional, meaning that it merely raises the bar that defines adequate. Because much of the total spending in today's economy is purely positional, it is wasteful in the same way that military arms races are wasteful.
This is one reason highly unequal societies are always less successful than more equal ones. Add to the fact that human capital is developed much more broadly in equal societies, and there is more opportunity for everyone in equal societies rather than just a hereditary elite.

The article mentions that the rich, too, would be better off without having to play keep-up with this arms race, and higher taxes would accomplish this with no downside since the additional income to the rich is now just being used for competition with other elites, rather than productive investment, and causing the costs for everybody to rise (especially for housing). Of course it will never happen, since the American rich are uniquely sociopathic, and will never rest until they have every penny and have to step over the dead bodies to get to their twentieth mansion.

Sunday, February 22, 2015

When Competition Kills

At some start-ups, Friday is so casual that it’s not even a workday (Washington Post)
Treehouse is closed every Friday. The 80-and-counting employees work a 32-hour work week Monday through Thursday. On Fridays, employees are expected to be home, with their families, having fun, doing something, anything, other than work.

That’s exactly how founder and chief executive Ryan Carson, 37, has been working since 2005. These days, on Fridays, he gets his two young sons off to school and spends the day hanging out with his wife, Gill. “It’s like dating again. We go to coffee shops. We read books together. I really feel like I’m involved in my kids’ lives and my wife’s life,” Carson said. “This schedule has been absolutely life-changing for me. I can’t imagine anything more valuable.”  
Sounds pretty good right? What's the problem?
Books, business journals and news stories abound detailing the insanely long hours tech workers put in, sleeping under desks, napping in broom closets and missing parents’ funerals and children’s births. Yahoo CEO Marissa Mayer has said she worked 130 hours a week at Google. And in a post that went viral, Michael Arrington, founder of TechCrunch and now a tech investor, wrote: “Startups are hard. So work more, cry less, and quit all the whining.”
“As far as I’m concerned, working 32 hours a week is a part-time job,” Arrington, said in an interview. “I look for founders who are really passionate. Who want to work all the time. That shows they care about what they’re doing, and they’re going to be successful.”
Others cite the need to best the competition and meet investors’ demands.
While Mike McDerment no longer works himself into the ground like he did when he moved into his parents’ Toronto basement to launch his start-up, he still views long hours as necessary. 
“It’s really about survival,” said McDerment, CEO of FreshBooks, a cloud-based accounting software firm. “It’s this kind of us-against-the -world mentality. You’re competing with multibillion-dollar companies that want to crush you out of existence. You have to work your ass off just to survive until tomorrow.” 
Treehouse founder Carson says he worked around the clock when he was starting out as a Web developer for a start-up in the United Kingdom. 
“I remember trying to sleep under my desk for 20 minutes, and feeling delirious and frustrated,” he said. “And then, getting to the end of that and realizing it was all for nothing. We were building a Web site for some tobacco company. That’s when I realized, this is where it goes, if you work hard and never stop. It’s like being on the treadmill and not even understanding why you’re there.” 
Indeed, Shikhar Ghosh, a Harvard business school lecturer, studied 2,000 tech companies that started up between 2004 and 2010 and found that 90 to 95 percent fail. “Failure is the norm,” he has said.
"Us against the world?" Twenty-minute naps under your desk? Delirious and frustrated? Companies that want to "crush you out of existence." Working just to "survive another day?"

Gee, that sounds fucking awful. It sounds more like Mutual of Omaha's Wild Kingdom, red in tooth-and-claw, than running any kind of business providing goods and services. What is the point of this again?

Is any of this really necessary? What the fuck has happened to the business world? And why does anyone think this is a good thing? It didn't used to be this way. Who benefits from this state of affairs? Based on the above, certainly not the workers or the owners. I don't really need anything these people are selling. So if it's not benefiting the workers, owners, or consumers, why exactly is working in business such a sheer hell?

Its seems like this competition economy isn't making people better off, it's destroying us. We're working ourselves literally to death just so we can get another fucking iPhone app? Maybe they can make an app to monitor employees' deteriorating physical and mental health.

And it's not just the business world, it starts in schools:
Anjan Chatterjee, a professor of neurology at the University of Pennsylvania, who has published several influential papers on the ethics of smart drugs, tells me that he sometimes makes jokes about it. “When I was young, students would use drugs to check out. Now they’re using them to check in.”

He’s witnessed the rise, in the last 10 years, of a generation of American students doping themselves up on various medications they believe will give them a competitive edge. “It’s even in high schools now, especially in the more affluent suburbs. Students call them ‘study aids’; they don’t even think of them as drugs. There’s an entire grey market on campuses. But then, the current estimate is that a third of all students have a prescription for some sort of psychoactive medication anyway: antidepressants, or medication for ADHD, or for anxiety, so the availability is quite high. Often, they’ll just sell on the medication in the library.”

He believes that cognitive enhancement – or cosmetic neurology, as he calls it – is likely to become viewed as normal over time, in much the same way as cosmetic surgery has been. If it’s available, people will avail themselves of it. And his intuition “is that this use of drugs is not the cause of this sense of competition. It’s a phenomenon of it.” Smart drugs are part of a “parcel of broader cultural trends” that tap into something that is already within our culture. “And this is what does give me pause. It’s this relentless pursuit of productivity, and material productivity in particular that seems to be at the root of this. Going after drugs is a symptom of that underlying impulse.”
Students used to take drugs to get high. Now they take them to get higher grades (The Guardian)

The ultimate answer is, of course, that it is an arms race that benefits exactly no one. The example here is from Darwinian evolution as the excerpt below details. Everyone would  be much better off if we called off the escalating competition and took it easy, which would have no real downsides. The question is, will this happen in our winner-take-all, loser-gets-nothing, economy, or end up with people drugging and microchipping themselves into 24-hour-a day always-working psychosis and decrying others as "weak" and therefore worthy of elimination?
[Adam] Smith never believed that the invisible hand guaranteed good outcomes in all circumstances. His skepticism was on full display, for example, when he wrote, “People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices.” To him, what was remarkable was that self-interested actions often led to socially benign outcomes.

Like Smith, modern progressive critics of the market system tend to attribute its failings to conspiracies to restrain competition. But competition was much more easily restrained in Smith’s day than it is now. The real challenge to the invisible hand is rooted in the very logic of the competitive process itself.

Charles Darwin was one of the first to perceive the underlying problem clearly. One of his central insights was that natural selection favors traits and behaviors primarily according to their effect on individual organisms, not larger groups. Sometimes individual and group interests coincide, he recognized, and in such cases we often get invisible hand-like results. A mutation that codes for keener eyesight in one particular hawk, for example, serves the interests of that individual, but its inevitable spread also makes hawks as a species more successful.

In other cases, however, mutations that help the individual prove quite harmful to the larger group. This is in fact the expected result for mutations that confer advantage in head-to-head competition among members of the same species. Male body mass is a case in point. Most vertebrate species are polygynous, meaning that males take more than one mate if they can. The qualifier is important, because when some take multiple mates, others get none. The latter don’t pass their genes along, making them the ultimate losers in Darwinian terms. So it’s no surprise that males often battle furiously for access to mates. Size matters in those battles, and hence the evolutionary arms races that produce larger males.

Elephant seals are an extreme but instructive example.10 Bulls of the species often weigh almost six thousand pounds, more than five times as much as females and almost as much as a Lincoln Navigator SUV. During the mating season, pairs of mature bulls battle one another ferociously for hours on end, until one finally trudges off in defeat, bloodied and exhausted. The victor claims near-exclusive sexual access to a harem that may number as many as a hundred cows. But while being larger than his rival makes an individual bull more likely to prevail in such battles, prodigious size is a clear handicap for bulls as a group, making them far more vulnerable to sharks and other predators.

Given an opportunity to vote on a proposal to reduce every animal’s weight by half, bulls would have every reason to favor it. Since it’s relative size, not absolute size, that matters in battle, the change would not affect the outcome of any given head-to-head contest, but it would reduce each animal’s risk of being eaten by sharks. There’s no practical way, of course, that elephant seals could implement such a proposal. Nor could any bull solve this problem unilaterally, since a bull that weighed much less than others would never win a mate.

Similar conflicts pervade human interactions when individual rewards depend on relative performance. Their essence is nicely captured in a celebrated example by the economist Thomas Schelling. Schelling noted that hockey players who are free to choose for themselves invariably skate without helmets, yet when they’re permitted to vote on the matter, they support rules that require them. If helmets are so great, he wondered, why don’t players just wear them? Why do they need a rule?

His answer began with the observation that skating without a helmet confers a small competitive edge—perhaps by enabling players to see or hear a little better, or perhaps by enabling them to intimidate their opponents. The immediate lure of gaining a competitive edge trumps more abstract concerns about the possibility of injury, so players eagerly embrace the additional risk. The rub, of course, is that when every player skates without a helmet, no one gains a competitive advantage—hence the attraction of the rule.

As Schelling’s diagnosis makes clear, the problem confronting hockey players has nothing to do with imperfect information, lack of self-control, or poor cognitive skills—shortcomings that are often cited as grounds for government intervention. And it clearly does not stem from exploitation or any insufficiency of competition. Rather, it’s a garden-variety collective action problem. Players favor helmet rules because that’s the only way they’re able to play under reasonably safe conditions. A simple nudge—say, a sign in the locker room reminding players that helmets reduce the risk of serious injury—just won’t solve their problem. They need a mandate.

What about the libertarians’ complaint that helmet rules deprive individuals of the right to choose? This objection is akin to objecting that a military arms control agreement robs the signatories of their right to choose for themselves how much to spend on bombs. Of course, but that’s the whole point of such agreements! Parties who confront a collective action problem often realize that the only way to get what they want is to constrain their own ability to do as they please.
The Darwin Economy – Why Smith’s Invisible Hand Breaks Down (Farnham Street)

Welcome to the Brave New World. But what do I know, I'm just Darwinian roadkill.

Friday, February 20, 2015

Republican Extremism

I normally don't comment on party politics, but wow, have things gotten extreme.

The Republicans are battling tooth-and-nail to remove access to health care for the poorest Americans:
...Republicans are attempting one of the most brazen manipulations of the legal system in modern times. To pull it off, they’re relying on a toxically politicized judiciary to make law, and to make a mockery of everything that conservative legal scholars profess to believe.
In less than two weeks’ time, the Supreme Court will hear oral arguments in King v. Burwell — the net result of a well-orchestrated, well-financed, five-year campaign to kill President Obama’s signature achievement by legal assassination. It’s a remarkably flimsy case, the plaintiffs may lack standing, and a host of business and health care professionals have said the consequences of backing the right-wing consortium behind this case could be catastrophic.

But none of that matters to at least four justices on the court who would rule in favor of a ham sandwich, if it meant overturning the health care law. If they get a fifth vote, more than eight million people in 34 states could lose their health coverage. Premiums for several million more would rise enough to make insurance impossible. Thousands of people, lacking basic care, may even die prematurely.

“The Supreme Court is going to render a body blow to Obamacare from which I don’t think it will ever recover,” said Senate Majority Whip John Cornyn of Texas last month. He was licking his chops in anticipation.
They are also battling to stop the minumum wage from increasing:
A network of Republican lawmakers and their rightwing corporate funders are battling behind closed doors to block minimum wage increases in cities across the US, in a step-by-step counter-attack that could cut back the incomes of millions of Americans despite an economic upswing. 
According to strategic details obtained by the Guardian, the American Legislative Exchange Council (Alec) – along with its localised sister organization, ACCE – is trying to prevent elected city representatives from raising the minimum wage to levels above those set by their states. The group has launched an aggressive dual-track mission that combines legislation and litigation in what Alec calls a “new battleground” over worker compensation.
Citizens of industrialized nations throughout the world enjoy universal access to healthcare and reasonable minimum wages. But not in America, the "leader" of the free world and defender of democracy.

They are also going to war against academia:
The fate of the 17-campus public university system was bound to be affected: While many here take pride in its carefully cultivated rise to the top tier of American public education, conservatives have long groused about some campuses, particularly the flagship school at Chapel Hill, as out-of-touch havens of liberalism. 
Since the recession began, the state government has also subjected the system to budget cuts leading to the loss of hundreds of positions. 
Twenty-nine of the 32 university board members were appointed by the Legislature after the Republicans’ 2010 gains. Last year, lawmakers instructed the board to consider redirecting some of the funding that goes to the system’s 240 centers and institutes, which focus on topics ranging from child development to African studies. 
The advisory group’s report, which is likely to be considered by the full Board of Governors next Friday, recommends closing the Center on Poverty, Work and Opportunity at Chapel Hill; North Carolina Central University’s Institute for Civic Engagement and Social Change; and East Carolina University’s Center for Biodiversity.
And are even attempting to prevent AP courses from being taught to students because they contain more than simple-minded propaganda:
This week an Oklahoma legislative committee voted overwhelmingly to effectively ban the teaching of Advanced Placement U.S. history classes. The bill’s author, Rep. Dan Fisher (R), said that state funds shouldn’t be used to teach the course — which students can take to receive college credit — because he believes it emphasizes “what is bad about America” and characterizes the United States as a “nation of oppressors and exploiters.” Fisher’s proposal to replace the ready-made, nationally used, college-recognized AP curriculum — studied by hundreds of thousands of high school students each year — with a homegrown substitute would cost the state an estimated $3.8 million. 
After facing national criticism, Fisher withdrew his bill this week and said he plans to submit a new one requiring a state “review” of the AP course rather than its complete defunding. 
But Oklahoma is far from alone in wanting to reinvent the wheel by creating its own, allegedly more patriotic version of advanced coursework. Policymakers in Georgia, Texas, South Carolina, North Carolina and Colorado have agitated to scrap or doctor the AP course, citing its “liberal bias” and supposed focus on U.S. “blemishes.” The Republican National Committee likewise called on Congress last year to withhold funding from the nonprofit that developed the course, the College Board, because its AP course “emphasizes negative aspects of our nation’s history while omitting or minimizing positive aspects.” In Colorado, where a local school board proposed revamping the AP curriculum to make sure it does “not encourage or condone civil disorder [or] social strife,” some brave students decided to demonstrate the virtues of civil disorder and social strife by peacefully protesting. 
In some states, U.S. history isn’t the only AP course to come under attack. In both Oklahoma and Kansas, legislators have threatened to bar or defund any curriculum not developed locally, which could disqualify all AP and International Baccalaureate classes from being taught in the public schools.
Of course, say it with me, "both sides are equally extreme." That is the mantra we must constantly repeat in the U.S. because eventually it will become the truth, apparently.

You know, I hear all the time from various pundits about the supposed extremest regimes around the world, whether it's Erdogan in Turkey, Orban in Hungary, Maduro in Venezuela or the celebrated foe of the moment Vladimir Putin, as well as all the so-called "far right" parties in Europe (none of whom hold real power). Yet the Republicans are arguably the most powerful party in the United States and (less) arguably one of the most right-wing, extremist and radical in the world. Why are they never mentioned in the same breath as those names above?

Sunday, February 15, 2015

Is Nutrient Deficiency the Cause of Social Breakdown?


While researching this post, I came across this article which detailed just how depleted of vitamins and minerals our vegetables are. I had heard some statistics before, but these are truly frightening:
A landmark study on the topic by Donald Davis and his team of researchers from the University of Texas (UT) at Austin’s Department of Chemistry and Biochemistry was published in December 2004 in the Journal of the American College of Nutrition. They studied U.S. Department of Agriculture nutritional data from both 1950 and 1999 for 43 different vegetables and fruits, finding “reliable declines” in the amount of protein, calcium, phosphorus, iron, riboflavin (vitamin B2) and vitamin C over the past half century. Davis and his colleagues chalk up this declining nutritional content to the preponderance of agricultural practices designed to improve traits (size, growth rate, pest resistance) other than nutrition.
“Efforts to breed new varieties of crops that provide greater yield, pest resistance and climate adaptability have allowed crops to grow bigger and more rapidly,” reported Davis, “but their ability to manufacture or uptake nutrients has not kept pace with their rapid growth.” There have likely been declines in other nutrients, too, he said, such as magnesium, zinc and vitamins B-6 and E, but they were not studied in 1950 and more research is needed to find out how much less we are getting of these key vitamins and minerals.
The Organic Consumers Association cites several other studies with similar findings: A Kushi Institute analysis of nutrient data from 1975 to 1997 found that average calcium levels in 12 fresh vegetables dropped 27 percent; iron levels 37 percent; vitamin A levels 21 percent, and vitamin C levels 30 percent. A similar study of British nutrient data from 1930 to 1980, published in the British Food Journal,found that in 20 vegetables the average calcium content had declined 19 percent; iron 22 percent; and potassium 14 percent. Yet another study concluded that one would have to eat eight oranges today to derive the same amount of Vitamin A as our grandparents would have gotten from one.
Soil Depletion and Nutrition Loss (Scientific American)

And then I ran across this fascinating article in the context of America's imprisonment spree: Is Pellagra the Root Cause of Violent Shooting Rampages?
Dr. Weston A. Price, a researcher in the 1930′s found that primitive tribes eating a whole foods, natural diet high in animal foods and animal fat had no need for prisons. The moral character of these isolated people was strong. They were not incapacitated mentally or physically. In his book, Nutrition and Physical Degeneration, Price describes his travels around the globe, and he marveled at the stellar character of these people who had no access to modern manufactured foods.

On page 486 of Nourishing Traditions: The Cookbook that Challenges Politically Correct Nutrition and the Diet Dictocrats, is a shocking clue to the mystery of the scourge of violence amongst young people.

    “While pellagra was being investigated as an interesting curiosity in Europe, it was becoming a way of life in the Southern United States…The general diet consisted of cornmeal and grits, soda biscuits, corn syrup and salt pork; and even when they had enough bulk of food, the Southerners developed sore skin and mouths, became thin and listless, and suffered from depression, hallucinations, irritability and other mental disorders.

    The clinical description of the typical poor Southerner, any time between about 1900 and 1940, comes alive in the novels of William Faulkner–the brooding sullenness, suddenly shattered by outbursts of irrational anger, persecution, mania, the feeling of people living in a cruel and demented world of their own…Doctors knew very well that diet was at the bottom of all the misery they saw around them, and that disease could be kept at bay by a balanced food supply…”

Compare the modern junk food diet to the diet of poor Southerners: cereals, food bars, corn chips, crackers, and the high fructose corn syrup found in energy drinks and sodas. Not too dissimilar!

Vitamin B3 or niacin deficiency is the cause of pellegra. When I googled Pellegra and violence, sure enough I find a letter to a U.S. Senator by Barbara Stitt, an author who once worked as a probation officer. She found that changing the diet of ex-offenders eliminated the hostility and other symptoms that would lead them to act out in a criminal fashion.

Her book is aptly titled, Food & Behavior: A Natural Connection and her work seems to confirm the findings of Dr. Weston A. Price on nutritional injury and the role it plays in juvenile delinquency and adult crimes.

A review of Barbara’s book mentions her concern about reactive hypoglycemia, sub-clinical pellegra and vitamin B deficiencies being at the root of violent criminal’s actions.

Check out this revealing quote from the review:

    “The startling part of sub-clinical pellagra, like hypoglycemia, is that the symptoms also mirror those of schizophrenia, a problem so widespread that those who suffer from it occupy one out of every four hospital beds in the United States.”
Is Pellagra the Root Cause of Violent Shooting Rampages? (Hartke Is Online)

One is forced to wonder whether the junk-food diet fed to America's poor classes is at least partially responsible for the deteriorating state of the poor in America today. Note that most crime is in so-called "food deserts." Note that McDonald's junk food is primarily marketed to children. Go into any McDonald's, particularly in urban ghettos, and you will see poorer families using McDonald's as both cook and babysitter for their litter. The Little Caesars in the strip mall near my house always has a line literally out the door in weekends. Observing these people, they are almost always lower-class families with lots of young children, and  five-dollar pizzas is sadly the only way they can afford to feed their families.

Here's an article on the same subject by the Guardian:
That Dwight Demar is able to sit in front of us, sober, calm, and employed, is "a miracle", he declares in the cadences of a prayer-meeting sinner. He has been rocking his 6ft 2in bulk to and fro while delivering a confessional account of his past into the middle distance. He wants us to know what has saved him after 20 years on the streets: "My dome is working. They gave me some kind of pill and I changed. Me, myself and I, I changed."

Demar has been in and out of prison so many times he has lost count of his convictions. "Being drunk, being disorderly, trespass, assault and battery; you name it, I did it. How many times I been in jail? I don't know, I was locked up so much it was my second home."

Demar has been taking part in a clinical trial at the US government's National Institutes for Health, near Washington. The study is investigating the effects of omega-3 fatty acid supplements on the brain, and the pills that have effected Demar's "miracle" are doses of fish oil....

For the clinician in charge of the US study, Joseph Hibbeln, the results of his trial are not a miracle, but simply what you might predict if you understand the biochemistry of the brain and the biophysics of the brain cell membrane. His hypothesis is that modern industrialised diets may be changing the very architecture and functioning of the brain.

We are suffering, he believes, from widespread diseases of deficiency. Just as vitamin C deficiency causes scurvy, deficiency in the essential fats the brain needs and the nutrients needed to metabolise those fats is causing of a host of mental problems from depression to aggression. Not all experts agree, but if he is right, the consequences are as serious as they could be. The pandemic of violence in western societies may be related to what we eat or fail to eat. Junk food may not only be making us sick, but mad and bad too.

Essential fatty acids are called essential because humans cannot make them but must obtain them from the diet. The brain is a fatty organ - it's 60% fat by dry weight, and the essential fatty acids are what make part of its structure, making up 20% of the nerve cells' membranes. The synapses, or junctions where nerve cells connect with other nerve cells, contain even higher concentrations of essential fatty acids - being made of about 60% of the omega-3 fatty acid DHA.

Communication between the nerve cells depends on neurotransmitters, such as serotonin and dopamine, docking with receptors in the nerve cell membrane.

Omega-3 DHA is very long and highly flexible. When it is incorporated into the nerve cell membrane it helps make the membrane itself elastic and fluid so that signals pass through it efficiently. But if the wrong fatty acids are incorporated into the membrane, the neurotransmitters can't dock properly. We know from many other studies what happens when the neurotransmitter systems don't work efficiently. Low serotonin levels are known to predict an increased risk of suicide, depression and violent and impulsive behaviour. And dopamine is what controls the reward processes in the brain.

Laboratory tests at NIH have shown that the composition of tissue and in particular of the nerve cell membrane of people in the US is different from that of the Japanese, who eat a diet rich in omega-3 fatty acids from fish. Americans have cell membranes higher in the less flexible omega-6 fatty acids, which appear to have displaced the elastic omega-3 fatty acids found in Japanese nerve cells.

Over the last century most western countries have undergone a dramatic shift in the composition of their diets in which the omega-3 fatty acids that are essential to the brain have been flooded out by competing omega-6 fatty acids, mainly from industrial oils such as soya, corn, and sunflower. In the US, for example, soya oil accounted for only 0.02% of all calories available in 1909, but by 2000 it accounted for 20%. Americans have gone from eating a fraction of an ounce of soya oil a year to downing 25lbs (11.3kg) per person per year in that period. In the UK, omega-6 fats from oils such as soya, corn, and sunflower accounted for 1% of energy supply in the early 1960s, but by 2000 they were nearly 5%. These omega-6 fatty acids come mainly from industrial frying for takeaways, ready meals and snack foods such as crisps, chips, biscuits, ice-creams and from margarine. Alcohol, meanwhile, depletes omega-3s from the brain.

To test the hypothesis, Hibbeln and his colleagues have mapped the growth in consumption of omega-6 fatty acids from seed oils in 38 countries since the 1960s against the rise in murder rates over the same period. In all cases there is an unnerving match. As omega-6 goes up, so do homicides in a linear progression. Industrial societies where omega-3 consumption has remained high and omega-6 low because people eat fish, such as Japan, have low rates of murder and depression.
An earlier pilot study on 30 patients with violent records found that those given omega-3 supplements had their anger reduced by one-third, measured by standard scales of hostility and irritability, regardless of whether they were relapsing and drinking again. The bigger trial is nearly complete now and Dell Wright, the nurse administering the pills, has seen startling changes in those on the fish oil rather than the placebo. "When Demar came in there was always an undercurrent of aggression in his behaviour. Once he was on the supplements he took on the ability not to be impulsive. He kept saying, 'This is not like me'."

Demar has been out of trouble and sober for a year now. He has a girlfriend, his own door key, and was made employee of the month at his company recently. Others on the trial also have long histories of violence but with omega-3 fatty acids have been able for the first time to control their anger and aggression. J, for example, arrived drinking a gallon of rum a day and had 28 scars on his hand from punching other people. Now he is calm and his cravings have gone. W was a 19st barrel of a man with convictions for assault and battery. He improved dramatically on the fish oil and later told doctors that for the first time since the age of five he had managed to go three months without punching anyone in the head.
Omega-3, junk food and the link between violence and what we eat  (The Guardian)

Is our diet driving us crazy? Look at the poor social outcomes. Look at increasing diseases like ADHD and teenage depression and suicide. Look at bullying, school shooting rampages and poor impulse control. The majority of America's children in public schools are considered low-income.  And junk food filled with GMO wheat and corn syrup is specifically marketed to poor Americans. Even vegetables may not help since they are so depleted of essential nutrients. And it's been shown that meat raised in CAFOs has a bad Omega-3 to Omega-6 fatty acid ratio, substances that are necessary for proper brain growth and development as noted above.
In a landscape dominated by golden arches, dollar menus, and value meals serving up to 2,150 calories, fast food has been much maligned. It’s fast, it’s cheap, but we know it’s generally not good for us. And yet, well-touted statistics report that Americans are spending more than ever on fast food:

    In 1970, Americans spent about $6 billion on fast food; in 2000, they spent more than $110 billion. Americans now spend more money on fast food than on higher education, personal computers, computer software, or new cars. They spend more on fast food than on movies, books, magazines, newspapers, videos, and recorded music—combined.
Separating the Wheat from the Chaff: Will Industrialized Foods Be the End of Us? (SciAm)

Is the supposed poor behavior of Americans that conservatives like Charles Murray use to blame the poor for their plight actually caused by the atrocious foods they are forced to eat by agribusiness? Note that in the past, even the poorest Americans had better diets than many of the middle classes almost anywhere on earth, including large amounts of pasture-grazed meat. Individual behavior seemed a lot more calm and controlled back then. Look at old photos of people, and you can see an intelligence and quiet dignity in the eyes of even the poorest people. The people are calm and well-dressed. Look at the quality of school textbooks back then. Look at the front pages of newspapers compared to today. Exams given to secondary school children would baffle graduate students today. Look at the quality of rhetoric in the Lincoln-Douglas debates compared to today's televised spectacles - and those debates were targetted at the average voter.
If you live in America, you must have an inkling of these changes too. If you're one of the educated class - and if you're reading this, you probably are - then you must have had at least a glimpse of how the other side lives. You must have seen the awful "food" that they eat, the wrecked, destroyed state of their bodies. You must have at least seen hints of their broken family lives. Just the other day I got my hair cut in a poor white neighborhood, and the barber spent the whole time telling me about "that bitch" who was the mother of his children. No matter how thick the Belmont Bubble, you must have seen hints like this. And if it doesn't hurt you to see Americans living that sort of unhealthy lifestyle, then I don't know what to say to you.
America is separating into peasants and scholar-gentry (Noahpinion)

Today we have the disintegrating social behavior that conservatives tut-tut about - the drug abuse, the ugly ill-fitting wardobes, the bodies festooned with tattoos, the thuggish behavior, violence and divorce.

People of Wal-Mart

I wonder if this can explain history too. Humans evolved to eat a certain diet over millions of years (the lifetime of the genus Homo). When civilization got started, eventually the diet rich in fish, meat and vegetables got confined to a small slice of upper-class wealthy people, while the majority of people were forced to subsist on grains, porridge, beer, etc.
The great nutrition divide paralleled distributions in wealth. Peasants who worked the land relied on gruel and flatbreads to see them through the winters—their bounties were collected and stored to feed city residents.[v] The best foods were prepared in the kitchens of royalty, aristocracy, and merchants—those who had the means to purchase excess and diversity. (ibid)
And that's when civilization got violent. Although Steven Pinker thinks that violence actually declined when that happened, I'm not so sure. We have plenty of data, including written records, of the wars and violence and slaughter of hierarchical states and empires. Prior to that, we have little evidence of the amount of violence in the daily lives of humans. Pinker's data for this period is comprised entirely of some skeletal remains in various places which constitute a vanishingly small percentage of the human race prior to civilization. In other words, there is a paucity of data in this time period relative to historical data, and that is undeniable. And plenty of people have put forward valid criticisms of even the data that Pinker used from Keeley, Gat, et. al.

Based on the above, nutritional deficiencies that became widespread throughout society after the rise of agriculture and hierarchical states based on grain monocrops would almost certainly cause more violence, not less. It's hard to believe that civilizations with their masses of impoverished slaves, serfs and peasants subsisting on a diet of grains were worse than the hunter-gatherer period when hunting a single mammoth could feed a tribe for a month, and fish and nutritious vegetables were there for the taking (early human populations seem to have migrated by hugging coastlines). Indeed, we can make the case that humans' "natural" diet is elephant meat, seafood, tubers, fruit and nuts. Agriculture strip-mines nutrients from the soil, so such depletion would have been widespread in the past too, as we know soil erosion has been. What we know from actual history (as opposed to speculative reconstruction) is that states and empires have been roiled by constant military conflict and everyday violence as far back as we can tell. Note that decaying empires have increasing levels of violence (Mesopotamia, Rome, China). Is malnutrition a neglected cause of social decline violence?

If the above is true, we can accept that as our food supplies are squeezed by climate change, and corporations continue to engineer our food for their own benefit and the best food is available only to the wealthy few, things will get worse, not better. We need equality in diet more so than anything else.

BONUS: Update on the Lead as the cause of crime theory (Mother Jones)

Saturday, February 14, 2015

Hello Collapse My Old Friend

I've come to talk with you again. At the risk of being accused of peddling doomer porn, here a few stories from last week on a common theme:
Although civilisation has ended many times in popular fiction, the issue has been almost entirely ignored by governments. “We were surprised to find that no one else had compiled a list of global risks with impacts that, for all practical purposes, can be called infinite,” says co-author Dennis Pamlin of the Global Challenges Foundation. “We don’t want to be accused of scaremongering but we want to get policy makers talking.”

The report itself says: “This is a scientific assessment about the possibility of oblivion, certainly, but even more it is a call for action based on the assumption that humanity is able to rise to challenges and turn them into opportunities. We are confronted with possibly the greatest challenge ever and our response needs to match this through global collaboration in new and innovative ways.”

There is, of course, room for debate about risks that are included or left out of the list. I would have added an intense blast of radiation from space, either a super-eruption from the sun or a gamma-ray burst from an exploding star in our region of the galaxy. And I would have included a sci-fi-style threat from an alien civilisation either invading or, more likely, sending a catastrophically destabilising message from an extrasolar planet. Both are, I suspect, more probable than a supervolcano.

But the 12 risks in the report are enough to be getting on with. A few of the existential threats are “exogenic”, arising from events beyond our control, such as asteroid impact. Most emerge from human economic and technological development. Three (synthetic biology, nanotechnology and artificial intelligence) result from dual-use technologies, which promise great benefits for society, including reducing other risks such as climate change and pandemics — but could go horribly wrong.

Assessing the risks is very complex because of the interconnections between them and the probabilities given in the report are very conservative. For instance, extreme global warming could trigger ecological collapse and a failure of global governance.

The authors do not attempt to pull their 12 together and come up with an overall probability of civilisation ending within the next 100 years but Stuart Armstrong of Oxford’s Future of Humanity Institute says: “Putting the risk of extinction below 5 per cent would be wildly overconfident.”
Twelve ways the world could end (Financial Times via CNBC)

Here are the leading candidates and their percentages:
  • Unknown consequences 0.1%
  • Asteroid impact 0.00013%
  • Artificial intelligence 0-10%
  • Supervolcano 0.00003%
  • Ecological collapse n/a
  • Bad global governance n/a
  • Global system collapse n/a
  • Extreme climate change 0.01%
  • Nuclear War 0.005%
  • Global Pandemic 0.0001%
  • Synthetic Biology 0.01%
  • Nanotechnology 0.01%
My major quibbles are that artificial intelligence should be at zero percent as it's nothing more than an imaginary threat. And I would include the risk of bad global governance at 100 percent.

No mention of a Solar Carrington Event? And yet again no mention of Peak Oil? I would give that probability at 100 percent since it's based on known geological reality, but it depends on whether you are considering the end of humanity or the end of modern industrial civilization. It's certainly possible for humans to survive without electricity and oil--we did so for most of our history as a species--but most experts argue that we cannot sustain the number of people currently alive without them, and certainly not our current living standards. That is, there would be a dieoff rather than extinction, so perhaps it was outside of the parameters of this particular study. Its absence is curious, however. Have these Oxford PhD's seriously never read Overshoot or The Limits to Growth?

Earth’s current human population is 7.27 billion and it is increasing at a rate of more than one per second: so fast that this will make you dizzy. By the middle of any given day, for example, there are about 205,000 thousand births, compared to 84,000 thousand deaths. Superficially, Asia and Africa are the most populous continents in the world, with the 10 most populated countries being China, India, the United States, Indonesia, Brazil, Pakistan, Nigeria, Bangladesh, Russia and Japan. On the other hand, if we consider the human impact on greenhouse-gas emissions, the most heavily industrialized countries contribute more per capita to the burden of overpopulation on climate change. Specifically, in 2012, China, the United States, and the European Union alone contributed about 56 percent of the world’s carbon dioxide (CO2) emissions from fuels: 29 percent from China, 16 percent from the US, and 11 percent from the EU. India and Russia were a distant fourth and fifth, at respectively, six and five percent. The rest of the entire world only contributed 33 percent of the total carbon emission! This includes all of Africa, South and Central America, Australia, and all the less industrialized countries of Asia...

The human population should have crashed from famine in the 1970s but was rescued by modern science. In particular, Norman Borlaug’s green revolution allowed our consumption of food to rely more and more on fossil fuels than on solar energy. We have come to depend on industrial fertilizers that require vast amounts of oil for their production, plus a heavily mechanized agricultural industry that also consumes large quantities of hydrocarbons. Indeed, for many years, the patterns of food, fertilizer, and oil prices over time have been superimposable. Today we can say with confidence, for example, that it takes three quarters of a gallon of oil to produce a pound of beef. Consequently, the idea of cheap oil has become regarded as a guarantee of affordable food. There are three problems with this notion: for one, oil is a finite resource; secondly, in a vicious cycle, cheap and abundant oil will, at best, postpone the inevitable human population crash to a much higher population; and finally, all the oil will eventually wind up in the atmosphere as CO2.

According to a November 2014 report from the International Energy Agency (IEA) that projects the state of the planet and its energy resources to 2040, humans will not have to face a shortage of energy. The IEA projects, that by 2040, the world will consume greenhouse-contributing energy like oil, gas and coal, compared with so-called green energy like wind, solar, and nuclear, in a 1:1 ratio. A world-wide expansion of fracking is expected to keep carbon energy cheap and plentiful. This fact plus a projected two-billion-people increase on Earth mean that energy consumption would increase by 37 percent by 2040. This rate of growth and consumption implies a rapid increase of greenhouse-gas emissions, which in turn translates into a 3.6-degree Celsius global warming by the year 2100. This, according to the IEA report, is a “catastrophic scenario.”
Breeding Ourselves to Extinction (Counterpunch)
[C]limate change has triggered the collapse of advanced civilisations dating back nearly 3,000 years. Around 1200 BCE, a perfect storm of calamities – including earthquakes, famines, and a drought that lasted 150 years or more – set in motion the breakdown of the late Bronze Age kingdoms clustered around the eastern Mediterranean in an area that includes much of what is now Greece, Israel, Lebanon, Turkey and Syria...It’s also instructive to look back at the last time Earth was inhabited by 500 million humans, in the 17th century, coincidentally also a time of tremendous climate-induced upheavals. ..Historians have called this era the General Crisis because wars raged almost non-stop across the globe, including the Thirty Years War, and the collapse of the Ming Dynasty in China and the Stuart monarchy in England...The extreme weather shift was behind most of the crises that occurred during the 17th century: colder weather, with many more episodes of storm-generating El Niños, contributed to flooding, crop failures, drought and famine, leading to civil unrest, rebellions and war...The prolonged crisis weakened once-dominant states such as Spain, Russia and the Ottoman Empire, and left about a third of the population dead...Climate models predict temperatures could rise by four degrees Celsius (7.2 degrees Fahrenheit) or more by the end of this century, a level that Kevin Anderson of the UK’s Tyndall Centre for Climate Change Research described as ‘incompatible with any reasonable characterisation of an organised, equitable and civilised global community’.

‘We will see temperatures higher than any known during human civilisation – temperatures that we are simply not adapted to,’ says Heidi Cullen, chief scientist for the NPO Climate Central in Princeton, and author of The Weather of the Future (2010). ‘With each passing year, our “new normal” is being locked in with the full impacts arriving towards the latter part of this century,’ she says. ‘It’s hard for us to imagine that large parts of the planet would be unlivable outdoors.’

An increase of seven degrees Fahrenheit would see mass migrations from some of the most humid places on Earth – the Amazon, parts of India, northern Australia. Rising sea levels of four feet or more and ferocious storms would flood coastal cities from Tokyo to Mumbai, and submerge low-lying areas such as Bangladesh and Florida, displacing millions. Earth’s most populated areas, that belt of land extending from central China and most of Europe, Africa, Australia, the US and Latin America, would be parched by this century’s end, drying up surface water and killing crops that hundreds of millions depend upon for survival. Nearly half the world’s population, almost 4 billon people, could be enduring severe water scarcity and starvation, numerous studies suggest.

Scorching heat waves and cataclysmic fires will spark food riots, famine and mass migrations of millions. An explosion in insects will trigger widespread outbreaks of typhus, cholera, yellow fever, dengue, malaria and a host of long-dormant or even novel pathogens, unleashing epidemics reminiscent of the Black Death which killed as many as 200 million people in the 14th century. Once-teeming metropolises would become watery ghost towns: Picture Manhattan, Tokyo, São Paulo underwater, sparsely populated colonies of hardy survivors who eke out vampire-like subterranean existences, emerging only at night when the temperatures dip into the low triple digits.

Worse yet, temperatures won’t conveniently stabilise at just seven degrees of warming – Earth’s climate won’t reach a new equilibrium for hundreds of years because of all the heat trapping carbon dioxide that’s already been dumped into the environment. ‘We have only felt a fraction of the climate change from the gases already in the atmosphere,’ said James Hansen, a leading climatologist and director of the Earth Institute at Columbia University, recently. ‘Still more is in the pipeline because the climate system has enormous inertia and doesn’t change that quickly.’ The planet will continue to heat up, triggering feedback loops of runaway climate change, until we can kiss most of civilisation goodbye.

As we move into the 22nd century, tropical rain forests – the lungs of the planet – could be enveloped by desertification while alpine forests will be ravaged by fires. In a 2014 paper in Science, the biologist Rodolfo Dirzo at Stanford University and colleagues predicted that we are on the verge of the planet’s sixth mass extinction event, which could wipe out as much as 90 per cent of the species on Earth. The birds and animals that roam the equatorial belt will be gone forever. Australia will revert to a blazing desert once more, empty of humans. The island chains in the South Pacific, from Hawaii to Fiji, will be swallowed by the oceans.
Welcome to earth, population 500 million (Aeon)


That's timely, as I'm sure many have you seen this widely reported study this week:
The US south-west and the Great Plains will face decade-long droughts far worse than any experienced over the last 1,000 years because of climate change, researchers said on Thursday.

The coming drought age – caused by higher temperatures under climate change – will make it nearly impossible to carry on with current life-as-normal conditions across a vast swathe of the country.

The droughts will be far worse than the one in California – or those seen in ancient times, such as the calamity that led to the decline of the Anasazi civilizations in the 13th century, the researchers said.

“The 21st-century projections make the [previous] mega-droughts seem like quaint walks through the garden of Eden,” said Jason Smerdon, a co-author and climate scientist at Columbia University’s Lamont-Doherty Earth Observatory.

Researchers have long known that the south-west and Great Plains will dry out over the second half of the 21st century because of rising temperatures under climate change.

But this was the first time researchers found those droughts would be far worse even than those seen over the millennia.
US faces worst droughts in 1,000 years, predict scientists (The Guardian)
Several independent studies in recent years have predicted that the American Southwest and central Great Plains will experience extensive droughts in the second half of this century, and that advancing climate change will exacerbate those droughts. But a new analysis released today says the drying will be even more extreme than previously predicted—the worst in nearly 1,000 years. Some time between 2050 and 2100, extended drought conditions in both regions will become more severe than the megadroughts of the 12th and 13th centuries. Tree rings and other evidence indicate that those medieval dry periods exceeded anything seen since, across the land we know today as the continental U.S.
U.S. Droughts Will Be the Worst in 1,000 Years (Scientific American) 

US 'at risk of mega-drought future' (BBC)

The United States of Megadrought (Slate) Naked Capitalism comments, "Note that the Anasazi were an advanced civilization in Chaco Canyon, NM, by 1200-1300 standard. The end game of the drought included cannibalism."

Source
And if that hasn't depressed you enough, it seem that the world hit peak production of many key resources between five and thirty years ago:
The world may not be as close to peak oil as once believed, but peak food, it seems, has already passed.

Energy experts warned in the late 20th century that the world would soon use up its supply of oil, and that production rates were about to plateau. That gloomy prophecy fell flat when oil production accelerated in the last decade, buying us a sort of contract extension on our energy use habits. However, according to research recently published in Ecology and Society, production of the world’s most important food sources has maxed out and could begin dropping—even as the Earth’s human population continues to grow. 

Ralf Seppelt, a scientist with the Helmholtz Centre for Environmental Research in Germany, and several colleagues looked at production rates for 27 renewable and nonrenewable resources. They used data collected from several international organizations, including the Food and Agriculture Organization and the International Union for Conservation of Nature, and analyzed yield rates and totals over a period of time—from 1961 to about 2010 in most cases. For renewable resources like crops and livestock, the team identified peak production as the point when acceleration in gains maxed out and was followed by a clear deceleration.

While annual production is still increasing in all the food resources analyzed—except for wild-caught fish—the rate of acceleration for most of them has been slowing for at least several years. The research team concluded that peak production of the world’s most important crops and livestock products came and went between 5 and 30 years ago. For instance, peak corn came in 1985, peak rice in 1988, peak poultry eggs in 1993, and peak milk and peak wheat both in 2004. The world saw peak cassava and peak chicken in 2006 and peak soy in 2009. This trajectory is troubling, because it means production will eventually plateau and, in some cases, even start to decline.

“Just nine or ten plant species feed the world,” says Seppelt. “But we found there’s a peak for all these resources. Even renewable resources won’t last forever.” While fertilizing soils can help maintain high yields, peak nitrogen—an important fertilizer—occurred in 1983, the study says.

Converting forest, prairie and marsh into farmland may be partially offsetting the per-acre productivity decline in many crops—though this process cannot go on forever. Seppelt and his colleagues found that acceleration of farmland conversion peaked in 1950. What's more, trees support biodiversity and serve as a sponge for atmospheric carbon, so losing more of the world’s forests to agriculture would be a global disaster.

All this might not be a problem if the human population was also stabilizing. Though we recently passed peak population, growth is not decelerating especially fast, and by 2050 there will probably be 9 billion of us and counting. Compounding the increased numbers is the fact that Asian population giants China and India are adopting diets heavier in meat—like the one that the western world has enjoyed for decades.
The World Hit "Peak Chicken" in 2006 (Smithsonian) I wonder why this has received so little coverage in the mainstream media. Even this story was covered up by the rather anodyne headline.

Friday, February 13, 2015

Farmageddon


This article seems to be getting a bit of attention: What nobody told me about small farming: I can’t make a living (Salon):
I love having an organic farm in our community, the woman continued, I just think this whole food movement it so great. I imagined this woman walking into my farm stand, fumbling a tomato in her palm, admiring the new-car-shine of each purple eggplant. Maybe she chooses two crookneck squash and a handful of thumb-size jalapenos. Before getting back in her car she looks out at the fields, at the tidy rows of salad mix and baby kale; then the woman drives away smiling, watching my fields rise and fall in her rearview mirror.

My farm’s become a billboard, and like all billboards, this one is deceptive. It depicts abundance and prosperity— two young smiling farmers working among neat rows of greens under a crisp morning sun. Heaping bins of produce, all of it picked fresh and free of synthetic chemicals. Despite all the talk of small farms disappearing, despite concerns of big ag controlling our food, GMOing everything and dousing it all in RoundUp, driving past my farm one might feel a flutter of relief, think there’s a small farm right there where I can go and pick up a bag of organic baby kale, spot a bluebird resting on a fig branch, notice a patch of weeds growing among the lettuce.

Meanwhile, millions of dollars in federal subsidies are doled out to mono-crop farms growing high-input GMO corn and soybeans. Meanwhile, the EPA continues to approve the use of pesticides such as Atrazine, which have been linked to birth defects, infertility and cancer. Meanwhile, the Supreme Court rules in favor of Monsanto, allowing the corporation to sue farmers whose fields are inadvertently contaminated with GMO seeds. Meanwhile, Ryan and I rifle the Internet in search of a new opportunity, one that can provide us with enough income to purchase health insurance or see the dentist, to take our soon-to-be-born child on a trip to visit its grandparents, to save a little chunk of money each year so that one day we might be able to buy a piece of land ourselves, and perhaps then we could return to farming. Because the truth is, no matter how many young people choose to farm, no matter how many bunches of kale are made into smoothies, or canvas shopping bags are packed full of colorful carrots and lacy heads of lettuce, no matter how many hip new restaurants declare themselves farm-to-fork, none of these things address the policies that dictate how our country’s food system works, policies that have created a society in which the small farmer can’t even earn a living.
Follow-up: Community Discussion: Are small farms doomed?

Should you quit your job and become a farmer? (MarketWatch):
But the good intentions of these type-A types notwithstanding, the economics of organic farming are a potential blow to their fairly large egos. These are individuals with scores of successes in life, but experts say that despite the price premiums that come with organic labeling or other like-minded practices, the math doesn’t always work out. It is just too expensive to do. For that matter, almost all farming, organic or conventional, is a financial boondoggle when it’s outside the realm of factory farming. The median projected income of the American farm in 2013? It’s actually a loss of roughly $2,300, according to the U.S. Department of Agriculture. Is it any wonder that — the organic boom notwithstanding — the number of farms in the U.S. has been on a dramatic decline, from a high of nearly 7 million in the 1930s to 2.2 million today?
Don’t Let Your Children Grow Up to Be Farmers (NYTimes) h/t for the above Agroinnovations Podcast

Half of U.S. Farmland Being Eyed by Private Equity (Inter Press Service) See also: In Corrupt Global Food System, Farmland Is the New Gold

Wall Street Investors Take Aim at Farmland (Mother Jones)

As Crop Prices Surge, Investment Firms and Farmers Vie for Land (New York Times)

Food is cheaper in large cities (Marginal Revolution)

The Midwest's Vast Farms Are Losing a Ton of Money This Year (Mother Jones, 2014)

U.S. Farmers Watch $100 Billion-a-Year Profit Fade Away (Bloomberg)

Wall Street’s Frightening New Plan To Become America’s Landlord (ThinkProgress)

There's one more occupation to cross off the list. Why is there so much attention paid to urban farming when real farmers can't make ends meet?

Note that the most essential tasks in our society such as growing food (without which we would all die) pay the least, while useless tasks like spectator sports, writing fitness apps and developing computer games pay millions of dollars. What kind of f*cked-up economy do we live in???

Thursday, February 12, 2015

Alternative Lifestyles

Meet the pro baseball player who lives in a van (GrindTV) Toronto Blue Jays pitcher Daniel Norris opts for life on four wheels

I wish I could live in a Westfalia camper. I've always loved those. Good luck getting a job without a permanent address, though. Not many of us have won the Major League baseball contract lottery.

Trailer homesteading in the Mojave (BoingBoing)

The smell of creosote is the first premonition of rain in the Mojave. It stinks, until you learn to associate it with rain, and then it takes on a kind of magic. The leaves of the creosote bush produce the smell, a potent resin that drips onto the ground with the downpour and acts as an herbicide against competing plants. The smell is more reliable than weather reports...The last rain event in our part of Joshua Tree was in July, right after my brother-in-law, Damian, finished detailing the second swale. When the first big raindrops fell, we all ran out to watch the water flow into the swales.

Tuesday, February 10, 2015

Beyond Medieval

This is extraordinary. Just 6 people in Connecticut have so much wealth, that if even a single one of them left, there would be a huge drop in state revenue coffers!

This is crazy!!! Entire state budgets are dependent upon the droppings from the table of just a handful of royal barons. Even in the depths of the Middle Ages it never got this crazy. How long can American society continue to function with this level of inequality?
Connecticut tax officials track quarterly estimated payments of 100 high net-worth taxpayers and can tell when payments are down. Of that number, about a half-dozen taxpayers have an effect on revenue that's noticed in the legislature and Department of Revenue Services.

"There are probably a handful of people, five to seven people, who if they just picked up and went, you would see that in the revenue stream," said Kevin Sullivan, the state's commissioner of the Department of Revenue Services.

With one exception, he said, state officials don't actually approach the super-rich. He said: "There isn't friendly visiting or anything like that, how are you feeling? Doing all right? Doing OK?"

Two years ago, tax officials were alarmed that a super-rich hedge fund owner might leave and reduce the state's income tax revenue. They set up a meeting and urged the unidentified taxpayer to stay. The effort was partly successful, with the taxpayer leaving Connecticut but agreeing to keep the hedge fund here.

Many movers and shakers in and around New York City, the capital of the banking and hedge-fund world, work in or populate the verdant suburbs next door in Connecticut. They include names like hedge fund owner Steven Cohen; Thomas Peterffy, of Interactive Brokers; Ray Dalio, of Bridgewater Associates; and Paul Tudor Jones, of Tudor Investment Corp. Combined, their net worth is more than $40 billion, according to Forbes.

Those four declined to discuss their experiences, if any, with Connecticut tax officials. But if they or other big-moneyed individuals or their businesses decide to leave, the danger is real.

In April 2014, super-rich taxpayers in Connecticut and elsewhere shielded their income through charitable donations or other means to avoid a tax hit following the expiration of federal tax cuts.

The result: Connecticut income tax revenue plunged by nearly $281 million, more than 14 percent, compared with the same month a year before. In the 2014 budget year, state income tax revenue was $8.7 billion, more than half the $16.4 billion in total revenue from taxes and fees.
Connecticut to super-rich residents: Please don't leave us (AP)

Related, this article is long and difficult to read, but basically it says that the downturn in the economy was caused by more than just the loss of housing wealth. Because consumption by the super-rich drives so much of the economy now, the loss in stock prices had a disproportionate impact. Similar to the above story - because most of America is now so poor, it is dependent on the spending of a tiny sliver of ultra-rich people to keep it going:
Our results suggest that the standard narrative of the Great Recession may need to be adjusted. Housing played a role, but so did financial assets, which actually accounted for the bulk of the loss in wealth. The middle class played a role, but so did the rich. In fact, the rich now account for such a large share of the economy, and their wealth has become so large and volatile, that wealth effects on their consumption have started to have a significant impact on the macroeconomy. Indeed, the rich may have accounted for the bulk of the swings in aggregate consumption during the boom-bust.
The rich and the Great Recession (Vox EU)

Even the aristocratic Middle Ages wealth was not this concentrated! How much longer can this continue? Meanwhile, in the rest of America:
  • Between 2007 and 2013 median wealth dropped a shocking 40 percent, leaving the poorest half with negative wealth (because of debt), and a full 60% of households owning, in total, about as much as the nation’s 94 richest individuals.
  • Official poverty measures are based largely on the food costs of the 1950s. But food costs have doubled since 1978, housing has more than tripled, and college tuition is eleven times higher. The cost of raising a child increased by 40 percent between 2000 and 2010. And despite the gains from Obamacare, health care expenses continue to grow.
  • Over half of public school students are poor enough to qualify for lunch subsidies. There’s been a stunning70 percent increase since the recession in the number of children on food stamps.
How can we even have the pretense of democracy when America is so poor that it depends on the largesse of a handful of individuals for the state to function? We are truly in uncharted waters here.

And that's not the only way the world is starting to look more medieval:
After more than a decade of war in Iraq and Afghanistan, America’s most profound legacy could be that it set the world order back to the Middle Ages.
“Now that the United States has opened the Pandora’s Box of mercenarianism,” [Sean] McFate writes in The Modern Mercenary: Private Armies and What they Mean for World Order, “private warriors of all stripes are coming out of the shadows to engage in for-profit warfare.”...McFate said this coincides with what he and others have called a current shift from global dominance by nation-state power to a “polycentric” environment in which state authority competes with transnational corporations, global governing bodies, non-governmental organizations (NGO’s), regional and ethnic interests, and terror organizations in the chess game of international relations. New access to professional private arms, McFate further argues, has cut into the traditional states’ monopoly on force, and hastened the dawn of this new era.

McFate calls it neomedievalism, the “non-state-centric and multipolar world order characterized by overlapping authorities and allegiances.” States will not disappear, “but they will matter less than they did a century ago.” He compares this coming environment to the order that prevailed in Europe before the domination of nation-states with their requisite standing armies.

In this period, before the Peace of Westphalia of 1648 ended decades of war and established for the first time territorially defined sovereign states, political authority in Europe was split among competing power brokers that rendered the monarchs equal players, if not weaker ones. The Holy Roman Emperor, the papacy, bishoprics, city-states, dukedoms, principalities, chivalric orders–all fought for their piece with hired free companies, or mercenary enterprises of knights-turned profiteers.

As progenitors of today’s private military companies (PMCs), free companies were “organized as legal corporations, selling their services to the highest or most powerful bidder for profit,” McFate writes. Their ranks “swelled with men from every corner of Europe” and beyond, going where the fighting was until it wasn’t clear whether these private armies were simply meeting the demand or creating it.

In an interview with TAC, McFate said the parallels between that period in history and today’s global proliferation of PMCs cannot be ignored. He traces their modern origins to the post-Cold War embrace of privatization in both Washington and London, both pioneers in military outsourcing, which began in earnest in the 1980s.
A Blackwater World Order (The American Conservative) I expect that these private armies will soon be defending the estates of the super-rich in Connecticut. Perhaps the rest of us can get jobs digging up beets in the farmland they own. Oh wait, never mind, those jobs will be done by robots. Progress!