The Comey Firing -- Have The Progressive Journalists Completely Lost Their Minds?

One theory is that in order to consider yourself a progressive, you need first to have lost your mind.  The other theory is that the progressives had at least some slight cognition of the real world up until last week, but the Comey firing gave them that last push over the edge.  Either way, can there be any doubt that at this point that most if not all progressives -- at least those calling themselves journalists -- have completely lost their minds?

My favorite bits of evidence are the assertions from various quarters that by firing FBI Director Comey Trump has committed some kind of "coup" against the American republic, or maybe has thrown the country into some kind of "constitutional crisis."  OK, there can always be one or two kooks to make ridiculous statements on any subject.  But this was not that.  This was dozens of mainstream voices everywhere you looked.  Over at the Daily Caller they had a round-up a couple of days ago of one after another seemingly respectable news source making these kinds of statements.  Examples:

From CNN's legal analyst Jeffrey Toobin:

"It's a grotesque abuse of power by the President of the United States," said Toobin, speaking with Wolf Blitzer on "The Situation Room." "This is the kind of thing that goes on in non-democracies, that when there is an investigation that reaches near the President of the United States, or the leader of a non-democracy, they fire the people who are in charge of the investigation." . . . "This is not normal," he continued. "This is not politics as usual. This is something that is completely outside how American law is supposed to work."

From McClatchy, May 9, "Donald Trump takes a dictator’s stand against inquiry":

Trump has taken the kind of steps that would be routine for the dictatorial leaders—the Putins, the Dutertes, the Erdogans of the world—whom he appears to admire.  

From David From of the Atlantic:

The day began with Trump attempting to intimidate a former acting attorney general & Senate witness. It ends with a coup against the FBI. 

And then there's this important question, asked by Chris Hayes of MSNBC:

If we're in a constitutional crisis, what's the proper response?

Do you have the impression that because these people are on TV and talk with a tone of authority, they must know what they are talking about?  The fact is that all they are doing is displaying their profound ignorance of first principles.  

Let's have a basic civics lesson.  This is not the stuff you need to go to law school to know, but rather the most basic stuff that they teach (or ought to teach) in high school, or even junior high school.  The relevant provision of the U.S. Constitution is not exactly hidden.  It's the first sentence of Article II -- the article that defines the powers of the Presidency. It is all of 15 words long:

The executive power shall be vested in a President of the United States of America.

That's it.  There is no executive power of the United States that does not belong to the President.  The power to investigate and prosecute is an executive power, and therefore belongs to the President, and only to the President.  

This is what is sometimes known as the "unitary executive."  Making the President a unitary executive with all of the prosecutorial authority under his control was a very intentional decision of the framers of the Constitution.  To learn more, read Federalist 70, by Alexander Hamilton.  It does not have to have been this way.  For example, our 50 states have their own constitutions, and 45 of the 50 have not adopted the unitary executive concept when it comes to the prosecutorial power.  Instead, they divide their executive powers, and have their attorneys general separately elected (or, in a couple of cases, appointed by some entity other than the governor, in one case by the legislature (Maine) and in another case by the state Supreme Court (Tennessee)).  Thus, in the large majority of the states, the governor cannot fire the attorney general.  But the President of the United States absolutely can fire the Attorney General, and can also fire anybody else in a policy-making role in the Justice Department, including the Director of the FBI.  That's what the Constitution says, and there isn't the slightest doubt about it.

Separate from the question of constitutional powers are issues of tradition, or maybe of good judgment.  Some saner voices than those quoted above have asserted some kind of inviolable tradition in the U.S. that a president should not fire the FBI director, so as to maintain the complete independence of the FBI.  Look into the subject, and you will find that there is no real evidence of any such tradition.  The FBI as currently constituted only traces its roots back to 1935, and the first occupant of the position of Director, J. Edgar Hoover, came over from the prior Bureau of Investigation.  Hoover then proceeded to serve under six presidents without ever getting fired, but it's hard to claim much precedent from that.  The first few of those presidents (Roosevelt, Truman, Eisenhower) had no particular reason to consider firing Hoover, but by the time we got to Kennedy Hoover had accumulated to himself way more power than was appropriate.  And thus we had Kennedy appointing his brother to be Attorney General, and Nixon appointing his closest crony and ex-law partner (John Mitchell) to the job, in both cases undoubtedly in large part to have an assured way to zap Hoover if the need arose.  The need would have arisen for Nixon, but Hoover had died in 1972, just before the Watergate thing got going. 

So has any other President ever fired an FBI Director?  Yes -- Bill Clinton.  Clinton fired William Sessions a few months after taking office in 1993.  Sessions had been appointed by Reagan, served through the term of Bush 41, and still had more than 4 years to go in his 10 year term when Clinton came in.  Clinton gave some kind of excuse for firing Sessions having to do with alleged improper use of an FBI airplane.  Sessions denied it, but maybe it was true.  Do you think Clinton may have had some skeletons in his closet that he did not want investigated?  Funny, but I don't remember a single outraged word in the press about Clinton's firing of Sessions.  (Coup?  Constitutional crisis?)  Of course, Clinton shortly thereafter got Ken Starr as a Special Prosecutor, so the Sessions firing did not do him a whole lot of good.

Now, does Trump have some skeletons in his closet that he does not want investigated?  Anything's possible.  But, as previously stated in a post here, I find the whole idea of "collusion with the Russians to hack the election" completely absurd.  You may believe that meme, and time will tell.  But in the end, the only place where the President is really accountable is to the electorate in the next election.   

Socialism, Fantasy And Reality

Over in Congress, Republicans are gradually getting their act together on rolling back Obamacare, at least in part.  That of course has brought out a torrent of hysterical reaction from the progressive punditocracy.  To these people it seems just glaringly obvious that there is a moral imperative to provide "healthcare for all" through some kind of government handout or coercion.  After all, we all know that socialized provision of goods and services works flawlessly, and the government has an infinite pile of free money to pass out.  We do know those things, don't we?

On Monday, the New York Times op-ed page had no fewer than three pieces on the subject of Republican healthcare proposals by the in-house columnists, each more hysterical than the next.  Not meaning to give the likes of Krugman a pass on this one, but let me focus on the piece by Charles Blow, titled "Republican Death Wish."   Excerpt:

The A.C.A. had made a basic societal deal: The young, healthy and rich would subsidize access to insurance for the older, sicker and poorer. But this demanded that the former gave a damn about the latter, that people genuinely believed that saving lives was more important than saving money, that we weren’t living some Darwinian Hunger Games of health care where health and wealth march in lockstep. . . .  Let’s cut to the quick: Access to affordable health care keeps people alive and healthy and keeps families solvent. Take that away, and people get sick, run up enormous, crippling debt and in the worst cases, die. It is really that simple.

"Access to affordable healthcare" keeps people "alive and healthy."  This is one of those things that is just so blindingly obvious that it has to be true.  So what is the actual evidence?

  • There's that big randomized trial out of Oregon in 2013 that found, after two years, that there were "no significant improvements in measured physical health outcomes" between those with access to Medicaid and not.  A follow up study after five years showed that the same results persisted.
  • The big selling point of Soviet communism was supposedly the free universal access to health care.  In the early years, life expectance under communism did increase -- but then, it also increased in the capitalist countries that had nothing like free universal health care at the time.  By the end of the Soviet Union in the late 80s, that country was facing what was by then called a "health crisis," accompanied by dramatically lower life expectancy, particularly for men, than in the capitalist countries without the free universal health care.  This study from the British Medical Journal in 1988 shows male life expectancy in the late-stage Soviet Union as only 65 years.  Wait, doesn't "access to affordable healthcare" keep people "alive and healthy"?  Maybe not so much.  And by the way, in post-communism Russia, life expectancy has not recovered.

And then, can we please look at what is going on down there in Venezuela.  Free universal health care was the core promise of Hugo Chavez and his Bolivarian revolution.  For the latest, check out, for example, from Fox News on April 7, "Venezuela's health crisis nearing catastrophe, government pleads for help":

Triple digit inflation and a decaying socialist economic model have left medications ranging from simple anti-inflammatory drugs to chemotherapy medication out of reach for most Venezuelans. Patients are asked to bring their own. . . .  [M]any other ills afflict the Venezuelan public health system. According to the most recent National Survey of Hospitals, 97 percent of services provided by hospitals are faulty, 75 percent of hospitals suffer from scarcity of medical supplies, and 63 percent reported problems with their water system.  The children are the most affected by the sanitary crisis. According to confidential data gathered by the Ministry of Health leaked to the press, last year 11,000 Venezuelan babies died within their first year if life.

It goes on and on from there, in great and depressing detail.  The promise of "access to affordable healthcare for all" proved to be false -- without a vibrant private economy, the government couldn't deliver.  Well, but at least the people in Venezuela aren't starving.  Actually, as you probably already know, they are.  The Wall Street Journal reports on May 5 that "[t]hree in four Venezuelans said they had lost weight last year, an average of 19 pounds."  The causes include "nationalization of farms as well as price and currency controls."  

The claim that "access to affordable healthcare" keeps people "alive and healthy" turns entirely on the assumption that the socialized costs of the "affordable healthcare" do not degrade economic performance and leave the people poorer.  In other words, to believe the claim, you have to believe in the infinite pile of free money that the government can spend without cost to the people.

What Is Seen And What Is Not Seen, Climate Edition

In 1850 the famous French economist Fréderic Bastiat wrote a short essay titled "What Is Seen And What Is Not Seen."  The essay discusses what has come to be known as the "broken windows fallacy," that is, the idea that breaking windows really makes the world better off because of all the work that is generated for people to repair the windows.  Bastiat points out that the work to repair the windows may be "seen," but because the money gets diverted to that project, plenty of other things that might have been done -- and made someone better off -- remain undone.  Those things are the "unseen."  Overall the welfare of the people has been reduced.

It is not seen that, since our citizen has spent six francs for one thing [repairing the window], he will not be able to spend them for another. It is not seen that if he had not had a windowpane to replace, he would have replaced, for example, his worn-out shoes or added another book to his library. In brief, he would have put his six francs to some use or other for which he will not now have them. 

Over in the world of climate reporting, what is seen is the constant drumbeat of articles about the "hottest" day/month/year ever.  You have seen lots of those over the past year.  Quick, now, when was the last one?  Unless you follow this closely, you very likely won't know.  And, can you think of seeing any recent article revealing that some recent period was not the hottest day/month/year/whatever?  Neither can I.  That's the "unseen."  You can be forgiven for coming away with the impression that things just keep getting hotter and hotter.

For considerations of brevity, I'll leave out the first half of last year, and start in July.  The New York Times headline on July 9 was "Record High Temperatures in the First Six Months of the Year."   (Accompanied by a picture of a house engulfed in flames, of course.)

The average temperature across the contiguous United States for the first six months of this year has been the warmest on record — and by a considerable sum — dating back to 1895, according to a monthly report released Monday by NOAA’s National Climatic Data Center.

Then, on August 8, it was this:  "What Cornfields Show, Data Now Confirm: July Set Mark as U.S.'s Hottest Month."   Of course, they are explicit in making sure you know to draw the conclusion that the succession of "hottest" months proves the underlying trend toward catastrophe:

It may come as little surprise to the nation’s corn farmers or resort operators, but the official statistics are in: July was the hottest month in the lower 48 states since the government began keeping temperature records in 1895. . . .  “This clearly shows a longer-term warming trend in the U.S., not just one really hot month,” Mr. Crouch [climatologist at NCDC] said.

And, on September 12, "August Ties July for Hottest Month on Record."  

It just keeps getting hotter.  August has tied July for the distinction of being the hottest month since record-keeping began in 1880, NASA said in a news release on Monday.

Notice that this series of articles was in turn driven by a comparable series of press releases issued by the government propagandists.

And then, when were the next articles?  October, November, December?  Try to find them.  On January 18, we get "Earth Sets a Temperature Record for the Third Straight Year":  

Marking another milestone for a changing planet, scientists reported on Wednesday that the Earth reached its highest temperature on record in 2016, trouncing a record set only a year earlier, which beat one set in 2014.

OK, but was there anything that happened to temperatures toward the end of the year that you'd like to tell us about?  Nothing that you can find here.

And then, somehow, all these press releases and follow-on articles just disappeared.  Any guesses as to what might be happening?  Perhaps we should just go and check in on the satellite temperature data set over this period:

Aha!  The global lower atmosphere temperature has dropped a full .56 deg C (that's almost exactly one full degree F) since its peak in February 2016.  Do you think that any of these people would have the common decency to openly admit that fact and discuss it honestly?  Don't kid yourself.

The Blob Goes After Ben Carson

Of all the federal agencies that ought to have their budgets zeroed out, HUD should be the first.  There is no bureaucracy in the "anti-poverty" business that does more than HUD to take people perfectly capable of self-sufficiency and intentionally turn them into lifelong government dependents.  And, in the "bang for the buck" category, no other bureaucracy could possibly outdo HUD for futility.  No housing grant, subsidy or other initiative counts for as much as a penny in the income of any recipient.  Therefore, if somebody was poor before receiving an HUD grant or subsidy, it is one hundred percent assured that that person will still be poor after climbing on to the gravy train.  There is no known metric under which HUD's "bang for the buck" could ever get above zero.  If you count trapping millions of able-bodied people into dependency, HUD is hugely destructive.

At the same time, running HUD "programs" is the perfect bureaucratic sinecure.  Virtually nobody ever escapes HUD's web.  Once you are into subsidized housing, why would you ever go back to paying full price for your home?  So the bureaucrats have a permanent lifelong clientele ready to advocate at the drop of a hat for continuation and increase in budgets.  And if the bureaucrats fail in their job of maintaining the buildings properly (as of course they will), they have a ready-made source of self-inflicted heart-rending stories to use to get their budget increased.  The whole thing is a cancer.

Into this disaster has now stepped new HUD Secretary Ben Carson.  He has expressed many times his desire to reduce the dependency that it is HUD's core mission to increase.  But can he actually accomplish anything?  The preliminary Trump budget outline has proposed cutting HUD funding by about $6 billion -- about 12% of the total.  Not nearly enough, but a start.  In the couple of months since his confirmation, Carson has gone out around the country on some kind of a "listening tour," visiting places like Dallas, Miami, Detroit and Ohio.  Reading articles about the tour, it seems that it has turned mostly into an opportunity for recipients of HUD handouts to make their pitches for increased -- or at least, continued -- funding.  Or, to put it another way, Carson is getting swarmed by the Blob.

As Exhibit A of the Blob pushing back, here is a website called CarsonWatch, set up to advocate for keeping and/or increasing all HUD funding.  As Carson started his tour back in March, these guys put up a post oh-so-subtly titled "The Trump-Carson housing budget will push more Americans out on the street."  Excerpt:

At a time when millions of families are caught in a historic housing affordability crisis, the Trump-Carson budget for the Department of Housing and Urban Development (HUD) proposes $6.2 billion in cuts to vital programs that help everyday families have a place to call home. It also eliminates all grants to urban and rural communities that help spur job creation and economic development. These immoral cuts will exacerbate homelessness, racial and economic inequality, and fall hardest on our most vulnerable neighbors.

It's a good thing these guys don't ever trouble themselves to look up any numbers.  If they did, they might discover this:  New York City has a population of about 8.5 million, and is said to have over 60,000 "homeless," about one "homeless" person per 142 of population.  To pick another city that takes a different approach to public housing, Houston has a population of about 2.3 million, and is said to have a "homeless" population of about 5400.  But that's only about one "homeless" person per 425 of population, only about one-third of New York City's "homeless" rate.  Surely, then, to drive its numbers of "homeless" people down, Houston must have far more HUD-subsidized public housing per capita?  Wrong!  According to this  "cross-city" comparison from the NYU Furman Center, in New York City some 5.3% of all housing units are "public housing" (in this case, NYCHA), while in Houston the comparable percent is 0.4%.  Admittedly, the study is from 2008, but I doubt that those numbers have changed much since.  Could it really be that for all its extraordinary efforts to solve a "housing shortage" by building more and yet more HUD-subsidized housing, New York City only makes negative progress at getting people "off the street" and into housing?  Absolutely.  This is socialism, folks.  Look around at more of such easily-available statistics, and you will find that there is a strong positive correlation between amount of public housing and increased homelessness.  A cynic might conclude that extensive availability of subsidized housing incentivizes people to declare themselves "homeless" to jump the line to get in.

At various places along Carson's tour, it seems that the strategy has been to trot out one or another sympathetic or heart-rending case to try to stave off budget cuts.  For example, here is a New York Times article from Wednesday reporting on Carson's stop in Columbus, Ohio.  Excerpt:

On his second day in Columbus, Mr. Carson stopped by the apartment of Alzene Munnerlyn, an 87-year-old living in senior housing and using a voucher to pay part of her rent after she was priced out of her last apartment.

The Times doesn't choose to tell us if Ms. Munnerlyn has any children or grandchildren who might have helped.  In Dallas, the mayor chose to make a plea for relief from all the nitpicking regulations that HUD attaches to its grants.  But the reporter from the Dallas Observer was on to the diversion:  Dallas had just finished going through an HUD audit where hundreds of millions of dollars somehow turned up missing, only to be promptly forgiven by the Obama HUD:

You have to pause here and recognize that [Dallas Mayor] Rawlings is a Democrat who went to Washington and cut a deal with a democratic HUD secretary, Julian Castro, to get HUD to eat, kill, trash and deep-six its own four-year investigation showing that Dallas was sucking hundreds of millions of dollars out of HUD, lying about what it was doing with the money and then spending it in ways that violated federal law.

Yet, needless to say, the onslaught from the advocates has seemed to get Carson at least partially on the defensive.  From the Washington Post, April 3:

HUD Secretary Ben Carson said Monday that the Trump administration will seek to include housing funding in a yet-to-be unveiled infrastructure spending bill.  “The part that people are not hearing even though I’ve said it several times is that this administration considers housing a significant part of infrastructure in our country. And as such, the infrastructure bill that’s being worked on has a significant inclusion of housing in it,” Carson said at the National Low Income Housing Coalition conference in Washington.

So they are only talking about a budget cut of about $6 billion (out of about $50 billion), and then much or all of it is going to come back in through the back door in an "infrastructure" bill?  The Blob just never lets up.  Will Carson actually succeed in accomplishing anything in reining in HUD over the next several years?  The jury is out.  In the past, you could never go wrong by betting on the Blob.  I'm hoping it's at least a little different this time, but only because I'm a hopeless optimist.

Connecticut Discovers The Laffer Curve The Hard Way

Somewhere way back in the 70s, legend has it that economist Arthur Laffer got into a discussion over dinner with Dick Cheney and Donald Rumsfeld about the reason for America's then-slow economic growth.  Laffer thought the reason for the slow growth was unduly high marginal tax rates, which then included a federal top rate of 70%.  To illustrate the problem, Laffer supposedly drew on a napkin a curve showing tax collections increasing with rates, but only as long as rates are low; then at some point, as rates get higher, collections start to decrease, and then continue to drop until the rate reaches 100%, at which point collections fall to zero because nobody bothers to earn or report any income.  This very simple concept has since gone by the name of the "Laffer Curve."  Here is a version from The Intelligent Economist:

Note, though, that the curve is unspecific as to exactly where to find that point "c," where higher tax rates stop bringing in increased revenue and become counter-productive.  And thus a few years ago we saw the likes of Barack Obama, Andrew Cuomo, and Bill de Blasio all proposing at the same time to add just a few more points to the income tax rates of the same group of highest-income taxpayers, all in the name of "fairness"; and the likes of Paul Krugman always finding a way to claim that higher tax rates are a good idea.

But can we get an idea of where the point of inflection might be found?  Fortunately, the great state of Connecticut has decided to oblige our need to know by conducting a real live ongoing experiment.  

The experiment started way back in the 70s and 80s, when the combined top marginal rate in New York State and City reached 19% (approximately 15% for the State and 4% for the City), and Connecticut had no income tax at all.  The Connecticut towns closest to New York City -- Greenwich, Stamford, Darien, New Canaan, Westport -- experienced an enormous boom, and became some of the wealthiest places in the country.  Then, in 1992, Connecticut -- experiencing normal budget problems common to all states -- decided to take the plunge and impose its own income tax.  Hey, it was only going to be 3%.  Who would even notice?

And it has continued from there.  In the 80s, New York State recognized that it had become uncompetitive, and started cutting its top tax rate significantly; and that process continued during the 90s.  By the mid-90s the top New York State rate was under 7% (add about 4% additional for New York City -- a figure that hasn't changed much during the period under discussion).  Connecticut?  Per a chart from the Tax Foundation here, by 2000 Connecticut's top rate had reached 4.5% (New York State's top rate that year was 6.85%); by 2005 Connecticut's was 5% (and New York State's had snuck up to 7.7% on a "temporary" basis); and in 2011 Connecticut pushed its top rate all the way to 6.5% (on income over $1 million for a couple filing jointly).  Then in 2012 New York pulled a sneaky trick, raising its top rate to 8.82%, but only on income over $2 million (couple filing jointly), while lowering the rate for income between $300,000 and $2 million back to 6.85%.  And finally, in 2015, Connecticut pushed its top rate to 6.99% on income over $1 million, and 6.90% on income over $500,000 (both couple filing jointly).  Suddenly, Connecticut found itself with income tax rates higher even than New York for people making between $500,000 and $2 million per year and not residing in New York City.  A $1 million per year earner could now actually live in Rye (just on the NY side of the border) and work in New York City, and pay less income tax than if he lived and worked in Connecticut.  (Only New York City residents pay New York City income tax.)

Well, how has that been going?  The early rumblings were stories of some of the major hedge funds picking up and moving to Florida, home of zero income tax.  In July 2016, it was Paul Tudor Jones, reported in this article from the Yankee Institute to have had a personal income of about $600 million per year.  That one move cost Connecticut about $30 - 40 million per year in income tax revenue just from the one guy.  (Total annual income tax collections for the entire state of Connecticut run around $9 billion.  This one guy paid almost half a percent of the total for the whole state.)  A couple of months later came the report that Barry Sternlicht (of Starwood Capital) had also moved to Florida.  

That's the anecdotal evidence; how about the overall numbers?  With this year's April 18 tax deadline having passed, Connecticut now knows its tax collections for 2016.  On Monday May 1, a guy named Ben Barnes (Connecticut's Secretary of the Office of Policy and Management) gave a presentation to the legislature.  As reported in the CT Mirror here, Barnes described a "precipitous drop in revenue [that] we experienced in late April."  And which of the various taxes is the main source of that precipitous drop?

The income tax, the state’s largest revenue engine, saw the most erosion by far.  According to analysts, income tax receipts this fiscal year now are expected to total just under $9 billion. Not only is that well below the $9.44 billion analysts were anticipating just four months ago, but it falls short of the $9.2 billion collected last fiscal year. . . .  Income tax receipts are experiencing their first major decline since 2009 — just as Connecticut fell into The Great Recession.  And the bulk of the latest income tax erosion was tied not to paycheck withholding but to quarterly filings, most of which involves capital gains, dividends and other investment-related earnings.  According to the governor’s budget office, the state’s 100 largest-income taxpayers paid 45 percent less this year than last.

It looks like Connecticut has gone over to the back side of the Laffer curve.  What to do?  You won't be surprised to learn that a coalition of progressive groups, including unions representing state employees, is calling for raising the top income tax rate yet again, to 7.49%, and also imposing a special 20% rate on income from hedge funds:

For example, labor advocates and other progressives have suggested that Connecticut respond to the “carried interest” loophole within the federal income tax system by imposing a surcharge close to 20 percent on the earnings of hedge fund managers.     

But Governor Malloy -- a Democrat who was the driving force behind the 2011 and 2015 tax increases that were supposed to fix the state's revenue problems once and for all -- is not going along this time.  He has told his Democrat and union allies that such tax increases are off the table and shouldn't even be discussed publicly for fear of driving additional big taxpayers out of the state.

Meanwhile, Connecticut has dug itself into a really deep hole, with no easy way out.  It's not just that they can't increase taxes further; it's that even the current level of taxes has gotten them into a death spiral.

Readers familiar with Connecticut will know that it is a state without any dominant or particularly large city.  (The largest currently is Bridgeport, at a little under 150,000.).  But it has a dozen or so cities in the 35,000 to 150,000 range.  In the past few years, I have had occasion to visit or pass through more than half of them:  Bridgeport, New Haven, Hartford, New London, Waterbury, Torrington, Bristol.  Without exception, they are dreary and run-down.  In a recent East-West drive through Torrington on state Route 4, there were three large factory complexes, all appearing to be vacant, with prominent signs reading "Factory Space For Rent."  The only formerly-industrial Connecticut town I know of that has experienced a significant revival is South Norwalk.  Of course, it is directly between the fancy New York suburbs of Darien and Westport.  Does any reader know of another formerly-industrial Connecticut town that is on its way back?

If Connecticut continues to follow the strategy of higher and higher taxes -- or even leaving taxes at the current uncompetitive levels -- there is not much hope for any of these small cities to revive any time soon.  What they need is new investors, people willing to take a big risk in the hope of having a big success.  But the message that Connecticut sends out is, if you have any meaningful success, we will treat you like a goose to be plucked.  Nobody wants anything to do with them.  Too bad.

My Descent Into Abject Poverty; Or, How To Have Enough Money To Be "Poor"

If you read some of the usual propaganda about the plight of the elderly poor in New York City, it will bring a tear to your eye.  Or, at least, that's the intent.  For example, City Comptroller Scott Stringer is just out (March 21) with a big report titled "Aging With Dignity: A Blueprint for Serving NYC's Growing Senior Population."   We learn that some 20.0% of New York City seniors (over 65 years old) lived "below 100 percent of poverty" in 2015; and another 10.3% lived "between 100 and 149 percent of poverty."  In the housing category, things get even worse.  Some 38.8% of seniors who own their homes in New York City are said to be "rent-burdened" (funny term for homeowners) in the sense of spending more than 30% of their income on housing cost.  Among renters, a whopping 59.7% are said to be "rent-burdened."  The source for these numbers is given as the American Community Survey, i.e., the U.S. Census Bureau data that are the source for the usual reports about the "poverty rate," as well as other things like claims about "income inequality."

For something even more heart-rending, try this 2015 piece from CityLimits.org, titled "Aging in New York City: City Wrestles with Poverty Among Seniors."  Excerpts:

“The percentage of seniors living in poverty is staggering, ” says NYC Department for the Aging Commissioner Donna Corrado. “Too many older New Yorkers make difficult choices about purchasing food, medicine and paying their rent.” . . . .  How seniors can make ends meet is a question the whole country is grappling with. . . .  Although national poverty rates for seniors declined from 12.8 percent to 9.5 percent from 1990 to 2012, in New York City, the poverty rate among older adults increased by 15 percent during that period, rising from 16.5 percent to 19.1 percent, according to DFTA. 

Of course, all this talk about "poverty" and lack of "income" derives from the Census Bureau data.  Regular readers of this site will recognize that these Census statistics on "poverty" are completely arbitrary and fake.  For a few useful prior posts, try here and here.  The best way of looking at them is that they are just a big scam to gin up hugely inflated numbers of people claimed to be in "poverty," in order to play on the sympathies of the taxpayers and get support for increased funding for "anti-poverty" programs, none of which ever raise a single person out of "poverty" as defined.  The key sleight-of-hand is defining "poverty" solely in terms of current-year "cash income" -- a category that in many instances has little or no relationship to the amount of resources a person or family may have available to spend.  Since most people live mostly off their income most of the time, you can easily come to think of "income" as a good proxy for living standards; and thus, you become easy to deceive.  The fact is that, while "income" may be a useful proxy for living standard in many cases, there are many other cases where "income" is not a good proxy at all for living standard.  For an obvious large category, think college students.  Retirees are another large category, but the reasons may not be so obvious to you.

Anyway, over the past weekend I got a draft of my 2016 tax returns from my accountant (don't worry, we got an extension).  Of course, it was a big, fat pile of paper, some 96 pages of draft returns -- 78 for the IRS and another 18 for New York State/City.  (What, you thought this "poverty" thing was simple?)  The shocking news was right near the front, on the second page of the 1040:  based on our "income" as reported, by Census Bureau definitions, Mrs. MC and I lived "in poverty" during 2016.

How could this possibly be?  Wasn't the Manhattan Contrarian a high-income partner of a big law firm just a few short years ago?  How is it possible to fall so far, so fast?

First, disabuse yourself of the idea that this has anything whatsoever to do with living standard.  In fact, our living standard has not changed one bit.  We live in the same place.  We eat the same food.  (Manhattan restaurants!)  Sometimes we travel.  We pay all the bills.  We give substantial amounts to charity.  One of our daughters had a wedding -- in Manhattan -- during 2016.

So what's the secret?  In our case, the overall picture is a little complex, but one big thing stands out:  We have enough money that we can afford to be "poor"! 

Does that somehow seem not to make sense?  Then you haven't been paying attention.  Poverty, or non-poverty, by official definitions, turns on one and only one thing, which is current-year "cash income."  If you consider how this works for retired people, you will quickly realize that the people who saved more, who have more available to spend, and who in any real sense are better off, are actually more likely to turn up as "in poverty" than the people who saved less.  Think it through.  Suppose you have retired and you didn't save much, or maybe you did save some, but only in the form of tax-advantaged retirement savings, like 401(k) or IRA plans.  You are basically out of money, except for the retirement plans (if any).  You need something to live on.  The first thing you will do is start collecting your Social Security.  That counts as "income"!  If you worked most of your life regularly at a middle class or better job, that income alone could well be sufficient to raise you out of "poverty."  Or, if you have 401(k)s or IRAs, you can start drawing on them.  In most cases, that's "income" too!  Again, if you are trying to maintain your prior standard of living without other savings, you will need to withdraw sufficient funds from these plans that you will probably get lifted out of "poverty."

But suppose instead that you saved some substantial amount of money not in the form of tax-advantaged 401(k) or IRA plans.  Spending this money is one hundred percent not "income."  It just doesn't count, period.  Meanwhile, there are very good reasons not to collect Social Security until you reach the age of 70, and not to withdraw from 401(k)s and IRAs until you reach the age of 70.  First of all, those things count as "income," and you have to pay taxes on income.  Duh!  Why would you volunteer to pay income tax when you don't have to pay any tax?  Second, both Social Security and tax-advantaged retirement assets continue to grow through age 70 as long as you don't use them.  If you can hold off on collecting your Social Security benefit for the five years from age 65 to 70, the monthly amount will grow by some 40%.  This is not a difficult decision.  You just need to have enough money set aside to avoid drawing on the sources that count as "income."  Or, to put it another way, you need to be rich enough to be poor!

Now that you have absorbed this information, go back and re-read those heart-rending tales from the beginning of this post.  For example, did you feel sorry for those "rent-burdened" elderly New Yorkers who spend "more than 30% of their income" on housing cost.  Well, Mrs. MC and I spent more like 500% of our (completely arbitrary) 2016 "income" on housing cost.  Please, don't feel sorry for us.  And then there's that sham about the "poor" elderly New Yorkers "mak[ing] difficult choices about purchasing food, medicine, and paying their rent."  There are undoubtedly elderly New Yorkers in this situation, but the idea that the supposed 20% "poverty" rate is a real measure of their numbers is ridiculous and insulting.  

Without doubt, somewhere in the 20.0% of elderly New Yorkers who are counted as "in poverty" in the official statistics, there are numerous instances of real hardship.  But how many?  Is it most of the 20%, or half, or maybe only a tenth or less?  Unfortunately, there is no way to tell from the official numbers.  That failing is completely intentional, and gives advocates infinite room for fraudulent use of the statistics, as illustrated in the examples at the beginning of this post.  

Finally, consider that subset of "poor" elderly New Yorkers who live in Manhattan.  In a post back in 2013, I asked whether it was even possible to live in Manhattan and be in real "poverty."  After all, there is literally no place to live in Manhattan where the market rent alone does not exceed the official poverty level for the number of people living there.  Therefore, if you live in Manhattan, by definition, the resources -- whether your own, or handouts from the government, or from someone else -- that are spent in a year to support you, exceed the so-called "federal poverty level."  In at least tens of thousands of cases of people deemed "in poverty" by the Census statistics, those spent resources are large multiples of the federal poverty level.  Many "poor" families in Manhattan receive government benefits that cost the taxpayers well into the six figures.  This is particularly true of families that live in public or "affordable" housing, which by itself, in prime areas of Manhattan, is worth $50,000 and up in annual taxpayer subsidy.  So the following statement is assuredly true:  to the extent that there are elderly "poor" people in Manhattan who suffer real hardship -- in the sense of having to "make difficult choices about purchasing food, medicine, and paying their rent" or struggling to "make ends meet" -- that is one hundred percent a consequence of the poor design and implementation of the existing government handout programs.  How our government can spend $50,000 in a year to support a family in subsidized housing in Manhattan, and another $40,000 per year to provide Medicaid for that family, and still more for free phones and EITC and other assistance, and still leave that family in "poverty" and without enough cash to "purchase medicine" or "make ends meet," is completely beyond me.        

OK, enough of this for now -- I have to go apply for food stamps!