Tactical Linguistics Research Institute

"nox sicut dies illuminabitur"

Philanthropy is Theft, or, How Competitive Society is Optimized for the Success of Individuals who Exhibit Sociopathic Personality Traits

leave a comment »

Elon Musk is the richest African-American. To hear him say it, however, he really doesn’t care all that much about money. He’s actually just tirelessly using his genius to save humanity from its dumb-ass self — and that’s why he needs so much money. He just wants to help out.

But because worldly possessions just weigh a person down, Musk will be selling his homes and belongings, as if to become a penniless, screen-less, wandering cyber mystic. Like a hyper-modern Tolstoy, when he disowned his literary output and embraced Christian Anarchism, Musk is devoting his wealth entirely to the true deliverance and redemption of humanity.

African-American Elon Musk gives his future two thumbs up.

And through this delicate, alchemical fusion of earth and heaven — part of the hidden meaning of the riddle of the sphinx — Musk will bring the light of reason to the stars.

Setting aside how obviously sociopathic and delusional this all sounds, there’s a certain contempt towards all the employees and customers that supply him with so much of his wealth: his boasts about giving away all his money were tweeted on May Day — the International Worker’s Day. As CEO of Tesla Motors, Musk makes 40,688 times what his average employee makes, which is the highest CEO-to-worker pay ratio ever recorded. Which, of course, is vital to single-handedly saving humanity.

In addition to getting rich by saving humanity, Musk is ruining astronomy to bring wireless social media propaganda to the entire planet, terrorizing rural residents by starting brush fires and breaking windows and shutting down highways for test launches, forcing employees to go to work during a pandemic and in defiance of local ordinances, earning billions of dollars in corporate welfare in the form of government subsidies and tax credits, and planning to build a humanoid robot in a bid to potentially destroy more jobs than George W. Bush.

The US economy shed roughly as many manufacturing jobs during George W. Bush’s presidency as World War II created.

While many Americans assume their society operates along meritocratic lines — such that the “best and brightest” are entitled to as much wealth as they can accumulate — the greatest predictor of who will become wealthy is not genius or talent, but whether one’s parents are wealthy.

And so both Musk and his brother — both of whom grew up white in apartheid South Africa — have become rather wealthy. Although Musk likes to tell the story that he “left South Africa by myself when I was 17 with just a backpack & suitcase of books,” his father’s fabulous wealth played a pivotal role in Musk’s success.

African-American Elon Musk gives his future two thumbs up.

Musk opportunistically arrived in Silicon Valley to become an entrepreneur right in the middle of the dotcom bubble, when companies with big ideas but which only existed on paper were a dime a dozen. The speculative bubble led companies without products to grab up wads of cash through an IPO craze, while venture capitalists threw money around left and right. At this time, Musk’s wits alone had only helped him to accumulate $2000. So his father stepped in with $28,000 to get Musk and his brother off the ground.

Musk then started work on a web site with the dreadfully un-sexy name “Global Link Information Network.” A year later, to impress some venture capitalists, he dressed up his web server during an office tour to make it look like a supercomputer; in exchange for a $3 million investment, Musk ceded control of the website he designed to Rich Sorkin. Under Sorkin’s leadership, the company changed its name to Zip2 — something more in line with how early internet firms named themselves — and within three years, the firm was sold to computer maker Compaq. Compaq paid $305 million, of which Musk received $22 million, which enabled him to “flip” a couple more start-ups.

So here is the Elon Musk recipe for success: get born rich, go to the right place at the right time, work hard, employ trickery, and get lucky. Unquestionable genius.

Martin Buber wrote “The man to whom freedom is guaranteed does not feel oppressed by causality.” We can see the glibness in Musk’s demeanor as the product of a certain type of fate mistaken for merit. It is perhaps this quality of the American new rich — which believes so fervently in the equality of merit and money — that most cleanly separates it from the old rich like Bill Gates.

African-American Elon Musk gives his future two thumbs up.

Like Musk, Gates has a story he likes to tell about how he dropped out of college to start a company in his garage –and then his genius made him rich. Of course, it didn’t hurt that Gates was born into a family with the means to send him to Harvard — one of those top universities that seem perfectly willing to admit anybody whose parents can make a sizable donation.

And so it went with Gates: his mother, Mary Maxwell Gates, was on the Board of Regents at the University of Washington, knew the CEO of IBM, served on the boards of banks and telecommunications carriers. And so it was with Mrs. Gates: her father, James Willard Maxwell, was a banker born around 1900. And so it was with Mr. Maxwell: his father, also named James Willard Maxwell, was a also banker and former head of the San Francisco Federal Reserve, born at the outbreak of the American Civil War. Bill Gates, Sr. is memorialized in the Puget Sound Business Journal like so: “Gates, a lawyer and philanthropist, was known as an optimist in relentless pursuit of an equitable world.” That’s the whole thing.

The difference in attitude between these two types of wealth — the status-seeking new wealth of the colonialist and the low-key old wealth of the aristocracy — can perhaps illuminate one of the more troubling, Orwellian consequences of societies that permit such accumulations of money and power: philanthropy. The aristocracy experiences something like the noblesse oblige, and uses the term “philanthropy” to describe their efforts to justify amassing huge fortunes while their countrymen struggle and millions starve everywhere.

African-American Elon Musk gives his future two thumbs up.

It was perhaps this noblesse oblige that compelled Mrs. Gates in her time at the University of Washington to pressure the University to divest itself of South African holdings to protest apartheid. And which led Gates, Jr. to associate with pedophile-embezzler-drug dealer-spy Jeffrey Epstein in a relentless effort to “get more philanthropy.” As if such absurd amounts of wealth weren’t inherently immoral, regardless of how it’s acquired.

The status-seeking new wealth of the colonialist mindset has another facet,observed by Brazilian educator Paolo Freire in The Pedagogy of the Oppressed. Freire wrote:

The oppressed, having internalized the image of the oppressor and adopted his guidelines, are fearful of freedom. Freedom would require them to eject this image and replace it with autonomy and responsibility. Freedom is acquired by conquest, not by gift. It must be pursued constantly and responsibly.

The oppressed suffer from the duality which has established itself in their innermost being. They discover that without freedom they cannot exist authentically. Yet, although they desire authentic existence, they fear it. They are at one and the same time themselves and the oppressor whose consciousness they have internalized.

In order for this struggle to have meaning, the oppressed must not, in seeking to regain their humanity (which is a way to create it), become in turn oppressors of the oppressors, but rather restorers of the humanity of both.

For better or for worse, we must view Musk as a victim of apartheid — not in the same way as Black South Africans, to be sure, but in a more subtle, pernicious way. While he should neither be faulted nor lavishly rewarded for the accidents of his birth, he nevertheless grew up under apartheid, and he internalized the logic of the oppressor class to which he belonged. As a member of a colonial oppressor class, he is unaware of the autonomous psychic processes within himself that re-create the logic of his oppressor class on a colossal scale. He openly identifies with his greed to be first to colonize another world. And so he perpetuates this victimization as a morally-neutered victim himself.

African-American Elon Musk gives his future two thumbs up.

The argument that Musk needs his wealth to save the rest of us from ourselves resembles the historical arguments used by white slave owners in the US to justify treating people like common property. President John C. Calhoun is known to have remarked:

Never before has the black race of Central Africa, from the dawn of history to the present day, attained a condition so civilized and so improved, not only physically, but morally and intellectually… It came to us in a low, degraded, and savage condition, and in the course of a few generations it has grown up under the fostering care of our institutions

So, modern-day Black Americans can thank slavery for TV. But before we really decide whether it is fit and proper to allow such massive agglomerations of wealth to exist — before we really decide to have the discussion in any kind of coherent way — do we have to wait for Elon Musk — speculatively — to use his wealth to hire a small mercenary army to take over some African nation, seize its mineral wealth, and continue his project unmolested, like some Charles Taylor on some perverse messianic philanthropic mission? Is this what we want for our cosmic legacy? Is this why the aliens keep us in quarantine?

African-American Elon Musk gives his future two thumbs up.

A New, Clandestine Fiscal Policy?

leave a comment »

Towards the end of the 2008 US Presidential election, Barack Obama’s opponent John McCain repeatedly insisted that “the fundamentals of the economy are sound.” Just two months before the election, the widespread fraud committed by organized finance — popularly referred to as a “financial meltdown” in the media — threatened to undermine “the orderly exchange of commodities in interstate commerce.”

The 2008 financial crisis precipitated by “sub-prime lending” involved fraud in accounting, fraud in the real estate industry, fraud in the use of novel financial instruments to back residential mortgages, and fraud in global inter-bank lending.

John McCain with runningmate Sarah Palin — the Tea Party’s first foray into Presidential politics, courtesy of an old-school conservative.

Before the 2007 financial crisis that precipitated the 2008 crisis, the 2000 Dot-Com Bubble, the 2001 Enron energy trading crisis, the 2002 Arthur Anderson accounting crisis, and the 2002 WorldCom accounting scandal, gave ample evidence that the impending 2008 “financial meltdown” might have been averted were the Department of Justice, for example, to make it a routine practice to hire professional criminologists to pro-actively look for evidence of fraud in major financial markets.

“Crisis” would, rather, seem to be a common metaphor for “normal.” Or, at least where it comes to when decisions about huge amounts of money are made. Or, perhaps, an exciting, news-worthy way to say the world is run by crooks, and the commercial media sure as shit isn’t here to help.

If you think “social media” is the vital democratizing force here to save us, good luck to you and the malfunctioning DNA that made you.

Major Shifts in the Market for US Treasury Debt

In the wake of the Sub-Prime Mortgage Lending Scandal that served as the proximal cause to a global “financial meltdown” that was severe enough to help tip the 2008 US Presidential election, President Obama’s Administration implemented a policy of “Quantitative Easing” begun at the tail end of the George W. Bush Administration.

Quantitative Easing involved a significant change in US fiscal policy. Quantitative Easing involves the US Federal Reserve Bank purchasing US Treasury debt in huge volumes. This is highly unusual historically, untested economically, correlated with massive transfers of wealth to the wealthy recently, and is intensifying.

Until the 2007-08 financial crisis, the Federal Reserve purchased Treasuries at a steady pace.

Before the “2008 Crash,” the Federal Reserves purchasing of US Treasury Securities was steady. This is a reasonable way to diversify the Fed’s holdings and to stabilize demand for Treasuries. This process of “monetizing debt” must take place on the open market: the Fed doesn’t buy straight from the US Treasury, but from major banks — which helps create credit.

President Obama’s fiscal policy for managing the “toxic mortgage assets” involved stabilizing the currency market by purchasing “toxic assets” to increase their price, recapitalizing insolvent banks by expanding the money supply by purchasing huge volumes of Treasury bills, and restricting lending to slow the rate at which the new money enters the economy as a means to limit inflation.

After the 2007-08 Financial crisis, the Federal Reserve increased its Treasury holdings dramatically.

Over the course of President Obama’s eight years in office, the Federal Reserve tripled its holdings of Treasury bills to just over $2.4 trillion. This is, roughly, the amount of Treasury debt held by China, Japan, Canada, and Mexico combined (our major trading partners). These debt purchases meant that the US economy received $2.4 trillion more than the total goods and services produced.

Injecting large amounts of cash into the economy risks creating inflation, and control over interest rates is the main tool the Federal Reserve uses to combat inflation. During the Obama Administration the Fed took an additional inflation risk, as interest rates plummeted to spur economic activity in the form of borrowing (creating more consumer debt and profits for the banks). At the same time, rates for interest on savings also plummeted, so that anybody with modest savings would lose the yields they previously earned from mundane financial instruments like CD’s or their bank’s savings account (gradually shifting more money to the major backers of the banks). A savings account as the cornerstone of smart personal finance is no longer a meaningful option.

Lowering interest rates hurts individuals who save money, but helped recapitalize insolvent banks when combined with “quantitative easing.”

While the jury is very much still out on the long-term effects of this type of fiscal policy, its use does not appear to be limited to this single, major financial crisis. At the end of the Trump Presidency, the US embarked on another round of quantitative easing — without much discussion of how to manage the long-term consequences.

More Major Purchases of US Treasury Debt

When the COVID Pandemic led to lockdowns and layoffs, the US was facing various forms of economic disruption. By April 2020, over 20 million people found themselves out of work, and the unemployment rate rose above 14%. This, in turn, threatened the purchasing power of many families and, ultimately, the revenue of large corporations. Because interest rates were already so low — pegged at .05% in April 2020 — the Fed could not invoke this tool to spur economic activity.

Because Congress is unwilling to tax the wealthy or aggressively tax large corporations, the only option available to the Federal Reserve was to provide more credit itself by purchasing Treasury bills.

While the Federal Reserve’s US Treasury holdings tripled under the Obama Administration, they doubled again under Trump’s.

Instead of easing back on Treasury purchases as planned — which would gradually restrict the money supply after creating trillions of dollars in new credit — the Fed resumed Treasury purchases in early 2021, eventually doubling the amount of Treasury debt on the books — which had already tripled in the previous decade.

This policy — creating massive amounts of new credit in the form of Fed Purchases of US Treasury bills — appears to be continuing into the Biden Administration.

What Are the Implications of This Fiscal Policy?

This fiscal policy is pumping new credit into the financial system, but not in a way that benefits ordinary individuals. Taxpayers have continued to struggle financially through the COVID pandemic, receiving small, infrequent stimulus payments, and relying on extended unemployment benefits because available wages aren’t keeping up with the cost of living.

Full view of the Federal Reserve’s acquisitions of Treasury Securities.

At the same time, the wealthy have become astronomically wealthier. The top few percent of the wealthy — whose wealth derives from structured financehave increased their wealth by 54%, or $4 trillion during the pandemic, while 200-500 million people slid into poverty. While the relationship is not exactly direct, this $4 trillion figure is nearly identical to the amount of new credit the Federal Reserve has created during the pandemic. Bailouts to large, struggling corporations wind up in executive bonuses at a much higher rate than in the pockets of employees: during 2019, CEO compensation increased 14% such that the average CEO makes 320 times as much as the average employee.

The long-term implications for the economy are unclear, as this is a new, little-discussed acceleration of the processes of financialization that began in the 1980’s. In addition to facilitating massive wealth transfers, this may, ultimately, impact the global market for dollars and the “real economy” that is increasingly marginalized by the financial sector.

Under the post-war neo-Keynesian economic model, Federal debt is perfectly sustainable as long as the economy continues to grow at a rate that exceeds interest on the debt. Several key historical US policy decisions echo this principle: Woodrow Wilson ended the Gold Standard in 1913 while simultaneously establishing the Federal Reserve, while Nixon ended the convertibility of dollars to gold in 1973. As a result of these two policy decisions, the US dollar was able to become a major global reserve currency.

It is not true that “since the US abandoned the gold standard the value of the dollar isn’t based on anything anymore,” as argued by many conservatives who would like to see the US return to a fixed dollar price. The value of the dollar is driven by the demand for dollars — largely backed by the demand for US goods and oil from OPEC.

Although many Americans despair that “the United States doesn’t make anything anymore,” this is not true either. The United States manufactures more than ever before, except this is done with machines now instead of people — especially since the highly-contested election of George Bush II in the Fall of 2000 signaled a change in the political order. Foreign buyers who want US goods need dollars first — and this demand for dollars helps maintain the price of the dollar.

Another major demand for dollars comes from the demand for oil. The US produces more oil than Saudi Arabia. Because oil is a global commodity, oil is priced according to global demand, and the global demand for oil helps maintain the value of the dollar. When Nixon ended the last vestige of the gold standard, he struck a deal with OPEC: OPEC agreed to price oil exclusively in dollars, creating the petrodollar. Any buyer on Earth who wants oil from OPEC must purchase dollars first, which creates a global demand for dollars.

At the moment, Federal Reserve policy is creating the demand for dollars out of thin air. Foreign governments currently hold about $10 trillion in Treasury securities, about equal to what the Federal Reserve and local governments hold — except half of that is just from the past decade, representing a major shift in the financial order.

What happens if OPEC stops pricing oil in dollars, and starts using the yuan? This would be a problem for the dollar, unless the Federal Reserve acts to avert such a crisis with another round of quantitative easing. What happens if electric vehicles reduce the global demand for oil? You can be sure that large corporations — having failed to plan for this eventuality — will be rewarded with another round of quantitative easing.

What this Fed policy appears to be creating is a financial order where the demand for dollars among the owners of financial wealth keeps the “real economy” functioning: essentially, the demand for dollars among the wealthy can be used to replace the demand for dollars among nations seeking oil or US manufacturing goods. With interest rates near zero, inflation can be controlled by ensuring that most people never see any of the Fed’s new financial wealth: as long as those dollars stay in a rich person’s offshore bank account, they stay out of the economy.

This is a financial system that requires an ultra-wealthy, financialized oligarchy who, in turn, sell ordinary citizens commodity survival on credit. The purpose of the individual in this new system, then, is to convert credit into debt — no more, no less. No more owning music on vinyl or tape or CD, or films or physical books, or owning phones that now are leased, or cars, or houses or even one’s own online social activity, or the economic value thereof, unless one is selling images of one’s young, un-spent body.

Gone are the days when individualist economists like Friedrich Hayek cautioned that using individuals as means to economic ends was the hallmark of the authoritarian economies.

When capitalism enjoys a monopoly and no longer needs to compete in the marketplace of ideas, all options are on the table; education, healthcare, arts, and culture become unnecessary social expenses that diminish the ability of individuals to convert credit into debt.

Written by Indigo Jones

April 28, 2021 at 3:19 pm

New World Order for Fun and Profit

leave a comment »

The Martial Lord of Wei asked one of his ministers what had caused the destruction of a certain nation-state. The minister said, “Repeated victories in repeated wars.”

The Martial Lord said, “A nation is fortunate to win repeated victories in repeated wars. Why would that cause its destruction?”

The minister said, “Where there are repeated wars, the people are weakened; when they score repeated victories, rulers become haughty. Let haughty rulers command weakened people, and rare is the nation that will not perish as a result.”

— Masters of Huainan (ca. 200 BCE)

What Just Happened?

Although many American voters feel alarmed and disoriented by Donald Trump’s rise to power, this is nevertheless the direct, causal result of millions of Americans pretending that Democrats are an opposition party over the last 30 years.  Unfortunately, for these naive souls, the real difference between Democrats and Republicans is that Republicans are delusional while Democrats are in denial.

At present, Barack Obama and other prominent Democrats are blaming Russia for manipulating the election, which is pure propaganda. Russia probably manipulates every election.  Russia, China, and others probably do too — just like we routinely interfere with theirs.

The real issue is both more nuanced and more troubling.  In Wisconsin, for example, less than 1% of the vote separated Clinton from Trump. For comparison, in Florida after the 2000 election, a similarly narrow result triggered a recount by statute.

When one looks at what factors constitute that 1% of the vote in Wisconsin, however, the role of Russian involvement is not the decisive factor (though it may be a bit of a wildcard).  Rather, a number of pervasive influences within American society amount to a far more profound influence:

1) Systematic black disenfranchisement due to Reagan-era federal sentencing guidelines (2.2 million voters nationally)

2) Systematic disenfranchisement through Voter ID laws (coordinated among statehouses through think tanks like ALEC)

3) Closed-source electronic voting machines with proprietary code that behave anomalously but can’t be audited

4) Rampant gerrymandering

5) Campaign strategies that game the electoral college (Clinton won the 2016 popular vote, just like Gore won the 2000 popular vote against Bush)

6) Citizens United Supreme Court ruling overturning campaign finance reform, opening a floodgate of un-traceable political manipulation

7) Sporadic election fraud by officials like Kathy Nickolaus in Waukesha

Of course, Barack Obama — in blaming Russia — cannot call attention to these issues, because it undermines the validity of the very system from which he derives his power, influence, prestige, identity.

This was the same reason Al Gore couldn’t call for protests in the street after the 2000 election: he’s part of the system and therefore needs to help preserve it.  So now, instead of solutions, we get propaganda from the left in addition to the right.

What Does it Mean?

Historian Robert Paxton has analyzed historical fascist movements, and has discerned five distinct stages along the way to fascism:

1. Development of the ideology
2. Ideology takes root
3. Ideology gains power
4. Ideology exercises power
5. Open society overtaken by entropy or becomes a police state

In Paxton’s analysis, the proto-fascist disillusionment with popular democracy begins in the rural around a rhetoric of renewal. The Greek Golden Dawn party, with seats amounting to about 10% of the European parliament, holds the phoenix as the emblem of their movement.

Popular with past Greek fascist movements, the symbol of the phoenix — which rises reborn from its own ashes — resonates with Trump’s promise to “make america great again.”

After a fascist ideology takes hold, traditional conservative elite adopt the rhetoric of the rural brownshirts to stave off a resurgent progressive movement. Which is to say: Hitler used the brownshirts but did not create them, just like the Republican Tea Party took advantage of American right wing militants.

Once the traditional conservatives are able firmly to re-establish control over resurgent progressives, the brownshirts become emboldened to act on their own.  This historically has often taken the form of attacks on farmers.  It may seem appealing to suppose — given the current state of race relations in the United States and the animosity suffered between rural and urban areas — that a modern equivalent of this step may consist in right wing militants squaring off with disaffected black inner-city youth (who figure they have nothing to lose, since they’ll be dead or in jail by the time they’re 20 years old).  However, for all the problems with policing, it is unlikely that many police would sit by idly while white militants pick off black folks indiscriminately.  Moreover, rural folks are often afraid of the city, making such a scenario less likely.

While the rural folks are afraid of the city, however, they’re not afraid of Mexican migrant farmers with no rights. They’ll be the canary in the coal mine, or the sacrificial lamb, for the same reason a john beats up a hooker: she can’t take it to the cops.

Once the traditional conservative elements try to restore order in the face of quasi-sanctioned vigilantism, two general outcomes typically result: entropy, or a police state.

Since Bill Clinton put the wiretaps in place, Bush switched them on, and Obama made it legal to use them, we would seem to have a pretty comprehensive system of repression in place already.

What’s Next?

The panopticon principle makes it pretty clear that surveillance is not some passive proposition, but an active system of control. If you CAN be monitored at any time, but NEVER know exactly when, it is in your interest to behave at ALL times as though you ARE being monitored.

Foucault summarized Bentham’s insight:

“Hence the major effect of the Panopticon: to induce in the inmate a state of conscious and permanent visibility that assures the automatic functioning of power…
“So… that the surveillance is permanent in its effects, even if it is discontinuous in its action; that the perfection of power should tend to render its actual exercise unnecessary; that this architectural apparatus should be a machine for creating and sustaining a power relation independent of the person who exercises it; in short, that the inmates should be caught up in a power situation of which they are themselves the bearers.”
— Michel Foucault, Discipline and Punish, 1975

The surveillance society depicted in George Orwell’s novel 1984 is built along these precise lines:

“There was of course no way of knowing whether you were being watched at any given moment. How often, or on what system, the Thought Police plugged in on any individual wire was guesswork. It was even conceivable that they watched everybody all the time. But at any rate they could plug in your wire whenever they wanted to.”

“You had to live–did live, from habit that became instinct–in the assumption that every sound you made was overheard, and, except in darkness, every movement scrutinized.”

— George Orwell, 1984 (1949)

The social deviance of 1984’s protagonist Winston Smith is eventually arrested through police entrapment, not through the efficacy of the surveillance infrastructure.  Smith wanders into a forbidden part of town and rents a room from an undercover cop to use as a love nest.

In our world, these forces are already being brought to bear on the American population.  What’s needed first is a sober recognition of these realities.

Those who would organize an opposition should not do so over social media.  Use of the element of surprise is a rudimentary component of any strategy — which would-be organizers surrender without fight when organizing on a wiretap.

Those who would organize an opposition should not report their activities on social media.  The types of information gained through the elaborate system of hundreds of thousands of spies and informers utilized by the East German state is routinely handed over voluntarily today on FaceBook.  The East German state maintain control by exploiting personal information, and our system is not likely to be much different.

Cellphones are tracking devices.  Even with GPS off, they are in constant communication with cell towers.  The slight time differential between when a phone’s signal is picked up at multiple nearby towers can be used to precisely locate any such mobile device.

The walls have ears.  Cellphones speak and understand English now.  Past movements — the labor movement, the women’s movement, the civil rights movement — succeeded without social media.  An opposition movement that has any chance of success today will be no different in that regard.

 

Written by Indigo Jones

January 25, 2017 at 7:45 pm

Wrestling with Phantoms

leave a comment »

Last month, ABC News reported on the results of an undercover Department of Homeland Security test to assess the effectiveness of security screening procedures at US airports.  Although the results of the study could not be independently confirmed, multiple media outlets repeated the claim that agents smuggling fake contraban through aiport security were able to get 95% of their fake weapons and explosves past screening agents the Transportation Security Administration’s travel checkpoints.

While most of the recycled news stories predictably framed the findings in terms of the ineptitude of the upopular TSA, from a statistical perspective the meaning of the report are much different.

A 95% security failure rate is the best empirical evidence we have to suggest there are no terrorists are airports.  If there were terrorists smuggling guns and bombs into airports, then, presumably, we would have heard about some horrible jihadi shooting spree at an airport by now.  It would seem the only people smuggling contraban into airports these days is the government itself.

This state of affairs — where the government bureaucracy terrorizes itself on TV, with the general population as collateral damage — has a long pedigree.  It is, effectively, a Cold War phenomenon wrapped in a new garb to keep the boogieman scary.

During the Cold War, American schoolchildren were periodically terrorized by “duck and cover” drills, which presumed to offer some defense against a suprise Soviet nuclear attack.  Despite all the hype about the Soviet nuclear threat, however, the only radioactive fallout Americans were ever exposed to during the Cold War came from the American government itself.  Decades of atmospheric tests, thermonuclear tourism in Las Vegas, nuclear accidents like Mighty Oak (radiation from which was blamed on Chernobyl) and Midas Myth, all exposed Americans to nuclear fallout in the name of fighting the Soviet nuclear threat.

Today, during the War on Terrorwhich continues despite the retirement of that epithet — we have a similar scenario, where the Federal Bureau of Investigation incites and entraps would-be terrorists in order to justify the government’s anti-terrorism policies.  From the “Detroit Sleeper Cell” (a farce created by prosecutorial misconduct) to the “Liberty City Seven”, most high-profile terrorism cases have been the creation of the FBI.  Even if the charges in these cases are overturned, once the incident gets into the mass media, the damage is done.  This is a reality TV replacement for the fruity loops terror alerts.

While the first victim in all this is the truth, the un-critical parroting of spurious claims by the media adds a new dimension to the current brainwashing.  We like to assume that because of the Internet, information is more accessible than ever.  Unfortunately, the critical distinctions between information, facts, truth, and intelligence has been lost.

Government Accountability and Efficiency

leave a comment »

In a recent talk about cyber security, NSA Director Mike Rogers claimed:

As it stands, Rogers explained, we’re losing somewhere between $100 billion and $400 billion worth of intellectual property to theft each year. This, he said, is of particular concern to the Department of Defense, which watches as its contractors networks are regularly compromised by adversaries.

To the extent that his statement reflects how US tax dollars are spent, the situation is to a considerable extent the result of military downsizing under President Clinton, which wasn’t really downsizing, but outsourcing.  Issued on May 21, 1996, Executive Order 13005 – Empowerment Contracting states its objectives as follows:

In order to promote economy and efficiency in Federal procurement, it is necessary to secure broad-based competition for Federal contracts. This broad competition is best achieved where there is an expansive pool of potential contractors capable of producing quality goods and services at competitive prices. A great and largely untapped opportunity for expanding the pool of such contractors can be found in this Nation’s economically distressed communities.

The problem is, this way of approaching “efficiency” leads directly to reduced accountability. Those who rail against government inefficiency don’t understand that accountability is not efficient: it is not efficient to justify your actions at every step. So the push to make government more lean and “efficient” by outsourcing government functions to the private sector leads directly to an erosion of accountability. You can’t have both accountability and efficiency as policy goals.

Pulitzer prize winning historian Gary Wills suggested that, for example, part of why the Manhattan Project was conducted with such extraordinary secrecy was specifically to evade accountability. The Russians knew what we were up to, the Germans probably knew too, it was the American people kept in the dark. Wills argues this was probably to avoid potential opposition to the development of nuclear weapons in light of the 1925 Geneva Protocols against chemical and biological weapons. Around the globe, people were shocked by the destructiveness of mechanized warfare during World War I and by the use of chemical weapons.  The First World War and the technological horrors is brought were still very much in public memory by the time World War II came around.

Wills also points out that this use of secrecy to evade accountability was no isolated instance. When the US bombed Cambodia, the Cambodians knew it, it was US citizens kept in the dark. When the US invaded Cuba, the Cubans knew what was happening and the Soviets knew, it was US citizens kept in the dark.

Today we have active drone campaigns in at least eight foreign countries responsible for the deaths of thousands in what is essentially an undeclared global war. Insofar as the targets are terrorists, the terrorists know they’re being targeted. Again, it’s US citizens kept in the dark.

The origin of the “state secrets” doctrine derives not from any law that Congress passed, but from efforts by the US military to evade accountability over flaws in the engine design of a new aircraft, which led to the deaths of several citizens.

Accountability is not efficient. To increase accountability with surveillance matters, there needs to be a reduction in contracting, which means, the government needs to get bigger. Edward Snowden — a contractor himself — would seem to be a clear cut example in support of this view.

Written by Indigo Jones

February 26, 2015 at 8:44 pm

Primer on Resistance and the Surveillance State

leave a comment »

There’s no Internet without surveillance. The Internet was built by the US military to be robust, not for privacy or security.  Privacy was not part of the Internet’s design goals.

The Internet became a commonplace household word in part because of the hype surrounding an economic bubble created during the presidency of Bill Clinton.  Under Bill Clinton, the US Congress also enacted the Communications Assistance for Law Enforcement Act at the same time that Windows 95 introduced Americans to personal computers and the phrase “information superhighway” introduced Americans to networking. Surveillance was an integral part of handing the Internet over to commerce.

The relationship between commerce and the surveillance state is now well-established: Apple and Microsoft are suspect, and Yahoo has made surveillance a business proposition — as per 18 U.S.C. § 2706, Yahoo’s 2009 rates ran as follows:

Basic subscriber records cost $20 for the first ID, $10 per ID thereafter; basic group Information (including information about moderators) cost $20 for a group with a single moderator; contents of subscriber accounts — including email — cost $30-$40 per user; contents of groups cost $40 – $80 per group.

Given that typical internet advertising revenue brings in only pennies per click, the current scale of Internet surveillance clearly implies that spying on customers is big business for online firms.

Other telecommunications carriers have made similar overtures, some companies have faced legal and economic reprisal for refusing to cooperate, and yet others have availed themselves of their free speech rights as corporate persons to engage in this dubious commerce.

It should be reason enough to be disturbed by NSA surveillance that the Founders prohibited this type of information gathering in the 4th Amendment to the US Constitution. The excuse “I’ve got nothing to hide” misses the point.  The government should obey the law, that’s a core feature of what “rule of law” means. And the example of non-violent resistance through non-participation set by Ghandi and the Southern Christian Leadership Conference and vegetarians and vegans offers a clear a lesson for how to resist the surveillance society: stop participating in an abusive system.  The Internet is cruelty to human animals and it’s bad for the social environment.

If it weren’t for so many Americans purchasing data plans on “smart” phones, purchasing home Internet access, and dutifully reporting their daily thoughts and habits psychological makeup on FaceBook accounts, the costs to Uncle Sam for maintaining the current surveillance state would very rapidly prove prohibitive.  That is, if the government had to pay your phone bill and your internet costs and pay a spy to follow you around to listen in on your conversations, it could no longer afford to spy on everybody. Through consumer habits and the cultural value placed on convenience, Americans effectively subsidize the surveillance state on behalf of the governmentDan Geer stated the matter succinctly: our online choices are between freedom, security, and convenience, but we can only pick two.

From a cost perspective, a “vegetarian” approach to resisting the surveillance state (that is, by simply opting out) is an inexpensive solution that aims at increasing the cost of surveillance to the state.  This approach requires little social coordination other than a shared will to change prevailing circumstances — and a little personal initiative.   Such a “vegetarian” approach also serves to inject additional uncertainty into what data is gathered (thereby diminishing the value of what data Uncle Sam does collect).  This doesn’t mean life without the internet any more than vegetarianism means life without food, it just means being more selective about where your internet comes from, where you take it, and what you do with it.

You don’t need to be online all day.  A good starting point would be to make a habit of leaving your cellphone tracking device at home once in a while.  Just because your cellphone is wireless, that doesn’t mean you need to take it with you everywhere you go.  If you take it with you everywhere you go, it’s more of a tracking device than a phone.  When Uncle Sam looks through your cell tower data, changing your cellphone habits will increase the uncertainty as to your location at any given time during the day.

If you care to preserve “democracy,” all that’s really needed is a little social coordination and a willingness to put up with a little less “convenience.”  This may sound incompatible with the modern world, but there’s good reason to get motivated: the modern world is incompatible with the perpetuation of the human race.  There’s more at stake than a little privacy, though the more fundamental problem is bound up with the psychology of consumer society: in a growth economy based on persuasion though advertising — where consumers must make choices about the allocation of their scarce resources — every new product requiring new investment must be presented as needful and fundamental to the modern way of life.

Many people know things have gone awry with the modern world: between the threats posed by persistent national militarism, thermonuclear war, war over resources, mass hunger, environmental degradation, climate change, shortening attention spans, new communicable diseases — something is clearly wrong.  And yet, somehow, everyone looks to another for the solution.  Nobody is willing to see their complicity and change their behavior.  So: if you don’t like internet surveillance, stop surveilling yourself.  The problem isn’t some nebulous “big brother,” it’s you. The government isn’t going to change its behavior, so stop waiting for the government to save you from the government. You have to save yourself from yourself.

 

Myth-Making for a Conservative Nation

leave a comment »

In 1988, Guy Debord observed:

“We believe we know that in Greece, history and democracy appeared at the same time. We can prove that their disappearances have also been simultaneous.”

“To this list of the triumphs of power we should, however, add one result which has proved negative for it: a State, in which one has durably installed a great deficit of historical knowledge so as to manage it, can no longer be governed strategically.”

This absence of historical knowledge manifests itself today.  In Fall of 2013, after weeks of partisan gridlock, the US Congress managed to re-opened the Federal government with a last-minute deal.  This partisanship, however, is not ideological: it is emotional, it is irrational, and its results are unpredictable.

In response to this minor “accomplishment” President Obama remarked:

“Let’s work together to make the government work better, instead of treating it like an enemy, or making it worse.  That’s not what the founders of this nation envisioned when they gave us the gift of self-government.”

The President’s statement seems, on its face, uncontroversial — and that’s a big part of the problem.  His statement is profoundly anti-historical, and in the most problematic manner possible, reads present values into the past.

Until 1850 or so, only white men with substantial wealth — such as bankers, factory owners, or plantation owners — were allowed the vote.  Renters, subsistence farmers, and the laboring majority — whites and blacks — lacked political representation at the time of the US’s founding.  The Constitution made no mention of suffrage until the 14th and 15th Amendments in 1868 and 1870 — almost 100 years after the original Constitution was ratified.  In 1875, the US Supreme Court explicitly ruled that the 14th Amendment — which defined citizenship for the first time — did not give women the right to vote.  Women didn’t get the vote until 1920.  Blacks didn’t get full rights until the Civil Rights Act of 1964 and the Voting Rights Act of 1965.

That’s an odd “gift of self-government.”  Truly, it sounds more like a long, hard struggle to obtain self-government — IN SPITE OF the Founding Fathers.  Indeed, despite being subjects to the Crown, the British managed to outlaw the slave trade, abolish slavery, and enfranchise women before the United States.  And despite being subject to the Crown, the British today enjoy many of the same rights — like freedom of religion, freedom of the press, freedom of speech — that Americans consider distinctly American innovations under the Bill of Rights.

Though the myth of Democracy endures, it is plain to see that the Founders feared Democracy, and made provision to prevent its emergence in the New World.  That so much time passed before American citizens obtained universal suffrage is testament to the effectiveness of the Founding Fathers’ plans.

In arguing for a new Constitution to replace the Articles of Confederation, Founding Father Elbridge Gerry complained to the Constitutional Convention on May 31, 1787: “The evils we experience flow from the excess of democracy.”  Founding Father Edmund Randolf also complained about the “turbulence and follies of democracy.”  In Convention, Founding Father John Dickinson argued against expanding political enfranchisement: “The danger to free governments has not been from freeholders, but those who are not freeholders.”  Dickinson went so far as to claim that a constitutional monarchy was “one of the best Governments in the world.”

Founding Father Alexander Hamilton, in Convention on June 18, 1787, expressed his belief that “nothing but a permanent body can check the imprudence of democracy … you cannot have a sound executive upon a democratic plan.”  In The Federalist #10, James Madison wrote: “democracies have ever been spectacles of turbulence and contention, have ever been found incompatible with personal security and the rights of property.”

Thomas Jefferson — who did favor democracy — was not particularly effective in pushing his views.  He was not even in the country when the Constitution was drafted and ratified.

The attainment of self-government in the United States was no “gift.” President Obama is either unaware of this nation’s history, or perhaps he has no problem brushing it aside in his public statements.  Perhaps he is content that the myth of America’s long history of revolutionary self-government is a suitable expedient in the short-sighted calculus of contemporary electoral politics.

Assuming a political narrative where “liberals” oppose “conservatives,” one might think the first black president in the United States would show some slight interest in calling attention to the heritage of liberal reformers and progressives who paved the way for him to attain high office.  One might suppose it would be in his political advantage to make it clear for all to see that the rights most “conservatives” today enjoy are the result of the efforts of their political adversaries.

Instead, the President’s choice to ignore this history — and to implicitly endorse the a-historical nationalist myth favored by self-described “conservatives” — obscures real threats to what measure of democracy Americans have gained by long struggle.

“Conservatives” who think they are defending the “gift” of democracy vehemently favor so-called “voter ID” laws — which have the effect of disenfranchising students and the elderly.  The ostensible rationale for these laws — that they prevent voter fraud — not only points to a problem that does not appear to exist, but affects a sort of bait-and-switch.  These laws brush aside legitimate concerns about election fraud — they ignore systemic flaws with electronic voting machines, irregularities in the 2004 elections which may have been covered up, voting irregularities and questionable legal activities surrounding the contested 2000 election, and ongoing voting irregularities in parts of small town America like Waukesha, Wisconsin.

In addition to enacting “voter ID” laws, “conservative” governors have been purging voter rolls, and continue to push for national policies — such as the failed War on Drugs and mandatory minimum sentencing laws — which have had the net effect of leaving one in five black men disenfranchised.

The “conservatives” perhaps don’t know to what condition their “traditionalist” views are returning this country — and our “liberal” President does not seem particularly concerned with remedying the matter.  And for all the lofty speech of “history” surrounding the 2008 elections, the word was routinely used not by way of elucidating the past, but by way of branding what was then the present moment.

All this is perhaps fitting.  President Obama’s policies are by and large center-right.  With the exception of gay marriage, he has a dismal civil rights record, that includes granting amnesty to CIA torturers, failing to close the Guantanamo Bay detention facility, extrajudicial assassinations of US citizens, requesting indefinite detention provisions in the 2012 National Defence Authorization Act, negotiating secret  treaties, expanding and legitimating Bush-era surveillance programsdespite well-documented evidence detailing what these programs can lead to, violating the sovereignty of foreign nations with drone strikes, prosecuting whistleblowers with a vengeance … and the list goes on.

Mr. Obama’s calls for “unity” are perhaps well-intentioned, but cannot be strategically effective in the absence of historical knowledge among the population.  Something closer to a one-party system is unlikely to mend the damage caused by a dysfunctional two-party system.  What America needs is not “unity” but a real opposition party — a role that, in the face of “conservative” efforts in the Tea Party Caucus — the Democratic party seems unwilling or unable to fulfill.

Written by Indigo Jones

October 21, 2013 at 1:35 pm

Marijuana and Medicine

leave a comment »

Although many states in the US have been passing laws allowing for medical and even recreational use of cannabis, the substance remains illegal at the federal level.  Specifically, cannabis is classified as a Schedule I substance under the Controlled Substances Act.  This classification is on the basis of three main criteria:

1. The drug or other substance has a high potential for abuse.

2. The drug or other substance has no currently accepted medical use in treatment in the United States.

3. There is a lack of accepted safety for use of the drug or other substance under medical supervision.

Cannabis is classed with Heroin and LSD.  Cocaine, Morphine, and Oxycodone are less tightly regulated Schedule II substances.

As cannabis use persists among teenagers and adults, both legally and illegally, more and more people — especially young people — are seeing first hand that the risks associated with the use of cannabis don’t square properly with the federal government’s treatment of cannabis.  If the government has an interest in protecting young people from the risks of substance abuse, they also have an interest in providing accurate information and formulating sensible policies that don’t simultaneously undermine their own credibility.

Problems with the Current Classification of Cannabis

The Schedule I classification of cannabis has a number of problems.  First, abuse is hard to quantify, and just what patterns of cannabis usage fall under this rubric are not well defined.  Second, there are few quantifiable safety concerns with cannabis:   the substance is profoundly non-toxic, and it is, for all practical purposes, impossible to overdose on cannabis.  This distinguishes cannabis from other Schedule I substances like heroin, and even from legal recreational drugs like alcohol, which is a contributing factor in the death of some 80,000 Americans each year.  Third, the current federal scheduling of cannabis does not take into consideration the accepted medical use of cannabis in a number of states.  The Department of Veterans Affairs has issued a formal directive permitting the clinical use of cannabis in those states where medical uses are approved.  Researchers studying the relative risks and merits of the substance encounter great difficulties acquiring suitable samples to study, and their findings are of limited applicability to the way the substance is routinely consumed in a non-standardized, non-regulated black market.

Perhaps the most dramatic difficulty with the federal government’s position on cannabis is that the US Department of Health and Human Services holds a patent on medical uses of cannabis.  Issued in 2003, US Patent #6630507 is titled “Cannabinoids as antioxidants and neuroprotectants.”  The patent examines a molecule found in cannabis, CBD, though the chemical mechanism the patent identifies should be present in all cannabinoids, including THC.  The patent notes that “cannabinoids are found to have particular application as neuroprotectants, for example in limiting neurological damage following ischemic insults, such as stroke and trauma, or in the treatment of neurodegenerative diseases, such as Alzheimer’s disease, Parkinson’s disease and HIV dementia,” and also indicates that cannabinoids offer a unique delivery mechanism due to the facility with which these molecules can cross the blood-brain barrier.

When cannabis was originally listed as a Schedule I substance in 1970, the classification was intended to be provisional, pending the results of an ongoing study.  The National Commission on Marijuana and Drug Abuse issued the study findings in 1972, finding that there “is little proven danger of physical or psychological harm from the experimental or intermittent use of the natural preparations of cannabis.”  Although the study recommended de-criminalizing cannabis and treating use or possession akin to alcohol, President Nixon chose not to implement the Commission’s recommendations, and marijuana has remained a Schedule I substance since.  Although whole-plant marijuana remains a Schedule I substance, the synthetic THC called dronabinol — sold under the brand name Marinol — is classified as a less-restricted Schedule III substance.

Social Attitudes Affecting Cannabis as Medicine

In the United States, a lot of opposition to medical cannabis laws have presumed that such laws are just a “first step” towards outright legalization.  While there is little to suggest such an outcome would be inherently detrimental, there is also ample evidence that supports medical uses of cannabis on the substance’s own merits.

What presents a more profound problem to the public is in part a tacit sociology of medicine that limits and proscribes how individuals view treatment.  Politicians have adopted these cultural attitudes unquestioningly — indeed, the authoritarian personalities of these politicians wouldn’t allow them to ask such questions.  Those who are open to such questions don’t dare assert themselves, despite polling results that show 70-85% of Americans favor significant changes in current federal policy.

Of particular note in this regard is the unexamined notion that medicine has to come in the form of an expensive bitter pill.  The notion that medicine might also be pleasurable is anathema, and that healing might be enjoyable is equally heretical.  Medicine is still penance, disease is sin, the new medical complexes are cathedrals, and doctors are the high priesthood, mediating between this world and the next, serving as both the front line and the last defense against the forces of corruption, decay, and disorder.

We apologize when we call in sick to work, and are stigmatized by our ailments.  Just as the medieval church was one of the largest landlords in Europe, today’s medical industry claims vast swaths of the GDP.  In the US, healthcare spending exceeds the 10% tithe commanded by the medieval church.  The religion analogy is quite complete, and includes the irreligiousness of the most ardent devotees.  Hospitals, gathering together the diseased, are diseased.  They are morally perverse and rotten with wealth.

Data from multiple nations detailing spending on the medical industry as a percentage of GDP

Data from multiple nations detailing spending on the medical industry as a percentage of GDP

Along with unexamined notions of how medicine fits into our culture, there is another factor promoted by our culture, related to the ideology of Progress.  Progress holds that the future will always bring improvements, that all new technology is better technology, and that what is new must replace what is old.  From within the confines of this ideology of Progress, it seems on the face of things obvious that any new pill is inherently superior to “natural preparations.”  This is, unfortunately, quite difficult to establish with any certainty.

There are easy-to-identify counter-examples where modern medicine has delivered a harmful product: the recall of pills like VIOXX make a big splash in the media, and create the impression that these are exceptions to the general rule that modern medicine routinely delivers improvements.  But these issues have been with medicine for a long time: heroin, for example, was originally invented by the pharmaceutical company Bayer, and marketed as a non-addictive alternative to morphine.

The litany of prescription painkillers marketed since Bayer invented heroin have now surpassed car crashes in the number of annual deaths they cause, accounting for some 90% of all poisonings.  The number killed by these drugs amounts to about ten 9-11’s each year — every year.  Instead of figuring out how to deal with this plague, however, the US throws more and more money at the medical industry, which keeps developing new drugs with serious side effects and abuse potential.  The broader, social implications of this are even more troubling.

The Decline of Western Medicine

Most of modern medicine is unnecessary.  After sanitation and hygiene, antibiotics, analgesics and anesthetics, and vaccines, most of modern medicine is devoted to coping with the side effects of industrialization.   This effect can be seen in diet particularly, but also with respect to such vectors as environmental pollution.  Environmental pollution may take the form of contaminants in the air and water, particulate matter in the air (which causes diseases like asthma), or increased radiation in the environment (due to industrial processes, residues from atmospheric nuclear testing, or because of solar radiation that is increased by a depleted ozone layer in the upper atmosphere).

If he or she lives past the age of 15, the typical hunter-gatherer stands a reasonable chance of remaining healthy and active into their 70’s, with a strong social support network to care for them as they age.  The modern US health care industry really doesn’t do all that much better.  A sizable portion of the modern improvements in life expectancy over what is offered by a hunter-gatherer society come from improved infant mortality, a hygiene problem identified by Ignaz Semmelweis in 1847.  Hand washing is a an extraordinarily cheap and effective medical technology.  Antibiotics, which were developed for around $20,000 of basic research, have saved many more individuals from childhood disease, and increased the range of surgeries that are possible.

As modern medicine grows more expensive, its productivity declines precipitously.  This decline in productivity can be measured in terms of substantive outcomes or in terms of the cost per patent.  Either way, the role of diminishing returns in this field is not adequately addressed in the contemporary discourse.

modern medicine chart productivity of the us healthcare system

Most of the big medical breakthroughs of the last 300-500 years were inexpensive.  Everything recent is increasingly expensive and of rapidly declining effectiveness compared to basic innovations like sanitation or antibiotics.  Most modern medicines and medical procedures could be avoided through less expensive means, specifically, through dietary and behavior modification.

patent applications productivity of reseach dollars

The cost of medicine detracts from other public welfare programs, such as nutrition, food security, education, and mass transit, all of which yield a far greater return on investment than modern medicine.

At some point, the moral aspects of modern medicine need to be evaluated in terms of the social cost.  For example: as a percentage of GDP, the US spends three times more money on the healthcare industry than on education.  We know that basic education makes us smarter, better socialized, and better equipped for employment; but most medicine isn’t really making us all that much healthier.

Marijuana and Medicine

Progress makes a raw agricultural commodity like cannabis seem suspect as a medicine, through really, it is modern medicine that should be suspect.  Whereas a typical television commercial for a new pharmaceutical product will often devote more than half its airtime to potential side effects, no similarly funded social initiative exists to teach Americans how to eat properly, or how to prepare nutritious foods.  Nutritious foods or vegetarian diets are routinely mocked by Americans.

Somehow, none of this is a medical problem.  Rather, in the discourse, these are treated as political problem.  So doctors, not being politicians, stay out of politics; and, somehow, proper diet is only of tangential concern to the medical industry, while new drugs of dubious effectiveness are promoted as indispensable innovations.

Somehow health is not a medical issue, only disease warrants attention.  And where medicine and politics do intersect on this issue of cannabis, instead of informed discussion, the public is treated to a wall of silence, or else jokes about hapless stoners.

Written by Indigo Jones

March 9, 2013 at 6:33 pm

Mass Shootings, The Media, and Politicized Narratives

leave a comment »

In the wake of this most recent school shooting in Connecticut, it is important to remember that these are still isolated incidents.  That these cases become public spectacles represents an ideology of indoctrination more than they represent symptoms of a peaceful society changing into something frightening and more violent.

The Problem

Our society is already violent.  12,000 gun homicides annually ranks our “peaceful” Homeland among many war zones.  This is a 10-year low.  For every gun homicide, there is roughly one accidental gun death.  But very few of these deaths take place in mass shootings: they are overwhelmingly the results of domestic disturbances, armed robberies, and gang violence.  Yet, when these more common types of shootings occur, the media is largely silent.  If it were otherwise, the news would be about nothing but shootings.

When the media picks up gun violence as a topic — typically when a rare mass shooting occurs — the narrative the media propagates tells audiences that these acts are result of some “disturbed” individual.  Now, clearly, shooters in domestic disturbances are “disturbed” at the time, but the media means to say something about psychological pathology: that mass shooters are somehow unhinged and acting irrationally.  Relatively little attention is paid to what sorts of social pressures these mass shooters may have been under.  And almost no attention is paid to the significance of the fact that these shooting sprees often end with a suicide.

The Suicide Shooter

What should be deeply disturbing to media consumers is that the sensationalism of the media coverage is oriented more towards inflicting emotional trauma on audiences than providing a useful description of what gun violence in the country actually looks like.

The media seems to be operating under the premise that when American audiences see coverage of events like this, audiences implicitly understand that these events are stand-ins for countless other events that vary widely in their specifics.  All the Tweets and FaceBook posts to the effect that, “if only more teachers carried weapons in the classroom” seem to contradict such an assumption.  I would wager that if more teachers were armed in class, there may be more school shootings: it is not safe to assume that all teachers love children unconditionally, or are a paragon of infinite patience.

Perhaps most disturbing about the media coverage of these events — in a sociological sense — is the uniform lack of discussion of the frequent suicidal climax of the killing.  Perhaps the media feels it is enough to state that the shooters are “disturbed,” and that suicide therefore seems natural enough as a conclusion to such a “disturbed” episode.  But we do have another paradigm for suicide attacks like these: suicide bombers in the Middle East.

It is not so easy to write off Middle Eastern suicide bombers as uniformly “disturbed.”  They are trained in their attack strategies — much like our suicide shooters plan assaults.  Suicide bombers are motivated by ideology or other political goals — which often enough appears to be the case with our suicide shooters.  While suicide bombers are promised great rewards in the afterlife, our suicide shooters live in a culture saturated with aesthetic treatments of violence, such that the rush of a real-life shoot out and the subsequent notoriety may be reward enough.

Often enough, the perpetrators of these acts of mass violence do have reasons, though, due to the media’s sensationalism and lack of analysis, we never get any closer to understanding those reasons or, consequently, understanding what we can do address them.  All we are left with, as a nation, is televised grieving, emotional trauma inflicted by the media over distant events, and nebulous debates about “gun control.”

The Calculating Killer

Joe Stack, who after several failed attempts at pursuing the American Dream through entrepreneurism, eventually committed suicide by flying a private plane into an IRS tax office in Texas.  Joe Stack wrote a suicide note, titled, “Well Mr. Big Brother IRS man… take my pound of flesh and sleep well.”  In it, he stated:

“If you’re reading this, you’re no doubt asking yourself, ‘Why did this have to happen?’ The simple truth is that it is complicated and has been coming for a long time. The writing process, started many months ago, was intended to be therapy in the face of the looming realization that there isn’t enough therapy in the world that can fix what is really broken. Needless to say, this rant could fill volumes with example after example if I would let it. I find the process of writing it frustrating, tedious, and probably pointless… especially given my gross inability to gracefully articulate my thoughts in light of the storm raging in my head. Exactly what is therapeutic about that I’m not sure, but desperate times call for desperate measures.

“We are all taught as children that without laws there would be no society, only anarchy. Sadly, starting at early ages we in this country have been brainwashed to believe that, in return for our dedication and service, our government stands for justice for all. We are further brainwashed to believe that there is freedom in this place, and that we should be ready to lay our lives down for the noble principals represented by its founding fathers. Remember? One of these was ‘no taxation without representation’. I have spent the total years of my adulthood unlearning that crap from only a few years of my childhood. These days anyone who really stands up for that principal is promptly labeled a ‘crackpot’, traitor and worse.”

Joe Stack doesn’t sound like his reason has become unhinged, he sounds like society pushed him to the breaking point.

Some time earlier, the Unabomber, Ted Kaczynski, wrote in in his Manifesto:

“96. As for our constitutional rights, consider for example that of freedom of the press. We certainly don’t mean to knock that right: it is very important tool for limiting concentration of political power and for keeping those who do have political power in line by publicly exposing any misbehavior on their part. But freedom of the press is of very little use to the average citizen as an individual. The mass media are mostly under the control of large organizations that are integrated into the system. Anyone who has a little money can have something printed, or can distribute it on the Internet or in some such way, but what he has to say will be swamped by the vast volume of material put out by the media, hence it will have no practical effect. To make an impression on society with words is therefore almost impossible for most individuals and small groups. Take us (FC) for example. If we had never done anything violent and had submitted the present writings to a publisher, they probably would not have been accepted. If they had been accepted and published, they probably would not have attracted many readers, because it’s more fun to watch the entertainment put out by the media than to read a sober essay. Even if these writings had had many readers, most of these readers would soon have forgotten what they had read as their minds were flooded by the mass of material to which the media expose them. In order to get our message before the public with some chance of making a lasting impression, we’ve had to kill people.”

Say what you will about his methods, but his thinking was not delusional or unhinged from reason: this was a man pushed by society to his breaking point.  His actions were not the result of an irrational outburst.  He sought social change.

The Media

For a media apparatus that literally sold us the lies that lead up to our 2003 invasion of Iraq; that has been largely silent on the erosion of the 4th Amendment under the FISA amendments and the free reign given to the National Security Agency; that has been largely silent on the erosion of the 5th Amendment under the 2012 NDAA; that is silent about illegal drone campaigns in foreign countries, extrajudicial assassinations of US citizens without trial, that still hasn’t gotten to the bottom of our war crimes in Fallujah, or what actually happened in the 2000 elections; that allows the sprawling security state unchecked growth, and that even cheerleads for the technologies that make it all possible, subsidized by mass production yields in consumer goods … all the media’s moralizing on issues like this rings hollow.

Anarchist thinker Emma Goldman had a different take on events like these:

“To Analyze the psychology of political violence is not only extremely difficult, but also very dangerous. If such acts are treated with understanding, one is immediately accused of eulogizing them. If, on the other hand, human sympathy is expressed with the Attentäter, one risks being considered a possible accomplice. Yet it is only intelligence and sympathy that can bring us closer to the source of human suffering, and teach us the ultimate way out of it.”

Emma Goldman suggests that such violent outbursts are the product of an unusually sensitive soul, not an irrational madman or an unfeeling sociopath.  She does not condone these events, but seeks to understand them both emotionally and rationally, as part of a larger program towards making the world a more just and equitable place.  The media could learn a few things from her approach.

The Public

When the media or politicians raise the spectre of “gun control” following such incidents — which is logical enough, given the total lack of any more substantive debate about the causes and conditions underlying such violent outbursts — a certain highly vocal group of Americans starts clamoring about their rights.

Recent Supreme Court rulings depart from precedent dating back to the 1930’s, mainly, the Miller Case, which affirmed that a sawed-off shotgun is not a militia weapon, and that individual ownership of such weapons can therefore be curtailed.

The decision in the 1939 Miller case reads, in part:

“The Court cannot take judicial notice that a shotgun having a barrel less than 18 inches long has today any reasonable relation to the preservation or efficiency of a well regulated militia, and therefore cannot say that the Second Amendment guarantees to the citizen the right to keep and bear such a weapon.”

In their close-reading of the text of the Constitution, the justices in Miller asserted:

“The Constitution, as originally adopted, granted to the Congress power —

“To provide for calling forth the Militia to execute the Laws of the Union, suppress Insurrections and repel Invasions; To provide for organizing, arming, and disciplining, the Militia, and for governing such Part of them as may be employed in the Service of the United States, reserving to the States respectively, the Appointment of the Officers, and the Authority of training the Militia according to the discipline prescribed by Congress.

“With obvious purpose to assure the continuation and render possible the effectiveness of such forces, the declaration and guarantee of the Second Amendment were made. It must be interpreted and applied with that end in view.”

The Second Amendment, in the strictest sense, is not the “gun” amendment, but the “militia” amendment.  The introductory clause, “A well regulated militia,” serves the legal function of a “whereas” clause, delineating the scope of the provision.  In the Second Amendment’s equivalent clause under the Articles of Confederation, the text is explicit in that arms are to be kept “in public stores.”

This is not to say there is not a Constitutional right for individuals to keep and bear privately-owned weaponry; but if the individual right to own a private firearm derives from the Second Amendment, it is the result of activist judges rather than the original intent of the Founding Fathers.  An individual right to own a firearm is more plausibly found in the 9th or the 10th Amendment.

Gun advocates, however, do not make this argument, because although they frequently want smaller government and despise activist judges, there is no judicial case history testing gun rights under these more plausible Amendments.

Moreover, if individual gun ownership derives from the 9th or 10th Amendment, this also opens the door to State regulation of gun ownership — something these anti-Federal government, States-rights gun advocates paradoxically want to avoid (opposition to the Chicago and DC handgun bans illustrate this point).  They want to have their cake and eat it too, where the Federal government and gun rights intersect.

Now, we are all accustomed to restrictions put on legal products to which we otherwise feel we have a right to purchase: there is a minimum drinking age for alcohol, all cars must drive on the same side of the street and stop at red lights, all electronic devices must limit their electromagnetic interference.  There are countless other instances which, under certain conditions, may represent an inconvenience, but which, on the whole, represent the needs of a greater social good.

But, by and large, the largest source of confusion and agitation on this point comes from the National Rifle Association.  Started as a support group for Marines, the powerful lobby completely ignores that gun ownership today is a completely different proposition that it was in the early days of the Republic.  The group’s position is completely a-historical.

In the Revolutionary era, guns were hand-made artisan items, not mass-produced commodities.  Maybe one out of eight men owned a gun.  As many as half these guns didn’t work, but were viewed as property nevertheless and passed on from generation to generation.  They weren’t essential for hunting, as many Americans lived in cities or raised crops and livestock as farmers.  There weren’t organized municipal police forces at the time.  The original “individual mandate” from the Federal government required that able-bodied me purchase firearms due to a massive shortage.

Today, there is one gun per man, woman, and child in the country.  It is a very different situation.  And here, perhaps, lies the most atrocious facet of the NRA: it is little more than an industry trade group masquerading as a civil rights organization.

Written by Indigo Jones

December 21, 2012 at 3:35 pm

Terror on the Airwaves

leave a comment »

Over the weekend, there was a security breach at JFK International Airport: a man, who fell off his jet ski, swam to shore, climbed an electric fence, and then walked across two runways to a terminal entrance.  With little effort, he was able to defeat a new, state-of-the-art, $100 million Raytheon security system.

As the experts took to the airwaves demanding that heads roll, the more profound question remains unexamined: if a $100 million security can be so easily defeated, why haven’t fanatical terrorists done so since September 11, 2001?

The simplest explanation is that there are few — if any — active terrorists in the United States who are interested in targeting airports.

Of the high-profile terror cases that have targeted other types of locations, many have been cases of entrapment.  In 2009, Hosam Maher Husein Smadi was arrested in Dallas; the other members of his “sleeper cell” were all Federal agents, who provided him with what he thought was a bomb.  In 2010, Mohamed O. Mohamud was arrested in Portland; after being identified as “a person of interest,” he was approached by several undercover FBI agents, introduced to a fabricated bomb plot, given instructions for building a bomb, and given $3000 for living expenses.  Also in 2010, a man with the alias Muhammad Hussain was arrested in Baltimore after meeting a paid informant, who led the man to an undercover FBI agent with a fake explosive.  In 2006, the FBI broke up a cult in Miami, and the media packaged it as the Liberty City Seven Terror Plot; the group to which these men belonged was infiltrated by two paid FBI informants, who hired an additional infiltrator.  The FBI also paid the rent for their meeting place and arrested the men when they tried to buy weapons from the FBI; their first two trials ended in a mistrial because the jury could not reach a verdict.

In 2004, Federal agents broke up the Detroit Sleeper Cell: four men went to Disney World, where they recorded some amusement rides; a fifth man, who had plead guilty to credit card fraud and identity theft, earned a reduced sentence by testifying against the other four men, two of whom were convicted.  After these convictions were overturned, the Washington Post reported in 2005: “In its best light, the record would show that the prosecution committed a pattern of mistakes and oversights that deprived the defendants of discoverable evidence … and created a record filled with misleading inferences.”  That many of these cases rely on paid informants underscores serious problems with the approach, which incentivizes the creation of the appearance of criminality: for example, one such FBI informant, Craig Monteilh, made $177,000 working for the FBI — tax free — in just over a year (the median household income in the US is just over $50,000).   Without the encouragement of the FBI or paid informants, it seems unlikely that any of these people would have made any serious or successful attempts to disrupt the lives of Americans.

It might be argued that these people, nevertheless, harbored hostile sentiments toward the United States, and that we’re better off without them.  Aside from the problems associated with such ends-justify-the-means thinking, however, is a more basic problem of priorities: reckless bankers and speculators harm Americans, but they walk free.  Enron collapsed over a decade ago and the LIBOR affair has been unwinding for years: if entrapment tactics are legitimate, why don’t we see the FBI entrapping crooked accountants, or CEO’s engaged in fraudulent schemes, or bankers who launder drug money, or investors who gamble with the retirement accounts of hard-working Americans?

It might be argued that these so-called terrorists represent only the high-profile cases, and that, due to national security concerns and the need to protect “sources and methods,” we don’t hear about the many smaller cases that are successfully prosecuted.  The plea of “state secrets,” however, is often little more than a justification used to avoid oversight: our enemies often already know they’re being targeted, and it is taxpaying Americans who are kept in the dark.

During World War II, the Germans and the Russians knew about the Manhattan project before Americans learned of it.  Cuba knew we invaded their island, even when this information was kept from Americans; Fidel Castro even complained to the UN that he was being targeted.  Cambodians knew they were being carpet-bombed even when this was being kept secret from the American citizens financing the bombing.  Russia knew about the U2 spy plane well before Americans learned of it.  Everybody involved in the Iran-Contra affair knew of US involvement well before Americans learned of it.  The very nature of such “state secrets” justifications prevents us not only from knowing if any such smaller cases exist, but moreover prevents us from knowing whether they represent legitimate prosecutions, or cases of questionable conduct.  If there are smaller cases, it would seem that they have been treated as the criminal matters that they are, rather than as instances of organized international terrorism.

In any event, there is other evidence that terrorism is not treated as the priority it is made out to be in the news: the PATRIOT ACT, for example, which was passed in the wake of the 911 attacks, provides expanded powers for police to conduct “sneak and peek” searches.  Between 2006 and 2009, these PATRIOT ACT searches were used 100x more often in drug cases than in terror cases.

PATRIOT ACT provision used more often for drug cases than for terrorism cases
At present, it seems like a good possibility that the “War on Terror” is simply a substitute for the military and industrial subsidies put in place during the Cold War.  Now, instead of a shadowy global network of communist infiltrators, we face a shadowy network of “al Qaeda franchises.”  Just like McDonald’s or Starbucks, an al Qaeda “franchise” may be lurking around any city street corner.

If we are to extract any major lessons from the Cold War, it should include these: first, if, during the Cold War, the world had two powerful, ideologically-motivated governments each struggling for the global dominance of its ideology, and each was willing to destroy the entire planet with weapons of mass destruction for the sake of that struggle, it is a dubious “victory” for humanity that either side should come out on top.  Second, we should remember that, despite the “duck and cover” drills and media propaganda about Soviet nuclear attack, the only radiation to which Americans were exposed during the Cold War came from the American government itself: atmospheric tests in the Southwest and over the Pacific Ocean, a cloud of strontium-90 floating over the US in the late 1950’s, soldiers used as guinea pigs, prison inmates and the mentally ill deliberately and secretly injected with radioactive material…

Guns are abundant in the United States, and easy to acquire; yet, there have been no terrorist shooting sprees.  If a terrorist were willing to die for his or her cause, it would not be hard to take out several Americans too.  Former “Freedom Fighters” in Afghanistan seem to have little difficulty constructing improvised explosive devices, but jihadis stateside seem wholly incapable of this feat when removed from an impoverished desert environment swarming with US military personnel.  A terrorist could drive a car into a crowd, or sit in a boat by an airport with a high-powered rifle; if a terrorist wanted to disrupt American life, that terrorist could drive cross-country in the middle of the night attacking high-tension power lines without being caught.  If any of these things were happening with the regularity that would justify something like the PATRIOT ACT or a new bureaucracy the size of the Department of Homeland Security, “state secrets” wouldn’t be able to keep it out of the news for very long.  And yet, for over a decade now, we’ve remained under the same state of emergency declared by George W. Bush.  President Obama has repeatedly extended this state of national emergency.  Congress is required under the 1976 National Emergencies Act (50 U.S.C. 1601-1651) to review presidential emergencies every six months; this oversight, however, does not appear to be much of a priority, and Congress has not been actively reviewing this emergency declaration.

In the US, terror is spread almost exclusively by the media.  Most people traumatized on 911 were traumatized by watching TV in their kitchen or school classroom.  Whereas terrorists have killed around 3000 Americans in the past decade, in the same period, automobiles have been responsible for some 400,000 deaths.  The typical American is astronomically more likely to be killed or injured in a car accident than in a terrorist attack; yet, rather than treat effective mass transit as a life-saving national priority, politicians like Wisconsin Governor Scott Walker make a show of turning down $1 billion in Federal funds for commuter rail.

In an important sense, we are our own worst enemy: Congress and the President have done more to disrupt the American Way of Life than any terrorist could have hoped to accomplish.  As the media obsesses over dramatic fictions, as the federalization of local law enforcement continues, as local police procure military hardware, as civil liberties are swept aside, as drones and dirigibles with high-resolution cameras appear over major urban centers, keep in mind that it is not just Muslims that are targeted: during the recent G-20 summit in Toronto, police have been accused of profiling protestors with “black backpacks” and women with “hairy legs.”  A Canadian police watchdog group calls these claims “substantiated.”  Over 1,000 protesters were arrested — mostly without charge — and only around 40 were successfully prosecuted.  Police profiling isn’t strictly a religious or an ethnic issue: control freaks don’t discriminate.  If they don’t come to lock you up in person, be sure that they’ll come for your heart and your mind.

Written by Indigo Jones

August 14, 2012 at 1:58 pm

%d bloggers like this: