Tactical Linguistics Research Institute

"nox sicut dies illuminabitur"

Posts Tagged ‘Economics

Philanthropy is Theft, or, How Competitive Society is Optimized for the Success of Individuals who Exhibit Sociopathic Personality Traits

leave a comment »

Elon Musk is the richest African-American. To hear him say it, however, he really doesn’t care all that much about money. He’s actually just tirelessly using his genius to save humanity from its dumb-ass self — and that’s why he needs so much money. He just wants to help out.

But because worldly possessions just weigh a person down, Musk will be selling his homes and belongings, as if to become a penniless, screen-less, wandering cyber mystic. Like a hyper-modern Tolstoy, when he disowned his literary output and embraced Christian Anarchism, Musk is devoting his wealth entirely to the true deliverance and redemption of humanity.

African-American Elon Musk gives his future two thumbs up.

And through this delicate, alchemical fusion of earth and heaven — part of the hidden meaning of the riddle of the sphinx — Musk will bring the light of reason to the stars.

Setting aside how obviously sociopathic and delusional this all sounds, there’s a certain contempt towards all the employees and customers that supply him with so much of his wealth: his boasts about giving away all his money were tweeted on May Day — the International Worker’s Day. As CEO of Tesla Motors, Musk makes 40,688 times what his average employee makes, which is the highest CEO-to-worker pay ratio ever recorded. Which, of course, is vital to single-handedly saving humanity.

In addition to getting rich by saving humanity, Musk is ruining astronomy to bring wireless social media propaganda to the entire planet, terrorizing rural residents by starting brush fires and breaking windows and shutting down highways for test launches, forcing employees to go to work during a pandemic and in defiance of local ordinances, earning billions of dollars in corporate welfare in the form of government subsidies and tax credits, and planning to build a humanoid robot in a bid to potentially destroy more jobs than George W. Bush.

The US economy shed roughly as many manufacturing jobs during George W. Bush’s presidency as World War II created.

While many Americans assume their society operates along meritocratic lines — such that the “best and brightest” are entitled to as much wealth as they can accumulate — the greatest predictor of who will become wealthy is not genius or talent, but whether one’s parents are wealthy.

And so both Musk and his brother — both of whom grew up white in apartheid South Africa — have become rather wealthy. Although Musk likes to tell the story that he “left South Africa by myself when I was 17 with just a backpack & suitcase of books,” his father’s fabulous wealth played a pivotal role in Musk’s success.

African-American Elon Musk gives his future two thumbs up.

Musk opportunistically arrived in Silicon Valley to become an entrepreneur right in the middle of the dotcom bubble, when companies with big ideas but which only existed on paper were a dime a dozen. The speculative bubble led companies without products to grab up wads of cash through an IPO craze, while venture capitalists threw money around left and right. At this time, Musk’s wits alone had only helped him to accumulate $2000. So his father stepped in with $28,000 to get Musk and his brother off the ground.

Musk then started work on a web site with the dreadfully un-sexy name “Global Link Information Network.” A year later, to impress some venture capitalists, he dressed up his web server during an office tour to make it look like a supercomputer; in exchange for a $3 million investment, Musk ceded control of the website he designed to Rich Sorkin. Under Sorkin’s leadership, the company changed its name to Zip2 — something more in line with how early internet firms named themselves — and within three years, the firm was sold to computer maker Compaq. Compaq paid $305 million, of which Musk received $22 million, which enabled him to “flip” a couple more start-ups.

So here is the Elon Musk recipe for success: get born rich, go to the right place at the right time, work hard, employ trickery, and get lucky. Unquestionable genius.

Martin Buber wrote “The man to whom freedom is guaranteed does not feel oppressed by causality.” We can see the glibness in Musk’s demeanor as the product of a certain type of fate mistaken for merit. It is perhaps this quality of the American new rich — which believes so fervently in the equality of merit and money — that most cleanly separates it from the old rich like Bill Gates.

African-American Elon Musk gives his future two thumbs up.

Like Musk, Gates has a story he likes to tell about how he dropped out of college to start a company in his garage –and then his genius made him rich. Of course, it didn’t hurt that Gates was born into a family with the means to send him to Harvard — one of those top universities that seem perfectly willing to admit anybody whose parents can make a sizable donation.

And so it went with Gates: his mother, Mary Maxwell Gates, was on the Board of Regents at the University of Washington, knew the CEO of IBM, served on the boards of banks and telecommunications carriers. And so it was with Mrs. Gates: her father, James Willard Maxwell, was a banker born around 1900. And so it was with Mr. Maxwell: his father, also named James Willard Maxwell, was a also banker and former head of the San Francisco Federal Reserve, born at the outbreak of the American Civil War. Bill Gates, Sr. is memorialized in the Puget Sound Business Journal like so: “Gates, a lawyer and philanthropist, was known as an optimist in relentless pursuit of an equitable world.” That’s the whole thing.

The difference in attitude between these two types of wealth — the status-seeking new wealth of the colonialist and the low-key old wealth of the aristocracy — can perhaps illuminate one of the more troubling, Orwellian consequences of societies that permit such accumulations of money and power: philanthropy. The aristocracy experiences something like the noblesse oblige, and uses the term “philanthropy” to describe their efforts to justify amassing huge fortunes while their countrymen struggle and millions starve everywhere.

African-American Elon Musk gives his future two thumbs up.

It was perhaps this noblesse oblige that compelled Mrs. Gates in her time at the University of Washington to pressure the University to divest itself of South African holdings to protest apartheid. And which led Gates, Jr. to associate with pedophile-embezzler-drug dealer-spy Jeffrey Epstein in a relentless effort to “get more philanthropy.” As if such absurd amounts of wealth weren’t inherently immoral, regardless of how it’s acquired.

The status-seeking new wealth of the colonialist mindset has another facet,observed by Brazilian educator Paolo Freire in The Pedagogy of the Oppressed. Freire wrote:

The oppressed, having internalized the image of the oppressor and adopted his guidelines, are fearful of freedom. Freedom would require them to eject this image and replace it with autonomy and responsibility. Freedom is acquired by conquest, not by gift. It must be pursued constantly and responsibly.

The oppressed suffer from the duality which has established itself in their innermost being. They discover that without freedom they cannot exist authentically. Yet, although they desire authentic existence, they fear it. They are at one and the same time themselves and the oppressor whose consciousness they have internalized.

In order for this struggle to have meaning, the oppressed must not, in seeking to regain their humanity (which is a way to create it), become in turn oppressors of the oppressors, but rather restorers of the humanity of both.

For better or for worse, we must view Musk as a victim of apartheid — not in the same way as Black South Africans, to be sure, but in a more subtle, pernicious way. While he should neither be faulted nor lavishly rewarded for the accidents of his birth, he nevertheless grew up under apartheid, and he internalized the logic of the oppressor class to which he belonged. As a member of a colonial oppressor class, he is unaware of the autonomous psychic processes within himself that re-create the logic of his oppressor class on a colossal scale. He openly identifies with his greed to be first to colonize another world. And so he perpetuates this victimization as a morally-neutered victim himself.

African-American Elon Musk gives his future two thumbs up.

The argument that Musk needs his wealth to save the rest of us from ourselves resembles the historical arguments used by white slave owners in the US to justify treating people like common property. President John C. Calhoun is known to have remarked:

Never before has the black race of Central Africa, from the dawn of history to the present day, attained a condition so civilized and so improved, not only physically, but morally and intellectually… It came to us in a low, degraded, and savage condition, and in the course of a few generations it has grown up under the fostering care of our institutions

So, modern-day Black Americans can thank slavery for TV. But before we really decide whether it is fit and proper to allow such massive agglomerations of wealth to exist — before we really decide to have the discussion in any kind of coherent way — do we have to wait for Elon Musk — speculatively — to use his wealth to hire a small mercenary army to take over some African nation, seize its mineral wealth, and continue his project unmolested, like some Charles Taylor on some perverse messianic philanthropic mission? Is this what we want for our cosmic legacy? Is this why the aliens keep us in quarantine?

African-American Elon Musk gives his future two thumbs up.

A New, Clandestine Fiscal Policy?

leave a comment »

Towards the end of the 2008 US Presidential election, Barack Obama’s opponent John McCain repeatedly insisted that “the fundamentals of the economy are sound.” Just two months before the election, the widespread fraud committed by organized finance — popularly referred to as a “financial meltdown” in the media — threatened to undermine “the orderly exchange of commodities in interstate commerce.”

The 2008 financial crisis precipitated by “sub-prime lending” involved fraud in accounting, fraud in the real estate industry, fraud in the use of novel financial instruments to back residential mortgages, and fraud in global inter-bank lending.

John McCain with runningmate Sarah Palin — the Tea Party’s first foray into Presidential politics, courtesy of an old-school conservative.

Before the 2007 financial crisis that precipitated the 2008 crisis, the 2000 Dot-Com Bubble, the 2001 Enron energy trading crisis, the 2002 Arthur Anderson accounting crisis, and the 2002 WorldCom accounting scandal, gave ample evidence that the impending 2008 “financial meltdown” might have been averted were the Department of Justice, for example, to make it a routine practice to hire professional criminologists to pro-actively look for evidence of fraud in major financial markets.

“Crisis” would, rather, seem to be a common metaphor for “normal.” Or, at least where it comes to when decisions about huge amounts of money are made. Or, perhaps, an exciting, news-worthy way to say the world is run by crooks, and the commercial media sure as shit isn’t here to help.

If you think “social media” is the vital democratizing force here to save us, good luck to you and the malfunctioning DNA that made you.

Major Shifts in the Market for US Treasury Debt

In the wake of the Sub-Prime Mortgage Lending Scandal that served as the proximal cause to a global “financial meltdown” that was severe enough to help tip the 2008 US Presidential election, President Obama’s Administration implemented a policy of “Quantitative Easing” begun at the tail end of the George W. Bush Administration.

Quantitative Easing involved a significant change in US fiscal policy. Quantitative Easing involves the US Federal Reserve Bank purchasing US Treasury debt in huge volumes. This is highly unusual historically, untested economically, correlated with massive transfers of wealth to the wealthy recently, and is intensifying.

Until the 2007-08 financial crisis, the Federal Reserve purchased Treasuries at a steady pace.

Before the “2008 Crash,” the Federal Reserves purchasing of US Treasury Securities was steady. This is a reasonable way to diversify the Fed’s holdings and to stabilize demand for Treasuries. This process of “monetizing debt” must take place on the open market: the Fed doesn’t buy straight from the US Treasury, but from major banks — which helps create credit.

President Obama’s fiscal policy for managing the “toxic mortgage assets” involved stabilizing the currency market by purchasing “toxic assets” to increase their price, recapitalizing insolvent banks by expanding the money supply by purchasing huge volumes of Treasury bills, and restricting lending to slow the rate at which the new money enters the economy as a means to limit inflation.

After the 2007-08 Financial crisis, the Federal Reserve increased its Treasury holdings dramatically.

Over the course of President Obama’s eight years in office, the Federal Reserve tripled its holdings of Treasury bills to just over $2.4 trillion. This is, roughly, the amount of Treasury debt held by China, Japan, Canada, and Mexico combined (our major trading partners). These debt purchases meant that the US economy received $2.4 trillion more than the total goods and services produced.

Injecting large amounts of cash into the economy risks creating inflation, and control over interest rates is the main tool the Federal Reserve uses to combat inflation. During the Obama Administration the Fed took an additional inflation risk, as interest rates plummeted to spur economic activity in the form of borrowing (creating more consumer debt and profits for the banks). At the same time, rates for interest on savings also plummeted, so that anybody with modest savings would lose the yields they previously earned from mundane financial instruments like CD’s or their bank’s savings account (gradually shifting more money to the major backers of the banks). A savings account as the cornerstone of smart personal finance is no longer a meaningful option.

Lowering interest rates hurts individuals who save money, but helped recapitalize insolvent banks when combined with “quantitative easing.”

While the jury is very much still out on the long-term effects of this type of fiscal policy, its use does not appear to be limited to this single, major financial crisis. At the end of the Trump Presidency, the US embarked on another round of quantitative easing — without much discussion of how to manage the long-term consequences.

More Major Purchases of US Treasury Debt

When the COVID Pandemic led to lockdowns and layoffs, the US was facing various forms of economic disruption. By April 2020, over 20 million people found themselves out of work, and the unemployment rate rose above 14%. This, in turn, threatened the purchasing power of many families and, ultimately, the revenue of large corporations. Because interest rates were already so low — pegged at .05% in April 2020 — the Fed could not invoke this tool to spur economic activity.

Because Congress is unwilling to tax the wealthy or aggressively tax large corporations, the only option available to the Federal Reserve was to provide more credit itself by purchasing Treasury bills.

While the Federal Reserve’s US Treasury holdings tripled under the Obama Administration, they doubled again under Trump’s.

Instead of easing back on Treasury purchases as planned — which would gradually restrict the money supply after creating trillions of dollars in new credit — the Fed resumed Treasury purchases in early 2021, eventually doubling the amount of Treasury debt on the books — which had already tripled in the previous decade.

This policy — creating massive amounts of new credit in the form of Fed Purchases of US Treasury bills — appears to be continuing into the Biden Administration.

What Are the Implications of This Fiscal Policy?

This fiscal policy is pumping new credit into the financial system, but not in a way that benefits ordinary individuals. Taxpayers have continued to struggle financially through the COVID pandemic, receiving small, infrequent stimulus payments, and relying on extended unemployment benefits because available wages aren’t keeping up with the cost of living.

Full view of the Federal Reserve’s acquisitions of Treasury Securities.

At the same time, the wealthy have become astronomically wealthier. The top few percent of the wealthy — whose wealth derives from structured financehave increased their wealth by 54%, or $4 trillion during the pandemic, while 200-500 million people slid into poverty. While the relationship is not exactly direct, this $4 trillion figure is nearly identical to the amount of new credit the Federal Reserve has created during the pandemic. Bailouts to large, struggling corporations wind up in executive bonuses at a much higher rate than in the pockets of employees: during 2019, CEO compensation increased 14% such that the average CEO makes 320 times as much as the average employee.

The long-term implications for the economy are unclear, as this is a new, little-discussed acceleration of the processes of financialization that began in the 1980’s. In addition to facilitating massive wealth transfers, this may, ultimately, impact the global market for dollars and the “real economy” that is increasingly marginalized by the financial sector.

Under the post-war neo-Keynesian economic model, Federal debt is perfectly sustainable as long as the economy continues to grow at a rate that exceeds interest on the debt. Several key historical US policy decisions echo this principle: Woodrow Wilson ended the Gold Standard in 1913 while simultaneously establishing the Federal Reserve, while Nixon ended the convertibility of dollars to gold in 1973. As a result of these two policy decisions, the US dollar was able to become a major global reserve currency.

It is not true that “since the US abandoned the gold standard the value of the dollar isn’t based on anything anymore,” as argued by many conservatives who would like to see the US return to a fixed dollar price. The value of the dollar is driven by the demand for dollars — largely backed by the demand for US goods and oil from OPEC.

Although many Americans despair that “the United States doesn’t make anything anymore,” this is not true either. The United States manufactures more than ever before, except this is done with machines now instead of people — especially since the highly-contested election of George Bush II in the Fall of 2000 signaled a change in the political order. Foreign buyers who want US goods need dollars first — and this demand for dollars helps maintain the price of the dollar.

Another major demand for dollars comes from the demand for oil. The US produces more oil than Saudi Arabia. Because oil is a global commodity, oil is priced according to global demand, and the global demand for oil helps maintain the value of the dollar. When Nixon ended the last vestige of the gold standard, he struck a deal with OPEC: OPEC agreed to price oil exclusively in dollars, creating the petrodollar. Any buyer on Earth who wants oil from OPEC must purchase dollars first, which creates a global demand for dollars.

At the moment, Federal Reserve policy is creating the demand for dollars out of thin air. Foreign governments currently hold about $10 trillion in Treasury securities, about equal to what the Federal Reserve and local governments hold — except half of that is just from the past decade, representing a major shift in the financial order.

What happens if OPEC stops pricing oil in dollars, and starts using the yuan? This would be a problem for the dollar, unless the Federal Reserve acts to avert such a crisis with another round of quantitative easing. What happens if electric vehicles reduce the global demand for oil? You can be sure that large corporations — having failed to plan for this eventuality — will be rewarded with another round of quantitative easing.

What this Fed policy appears to be creating is a financial order where the demand for dollars among the owners of financial wealth keeps the “real economy” functioning: essentially, the demand for dollars among the wealthy can be used to replace the demand for dollars among nations seeking oil or US manufacturing goods. With interest rates near zero, inflation can be controlled by ensuring that most people never see any of the Fed’s new financial wealth: as long as those dollars stay in a rich person’s offshore bank account, they stay out of the economy.

This is a financial system that requires an ultra-wealthy, financialized oligarchy who, in turn, sell ordinary citizens commodity survival on credit. The purpose of the individual in this new system, then, is to convert credit into debt — no more, no less. No more owning music on vinyl or tape or CD, or films or physical books, or owning phones that now are leased, or cars, or houses or even one’s own online social activity, or the economic value thereof, unless one is selling images of one’s young, un-spent body.

Gone are the days when individualist economists like Friedrich Hayek cautioned that using individuals as means to economic ends was the hallmark of the authoritarian economies.

When capitalism enjoys a monopoly and no longer needs to compete in the marketplace of ideas, all options are on the table; education, healthcare, arts, and culture become unnecessary social expenses that diminish the ability of individuals to convert credit into debt.

Written by Indigo Jones

April 28, 2021 at 3:19 pm

Primer on Resistance and the Surveillance State

leave a comment »

There’s no Internet without surveillance. The Internet was built by the US military to be robust, not for privacy or security.  Privacy was not part of the Internet’s design goals.

The Internet became a commonplace household word in part because of the hype surrounding an economic bubble created during the presidency of Bill Clinton.  Under Bill Clinton, the US Congress also enacted the Communications Assistance for Law Enforcement Act at the same time that Windows 95 introduced Americans to personal computers and the phrase “information superhighway” introduced Americans to networking. Surveillance was an integral part of handing the Internet over to commerce.

The relationship between commerce and the surveillance state is now well-established: Apple and Microsoft are suspect, and Yahoo has made surveillance a business proposition — as per 18 U.S.C. § 2706, Yahoo’s 2009 rates ran as follows:

Basic subscriber records cost $20 for the first ID, $10 per ID thereafter; basic group Information (including information about moderators) cost $20 for a group with a single moderator; contents of subscriber accounts — including email — cost $30-$40 per user; contents of groups cost $40 – $80 per group.

Given that typical internet advertising revenue brings in only pennies per click, the current scale of Internet surveillance clearly implies that spying on customers is big business for online firms.

Other telecommunications carriers have made similar overtures, some companies have faced legal and economic reprisal for refusing to cooperate, and yet others have availed themselves of their free speech rights as corporate persons to engage in this dubious commerce.

It should be reason enough to be disturbed by NSA surveillance that the Founders prohibited this type of information gathering in the 4th Amendment to the US Constitution. The excuse “I’ve got nothing to hide” misses the point.  The government should obey the law, that’s a core feature of what “rule of law” means. And the example of non-violent resistance through non-participation set by Ghandi and the Southern Christian Leadership Conference and vegetarians and vegans offers a clear a lesson for how to resist the surveillance society: stop participating in an abusive system.  The Internet is cruelty to human animals and it’s bad for the social environment.

If it weren’t for so many Americans purchasing data plans on “smart” phones, purchasing home Internet access, and dutifully reporting their daily thoughts and habits psychological makeup on FaceBook accounts, the costs to Uncle Sam for maintaining the current surveillance state would very rapidly prove prohibitive.  That is, if the government had to pay your phone bill and your internet costs and pay a spy to follow you around to listen in on your conversations, it could no longer afford to spy on everybody. Through consumer habits and the cultural value placed on convenience, Americans effectively subsidize the surveillance state on behalf of the governmentDan Geer stated the matter succinctly: our online choices are between freedom, security, and convenience, but we can only pick two.

From a cost perspective, a “vegetarian” approach to resisting the surveillance state (that is, by simply opting out) is an inexpensive solution that aims at increasing the cost of surveillance to the state.  This approach requires little social coordination other than a shared will to change prevailing circumstances — and a little personal initiative.   Such a “vegetarian” approach also serves to inject additional uncertainty into what data is gathered (thereby diminishing the value of what data Uncle Sam does collect).  This doesn’t mean life without the internet any more than vegetarianism means life without food, it just means being more selective about where your internet comes from, where you take it, and what you do with it.

You don’t need to be online all day.  A good starting point would be to make a habit of leaving your cellphone tracking device at home once in a while.  Just because your cellphone is wireless, that doesn’t mean you need to take it with you everywhere you go.  If you take it with you everywhere you go, it’s more of a tracking device than a phone.  When Uncle Sam looks through your cell tower data, changing your cellphone habits will increase the uncertainty as to your location at any given time during the day.

If you care to preserve “democracy,” all that’s really needed is a little social coordination and a willingness to put up with a little less “convenience.”  This may sound incompatible with the modern world, but there’s good reason to get motivated: the modern world is incompatible with the perpetuation of the human race.  There’s more at stake than a little privacy, though the more fundamental problem is bound up with the psychology of consumer society: in a growth economy based on persuasion though advertising — where consumers must make choices about the allocation of their scarce resources — every new product requiring new investment must be presented as needful and fundamental to the modern way of life.

Many people know things have gone awry with the modern world: between the threats posed by persistent national militarism, thermonuclear war, war over resources, mass hunger, environmental degradation, climate change, shortening attention spans, new communicable diseases — something is clearly wrong.  And yet, somehow, everyone looks to another for the solution.  Nobody is willing to see their complicity and change their behavior.  So: if you don’t like internet surveillance, stop surveilling yourself.  The problem isn’t some nebulous “big brother,” it’s you. The government isn’t going to change its behavior, so stop waiting for the government to save you from the government. You have to save yourself from yourself.

 

Marijuana and Medicine

leave a comment »

Although many states in the US have been passing laws allowing for medical and even recreational use of cannabis, the substance remains illegal at the federal level.  Specifically, cannabis is classified as a Schedule I substance under the Controlled Substances Act.  This classification is on the basis of three main criteria:

1. The drug or other substance has a high potential for abuse.

2. The drug or other substance has no currently accepted medical use in treatment in the United States.

3. There is a lack of accepted safety for use of the drug or other substance under medical supervision.

Cannabis is classed with Heroin and LSD.  Cocaine, Morphine, and Oxycodone are less tightly regulated Schedule II substances.

As cannabis use persists among teenagers and adults, both legally and illegally, more and more people — especially young people — are seeing first hand that the risks associated with the use of cannabis don’t square properly with the federal government’s treatment of cannabis.  If the government has an interest in protecting young people from the risks of substance abuse, they also have an interest in providing accurate information and formulating sensible policies that don’t simultaneously undermine their own credibility.

Problems with the Current Classification of Cannabis

The Schedule I classification of cannabis has a number of problems.  First, abuse is hard to quantify, and just what patterns of cannabis usage fall under this rubric are not well defined.  Second, there are few quantifiable safety concerns with cannabis:   the substance is profoundly non-toxic, and it is, for all practical purposes, impossible to overdose on cannabis.  This distinguishes cannabis from other Schedule I substances like heroin, and even from legal recreational drugs like alcohol, which is a contributing factor in the death of some 80,000 Americans each year.  Third, the current federal scheduling of cannabis does not take into consideration the accepted medical use of cannabis in a number of states.  The Department of Veterans Affairs has issued a formal directive permitting the clinical use of cannabis in those states where medical uses are approved.  Researchers studying the relative risks and merits of the substance encounter great difficulties acquiring suitable samples to study, and their findings are of limited applicability to the way the substance is routinely consumed in a non-standardized, non-regulated black market.

Perhaps the most dramatic difficulty with the federal government’s position on cannabis is that the US Department of Health and Human Services holds a patent on medical uses of cannabis.  Issued in 2003, US Patent #6630507 is titled “Cannabinoids as antioxidants and neuroprotectants.”  The patent examines a molecule found in cannabis, CBD, though the chemical mechanism the patent identifies should be present in all cannabinoids, including THC.  The patent notes that “cannabinoids are found to have particular application as neuroprotectants, for example in limiting neurological damage following ischemic insults, such as stroke and trauma, or in the treatment of neurodegenerative diseases, such as Alzheimer’s disease, Parkinson’s disease and HIV dementia,” and also indicates that cannabinoids offer a unique delivery mechanism due to the facility with which these molecules can cross the blood-brain barrier.

When cannabis was originally listed as a Schedule I substance in 1970, the classification was intended to be provisional, pending the results of an ongoing study.  The National Commission on Marijuana and Drug Abuse issued the study findings in 1972, finding that there “is little proven danger of physical or psychological harm from the experimental or intermittent use of the natural preparations of cannabis.”  Although the study recommended de-criminalizing cannabis and treating use or possession akin to alcohol, President Nixon chose not to implement the Commission’s recommendations, and marijuana has remained a Schedule I substance since.  Although whole-plant marijuana remains a Schedule I substance, the synthetic THC called dronabinol — sold under the brand name Marinol — is classified as a less-restricted Schedule III substance.

Social Attitudes Affecting Cannabis as Medicine

In the United States, a lot of opposition to medical cannabis laws have presumed that such laws are just a “first step” towards outright legalization.  While there is little to suggest such an outcome would be inherently detrimental, there is also ample evidence that supports medical uses of cannabis on the substance’s own merits.

What presents a more profound problem to the public is in part a tacit sociology of medicine that limits and proscribes how individuals view treatment.  Politicians have adopted these cultural attitudes unquestioningly — indeed, the authoritarian personalities of these politicians wouldn’t allow them to ask such questions.  Those who are open to such questions don’t dare assert themselves, despite polling results that show 70-85% of Americans favor significant changes in current federal policy.

Of particular note in this regard is the unexamined notion that medicine has to come in the form of an expensive bitter pill.  The notion that medicine might also be pleasurable is anathema, and that healing might be enjoyable is equally heretical.  Medicine is still penance, disease is sin, the new medical complexes are cathedrals, and doctors are the high priesthood, mediating between this world and the next, serving as both the front line and the last defense against the forces of corruption, decay, and disorder.

We apologize when we call in sick to work, and are stigmatized by our ailments.  Just as the medieval church was one of the largest landlords in Europe, today’s medical industry claims vast swaths of the GDP.  In the US, healthcare spending exceeds the 10% tithe commanded by the medieval church.  The religion analogy is quite complete, and includes the irreligiousness of the most ardent devotees.  Hospitals, gathering together the diseased, are diseased.  They are morally perverse and rotten with wealth.

Data from multiple nations detailing spending on the medical industry as a percentage of GDP

Data from multiple nations detailing spending on the medical industry as a percentage of GDP

Along with unexamined notions of how medicine fits into our culture, there is another factor promoted by our culture, related to the ideology of Progress.  Progress holds that the future will always bring improvements, that all new technology is better technology, and that what is new must replace what is old.  From within the confines of this ideology of Progress, it seems on the face of things obvious that any new pill is inherently superior to “natural preparations.”  This is, unfortunately, quite difficult to establish with any certainty.

There are easy-to-identify counter-examples where modern medicine has delivered a harmful product: the recall of pills like VIOXX make a big splash in the media, and create the impression that these are exceptions to the general rule that modern medicine routinely delivers improvements.  But these issues have been with medicine for a long time: heroin, for example, was originally invented by the pharmaceutical company Bayer, and marketed as a non-addictive alternative to morphine.

The litany of prescription painkillers marketed since Bayer invented heroin have now surpassed car crashes in the number of annual deaths they cause, accounting for some 90% of all poisonings.  The number killed by these drugs amounts to about ten 9-11’s each year — every year.  Instead of figuring out how to deal with this plague, however, the US throws more and more money at the medical industry, which keeps developing new drugs with serious side effects and abuse potential.  The broader, social implications of this are even more troubling.

The Decline of Western Medicine

Most of modern medicine is unnecessary.  After sanitation and hygiene, antibiotics, analgesics and anesthetics, and vaccines, most of modern medicine is devoted to coping with the side effects of industrialization.   This effect can be seen in diet particularly, but also with respect to such vectors as environmental pollution.  Environmental pollution may take the form of contaminants in the air and water, particulate matter in the air (which causes diseases like asthma), or increased radiation in the environment (due to industrial processes, residues from atmospheric nuclear testing, or because of solar radiation that is increased by a depleted ozone layer in the upper atmosphere).

If he or she lives past the age of 15, the typical hunter-gatherer stands a reasonable chance of remaining healthy and active into their 70’s, with a strong social support network to care for them as they age.  The modern US health care industry really doesn’t do all that much better.  A sizable portion of the modern improvements in life expectancy over what is offered by a hunter-gatherer society come from improved infant mortality, a hygiene problem identified by Ignaz Semmelweis in 1847.  Hand washing is a an extraordinarily cheap and effective medical technology.  Antibiotics, which were developed for around $20,000 of basic research, have saved many more individuals from childhood disease, and increased the range of surgeries that are possible.

As modern medicine grows more expensive, its productivity declines precipitously.  This decline in productivity can be measured in terms of substantive outcomes or in terms of the cost per patent.  Either way, the role of diminishing returns in this field is not adequately addressed in the contemporary discourse.

modern medicine chart productivity of the us healthcare system

Most of the big medical breakthroughs of the last 300-500 years were inexpensive.  Everything recent is increasingly expensive and of rapidly declining effectiveness compared to basic innovations like sanitation or antibiotics.  Most modern medicines and medical procedures could be avoided through less expensive means, specifically, through dietary and behavior modification.

patent applications productivity of reseach dollars

The cost of medicine detracts from other public welfare programs, such as nutrition, food security, education, and mass transit, all of which yield a far greater return on investment than modern medicine.

At some point, the moral aspects of modern medicine need to be evaluated in terms of the social cost.  For example: as a percentage of GDP, the US spends three times more money on the healthcare industry than on education.  We know that basic education makes us smarter, better socialized, and better equipped for employment; but most medicine isn’t really making us all that much healthier.

Marijuana and Medicine

Progress makes a raw agricultural commodity like cannabis seem suspect as a medicine, through really, it is modern medicine that should be suspect.  Whereas a typical television commercial for a new pharmaceutical product will often devote more than half its airtime to potential side effects, no similarly funded social initiative exists to teach Americans how to eat properly, or how to prepare nutritious foods.  Nutritious foods or vegetarian diets are routinely mocked by Americans.

Somehow, none of this is a medical problem.  Rather, in the discourse, these are treated as political problem.  So doctors, not being politicians, stay out of politics; and, somehow, proper diet is only of tangential concern to the medical industry, while new drugs of dubious effectiveness are promoted as indispensable innovations.

Somehow health is not a medical issue, only disease warrants attention.  And where medicine and politics do intersect on this issue of cannabis, instead of informed discussion, the public is treated to a wall of silence, or else jokes about hapless stoners.

Written by Indigo Jones

March 9, 2013 at 6:33 pm

Plutocracy, Oligarchy, and the Myth of Free Markets

leave a comment »

The Occupy Philosophy blog recently posted an article about “plutocracy,” or rule by the wealthy, written by Brian Leiter, Director of The Center for Law, Philosophy & Human Values at the University of Chicago.  In his commentary on American plutocracy, Leiter asserts that “at historical moments pregnant with the potential for significant social and economic change, the choice of language sometimes matters.”  In light of these premises, let us examine his position.

Leiter identifies “plutocracy” as the primary ill in the modern United States.  He asserts that “plutocrats” have undermined democracy.  He states that “the United States is the most powerful ‘plutocracy’ in the world. It is no longer a democracy.”

To be precise about our “choice of language,” the United States Constitution guarantees a republican form of government, not democracy; and, insofar as the law originally limited political participation to white, land-owning males (the capitalist class), the United States has always been a plutocracy.

But the more profound problem with Leiter’s argument lies in his particular invocation of “plutocracy” as the source of the problem: to equate wealth with power does nothing to explain how wealth translates to power, but simply assumes this as a fact.  This, in one sense, amounts to simply stating the obvious.   It is like pointing out that businesses are run by businessmen, without discussing at all what varieties of business are present, how they operate, or how they are integrated with or, as the case may be, antagonistic to society at large.

I. Whither Capitalism?

Leiter begins by observing that “we are now in the fourth year of the worst economic catastrophe in the capitalist world since the Great Depression.”  While this, at first glance, may appear uncontroversial, some qualification is needed with respect to the use of the term “capitalism.”  Not only are there ideological disputes at issue, but historical conditions which are, on the whole, inadequately addressed in contemporary discourse.

The late 19th Century, in which wage labor became a dominant mode of subsistence, brought about radical changes in the nature of capitalism as industry became increasingly institutionalized and bureaucratized.  The entrepreneurialism of the revolutionary bourgeoisie gave way to a commingling of private and public bureaucracy — of capital and political power — and set the stage for the working conditions of the early 20th century.

It was here we saw the ascendency of the labor union as a serious political and economic power.  The antagonism of government to unionization was a result of the union’s encroachment on the management prerogatives of industry (that is, the setting of wages and working conditions). The state, acting on behalf of capital, revealed the presence of the close-knit connections between political and industrial power that had developed during the second half of the 19th Century.

By the middle of the 20th century, this trend continued to the point where, what had traditionally been called “the market” had ceased to be a relevant force in the dominant culture of the United States.  Classical liberalism assumes that capital (land and machinery) is fixed, while labor is flexible.  Industrialization caused mass migrations of labor from farms to urbanized areas, and workers readily acquired new skills to adapt to different types of labor.

As labor has become increasingly specialized, as two-income households have become more common, and as benefits have become an increasingly important part of employee compensation, labor has accordingly become less flexible.  At the same time, capital has moved overseas, and become more flexible.  By the end of the 20th century, the traditional relationship between capital and labor had been well inverted.

Today, when one uses the term “capitalism,” this term means different things to different people.  The American conservative uses the word to invoke a nostalgic vision of 19th century entrepreneurialism.  The American liberal typically uses the word to indicate a mode of collectivist action wherein professional managers control the means of industrial-scale production on behalf of absentee owners.

There is an important sense in which even Nazi Germany was a capitalist country.  To be sure, it wasn’t market capitalism — any more than market capitalism prevails in the United States today — it was a form of monopoly capitalism that took the State as the primary consumer, and which used an imperialist war of expansion to organize production.

Although the official ideology of the Nazi Party espoused a socialist organization of society, the Nazis did very little to restructure private property or private profit along the lines of socialist ideology (except for the expropriation of Jewish wealth, which was handed over to industrialists and bankers).

Between World War I and World War II, German industrialists were a key component to the German rearmament, and the same German industrialists were the key beneficiaries of the war economy.  The industrialist Fritz Thyssen, for example, was a central financier of the Third Reich, as was the Association of German Industrialists.  The automobile manufacturer Volkswagen was a private corporation that produced automobiles for the Third Reich.  Max Amann profited enormously as a publisher of Nazi propaganda.  The Zyklon B used in Nazi gas chambers was a commercial product.

Insofar as the Nazi economy was characterized by a vast agreement between industrialists and politicians, it is worth noting that American business and government alike agree that growth is the key to success.  This is despite the fact that we live on a planet with finite resources, the exploitation of which is characterized by diminishing returns, and that increases in worker productivity are of only marginal benefit to workers themselves, who have been seeing their compensation stagnate or diminish for quite some time.  It is government and industry agreeing that growth is of the utmost importance for the industrial-scale corporation.

II. Who Competes?

The typical American conservative will construct a binary opposition between capitalism understood as “free markets” and socialism understood as “economic planning.”  This is, however, a false dichotomy.

The modern corporation is largely defined by organizational prowess, and insofar as these organizations are risk averse, the chief market operations of the industrial firm are actions meant to eliminate market forces.  This is called planning.  A farmer in the midwest can be fairly certain of finding the fertilizer he needs when he needs it precisely because modern corporations are expert planners.

Stability is the enemy of competition (which must be unpredictable if it is to be fair), and insofar as corporations want to guarantee favorable performance for their shareholders, they set out to ensure economic stability and predictable growth.  Marketing and advertising are means to ensure consistent demand.  Corporations will sell their products at a loss to undercut competitors, and if this fails, they may buy their competition outright.

Because executives rarely go to prison when corporations break the law, corporations are apt to operate in open violation of the law if it will snuff out the competition — this is precisely what Microsoft did in Europe, paying $2 billion in fines during a decade of operation in direct violation of EU trade laws.  Corporations that pollute are granted enormous subsidies: given that most homes and businesses must pay for garbage collection, why should the biggest polluters be exempt to the extent that they are?  Insofar as schooling prepares students for employment, and college trains students in industry-standard skills and software applications, the cost of education represents a form of subsidy.

The result of this relentless push by modern industry to eliminate market forces at every opportunity has a profound impact on daily life — albeit one that is difficult to perceive at first glance.  Although there are many channels to choose from on television, most markets are served by a cable TV monopoly.  Although a consumer has many different brands of computers available to them for purchase, one firm — Intel — makes most of the chips in these computersA few large firms make most of the hard drives and optical drives in these computers.  Microsoft makes the operating system for most of these computers.  Computers are highly commoditized, and relatively few firms control the market for this commodity.

This dominant market arrangement is known as oligopoly, and is characterized by collusion between a few major firms to mutually ensure their continued dominance.  And it is not just the the cable television market or the technology sector that is characterized by this arrangement: as of 2005, 90% of the soy crop grown in the US was of the patented Roundup-Ready variety sold by Monsanto.  One company — Archer Daniels Midland (ADM) — claims close to 50% of the domestic market for ethanol.

US Government mandates that gasoline be blended with ethanol increased ADM’s net earnings by 26% in 2006 alone.  This is just one way in which ADM is the beneficiary of subsidies and governmental planning.  ADM also benefits from agricultural subsidies for corn, since most of the ethanol it produces is made from corn.  In 1993, ADM was also the target of the largest price-fixing case in US history.  It’s not just Microsoft that engages in anti-competitive business practices.

The demand for ethanol in gasoline, from which ADM benefits so enormously, is predicated on access to roads.  Roads are heavily subsidized.  For federal highways to be financially solvent, for example, the federal gasoline tax would need to be raised by 40¢ per gallon.  The federal gasoline tax was last raised by a nickel in 1993 — and whatever proceeds might be had from that increase have been consumed by inflation.

Not only are roads heavily subsidized, but the research that goes into advanced biofuels represents a subsidy as well: it could be argued that, given the economic law of diminishing returns, the money spent researching biofuels could be better spent investing in various forms of mass transit (though this would make the unpleasant implication that the American way of life is, as presently constituted, unsustainable — so politicians say what they must to get elected, and corporations keep giving consumers whatever marketing departments tell consumers they want).

None of this has happened by chance: the market is not an anarchy of small entrepreneurial firms as it was in the first half of the 19th century.  What we have in the West today is the result of planning.  Given that most wealth in the US is held by corporations, not plutocrats or governments, it is fair to say that most of the decisions about the US economy are the result of planning, since the modern industrial corporation is characterized by planning (that is, collusion with related firms and with government) rather than market competition (or voter turnout).

What is Excessive about CEO Compensation?

Although Mr. Leiter is content with the populist appeals of the Occupy Movement, which hold that excessive CEO compensation is the result of “avarice,” the truth of the matter is more subtle.  The problem of CEO compensation is not one of avarice, but, rather, is a particular solution to the personnel needs of the industrial corporation.

Most CEO’s are already wealthy by the time they are recruitedPay itself is not an incentive to work because they have neither fear of privation nor need for additional material comfort.  There are, then, two main approaches to providing them an incentive to work: psychological identification with the goals of the firm, or increased status.

Where CEO’s are recruited, rather than obligated to claw their way up through middle management, it is more difficult to get them to identify with the goals of the firm.  In certain industries this can be accomplished through an identification of the goals of the firm with specific social objectives (such as national defense), or through the dogma of indefinite growth (which even a tobacco company executive can participate in, and thereby contribute to society) — and it is here that a peculiar brand of nationalism comes into play — but in general it is easier to equate wealth with status, and motivate the CEO by enhancing his or her status accordingly (also satisfying the contemporary quantitative mindset).

And so growth becomes a central feature of American capitalism — providing both a psychological justification for those who manage industry on behalf of absentee owners (whose status derives from the circumstances under which they need only sit back and watch the money roll in) and what enables the firm to confer a form of status on the CEO.  It is through this fixation on growth that modern capitalism takes on an imperialist aspect.  It is moreover worth noting in this connection that the CEO is no more a capitalist than the typical pro-business unionized auto worker: the CEO is management, not a an individual proprietor, and is not inherently interested in the amassing of capital.

Of course, to the 99%, the CEO’s are, so to speak, high-status (in addition to being upper-class).  But what is often ignored is the extent to which they inhabit a completely separate social world with completely distinct norms.  There is, among industry, politics, and the military, a distinct affinity group — a set of shared goals, management practices, and close social ties.  You can see evidence of this affinity group where people who attain this status are able to move easily from one sector to the other.

Take former Vice President Dick Cheney, for example: he went from Secretary of Defense (military) to CEO of Haliburton (industry) to Vice President of the US (electoral politics).  It is not the case that the object of the work in any one occupation directly qualified him to occupy the other, especially in this era of specialization.  Yet what Cheney specialized in were certain management practices, bureaucratic proficiencies, and the cultivation of a specific social network.  His case is not an isolated one.

The personnel problem becomes a social problem where these people, who aren’t always the wealthiest, but who have access to authority and the media, set about normalizing the persistence of the affinity group from which they benefit.  It is not a matter of some wealthy folks being “well-intentioned” while others are “sociopathic” — though many in positions of power do exhibit sociopathic personality traits.  There is, more substantively, the important matter of why so many Americans go along with things.

Many Americans see collusion as waste and arrive at the conclusion that government should be run like a business, without ever stopping to think for half a second about what that means.  Many people believe that if government were run more like a business, it would work more efficiently.  If government were to hold efficiency to be of paramount importance, it would simply kill the infirm, rather than offer Social Security.  This is, of course, contrary to the US Constitution’s promise to “promote the general welfare,” understood as a means of guaranteeing “life, liberty, and the pursuit of happiness,” but markets, by definition, offer few guarantees.  It is a very circumscribed definition of “efficiency,” but one that highlights why “playing the stock market” is often equated with gambling.  Sometimes the bottom line isn’t the bottom line.

There are other problems with holding that government should be run like a business.  Businesses are not democratic organizations, they are authoritarian (you do what your boss tells you to do, and you don’t get to vote your boss out of office if you don’t like it); their management practices are in many cases proprietary (as opposed to publicly announced laws) and their office holders are appointed, rather than elected.

Furthermore, there are reasons to suppose that the ethical standards of conduct with respect to business and government are incompatible. Whereas a business man must be on the lookout for opportunities to engage in commerce, when an office holder does this, it’s called bribery or a conflict of interest.

Business (of the desirable, market-based kind) needs competition, but government needs loyalty. It doesn’t even make sense to think of government as competing: the whole point of a constitutional republic is that the state has a monopoly on the legitimate use of force as a means of coercion; the alternative is vigilante justice.

And What of it?

Where the notion that business represents a superior model for governance coincides with the ideology of political freedom deriving from economic freedom, it is worth noting that the sort of absolute freedom advocated by American conservatives is not the pinnacle of civil society, but its complete opposite.

In John Locke’s Second Treatise on Civil Government, published 1690, he states: “where there is no law, there is no freedom: for liberty is, to be free from restraint and violence from others; which cannot be, where there is no law: but freedom is not, as we are told, a liberty for every man to do what he lists” (57).  Liberty is having assurances everybody obeys the same law.

Laissez-faire economics is contrary to the Western Constitutional tradition, as originally conceived, and as understood in the mid 20th century.  Some centuries after Locke, in 1944, free market advocate Friedrich Hayek echoed much the same position, in articulating his view of rule of law: “The Rule of Law thus implies limits to the scope of legislation: it restricts it to the kind of general rules known as formal law and excludes legislation either directly aimed at particular people or at enabling anybody to use the coercive power of the state for the purpose of such discrimination. It means, not that everything is regulated by law, but, on the contrary, that the coercive power of the state can be used only in cases defined in advance by the law and in such a way that it can be foreseen how it will be used” (Road to Serfdom, Chapter 6).  Provided that individuals have a say in what laws are passed, freedom is having to obey only the law, and not yield to the whims of others.

The contemporary trend to privatize governmental services, then, is contrary to the goals of a just, democratic (or, republican, as the case may be) society.  It takes public resources and removes them from democratic control, under the banner of re-instating some nostalgic, 19th Century vision of entrepreneurial capitalism.

Of course, we have the benefit of history to tell us what that style of capitalism leads to: 15 hour workdays, no weekends, sweatshop conditions, mere subsistence pay, occupational safety hazards, and the like.  Union organizers fought tooth and nail for decent working conditions.  And already we can see both how far we’ve slid back into these precise conditions, and how they represent not the cooperation of individuals under the law, but the subjugation of individuals to what working conditions employers dictate.  This is an issue of no small concern, given that most people spend the better part of their waking hours for the better part of their lives working.

Say Again?

Where Mr. Leiter explains, “The social and economic world is both vast and complex, and in market economies, all the incentives of daily life demand focus on the immediate moment: closing this deal, getting to this business meeting, pleasing that client and, overridingly, getting what you can for yourself,” he is guilty of a gross over-simplification.

The very existence of government subsidies favorable to industry speak to the fact that these firms plan quite far ahead, and the lengths to which they go to undermine competition speaks to the extent to which they are averse to market participation.  The flaw here is the assumption that the conditions of a market economy are a relevant factor in shaping the shared goals of industry and politics.  These conditions do not prevail; rather, monopoly and oligopoly prevail.  There may be competition among filling stations and convenience stores or fast food restaurants within a particular neighborhood, but, the franchise agreements under which these small operators open up shop, as elsewhere, insulate the oligopoly from the risks of actual market participation.

Written by Indigo Jones

November 11, 2011 at 1:17 am

Follow the Leader: There is Opportunity in Disaster

with one comment

Although many elected officials lay claim to the title of “leader,” it is becoming increasingly self-evident that such a title only applies insofar as they are leading us off a cliff.  It is profoundly problematic that the media unquestioningly reinforces such baseless claims to leadership by routinely using so inappropriate a term to describe these officials.

While various “leaders” may market themselves as catalysts for social change, and seek to secure the confidence of voters who also seek social change, a change in “leadership” rarely brings about the promised social changes.  Not only is electoral politics first and foremost a means of legitimating those very power structures voters would seek to change, but belief in leadership is furthermore a tool to enforce conformity among voters, since following leaders is a form of conformity.  Conformists don’t bring about social change.

That few officials, out of humility, demure that they are not “leaders” but, rather, public servants, offers an important glimpse into a profoundly disturbing dynamic underlying the facade of “politics as usual.”

Many politicians are literally sociopaths. Compare the behavioral profile of the sociopath with the actions and attitudes of the typical politician: sociopaths don’t have normal moral reservations about manipulating people like objects; this is precisely how politicians get elected. Sociopaths understand little about human emotion beyond ego gratification; the prestige of high office satisfies this desire for the politician. Sociopaths wear a facade of normalcy and are often charming, but lie compulsively. Politicians speak in polite terms while plotting to stab their colleagues in the back. If they’re not telling outright lies, they’re “spinning” facts to suit their needs. Sociopaths don’t feel guilt or remorse or empathy; no US official to date has apologized for invading Iraq on false pretenses, turning five million Iraqi’s into refugees, pumping Fallujah full of depleted uranium, or engaging in torture.  Nobody in government has publicly investigated the Bush Administration’s use of torture or civil liberties violations. Sociopaths are glib, superficial, impulsive; their goal is the creation of a dependent, willing victim.  Elected office is the ideal job description for a sociopath.  The desire to attain office should disqualify a person from holding such a position.

The term “sociopath” is imprecise.  Often, “sociopath” is used interchangeably with “psychopath,” whereas other times, “psychopath” is used to designate a genetic predisposition, and “sociopath” a set of learned behaviors.  Either way, the prevalence of this sort of anti-social personality disorder among the general population is estimated at between 1-4%.

It may not be a coincidence that 1% of the population controls some 40% of the wealth in the US, and that the top 5% controls close to 70% of the wealth.   Competitive society is in many ways optimized to benefit those who exhibit sociopathic personality traits, and it reinforces sociopathic tendencies among the general population as a behavioral adaptation.

In competitive society, people are trained by sociopaths to think like sociopaths.  The public relations and marketing firms employed by both commercial and political interests train people to be opportunistic and calculating, to always be on the lookout for ways to treat other people as means that can be manipulated to various ends.  People are taught to be individualistic and egocentric rather than compassionate and cooperative.  Much of the advertising with which individuals are daily inundated promotes impulsive behavior and acculturates individuals to the distortions of reality that characterize most advertising and marketing.  As young people are brought into the fold, they become adults who are active participants in this process of training others to think like sociopaths — to think in the terms expounded by commercial marketers and political spin doctors — to such an extent that genuinely different worldviews become completely incoherent, in virtue of a sociopathic lack of empathy.

Beyond accommodating the lies and distortions that characterize so much advertising, marketing, and political posturing, individuals are, in numerous other ways, trained to think like sociopaths.  The aesthetic appreciation of violence in films, TV, and video games is an obvious example; a less obvious example is the popularity of “funniest home video” programs.

While slapstick comedy may be the cultural context in which “funniest home video” programs are appreciated, these programs contain none of the observational humor or physical ingenuity that characterize most slapstick.  The “funniest home video” programs are not, in any substantive terms, the products of creativity or skill.  They harvest moments of trauma from among the general population, and, in terms of their presentation, they train audiences to override natural empathy responses and to find humor in the misfortune of others.

Without an awareness of these dynamics, little can be done about them.  It is hard to criticize or correct a social trend without being able to even name it.  But such contemporary developments as the imposition of “austerity measures” or the renewed effort to disrupt labor organization and revoke “collective bargaining rights” can be understood in a precise historical context; to the extent that ordinary citizens support such measures, these citizens are being manipulated by criminal sociopaths.

In The Second Treatise on Civil Government, John Locke wrote, “he that in the state of society would take away the freedom belonging to those in that society or commonwealth must be supposed to design or take away from them everything else, and so be looked on as in a state of war” (¶19).  John Locke is not some fringe figure; the Preamble to the US Constitution is more or less a summary of Locke’s basic ideas on legitimate authority.  What is happening today has happened before, has been studied, and named, and diagnosed already.  In the past, monarchs caused civil unrest; today it is powerful sociopaths who have rigged the game to serve their own ends, who create for themselves an aura of respectability, and thus wrest from citizens assent to a degenerate state of affairs.

Written by Indigo Jones

October 19, 2011 at 4:28 pm

%d bloggers like this: