HANG THE JURY

If a juror believes a defendant is guilty of breaking the law, but believes also that the law itself is not just, she has the right to vote with her conscience and not with the law.

A single juror has the ability to acquit a defendant in a trial for any reason. Even if the juror believes the defendant is guilty. This is called jury nullification. This is not a loophole. Nor is it illegal. But it’s a secret and it shouldn’t be.

With that said, let’s begin.

A cursory review of prison statistics illustrates the nightmare that is African Americans’ experience with our criminal “justice” system. There are currently more than 7 million Americans caught up at some point in the prison system between probation, incarceration and parole. Incredibly, 40 percent of our prisoners are black even though African Americans comprise only 13 percent of the total U.S. population. I live in a state where that number is closer to 50 percent. All told, America has 25 percent of the world’s incarcerated population despite only having 5 percent of the world’s population. This makes the sheer number of blacks in the prison system today even more overwhelming.

If you think there’s something wrong with this picture, continue reading, as there’s something that you can do about it. If you think this is because black people commit crime at a higher rate than white people do, then there’s a special place for you in hell or, worse, Congress.

Half of the prisoners in the United States are serving time for non-violent drug-related charges and 80 percent of those charges are for possession. Advocates and activists throughout the nation are attempting to reverse this trend, as the mass incarceration of black men specifically has become an epidemic. Despite the best efforts of groups such as the NAACP and the ACLU to reverse the trend, the problem persists unabated with most feeling helpless to change the system in a meaningful way.

But something can be done. By understanding your rights as a citizen to participate in the legal system, change can occur. Simply performing a civic obligation and reporting for jury duty gives every American the ability to weigh in on this issue.

Few people who are arrested on drug possession charges ever make it to trial for two reasons. One is that most cases are settled with a plea deal that a defendant often learns of for the first time while standing in front of a judge. The court-appointed attorney is basically there just to explain the plea to the defendant. The second reason is that plea deals are often considerably more attractive than the potential of losing in a trial and being sentenced by a judge, who is obligated to hand down sentences in strict accordance with the law. In states with mandatory minimum sentencing requirements, the risks are enormous.

But for those rare cases that do make it to trial, most people would be surprised to know that the most powerful person in the room is not an attorney or even the judge, for that matter. It’s the juror. One dissenting juror has the ability to decide whether or not a defendant should be set free no matter how the facts are presented. If a juror believes a defendant is guilty of breaking the law, but believes also that the law itself is not just, she has the right to vote with her conscience and not with the law.

Whether or not a judge has an obligation to inform a jury of this right has been battled over for two and a half centuries. As it stands now, judges are not required to inform a jury of their right to nullify a verdict; therefore, most do not.

Intrigued? Incredulous? Inspired? If you are brave enough to defy injustice and provide the last line of sane defense in an insane world, it’s best to arm yourself with an understanding of how we arrived at this point in history and your constitutional right to turn the tide.

The Modern “Middle Passage”
In order to properly describe the extent to which our criminal justice system is inherently and overwhelmingly racist, we must learn to speak about it with a new language. The current language, inculcated into the population by the government and corporate media over several decades, includes phrases such as “tough on crime,” “zero tolerance,” and “three strikes.” This type of rhetoric has been delivered repeatedly and enthusiastically since President Ronald Reagan declared the official start to the War on Drugs in 1982. Thirty years and a billion episodes of Law & Order later, we are all fluent in the language of narcotics.

Unfortunately, most of us have turned a blind eye to the mass incarceration of young black men in America during this time. Most of us shrugged it off. Most of us have failed to comprehend the rise of the prison industrial complex. Most of us, but not all of us.

In her book, The New Jim Crow, Michelle Alexander speaks to both the sociological and institutional aspects of racism in the American legal system. Since its publication in 2010, her book has been gradually galvanizing members of the black community around the concept of incarceration as a new form of slavery. And because of the efforts of outspoken leaders such as Dr. Cornel West, tireless advocacy from grassroots drug and prison reform groups and the comprehensive analysis provided by Alexander, the nation is beginning to speak about incarceration with a new language.

Rev. Roger Williams, pastor of the First Baptist Church in Glen Cove, N.Y., and president of the local NAACP chapter, says the reaction in the black community has been “multifaceted.” He says Alexander’s book has certainly inspired debate, with some putting “all of the onus on the black community,” others who have a “balanced understanding,” and “then you have those who feel like white folks are coming for you.” In every case, says Williams, “it’s almost like shoveling smoke trying to get a consensus, but it’s certainly stirring leadership.”

Fred Brewington, a prominent New York attorney and activist, has lectured frequently on this issue and even given sermons on The New Jim Crow, as he lives it every day in the criminal justice system.
“Unfortunately, the system has become the norm,” says Brewington. He shares Williams’ view that the book hasn’t necessarily filtered through the black community, but it has started to take root. “It’s not as though everyone is waking up and saying, ‘Where are all our African American men?’” But he calls Alexander’s book a “wonderful compilation of information that is there for the use of front-line advocates.”

Alexander’s book boldly equates the effects of today’s punitive drug laws to those of the Jim Crow laws that legalized segregation and unequal treatment under the law with respect to race. Specifically, she addresses the mass incarceration of black men in America under draconian drug statutes. For those who believe her analogy is a stretch, Alexander has a powerful weapon at her disposal: statistics.

Our modern journey to enslavement begins in 1972 in the years immediately following stark gains made during the Civil Rights movement. The prison population was around 350,000 as compared to 2.2 million people today. In 1972, violent crime had already peaked and was on the decline in the United States. The reason for the peak during the prior years was arguably the result of the Baby Boomers being between 18 and 25 years old—the prime adolescent years of criminal agitation—mixed with civil unrest and protests during the Vietnam era.

But by the mid to late ’70s, conscription had formally ended, the Boomers were more worried about getting jobs than getting high and violent crime was precipitously declining. As Alexander notes in The New Jim Crow, the National Advisory Commission on Criminal Justice Standards and Goals recommended as early as 1973, “no new institutions for adults should be built and existing institutions for juveniles should be closed.”

Sociologists and criminologists had come to realize that punitive punishments and long-term sentences had little to no positive impact on crime statistics and that rehabilitation and treatment were more appropriate measures for all but the most violent criminals. Plus, the numbers were on their side. Despite a difficult economy, violent crime was falling—not only in the United States, but also around the globe. Given these circumstances, it was somewhat surprising that President Reagan declared an official “War on Drugs” in 1982, only two years into his first term. Surprising also because America didn’t really have a drug problem in 1982.

Ask enough people from a black neighborhood where “crack” came from, and it won’t take long for someone to tell you it was the CIA. This point has been hotly debated for years. But the fact remains that the period during which cocaine first began flooding the streets of American cities coincides precisely with the start of CIA operations in Central America, specifically Nicaragua. In the early 1980s guerrilla fighters in Nicaragua were suddenly flush with cash from American drug dealers—cash used to purchase American weapons in their fight against the Sandinistas, the Marxist government that aligned itself with Cuba.

In 1982, the U.S. Attorney General drafted a Memorandum of Understanding to the CIA establishing the United States’ interest in overthrowing the Sandinista government in Nicaragua; the same year the Reagan administration declared the War on Drugs. But crack cocaine had yet to reach the streets. It would take another three years for crack to begin appearing in the black neighborhoods; crack derived from cocaine funneled from Nicaragua. Call it a conspiracy or an incredible coincidence, but the timing is irrefutable. In the meantime, however, the Reagan administration didn’t sit idly by and wait for crack to become an epidemic. It had laws to change and a paradigm to shift. It didn’t take long.

Despite the downward trend of violent crime and no evidence yet of a rampant drug problem, the Reagan administration increased anti-drug funding for the FBI, Department of Defense and the Drug Enforcement Administration tenfold between 1980 and 1984; almost the exact size of the funding decrease to federal drug treatment, rehabilitation and education programs. Cocaine funneled from Central America hit the streets in 1985 in the form of crack and was deemed an epidemic by the media by 1986. By the end of 1986 the country had already adopted mandatory minimum sentencing requirements for drug-related felonies.

In less than five years a crisis had been fully manufactured in our cities and federal, state and local law enforcement agencies were given incentives in the form of military arsenals and cash to increase the number of arrests. Police departments were suddenly competing for cash grants, assault weapons and air power. The government’s sudden change of course and willingness to fund anything related to drug crimes also created an opportunity for private industry, which was only too anxious to jump into the fray.

In 1983, Corrections Corporation of America (CCA), the first privately held prison corporation, was formed. Despite the historically low prison population, the government’s drug war prompted private industry to suddenly jump into the incarceration game. Today, CCA is a nearly $2 billion (and growing) corporation with more than 90,000 “beds” under its control.

Allowing for privatization of our prisons is one of the more egregious examples of how divorced our policymakers are from common sense in this country. The goal of a private penal corporation is to profit from high and extended rates of “occupancy.” (CCA literally speaks in these terms as though it was part of the hospitality industry.) The private prison lobby in America has pressured lawmakers over the years to maintain harsh minimum sentencing requirements as corporations have little financial incentive to encourage rehabilitation of prisoners. As far as the private prison industry is concerned, the only useful felon is one who is incarcerated, not reformed.

Reagan’s “war” saw a clean population getting hooked on drugs. During this “war,” rehabilitation was replaced with recidivism. Treatment was abandoned in favor of solitary confinement. Education was upended by “stop and frisk.” Prevention was sacrificed in the name of incarceration. The result? Half of all inmates today are in prison for drug-related crimes, of which 80 percent are related to possession of marijuana. To say the black community bore the brunt of this war is an understatement. To wit, more black American men are in the prison system today than there were slaves just prior to the Civil War. Present the statistics any way you please. There’s no pretty picture to paint. Black America is once again in chains.

The System
Each year, hundreds of thousands of “stop-and-frisk” acts are performed in black neighborhoods. They are rarely, if ever, conducted in white neighborhoods, office complexes or college campuses. Nevertheless, politicians point to the success of “stop and frisk” in the absolute number of people arrested for carrying drugs instead of the miniscule percentage of people found carrying drugs who were searched. I’m no mathematician, but logic would dictate that if you only stop and search people in black neighborhoods, then when you find drugs on someone the chances are that person is going to be black.

The reasoning behind “stop and frisk” is so specious and the process itself so unconstitutional it defies logic. And yet, it’s generally upheld in court. In 2012, 533,000 people were subjected to “stop and frisk” by the NYPD, according to the NY Civil Liberties Union (NYCLU). Once again, even though blacks comprise 25 percent of the city’s population, they made up 55 percent of those who were stopped and frisked.

Many officers are unhappy with the “stop-and-frisk” protocol but are caught up in the nightmare due to pressure that comes from the top. Recently, the New York Daily News reported on a case where NYPD Officer Pedro Serrano testified against the department after taping his supervisor, Deputy Inspector Christopher McCormack, telling him to target “male blacks. And I told you at roll call, and I have no problem [to] tell you this, male blacks 14 to 21.” These kinds of orders are not unique. They stem from quotas that are often handed down from the police brass. And officers such as Serrano who speak out against these practices are often shunned by their colleagues.

But wrestling with one’s conscience and struggling to maintain police quotas is nothing compared to the hell that awaits a young black man swept up into the web of “stop and frisk.” Once in court, the odds are stacked against him. In a recent conversation, Brewington described the harrowing the process of being caught by the police and ushered through the “system.”

Those with a prior arrest who are brought in on possession charges may meet an attorney such as Brewington in the holding cell. They’re actually one of the lucky ones, as a staggering number of accused felons make it all the way to sentencing in front of a judge without ever having spoken to an attorney. A far cry from what happens on TV. Brewington describes the encounter as something less than a conversation, as he advises his client to answer simply “yes” or “no” because everyone around him has an incentive to use his words against him in their own plea deal.

Time is of the essence, as he is typically carrying an offer from the D.A. that is set to expire quickly. Whether they want to go free is not a question he will raise. They’re in the system now. The only question is, how long? Risking an appearance in front of a jury means risking a much longer sentence.

“The fear is that you’re going to get a jury that’s really not of your peers,” says Brewington, who is loath to advise a jury trial. He says many of the young men he encounters “have not acquired the requisite skills to appear sympathetic” in front of a jury “that looks at you as though you must have done something wrong.”

The confusing whirlwind of circumstances between being frisked by law enforcement officials and accepting a plea deal is just the start, a piece of the legacy from Reagan’s “War on Drugs.”
But if Ronald Reagan was responsible for putting so many black people behind bars, it was Bill Clinton who was most responsible for keeping them there. In an effort to make Democrats appear “tough on crime,” the Clinton administration institutionalized punitive measures outside of the system, such as lifetime bans on some forms of welfare including access to food stamps, government jobs and public housing. Parolees, now branded as felons for life, were suddenly unable to leave their district while being forbidden from returning home, accessing food and gaining employment in the public sector.

“If the initiative is to eradicate the drug trade,” says Williams, the opposite occurred. “What you’re doing is inducing the necessary anger on the inside that will be accentuated when they come back. And the only thing that will accept them back is the game.”


Throughout the ’90s, recidivism spiked and parolees came face to face with President Clinton’s most punitive anti-crime measures—the “Three Strikes” rule and mandatory minimums. Under Clinton, life sentences were mandated for any third-time felon, or felon convicted of multiple counts, regardless of the nature or severity of the crime. Mandatory minimum sentences for even the lowest level drug offenders were implemented as outrage finally began to creep into American consciousness. Black churches and organizations were up in arms. Some judges resigned. Alexander even recounts the story of a notoriously harsh judge who wept when forced to hand down a 10-year sentence “for what appeared to be a minor mistake in judgment in having given a ride to a drug dealer for a meeting with an undercover agent.”

Beyond the practical hindrances a felon faces in attempting to re-enter society, there’s an emotional burden and stigma that is carried forever; a burden that extends to the family as well. Dr. Jeffrey Reynolds, president of the Long Island Council on Alcoholism and Drug Dependence, runs programs to counsel children of incarcerated parents. While their parents are on the inside, the kids “suffer guilt, shame and isolation,” says Reynolds, adding, “Seventy percent of kids of incarcerated parents, without intervention, wind up incarcerated themselves.” But he speaks to the effectiveness of intervention, saying, “None of our kids have been incarcerated. With a little bit of help and a little bit of energy, it makes a huge difference.”

Even those who are released carry with them the shame of having been on the inside and the painful memories that accompany incarceration. Horrifically, more than 70,000 prisoners are raped every year. Additionally, tens of thousands of prisoners are locked in solitary confinement at any given time in the United States, a punishment usually employed by totalitarian regimes that was all but outlawed in the United States prior to Reagan’s War on Drugs and the emergence of the modern prison industrial complex.

Nullification is a “Juror’s Prerogative”
“Unjust laws exist; shall we be content to obey them, or shall we endeavor to amend them, and obey them until we have succeeded, or shall we transgress them at once?”
—Henry David Thoreau, Civil Disobedience

You don’t have to agree that the “War on Drugs” was an intentional war on the poor, disenfranchised people of color in this country to understand that this was the result. Thinking, feeling people know these laws must be changed. And while we, as citizens, must indeed protest, engage in civil disobedience and write to Congress, there is more that can be done and it begins with understanding your rights.

In a New York Times op-ed last year, Alexander floated a question raised to her by a woman named Susan Burton. Her question was simple, but brilliant: What if there was a movement to convince “thousands, even hundreds of thousands, of people charged with crimes to refuse to play the game, to refuse to plea out?” Her supposition was that this would theoretically crash the criminal justice system. She’s right. But the risk would be enormous given the potential and very legal retribution the system provides for.

But if the black community is examining this option and weighing the risks of such a strategy, it is incumbent upon the white liberal community to do the same on the opposite side of the equation. In this scenario, African Americans have everything to lose and white people have nothing to lose. So to possess this knowledge, have nothing to lose and still refuse to be an “upstander” is to be silently complicit in modern-day slavery.

Most white Americans have only a casual relationship with our legal system. Their understanding of what is just and what is legal generally comes from watching television crime shows and movies. This is why most people have the impression that the sole responsibility of a juror is to deliver a verdict based upon legal facts and that his or her personal feelings of fairness and justice cannot be considered.

This is patently false.

If you manage to get by “voir dire,” the process of questioning jurors to sit for a particular trial, and are fortunate enough to be selected, you can participate in a revolutionary movement. You can hang a jury without ever having to explain why. Jurors such as this are referred to as “stealth jurors.” Quiet activists who are guided by conscience not convention, or as Fred Brewington says, “The jury becomes the advocate for society.”
But first, you have to be in the position to do so. The key to getting through voir dire is to answer honestly without revealing anything ideologically. There is a science to voir dire and cases are often determined by how adroit an attorney is at selecting a jury. So remember these simple facts:

1) Less is more: You cannot misrepresent yourself by exercising restraint during voir dire.
2) You are not the one on trial.
3) Your goal is to get on that jury.

Serving on a jury is tedious, time-consuming and may even be financially detrimental. There is nothing romantic about the inner-workings of our legal system, no matter how glorified it is on television. Moreover, only a handful of Americans will actually be selected for a trial that involves drug possession charges for the reasons I stated in the opening of this piece. The goal here is to make enough people aware that the reason our system was designed to have trials decided by a “jury of one’s peers” was to prevent unjust laws from unfairly condemning citizens to incarceration or any form of punishment.

Like I said, the chance of being picked for a jury that involves drug possession charges is extremely remote. But our ability to disseminate a simple message of civil obedience to encourage defiance in the face of injustice has never been greater. If millions of Americans know who Joseph Kony is and know how to dance “Gangnam-Style” then they can at least understand their legal right and moral obligation to hang a jury in the case of drug possession charges.

Twitter. Facebook. Smoke signals. Whatever your preferred method of communication, it’s time to spread the word and find the “one in twelve” willing to hang the jury.

 

This article is an excerpt from Jed Morey’s forthcoming book titled The Great American Disconnect: Five Fundamental Threats to our Republic

“America in Chains” Illustration by Jon Moreno
“Dissenting Juror” Illustration by Jon Sasala
“Hang The Jury” Video by Rashed Mian
www.hangthejury.com created by Michael Conforti

In Sprecher We Trust. Hopefully.

Jeffrey Sprecher built a better mousetrap. But a mousetrap big enough to catch a whale? Apparently so.

Jeffrey Sprecher built a better mousetrap. But a mousetrap big enough to catch a whale? Apparently so. Sprecher is the founder and president of the Intercontinental Exchange (ICE) based in Atlanta. For all practical purposes he is the poster-boy of electronic trading and the man responsible for the meteoric rise of commodities trading. He’s also about to become the owner of the New York Stock Exchange. Do I have your attention yet?

In little more than a decade the commodities market has gone from $10 billion– a speck on the trading horizon – to more than half a trillion dollars. Nathaniel Popper’s front-page story in the business section of the New York Times today pulls the veil back on Sprecher the man and describes how he grew a little-known southern exchange into a juggernaut capable of purchasing the vaunted New York Stock Exchange. As Popper himself writes, “It sounds preposterous.”

That’s because it is.

Popper’s piece brings forward a story that few people know. Most have no idea that trading exchanges are even for-profit businesses. And while he does a worthy job demystifying the business of exchanges he overlooks the planet-sized regulatory loopholes that allowed Sprecher to convert a small energy futures trading exchange into a global Franken-exchange that is buying the biggest, most well known exchange on Earth.

Sprecher was even a bridesmaid recently when he nearly scuttled a merger between the Chicago Board of Trade (CBOT) and the Chicago Mercantile Exchange by coming in with a higher bid for the CBOT. The Chicago trading establishment was so freaked out by Sprecher’s surprise bid that they put their legendary differences aside and came to a deal faster than might otherwise have occurred had he not been breathing down everyone’s neck.

Though he was unsuccessful in his last minute bid, Sprecher moved deftly like a great white shark through the rocky financial seas in search of his next prey. Never sleeping, always moving, forever hungry.

To call Sprecher an opportunist would be technically accurate but cheap and intellectually dishonest. He understood the inevitability of electronic trading and the superior potential it held. If the Bloomberg terminal revolution was in providing information quickly and precisely then the Sprecher ICE revolution was in giving traders (and the houses they worked for) the ability to act upon information in the same fashion. My criticism of Sprecher – and Popper for that matter – is the way in which the story of the ICE has come to be told and accepted.

Missing from the brief history of the ICE are the loopholes that gave it life and the ability to flourish beyond imagination. It was the oft-spoken of – but rarely understood – “Enron Loophole” that gave corporations the legal right to trade energy futures even if the corporation itself was in the business of energy. This is the simplest way to convey its net result. The second loophole (and more meaningful for the ICE) was a maneuver by the Bush administration that granted the ICE foreign status as an exchange despite being based in Atlanta. This initiated a massive shift of trading dollars, and influx of new ones, onto the ICE for one reason: this singular move placed the ICE outside the purview of U.S. regulators at the Commodities Futures and Trading Commission (CFTC). Essentially, corporations could now trade energy futures electronically through the ICE without oversight or disclosure.

Sprecher has often stated that one of the great benefits of electronic trading is its inherent transparency. Theoretically, performing trades between parties on a screen reduces the likelihood of transactions being rigged. In some ways he’s right. We are unlikely to witness an old school “corner” where one party dupes all others into trading with it until it controls the vast majority, or position, of the item being traded. Electronic trading moves too quickly and there are too many players involved. But speed does not imply market transparency and openness.

Moreover, the mere fact that the founding investors of the ICE are some of the world’s largest investment banks and oil companies (Morgan Stanley, Goldman Sachs and BP) speaks to how little transparency there truly is. The fact that some of these banks (Morgan Stanley in particular) own and control oil companies and oil companies operate trading desks outside U.S. jurisdiction demonstrates how little need there is for small-time corners. Why pull off a two-bit corner when you have already cornered the entire marketplace?

Now, as Sprecher prepares to close on this historic transaction, investors, citizens and the government are about to be one step further removed from any realistic shot at transparency and oversight.

This in no way takes away from Sprecher’s genius as a businessman. It simply illustrates how willfully ignorant we are to the business of Wall Street and therefore how frightfully far away we are from properly regulating it. Everything Sprecher has done is legal and ethical; to the extent there is an ethos on Wall Street. Where all of this hits home for the consumer is at places like the gas pump and supermarket. The most important and direct relationship most of us have to Jeff Sprecher’s mousetrap is the high cost of the gas we pump and food we consume. Banks and oil companies have a vested interest in Sprecher’s success and in increasing their own revenues. Both are perfectly, mutually aligned. So far they have been able to grow profits with alacrity, free from federal oversight and bolstered by our collective ignorance of the process.

We’ve all been caught in Jeffrey Sprecher’s mousetrap. Now the question is will he “catch and release” or dispose of us in search of his next conquest. I hope he’s as nice and down-to-earth as Popper suggests.

 

Image: From 2008 Long Island Press cover story explaining the rise of the ICE and how Morgan Stanley became one of the largest oil companies in the world. For more on this story view the video below:

 

Drone Strikes and the Definition of War

The legality of an unmanned drone strike is subordinate to the morality of it. Further, it challenges our ability to define war; somehow the connection between direct human action and murder codifies the nature of true conflict.

Marines are trained to fire in unison at the enemy. It erases individual culpability by establishing a psychological barrier between the shooter and the target. Sharing the responsibility for a “kill” assuages personal guilt and allows soldiers to better compartmentalize traumatic events, or so the theory goes.

 This type of rationalization is made even more powerful (or palatable) by the remoteness that unmanned aerial vehicles (UAVs), commonly known as “drones,” provide. For most of the past decade UAVs have hammered away at al-Qaeda and Taliban insurgents hiding in the mountainous terrain of Pakistan that borders Afghanistan. And though there was little, if any, talk of controversial drone strikes during the presidential election, the use of UAVs has reached a tipping point in global politics.

The legality of an unmanned drone strike is subordinate to the morality of it. Further, it challenges our ability to define war; somehow the connection between direct human action and murder codifies the nature of true conflict. The struggle to define this type of faceless modern warfare suggests that we are moving away from a discussion of immorality and toward amorality; exactly the point our democratic ideals of “purposeful” and defensive war devolves into outright nihilism.

The anonymity and precision of drone strikes uses our military resources efficiently while wreaking havoc on our enemies abroad. They also enable the United States to carry out an offensive in a country like Pakistan when we are technically not at war with its government. In fact, we are operating with its tacit approval. For now. But if every strike was carried out directly by human hands, there would be little doubt we are indeed at war as it is conventionally defined. Now, in its second term, the Obama administration is wrestling with whether to declassify the drone program that everyone already knows about because it would put us firmly at odds with international law.

Unmanned drones were conceived and perfected by the George W. Bush administration but they were used far more sparingly compared to the Obama administration. Terrorism, or the threat of it, continues to be the raison d’etat that justifies our aggression and the use of drones. In this, the administrations are aligned. A terrorist killed with little collateral damage and zero American bloodshed is enticing but illusory because the technology is portable and easily replicable. It will undoubtedly be developed and deployed by other nations free to define targets by their own standards.

The tacit approval of drones by the Pakistani government does not erase the fact that we are threatening our national security in the long run; we are establishing an international precedent that we will someday be forced to confront.

To begin, many of the militants we target abroad have sought refuge in other nations such as Yemen and Somalia. And our drones have followed. Yet if the government of Yemen, were it capable and so inclined, bombed a US-based manufacturing plant that produced parts for UAVs, they would technically be justified in doing so by our own standards. If China decided to send drones into Tibet, or if Russia targeted Georgia, the same logic would hold true.


The New York Times reporter Scott Shane revealed in an article Sunday concerns within the Obama administration over what they call an “amorphous” policy; this worry increased prior to the election for fear of leaving an open-ended policy to an incoming Romney administration. According to Shane, victory has allowed the White House to take its foot off of the accelerator for the moment, but it remains an important part of the president’s agenda.

But this kind of sudden realization that current policy might become permanent and out-of-control has become a troubling hallmark of the Obama presidency. Clear evidence of this is found in Obama’s refusal to fight the “indefinite detention” provision in the 2012 National Defense Authorization Act. Critics fear that the language of this provision was so murky that it theoretically gives the government license to detain American citizens without due process. Instead of eliminating this verbiage and the conflict that surrounds it, Obama attached a signing statement to the bill that directly addresses the detention provision and essentially says that while he is aware of the fear it engenders, he would never use it to detain a US citizen. The very existence of the signing statement, however, is an admission that it is indeed open to interpretation; future presidents are not bound to Obama’s statement, but the law itself.

Understanding the psychology of the Obama administration or establishing a clear policy regarding drone strikes ultimately does nothing to more clearly delineate the nature of modern, human-less aggression. Carl von Clausewitz, who contributed as much to the understanding of our relationship with war as any writer on the subject, suggests in his defining work, On War, published in 1832, that: “The act of War can only be of two kinds; either the conquest of some small or moderate portion of the enemy’s country, or the defence (sic) of our own until better times.”

This was a practical analysis befitting the times that endured to the end of the last millennium. It defined conflict between nations but not necessarily between enemies as they are presently constituted. Post-9/11 warfare has pitted America, which relies on borders and nationalism, against roving mercenaries whose only allegiance is to a higher authority we cannot overcome. Clausewitz allows for wiggle room in his conventional theory, however.

“The third case, which is probably the most common, is when neither party has anything definite to look for from the future when therefore it furnishes no motive for decision. In this case the offensive War is plainly imperative upon him who is politically the aggressor.”

President Obama appears to be hedging his bet by placing a chip on each of the cases above. Furthermore, his reliance upon UAVs is loosely justified by its purported success thus far. But it also presents a persistent and impossible conundrum that assails our conventional understanding of war.

Somehow in this mess, this fog of invisible war, we must extricate ourselves from establishing precedent before it hardens into accepted global policy. If not, this dangerous game of cat and mouse will haunt us as it disperses our enemies while strengthening their resolve. Only by bolstering ties and intelligence in this region through financial support and diplomatic incentives will we assemble a righteous strategy for the future. Moreover, a retreat from this policy preserves our right to punish our enemies authoritatively with the support of our allies, while regaining the moral high ground. 

To walk softly and carry a big stick implies restraint, and restraint implies strength and confidence. These are characteristics closer to what the president exudes, which begs the question as to why he has tethered himself to policies that are so cowardly.

Wall Street Regulation

Glass-Steagall has made somewhat of a comeback with help from the Occupy movement and rising political stars like Elizabeth Warren… The only two political insiders you won’t catch talking about reinstating Glass-Steagall both happen to be running for president.

Part 4 of the Special “Off The Reservation” Election Series in the Long Island Press.

The Banking Act of 1933, commonly known as Glass-Steagall, was established to tame the harmful speculative behavior of an industry run amok in the early part of the 20th century; behavior many observers at the time credited for the market crash that precipitated the Great Depression. For some, the repeal of Glass-Steagall, by the Gramm-Leach-Bliley Act of 1999, was the deathblow to financial prudence on Wall Street.

 In reality it was simply the formal recognition of careless financial practices that were largely in place already. Since the near-collapse of the banking industry in 2008, Glass-Steagall has made somewhat of a comeback with help from the Occupy movement and rising political stars like Elizabeth Warren, the former federal consumer protection advocate now running for Senate in Massachusetts. The only two political insiders you won’t catch talking about reinstating Glass-Steagall both happen to be running for president.

Wall Street reform is as important as it was in 2008 but both President Obama and Gov. Mitt Romney have taken great pains to avoid talking about it too much. For his part, President Obama seems content to rest on the laurels of the Dodd-Frank Act, Congress’s attempt to rein in Wall Street excess, which had enough support to pass but not enough to be properly funded or enforced. According to Romney’s platform, he would “Repeal Dodd-Frank and replace with streamlined, modern regulatory framework.” That’s the extent of his vision for the future of Wall Street according his platform. Ten words.

So while the rest of the country is suddenly talking about a law enacted almost 80 years ago, these guys aren’t going anywhere near it. The truth is, Wall Street reform and, more specifically Glass-Steagall, is more complicated, making it easy for Obama and Romney to be evasive.

So let’s answer two questions. What would actual Wall Street reform look like and what exactly was Glass-Steagall?

The purpose of the original act was to establish a barrier between traditional banks and the risk-taking investment firms, denying investment banks access to consumer deposits and secure, interest-bearing loans. The unwritten effect of Glass-Steagall, however, was to establish a culture of prudency in the consumer and business banking realm, leaving sophisticated professional investments to more savvy financiers who had the ability to calculate the inherent risk of a financial instrument. For decades to follow, the merits of Glass-Steagall would continue to be debated, but it nevertheless drew a marked distinction between the function of a consumer bank and an investment bank.

Today reinstating Glass-Steagall is a common rallying cry among those who decry the bad behavior of Wall Street. Its repeal has become the fulcrum of nearly every debate surrounding deregulation. Actually accomplishing this, of course, is easier said than done.

The best way to reconcile the debate over whether to reinstate Glass-Steagall is to appreciate that the culture of Glass-Steagall was more important than the act itself. Over time the restrictions placed on bankers under the act were chipped away, but the culture that governed the banking industry endured beyond its measures. Eventually, savvy bankers and politicians found ways to loosen its screws and interpret the act to their own benefit.

Don’t Just Blame Republicans

In 1978, President Jimmy Carter oversaw the passage of the International Banking Act, a bill that should probably receive as much, if not more attention than Gramm-Leach-Bliley. Essentially, the act allowed foreign banks or entities that engaged in “banking-like activities” to participate in domestic financial markets. For the first time, foreign investment firms were able to make competitive loans so long as they didn’t compete for consumer deposits; initially individual states could determine whether their regulatory structure could support this new activity. The government would go on to loosen restrictions governing the competition for consumer deposits and allowing bank holding companies to treat money markets like checking accounts.

In his book “End This Depression Now,” economist Paul Krugman argues that perhaps the most influential step with respect to the banking sector came with Carter’s passage of the “Monetary Control Act of 1980, which ended regulations that had prevented banks from paying interest on many kinds of deposits. Unfortunately, banking is not like trucking, and the effect of deregulation was not so much to encourage efficiency as to encourage risk taking.”

 By 1987 the bank holding companies, including foreign companies allowed to operate within the U.S. banking system, were granted access to mortgages to create a package of investments called mortgage-backed securities; the threshold for the amount of investing activity in instruments such as these was also increased, paving the way for the growth of investments backed by the strength (or weakness) of the consumer market.

During that same year, members of the Federal Reserve began calling for the repeal of Glass-Steagall as then-chairman Paul Volcker was providing the tie-breaking resistance. But this was a mere formality because by this time, Glass-Steagall was effectively over.

Yet even though most of the threads of regulation had been pulled from the overcoat that protected consumers from risky banking practices, the culture of prudent banking still existed to an extent; maintaining the Glass-Steagall Act on the books was an indication of this sentiment. Throughout the decades when regulations were steadily eroding, powerful national figures such as Paul Volcker under Carter and Reagan, and Treasury Secretary Nicholas Brady under George H.W. Bush managed to temper the enthusiasm of the movement.

That George Bush Senior heeded their admonitions was an admission that the public’s appetite for deregulation was actually beginning to wane in the post-Reagan hangover. Richard Berke’s New York Times article of Dec. 11, 1988, on the eve of the Bush presidency, encapsulated this feeling. Berke wrote, “Lawmakers and analysts say the pressure is fed by a heightened public uneasiness about deregulatory shortcomings that touch the daily lives of millions of Americans: from delays at airports and strains on the air traffic control system to the presence of hazardous chemicals in the workplace to worries about the safety of money deposited in savings institutions.” Alas, these four years would prove to be a momentary hiccup in the deregulation movement.

During the Clinton years, the nation’s leadership was largely comprised of proponents of deregulation. In fact, by his second term, Clinton was almost entirely surrounded by rabid free market enthusiasts. A former chairman at Goldman Sachs, Robert Rubin, was Secretary of the Treasury, Alan Greenspan was still at the helm of the Federal Reserve and Phil Gramm was the head of the powerful Senate Banking Committee. All of these men had close ties to Wall Street and made no secret of their intention to release bankers from the burdensome shackles of regulation and oversight.

Reforming Reform

In 2008, economist Joseph Stiglitz warned of the enduring negative consequences of deregulation. At a hearing held in front of the House Committee on Financial Services, Stiglitz invoked Adam Smith saying, “Even he recognized that unregulated markets will try to restrict competition, and without strong competition markets will not be efficient.” One of Stiglitz’s solutions was to restore transparency to investments and the markets themselves by restricting “banks’ dealing with criminals, unregulated and non-transparent hedge funds, and off-shore banks that do not conform to regulatory and accounting standards of our highly regulated financial entities.”

For emphasis he noted, “We have shown that we can do this when we want, when terrorism is the issue.”

Still, the nagging question remains as to what reform might look like. After all, not all deregulation is irresponsible. Most of the discussion in the media surrounding deregulation revolves around the concept that our banking institutions are “too big to fail.” Thus the rallying cry for reinstating Glass-Steagall and separating banks from investment banks. I’m in tepid agreement with the underlying principle, but the reality of the situation is far more complicated. The fact is banking has gone global and the deregulation genie is out of the bottle.

As I said earlier, Glass-Steagall was as much about instilling a culture of prudency to the banking world as it was about erecting a barrier between commercial banks and investment banks. Advocates like Elizabeth Warren like to say that prior to 1999 and the repeal of Glass-Steagall, the economy functioned through periods of both prosperity and recession since 1934 without the banking sector once collapsing. It’s a fair, but oversimplified assertion that overlooks the fact that Glass-Steagall was on a ventilator in 1978 and dead by 1980. A 30-year run of prosperity from 1978 to 2008, with a few brief recessions in between, is nothing to sneeze at.

Restoring balance to the banking sector does not necessarily require separating the banks. Not yet at least. It begins with transparency and reestablishing the culture of prudency that has been conspicuously absent over the past decade. After all, you cannot value what you cannot see; nor can you mitigate risk unless you first manage reward.

What this really boils down to is accountability, which is ultimately a behavioral issue. Allowing investors to actually see how a bank behaves by viewing the size and scope of their transactions would theoretically assuage their appetite for risk. Given these conclusions, it’s easier to make the case that our current president would provide more accountability and inspire behavioral changes on Wall Street, particularly given Romney’s intransigence when it comes to considering financial reform. But tough talk against Wall Street has all but disappeared from Obama’s rhetoric leaving little hope that a second term will elicit any further positive change. So this week, while neither man seems serious about financial reform, the status quo is better than further deregulation and letting bankers rule the roost.

Tie goes to the incumbent.

To Spend or Not To Spend

To examine the effect the stimulus had on the economy, it’s necessary to understand the economic philosophy behind it while parsing the figures. The conflict between Democrats and Republicans on this issue is largely a debate over the economic theories of two men: Milton Friedman and John Maynard Keynes.

Mitt Romney called it “the biggest, most careless one-time expenditure by the federal government in history.” Paul Ryan characterized it as “a case of political patronage, corporate welfare, and cronyism at their worst.”

“It” was the American Recovery and Reinvestment Act of 2009, colloquially known as the “Obama Stimulus.” The Republican narrative is that Americans would have been better off not taking on more debt and allowing the omniscient markets to work themselves out. (This argument was noticeably absent in 2008 when President George W. Bush signed a stimulus bill for more than $150 billion.) Before  Obama signed his stimulus bill into law, House Republicans had voted against it. Every single one of them. In the Senate, only three Republicans approved the bill.

So we know where the parties stood in 2009—pretty much where they stand today. Democrats largely believe that the stimulus prevented the complete, Depression-like collapse of the economy. Republicans believe it had no effect on the economy and, furthermore, the additional debt will be our ultimate undoing. Republicans are correct to say that the stimulus had few offsetting revenues and blew yet another enormous hole in the budget deficit. They did not make this argument, however, when our country decided to wage two decade-long wars abroad while simultaneously reducing taxes. But the reality of the unfunded stimulus expense exists. So the question remains: Did the stimulus work?

Both Democrats and Republicans point to FDR’s New Deal to answer this question historically. Republicans take the short view that FDR’s programs had little effect on the nation’s economy as the economy double-dipped in 1937. Democrats take the long view that this date coincided with the Roosevelt administration’s decision to back off federal spending and that a resurgence of federal funding ultimately mitigated the decline. There is general consensus that the tipping point that put the nation back on a path toward prosperity was World War II and the wartime economy. Despite this philosophical harmony, Republicans are still loath to admit that the top marginal income tax rate in 1941 was 81 percent, and by 1945 it was 94 percent. That’s how you pay for war.

So while it can be instructive to look back and apply historical lessons to the present, the picture is incomplete because the circumstances are vastly different. To examine the effect the stimulus had on the economy, it’s necessary to understand the economic philosophy behind it while parsing the figures. The conflict between Democrats and Republicans on this issue is largely a debate over the economic theories of two men: Milton Friedman and John Maynard Keynes.

Born in 1912, Friedman would come to be recognized as one of the great economic minds of the modern era. A Nobel Prize-winning economist who taught at the University of Chicago, Friedman held a wide range of core libertarian views and is often credited as one of the principals of the ideology. Throughout his career he argued the benefits of monetary policy and the folly of fiscal policy. Think TARP versus stimulus. In other words, maneuvering liquidity through the system in a centralized fashion was an appropriate measure of government intervention whereas providing government funding for programs via the Treasury was not.

This is not to say that Friedman would have approved of President George W. Bush’s TARP “bailout” of the banks (Friedman died in 2006 before the financial world came unraveled) or even of the Federal Reserve itself. In a perfect world, Friedman would have abolished the Federal Reserve altogether, which is a common rallying cry among Libertarians who also promote a return to the gold standard no matter how economically or politically impossible this would be. Again, the theory being that private markets would be more efficient, accurate and apolitical with respect to pegging the value of currency in real time.

But if Friedman’s economic policies have dominated the years since Gerald Ford was in the White House, it was English economist John Maynard Keynes who dominated the years prior, beginning in 1933 with his paper, “The Means to Prosperity.” Keynes’ recommendations for dealing with recessions and depressions would fundamentally alter Europe and America’s approach to the Great Depression. Keynes’ first assumption, considered revolutionary at the time, was called the “paradox of thrift.” Simply put, if businesses and consumers collectively tighten their belts during difficult times, the effect would be a downward spiral in the demand for goods and services.

Under Keynes’ theory, this self-perpetuating loop of plunging demand would necessarily result in a decline of both profitability and confidence. Keynes believed the antidote was government spending. Specifically, the further the funding went down the economic chain the better. Businesses and consumers, those with the greatest need for liquidity, were likely to circulate government funds through the economy faster than institutions such as banks that might be more prone to hold onto liquidity. The net result, due to what Keynes coined the “multiplier effect,” would be spending that works its way through the normal economic channels via the purchase of goods and services at the consumer level, labor and equipment at the business level.

A great deal of attention is paid to the short-term effects of spending on infrastructure as large public works projects during the Depression became the most visible and lasting testaments to Keynesian economy theory during the Roosevelt era. But many Keynesian theorists argue that these types of projects also contribute to the long-term health of the economy, with the best possible result being partnership with, and ultimately transition to, private industry. A great example of this is the Tennessee Valley Authority (TVA) established under FDR, which ultimately became a private utility. But long-term infrastructure projects don’t have the immediate effect of direct government spending at the bottom levels of the economy.

Larry Summers, the notoriously prickly economist, has had a remarkable career serving in both the Clinton and Obama administrations (Summers was Treasury Secretary briefly under Clinton) and as one-time president of Harvard University. Tapped to join Obama’s transition team, he is credited with determining the strategy for bailing out the faltering American economy. In his book, The New New Deal, Time magazine senior staffer Michael Grunwald writes, “At Brookings, [Summers] proposed a technocratic approach to Keynesian stimulus that has dominated the debate ever since. A stimulus package, he argued, should be timely, targeted, and temporary.”

This guiding philosophy would result in a three-tiered approach to Obama’s stimulus. The first would be accomplished through tax breaks for the vast majority of Americans. The second would be through entitlement spending such as extending unemployment benefits and prolonging health insurance coverage for laid-off workers. It also provided direct aid to states to help plug budget gaps to prevent the layoffs of teachers and reductions to Medicaid. The third was investment in programs deemed “shovel-ready.”

This last point is somewhat controversial because few, if any, infrastructure projects can begin work at a moment’s notice. But on this, Obama was clear that funds would be found to target America’s aging infrastructure and invest in new projects on the drawing board, even if their timetables weren’t immediate.

Keynesian economists such as Joseph Stiglitz quickly lauded Obama’s plan, though most of them  believed the $787 billion package was only about half of what was required to properly address the crisis. Another Keynes disciple, Nobel Prize-winning economist and columnist for The New York Times, Paul Krugman, has been extremely vocal that the stimulus, while swift and necessary, was “woefully inadequate.” Nearly everyone on Obama’s transition team would concur, but the thought of a stimulus package topping $1 trillion was politically radioactive. Besides, almost everyone involved at the time hoped for a second crack at stimulus funding in Obama’s first term. And while most of Obama’s political advisors understood how difficult this would be, no one could have predicted how hard the Republican Party was preparing to fight against any new proposal from the Democrats.

Perhaps the most astounding revelation in Grunwald’s book is how Obama’s inner circle — especially the most cynical among them like the explosive Rahm Emanuel or acerbic Larry Summers — understood that the package was political suicide. In fact, they were prescient in this regard as the stimulus provided the freshly-routed GOP with a rallying cry and a strategy to take back control of the House of Representatives during the 2010 mid-term elections.

In reality, the Recovery Act did more than just pump taxpayer dollars temporarily into the economy and drive up the national debt. It put federal funds into the hands of agencies and consumers who had the ability to spend them in a timely fashion. This came in the form of tax cuts for the middle class, an extension of unemployment benefits and medical coverage, state aid to support endangered Medicaid programs, healthcare and student loans. It was the ultimate return to Keynesian philosophy.

Opposition to blanket stimulus funding isn’t fundamentally misguided. After all, no government can sustain unlimited subsidies without someday having to recoup these costs. This brings us to the second half of Keynes’ theory. If the government is supposed to aid a recovery during a recession by pouring funds through the economy, then it must likewise increase revenue during the boom times that follow. There are only two ways to do this: raise taxes or cut spending. Or both. The problem is that we haven’t meaningfully done either in decades.

While cutting spending is very much a part of the Republican narrative, increasing taxes is anything but. In a perfect world of no government intervention or regulation, the markets would simply figure it out and restore balance because recessions and depressions are, after all, bad for business in the long run. Having said that, this type of “boom and bust” behavior creates great potential short-term benefits, as volatility is a savvy investor’s best friend. But Keynes never meant to eliminate the boom and bust nature of the economy. His policies were intended to mitigate the depths and the peaks.

Shedding all government spending and letting the markets work it out was precisely the advice President Hoover received from Treasury Secretary Andrew Mellon after the market crashed in 1929. Hoover didn’t actually follow his advice. Instead, he set in motion many of the public works projects and federal spending plans continued by Franklin Roosevelt. The Depression was hung around Hoover’s neck in part because he chose to portray an aura of calm and confidence even though Rome was indeed burning.

Hoover fought vigorously behind the scenes for some of the programs that would make Roosevelt one of the most popular presidents of all time. Hoover’s biggest problem was actually Roosevelt. Because Hoover rarely took the opportunity to point out that the economy collapsed as a result of his predecessor’s policies and then failed to defend himself against Roosevelt’s subsequent attacks, he became unfairly synonymous with the Great Depression. This little bit of history was not lost on Obama.

Today, comparisons abound between the circumstances surrounding both the Great Depression and the (dare I say) current depression. Politicians and historians will forever debate their similarities and how they both arrived. But there are also current comparisons we can draw relating to Keynes’ paradox. In Europe today, where austerity is the mainstay of the economic recovery attempt, unemployment remains untenably high. In both Spain and Greece it hovers around a bruising 24 percent. Before the stimulus, the US economy shed 800,000 jobs in January of 2009 and GDP growth was negative. Since the beginning of 2010 America has added an average of 143,000 jobs every month and experienced positive GDP growth, although everyone acknowledges it’s a slog. But this kind of forward momentum amply defends the stimulus.

Beyond facts and figures, be sure to listen closely for what you cannot hear. Perhaps the most incredible aspect of the stimulus is the lack of fraud associated with the spending. The oversight has been so rigorous and the process so astoundingly transparent that almost no one is crying foul at the veracity of the disbursements. Instead, opponents gnash their teeth and shout at the rain about Solyndra, the failed California solar plant manufacturer, at every turn. And that’s about it. Forget the fact that the mechanism for funding Solyndra was established in 2005 and Solyndra was selected to participate in the program in 2007; if opponents of the stimulus want to make this their Alamo, so be it. Out of nearly $800 billion invested, one failed solar manufacturer is all you’ve got? Even Bain Capital would have relished this level of success.

So, did it work? I side with Krugman. The answer is that the stimulus package was a good start, but it should have been bigger. Nearly all of those involved in creating the stimulus recognized at the time that it would prevent catastrophe but fall short of prosperity. Unfortunately, our politics are so poisoned today that uttering the phrase, “should’ve been bigger,” is truly the third rail. There is no more room for a reasoned debate in America. But the fact remains that without the stimulus several state budgets would have collapsed, all but bankrupting Medicaid, far more roads and bridges would have fallen further into disrepair, middle-class Americans would have had less in each paycheck and millions more people would have fallen off of the unemployment rolls and into poverty.

All told, Ryan’s claims of  “patronage” and “cronyism” fell apart the moment he lobbied to divert federal funds to his district; Romney’s claim that the stimulus was “careless” underscores either a deep misunderstanding of the shrewd, tactical and successful nature of the program or a further illustration of his belief that no person, corporation or municipality deserves financial support, even under the most severe economic circumstances. Romney’s recent disdainful comments about “47 percent of Americans” may give weight to the latter sentiment, which should give us all pause.

Fracking: The Ultimate Scam Revealed

By touting natural gas as the clean-burning fossil fuel that is cheaper to use and helps reduce our dependence on foreign oil, the industry has nailed the PR trifecta: cheaper, cleaner and patriotic.

gas mask hydrofrackingOne of the great joys of writing, as in science, is the accidental discovery. To wit: penicillin. And while this entry hardly ranks near Alexander Fleming’s pharmaceutical breakthrough, it does relieve a particular itch that has been nagging my brain. For months I have been vexed by the discrepancy in pricing between crude oil and natural gas. (Wait, I know how tedious commodities can be but I promise you this column is worth sticking with.) Unable to settle on any fundamental market-based explanation, I placed the issue on the mental backburner. It was only when I decided to update a series of articles on the role of speculation in the commodities markets that I happened upon the most plausible solution to this puzzle.

First, a little context. Over the past couple of years New York State has been flirting with the idea of hydraulic fracturing, or “fracking.” The discovery of enormous pockets of natural gas in the Marcellus Shale formation that runs from West Virginia, Pennsylvania and New York to as far as Ohio, has led to a modern-day gold rush in the region, with Pennsylvania several years ahead of New York. While the gas has always been there, it wasn’t until the turn of the millennium when controversial chemical enhancements invented by Halliburton were added to a difficult horizontal drilling technique that accessing this gas became feasible.

Almost immediately, however, environmental concerns began to mount. Stories of contaminated groundwater, intense air pollution and, most recently, a ruptured fault line and mini-earthquake in Youngstown, Ohio, on Dec. 31, have begun leaking into public consciousness. Gasland, a documentary by Josh Fox, increasingly agitated environmental organizations, and high-profile activists such as actor Mark Ruffalo have helped fracking reach the tipping point in the media. Once seen as a panacea for rural land owners in depressed parts of the country, fracking has become a pariah in the environmental community, setting the stage for yet another battle between the oil and gas industry and environmentalists. Caught in the middle of the entire fiasco at the moment is Gov. Andrew Cuomo, who is cautiously moving toward legalizing fracking in New York, though his public reticence highlights how tenuous this decision truly is.

Early on, I came down firmly against fracking in New York, and the Long Island Press was in the vanguard of reporting on it downstate. So I’m on record quite clearly as to why I believe fracking to be a disaster for New York, or anywhere else for that matter. No need to rehash this position. Still, one piece of the broader issue was missing—until now.

Here’s the issue: Fracking is expensive. The prolonged low market price of natural gas is the most logical deterrent to increasing drilling because it barely pays to pull the gas out of the ground. Moreover, the U.S. Energy Information Administration projects that natural gas demand in the United States should rise only 11 percent over the next 25 years compared to a projected rise of more than 300 percent in China over the same period.

Here’s where the market rationale gets murky. Analysts point to increased demand for fossil fuel in developing economies as the primary reason behind the steady rise in oil prices. Goldman Sachs’ most recent forecast of Brent Crude Oil, commonly known as “sweet light crude,” is $120 a barrel for 2012, with most market analysts following suit. A weak dollar, the ongoing crisis and uncertainty in the Eurozone, a burgeoning conflict between the U.S. and Iran, and continued growth in China, India and Brazil are the oft-given reasons behind these prognostications.

Historically, natural gas and oil prices have generally moved in tandem, and with natural gas gaining momentum as the fossil fuel of choice, it only makes sense that they would continue their mirrored trajectory. Instead, the opposite has occurred. Crude oil remains stubbornly high and creeping ever higher while natural gas remains depressed.

A closer look reveals that the world has record stockpiles of both fuels, and has developed incredible potential for new sources such as the Marcellus Shale play or the tar sands in Canada. Then there are the yet-to-be-developed fields in Iraq that, according to the New York Times, are “expected to ramp up oil production faster than any other country in the next 25 years, with a capacity…more than traditional leaders like Saudi Arabia.” Or, if you prefer, the real reason we went to war in Iraq.

Excess supply, new discoveries, and sluggish demand—and yet only natural gas is acting appropriately in the markets. This behavior is undeniable proof that the invisible hand of speculation is at work, which naturally begs the question as to why traders would suppress the price of gas but not oil.

For this answer we must turn back the clock once again and revisit several acts in Congress over the past two decades that made it possible for banks to merge with investment banks and trade commodities without limits and without transparency. Much of this trading is done on the Intercontinental Exchange, a trading platform that was founded and owned by Morgan Stanley, Goldman Sachs and BP. When you understand that markets today are dominated by investment banks and oil companies, who are at times one in the same (Morgan Stanley’s direct holdings in oil companies, fossil fuel infrastructure and transportation companies make it one of the largest oil companies in America), it is possible to fully comprehend the psychology behind natural gas pricing. Oil companies and investment banks have the ability to move the market by forecasting prices and investing in their own products through opaque exchanges that they own, so no matter where prices are they are making money.

Now you’re ready for the secret behind the fracking con job.

As previously mentioned, domestic natural gas is difficult to procure. The process is devastating to human health and the environment, and the effects are irreversible. To gain momentum and influence public opinion, the oil and gas companies have launched an ingenious propaganda assault on America. By touting natural gas as the clean-burning fossil fuel that is cheaper to use and helps reduce our dependence on foreign oil, the industry has nailed the PR trifecta: cheaper, cleaner and patriotic. And with an earnest pitchman like T. Boone Pickens, who wouldn’t believe it?

The problem is none of the above is true. First, natural gas might burn cleaner than oil but the process to extract it is so harmful it doesn’t matter. And second, because the same companies who are in control of the product are in control of the pricing, once they sew up the drilling rights they can simply jack up the price. This leaves the final argument that is wrapped in the American flag and served with a side of apple pie: reducing dependence on foreign oil for the sake of the union.

For the truth, let’s check in with the rest of the world to see what they say. (This was the happy accident that prompted this column.)

According to India’s leading daily business newspaper, the Business Standard, “the increasing shale gas production in the U.S. has led to a surplus, likely to increase in the coming years. The U.S. is, therefore, eyeing export to countries like China, Japan, Korea and India… In the past, the U.S. has been an importer of gas.” The article goes on to quote A. K. Balyan, chief executive officer of Petronet LNG, India’s largest liquefied natural gas importer, who states, “With an increase in U.S. gas production, the gas receiving terminals need to be converted to exporting terminals.”

Ta-dah!

The average life of a fracking site is seven years. At best. The environmental and human health catastrophe is forever. All of the current talk of job creation and reducing dependence on oil is a sham. Our natural gas stockpiles are higher than ever and the demand for natural gas, by our own country’s admission, will remain basically flat until 2035. The oil and gas companies are planning to export gas from the Marcellus Shale region to the same developing economies we’re supposed to be competing against. How’s that for homeland security?

The real insult? American oil and gas companies are willing to risk the health and welfare of our own citizens by fracking on our land in order to export fuel they claim is more beneficial to the environment. Normally, our companies are busy screwing up other countries in pursuit of their natural resources for our own consumption. As if this isn’t bad enough, they are finally committing the cardinal sin of shitting where they eat.

Let’s do the right thing for once: Ban fracking now. There’s no other way.

JANUARY 11th – FINAL DAY FOR PUBLIC COMMENT ON DEC WEBSITE. CLICK HERE

Main Photo Image: Photograph from AP. April 22, 1970, the first Earth Day.
Long Island Press cover image. Original art by Jon Sasala
T. Boone Pickens
. AP Photo.

This article was published in the January 5th, 2012 edition of the Long Island Press.

The Empire Strikes Back

My election plans almost went off without a hitch with my posterior comfortably settled into the perfectly formed groove in the corner of my couch. Beside me, my wife, my home phone, my BlackBerry, my laptop, my Blue Point Toasted Lager, a bowl of popcorn and a dog with a broken leg and a cast the size of his body (long story) were all neatly in their places for the evening. The only thing missing from my election night space capsule was a pair of Depends. Everything was perfect except for one detail. By the time all of my communication devices were fired up and News12 was tuned in, it was 10 minutes past 9 p.m. Ten minutes from the moment the polls closed throughout New York. Sometime during those 600 seconds I missed the gubernatorial election.

The New York Times had declared Andrew Cuomo our next Governor at 9:01 p.m. with Newsday following suit six minutes later. At 9:10 p.m. I felt like the last lonely boy invited to the dance.

My faithful reporters were on hand at candidate headquarters from Islandia to Manhattan busily reporting, tweeting, blogging and conferring with one another while I stared absently at the computer, my wife stared absently at me, and the dog stared absently at the enormous cast on his leg. By the time I recovered and touched base with the first of our reporters, Sen. Chuck Schumer was giving his victory speech and national pundits were talking about the overwhelming message delivered to Barack Obama and attempting to quantify the “Tea Party Effect.”

All of this with zero precincts reporting in from any Board of Elections in the state.

Life is moving too quickly, and frankly I’m not sure what to make of it. Earlier in the evening I was on a dinner date with my 7-year-old daughter. In between talking about school, friends and funny things her little sister says, I mentioned that it was Election Day again and Daddy would be up late talking to his friends from work. She knows Daddy likes Election Day. But when I mentioned this, a perplexed look came over her face and she asked me, quite casually, “Is it time to get rid of Barack Obama already? Hasn’t it only been like two years?”

Did I mention she’s 7?

Stunned, I sat back in my chair and stared absently at her inquiring little face, and tried to formulate a cogent response. (Little did I know my absent expression would return so frequently throughout the night.) Collecting myself, I stammered through some benign, meandering explanation of federal and state governments, election cycles and the importance of voting. Then I gave her a stern look and said emphatically, “And by the way, we don’t ‘get rid’ of our elected officials, young lady. We need to have more respect for our public servants than to talk of discarding them so callously—irrespective of your opinion of them.” One day, of course, she will question everything I ever told her after she’s dug up yellowed copies of the Long Island Press and perused my vituperative political diatribes. She has plenty of time to reach the jaded pinnacle of life her father occupies now. Until then she should breathe deeply because the air is as thin up here as my patience.

Where was I? Right, 9:30 p.m. Since the world had careened by me in the past half hour and I could only bog down our reporters with inane questions, I settled into my normal caveman routine, obsessively navigating BOE websites and watching television coverage. Since candidates were declaring victory before any votes were tabulated, I assumed the new voting machines were so stealth they auto-tweeted the results and bypassed the media. The only thing left was to watch the flurry of victory and concession speeches, and call it a night.

And then the waiting began.

Sometime in the 11 o’clock hour, after watching the News12 anchors stumble through the broadcast—despite the valiant attempts of the field reporters, Jerry Kremer and Mike Dawidziak, to salvage it—my wife gave up and went to bed. My phone stopped ringing and e-mails ceased shortly thereafter. Even the dog limped away from me and fell asleep somewhere around midnight. By 1:30 a.m. the results were still trickling in with some local and statewide candidates declaring victory; others would have to wait a few more hours or even a lengthy recount. Either way, the early evening mania was a distant memory by this time and no one seemed to know why the results were taking so long.

Much of the uncertainty was put to rest today, and there were few surprises. New Yorkers thought better of Carl Paladino and otherwise returned to their pre-Obama voting habits, complete with the state Senate delegation (almost) back in Republican control. Democrats and moderate Republicans outside of New York were abused, and the House tipped dramatically to the right, while Senate Dems held on for dear life. The real story is the Tea Party newbies and whether anger-fueled rhetoric will convert to policy and reform, or wind up in gridlock and rancor. My guess is the latter because Washington D.C., is about to be overtaken by too many rookie politicians who are probably mouthing Robert Redford’s immortal words from The Candidate: “What do we do now?”

Taking Climate's Temperature

timemagcover1
This Guy's Really Smart and Really Old

This weekend’s NY Times Magazine cover story is about the oft-quoted Freeman Dyson, a scientist who has come to embody the anti-climate change argument. The problem he presents is that he has typically been regarding as a leftist politically. Thus he’s quite the quagmire for global warming theorists because he feels as though it has been blown out of perspective; a dramatic departure from what one would assume his political and emotional leanings are. At 85 and after a brilliant career in the sciences, Dyson is in danger of being painted as a single issue mad scientist because of his feelings and the attention the global warming movement has garnered. He also seems old enough and personally comfortable enough not to really give a shit. It looks rather freeing quite honestly.

(He’s the opposite of my recollection of meeting Dr. Atkins at a dinner party just a couple of years before he passed away. When introduced to him I make the immature mistake of saying something like, “oh, the man behind the diet!” I can tell you that this is not how he wanted to be remembered. But I digress…)

I happen to believe that humans are having a significant impact on climate. But I wouldn’t want to debate Dyson. I’m not equipped to have this scientific conversation but I do have eyes, senses and a memory. I can see that my immediate world looks different than I remember as a kid. Trees struggling to determine when to bud, geese hanging in for prolonged winter stretches, fucked up storm patterns and fewer snow days. Things are, well, different. It’s difficult to determine who is in the right scientifically but frankly I think the argument itself is a waste of time. For some reason the very topic of global warming sparks debate and polarizes an otherwise important discussion.

The biggest boost to green living and the climate change movement would be to stop focusing on it. To steer the debate away from scientific theories of rising oceans and dying polar bears and focus it on the tangible aspects of the problem. Asthma, increased cancer rates and the deteriorating health of our children can be linked to poor air quality from dirty manufacturing processes, a poisonous food supply, backwards farming practices, the disappearance of important ecologies, and more. These are similar, if not identical, agents of global warming. Sick children are hard to argue against. Nothing against the polar bears but my marketing instincts tell me that Rush Limbaugh, Al Gore, Kim Jong Il and Freeman Dyson would all agree that we have compromised our children’s ability to live natural and healthy lives.

Sometimes it not what you say. It’s just how you say it.