Daily Archives: 2011.04.2 01:45

What Happened to the American Declaration of War?

Another outstanding article from George Friedman which touches on some of the deeper aspects of American governance with regard to the current engagement in Libya.

By George Friedman

In my book “The Next Decade,” I spend a good deal of time considering the relation of the American Empire to the American Republic and the threat the empire poses to the republic. If there is a single point where these matters converge, it is in the constitutional requirement that Congress approve wars through a declaration of war and in the abandonment of this requirement since World War II. This is the point where the burdens and interests of the United States as a global empire collide with the principles and rights of the United States as a republic.

World War II was the last war the United States fought with a formal declaration of war. The wars fought since have had congressional approval, both in the sense that resolutions were passed and that Congress appropriated funds, but the Constitution is explicit in requiring a formal declaration. It does so for two reasons, I think. The first is to prevent the president from taking the country to war without the consent of the governed, as represented by Congress. Second, by providing for a specific path to war, it provides the president power and legitimacy he would not have without that declaration; it both restrains the president and empowers him. Not only does it make his position as commander in chief unassailable by authorizing military action, it creates shared responsibility for war. A declaration of war informs the public of the burdens they will have to bear by leaving no doubt that Congress has decided on a new order — war — with how each member of Congress voted made known to the public.

Almost all Americans have heard Franklin Roosevelt’s speech to Congress on Dec. 8, 1941: “Yesterday, Dec. 7, 1941 — a date which will live in infamy — the United States of America was suddenly and deliberately attacked by naval and air forces of the Empire of Japan … I ask that the Congress declare that since the unprovoked and dastardly attack by Japan on Sunday, Dec. 7, a state of war has existed between the United States and the Japanese Empire.”

It was a moment of majesty and sobriety, and with Congress’ affirmation, represented the unquestioned will of the republic. There was no going back, and there was no question that the burden would be borne. True, the Japanese had attacked the United States, making getting the declaration easier. But that’s what the founders intended: Going to war should be difficult; once at war, the commander in chief’s authority should be unquestionable.

Forgoing the Declaration

It is odd, therefore, that presidents who need that authorization badly should forgo pursuing it. Not doing so has led to seriously failed presidencies: Harry Truman in Korea, unable to seek another term; Lyndon Johnson in Vietnam, also unable to seek a new term; George W. Bush in Afghanistan and Iraq, completing his terms but enormously unpopular. There was more to this than undeclared wars, but that the legitimacy of each war was questioned and became a contentious political issue certainly is rooted in the failure to follow constitutional pathways.

In understanding how war and constitutional norms became separated, we must begin with the first major undeclared war in American history (the Civil War was not a foreign war), Korea. When North Korea invaded South Korea, Truman took recourse to the new U.N. Security Council. He wanted international sanction for the war and was able to get it because the Soviet representatives happened to be boycotting the Security Council over other issues at the time.

Truman’s view was that U.N. sanction for the war superseded the requirement for a declaration of war in two ways. First, it was not a war in the strict sense, he argued, but a “police action” under the U.N. Charter. Second, the U.N. Charter constituted a treaty, therefore implicitly binding the United States to go to war if the United Nations so ordered. Whether Congress’ authorization to join the United Nations both obligated the United States to wage war at U.N. behest, obviating the need for declarations of war because Congress had already authorized police actions, is an interesting question. Whatever the answer, Truman set a precedent that wars could be waged without congressional declarations of war and that other actions — from treaties to resolutions to budgetary authorizations — mooted declarations of war.

If this was the founding precedent, the deepest argument for the irrelevancy of the declaration of war is to be found in nuclear weapons. Starting in the 1950s, paralleling the Korean War, was the increasing risk of nuclear war. It was understood that if nuclear war occurred, either through an attack by the Soviets or a first strike by the United States, time and secrecy made a prior declaration of war by Congress impossible. In the expected scenario of a Soviet first strike, there would be only minutes for the president to authorize counterstrikes and no time for constitutional niceties. In that sense, it was argued fairly persuasively that the Constitution had become irrelevant to the military realities facing the republic.

Nuclear war was seen as the most realistic war-fighting scenario, with all other forms of war trivial in comparison. Just as nuclear weapons came to be called “strategic weapons” with other weapons of war occupying a lesser space, nuclear war became identical with war in general. If that was so, then constitutional procedures that could not be applied to nuclear war were simply no longer relevant.

Paradoxically, if nuclear warfare represented the highest level of warfare, there developed at the lowest level covert operations. Apart from the nuclear confrontation with the Soviets, there was an intense covert war, from back alleys in Europe to the Congo, Indochina to Latin America. Indeed, it was waged everywhere precisely because the threat of nuclear war was so terrible: Covert warfare became a prudent alternative. All of these operations had to be deniable. An attempt to assassinate a Soviet agent or raise a secret army to face a Soviet secret army could not be validated with a declaration of war. The Cold War was a series of interconnected but discrete operations, fought with secret forces whose very principle was deniability. How could declarations of war be expected in operations so small in size that had to be kept secret from Congress anyway?

There was then the need to support allies, particularly in sending advisers to train their armies. These advisers were not there to engage in combat but to advise those who did. In many cases, this became an artificial distinction: The advisers accompanied their students on missions, and some died. But this was not war in any conventional sense of the term. And therefore, the declaration of war didn’t apply.

By the time Vietnam came up, the transition from military assistance to advisers to advisers in combat to U.S. forces at war was so subtle that there was no moment to which you could point that said that we were now in a state of war where previously we weren’t. Rather than ask for a declaration of war, Johnson used an incident in the Tonkin Gulf to get a congressional resolution that he interpreted as being the equivalent of war. The problem here was that it was not clear that had he asked for a formal declaration of war he would have gotten one. Johnson didn’t take that chance.

What Johnson did was use Cold War precedents, from the Korean War, to nuclear warfare, to covert operations to the subtle distinctions of contemporary warfare in order to wage a substantial and extended war based on the Tonkin Gulf resolution — which Congress clearly didn’t see as a declaration of war — instead of asking for a formal declaration. And this represented the breakpoint. In Vietnam, the issue was not some legal or practical justification for not asking for a declaration. Rather, it was a political consideration.

Johnson did not know that he could get a declaration; the public might not be prepared to go to war. For this reason, rather than ask for a declaration, he used all the prior precedents to simply go to war without a declaration. In my view, that was the moment the declaration of war as a constitutional imperative collapsed. And in my view, so did the Johnson presidency. In hindsight, he needed a declaration badly, and if he could not get it, Vietnam would have been lost, and so may have been his presidency. Since Vietnam was lost anyway from lack of public consensus, his decision was a mistake. But it set the stage for everything that came after — war by resolution rather than by formal constitutional process.

After the war, Congress created the War Powers Act in recognition that wars might commence before congressional approval could be given. However, rather than returning to the constitutional method of the Declaration of War, which can be given after the commencement of war if necessary (consider World War II) Congress chose to bypass declarations of war in favor of resolutions allowing wars. Their reason was the same as the president’s: It was politically safer to authorize a war already under way than to invoke declarations of war.

All of this arose within the assertion that the president’s powers as commander in chief authorized him to engage in warfare without a congressional declaration of war, an idea that came in full force in the context of nuclear war and then was extended to the broader idea that all wars were at the discretion of the president. From my simple reading, the Constitution is fairly clear on the subject: Congress is given the power to declare war. At that moment, the president as commander in chief is free to prosecute the war as he thinks best. But constitutional law and the language of the Constitution seem to have diverged. It is a complex field of study, obviously.

An Increasing Tempo of Operations

All of this came just before the United States emerged as the world’s single global power — a global empire — that by definition would be waging war at an increased tempo, from Kuwait, to Haiti, to Kosovo, to Afghanistan, to Iraq, and so on in an ever-increasing number of operations. And now in Libya, we have reached the point that even resolutions are no longer needed.

It is said that there is no precedent for fighting al Qaeda, for example, because it is not a nation but a subnational group. Therefore, Bush could not reasonably have been expected to ask for a declaration of war. But there is precedent: Thomas Jefferson asked for and received a declaration of war against the Barbary pirates. This authorized Jefferson to wage war against a subnational group of pirates as if they were a nation.

Had Bush requested a declaration of war on al Qaeda on Sept. 12, 2001, I suspect it would have been granted overwhelmingly, and the public would have understood that the United States was now at war for as long as the president thought wise. The president would have been free to carry out operations as he saw fit. Roosevelt did not have to ask for special permission to invade Guadalcanal, send troops to India, or invade North Africa. In the course of fighting Japan, Germany and Italy, it was understood that he was free to wage war as he thought fit. In the same sense, a declaration of war on Sept. 12 would have freed him to fight al Qaeda wherever they were or to move to block them wherever the president saw fit.

Leaving aside the military wisdom of Afghanistan or Iraq, the legal and moral foundations would have been clear — so long as the president as commander in chief saw an action as needed to defeat al Qaeda, it could be taken. Similarly, as commander in chief, Roosevelt usurped constitutional rights for citizens in many ways, from censorship to internment camps for Japanese-Americans. Prisoners of war not adhering to the Geneva Conventions were shot by military tribunal — or without. In a state of war, different laws and expectations exist than during peace. Many of the arguments against Bush-era intrusions on privacy also could have been made against Roosevelt. But Roosevelt had a declaration of war and full authority as commander in chief during war. Bush did not. He worked in twilight between war and peace.

One of the dilemmas that could have been avoided was the massive confusion of whether the United States was engaged in hunting down a criminal conspiracy or waging war on a foreign enemy. If the former, then the goal is to punish the guilty. If the latter, then the goal is to destroy the enemy. Imagine that after Pearl Harbor, FDR had promised to hunt down every pilot who attacked Pearl Harbor and bring them to justice, rather than calling for a declaration of war against a hostile nation and all who bore arms on its behalf regardless of what they had done. The goal in war is to prevent the other side from acting, not to punish the actors.

The Importance of the Declaration

A declaration of war, I am arguing, is an essential aspect of war fighting particularly for the republic when engaged in frequent wars. It achieves a number of things. First, it holds both Congress and the president equally responsible for the decision, and does so unambiguously. Second, it affirms to the people that their lives have now changed and that they will be bearing burdens. Third, it gives the president the political and moral authority he needs to wage war on their behalf and forces everyone to share in the moral responsibility of war. And finally, by submitting it to a political process, many wars might be avoided. When we look at some of our wars after World War II it is not clear they had to be fought in the national interest, nor is it clear that the presidents would not have been better remembered if they had been restrained. A declaration of war both frees and restrains the president, as it was meant to do.

I began by talking about the American empire. I won’t make the argument on that here, but simply assert it. What is most important is that the republic not be overwhelmed in the course of pursuing imperial goals. The declaration of war is precisely the point at which imperial interests can overwhelm republican prerogatives.

There are enormous complexities here. Nuclear war has not been abolished. The United States has treaty obligations to the United Nations and other countries. Covert operations are essential, as is military assistance, both of which can lead to war. I am not making the argument that constant accommodation to reality does not have to be made. I am making the argument that the suspension of Section 8 of Article I as if it is possible to amend the Constitution with a wink and nod represents a mortal threat to the republic. If this can be done, what can’t be done?

My readers will know that I am far from squeamish about war. I have questions about Libya, for example, but I am open to the idea that it is a low-cost, politically appropriate measure. But I am not open to the possibility that quickly after the commencement of hostilities the president need not receive authority to wage war from Congress. And I am arguing that neither the Congress nor the president have the authority to substitute resolutions for declarations of war. Nor should either want to. Politically, this has too often led to disaster for presidents. Morally, committing the lives of citizens to waging war requires meticulous attention to the law and proprieties.

As our international power and interests surge, it would seem reasonable that our commitment to republican principles would surge. These commitments appear inconvenient. They are meant to be. War is a serious matter, and presidents and particularly Congresses should be inconvenienced on the road to war. Members of Congress should not be able to hide behind ambiguous resolutions only to turn on the president during difficult times, claiming that they did not mean what they voted for. A vote on a declaration of war ends that. It also prevents a president from acting as king by default. Above all, it prevents the public from pretending to be victims when their leaders take them to war. The possibility of war will concentrate the mind of a distracted public like nothing else. It turns voting into a life-or-death matter, a tonic for our adolescent body politic.

This report is republished with permission of STRATFOR.

Cloning: Not a viable business model

There has been a bit of talk over the last few decades about cloning technologies and the idea that we are technically capable of human cloning at the present time. One way of generating public interest in the mass media when there isn’t much to talk about is to resort to the scary-technology-future schtick. While the achievement of human cloning is noteworthy from a technical standpoint, visions of a eutopian/nightmare scenario in which vast numbers of human clones are produced to further a societal, military or economic end are simply not based in reality.

Humans have evolved the unique ability to adapt our environments to ourselves. This is the opposite of what other organisms have evolved to be capable of. That capability is built on the back of a number of significant human traits. To name a few: opposable thumbs, high intelligence, conscious imagination, multiple-layered memory, the ability to codify and classify our imaginings, complex inter-organism communications, high-order emotional responses and memory, and the critical ability to self-organize into super-organisms. It reads a bit like a product feature list, or more interestingly, a list of Unix-style package dependencies. There is no single trait which can grant the ability to do what humans spend most of their time doing, and there is no magic formula which can accurately model and explain human behavior.

The evolutionary pressures necessary to produce humanity in its present form are varied, complex and largely unknowable at the present time. That humans have ultimately come out of the process is nothing short of miraculous — at least by our present understanding. (On the other hand, strict observation of the anthropic principle forces us to abandon the notion that what has happened on Earth could not have happened elsewhere — and carrying this to a logical conclusion, if the universe is in fact infinite (or, stated another way, if the multiverse is infinitely multifaceted), then it must have occured somewhere else any number of times. Whether the universe/multiverse/innerverse/whatever-verse is infinite is, of course, a subject of debate.)

Cloning, in essence, locks in whatever changes have occured in the target organism indefinitely. This sets the cloned product outside of the world of evolutionary pressure and places it directly into the world of pure economic product — which is subject to the forces of supply and demand. At the present time people enjoy reading emotionally charged imaginings about mass clone scenarios, and yet the same people enjoy reading emotionally charged imaginings about the supposed over population of the Earth — in both cases produced and marketed by the same media organizations (whose business is marketing their product, not understanding applied technology).

If the world is overpopulated then we have no need for clones, because the expense of cloning will not provide a benefit any greater than that of recruiting existing humans who were produced at no burden to whoever the employer is in the scenario. Leaving the burden of (re)production, rearing, education, etc. on a family model (be it nuclear, polygamist, polyamorous, broken home, hooker bastard spawn, whatever) provides available humans at an enormous discount compared to any commercial cloning operation and is therefore the correct market option. This leaves the only commercial viable cloning options to be niche in nature at best. Rich men who really want to buy exactly 5 copies of their favorite shower girl may provide a tiny market of this nature, but there is no guarantee that all five clones will agree with whatever the job at hand winds up being, that the purchaser will be alive and remain interested in the project long enough to see it come to fruition (over a decade), or that the nature of the market will not change enormously before completion. (The ready availability of multiple-birth natural clones (twins, triplets, etc.) has not produced a similar market in any case outside of a very small niche in adult services, and that market already apears to be saturated. It turns out that variety tends to be the greatest male aphrodesiac anyway.)

So this leaves what? Very little market for one of the few proposed uses of clones.

The military has no use for clones over what use it already gains from mass human screenings of naturally evolved humans who do not come with the large overhead of a human cloning program attached. The idea that the military wants identical soldiers is flawed to begin with, however. The U.S. Army has a deep recognition of the benefits of having a hugely diverse fighting force and would not want to sacrfice those advantages in exchange for another budgetary drain the institutional burden of becoming Dad for a large number of clones — who may decide that they have better things to do than serve Washington once they have all the big guns anyway. War is a highly emotional experience and the support provided by soldiers between soldiers and the culture that has evolved within the military because of this is almost as complex to understand as the phenomenon of humanity to begin with. Trying to successfully replicate or replace such a complex system that already exists, works well and is free with one which does not yet exist and might fail at enormous cost would be a very difficult thing to pitch to taxpayers.

One again, this leaves very little potential market where the imagination has a fun time seeing one.

The only viable cloning market for the forseeable future would be in organ production and harvesting. There are a few reasons in this market why human clones will never be viable products as well, however. Once again, the expense and time required to clone a human is already equal to the human who is in need of a bio replacement in the first place, the primary difference between the clone and the natural human being that the existing human would already be rich and well enfranchised to be in a position to order a clone from which to harvest his needed spare parts (and the clone, obviously, would not). This conjures up images of a really fun movie from a few years ago, “The Island”, which told the story of two clones produced for the purposes of organ replacement suddenly realizing what they are and deciding that such a short future wasn’t really for them. But that is the movies. Back in the world of reality we already have the technology to clone human organs, and these organ clones do not require fully human hosts. It is possible to grow a human ear on the back of a lab rat, a human heart inside of a pig, and likely other parts on other hosts which are faster and far cheaper to maintain and harvest than human clones would be.

Once again, no market here, either.

Medical testing is another area where I’ve heard talk of mass human cloning. Perfect test subjects, so some claim. But these are only perfect test subjects on the surface. Identical people are not perfect test subjects in the slightest when it comes to medical testing. The most important aspect of drug, allergy, ergonomics, althetic tolerance, etc. medical testing is the statistical significance of the test group. The word “group” here is everything. Testing clones would merely provide the same person for testing a number of times, which amounts to just testing the same person ad nauseam at enormous expense for no gain. Humanity is a varied and evolving thing, and medical advancements must take that into account or else those advancements themselves become useless and thereby unmarketable.

Sorry sci-fi fans, no market here, either.

For the same reasons that medical testing on clones is useless so is an entire society created from clones. A clone society is instantly susceptible to lethal mass epedemics from every vector. It is very likely that a flu that kills one person would kill them all, whereas natural humanity tends to be largely resistant to every pathogen in nature (and even engineered ones) when taken as a whole. Though humans may suffer to vary degrees independently of one another due to individual variations, those individual variations when combined and spread across the masses of humanity provide an overwhelmingly powerful insurance against the mass extinction of humanity. A cloned society removes this ultimate protection at its root and leaves the population totally naked as a whole. Contemplating these realities means contemplating one’s own mortality and relative insignificance, and I imagine that is a likely reason why people don’t think about such things when they see scary stories on TV or the internet about future dystopic scenarios of a planned Earth-wide all-clone society (a la some Illuminati conspiracy variants).

So all-clone society? Just not workable. Not just economically unviable, a downright unsafe way to try to manage humanity.

So why all the fuss? Mainly because there is a big market in generating public drama around new technologies which the general public does not yet fully understand or have any practical contact with (yet). The technologies required to achieve a human clone are significant, but they will manifest themselves in the market (and already do) in very different ways than current popular media proposes.