Wednesday, July 31, 2013

When You Need It NOW! (You Shoulda Got It THEN!)

A huge portion of the emails I get at Put This On are about men who NEED IT NOW! They’ve just been invited to a black tie gala, they’re headed to a summer wedding this weekend, they have a state funeral to attend, they finally got a job interview with the firm they’ve been targeting. So they want to know: how can they save money and buy something great today?

The truth is: it’s impossible. You can go to Barney’s or Nordstrom or Brooks Brothers, beg for on-the-spot alterations, and walk out with something that works, but let me assure you: you will pay full price. And I’ll add that if you don’t live within easy access of those stores, you may well be plum out of luck.

So the solution is pretty simple: be prepared. Not for every eventuality, but for the few that you’re almost certain to encounter.

If you have black dress shoes, a solid gray suit, a white shirt and both a navy and black tie, you’re all set for almost any eventuality. A wedding, a funeral, a job interview.

These should be conservative, and fit. You can thrift them, eBay them, buy them on sale or buy them at full price. But if you’re a grown man, you will need these things. Often on short notice.

If your lifestyle means black tie is a regular occurrence - say once a year or more - then a black tie rig is worth owning as well. Give yourself the time you need to find exactly what you want at the price you want to pay, but do it now, not later.

Great-Uncles don’t die on your schedule, and once-in-a-lifetime job interviews don’t happen right when you expect them. So be prepared.


The Most Lutheran Man in the World

Sin boldly, my friends...

Not the Most Lutheran Man in the World but a Reasonable Facsimile

He felt the spirit move him once.
His confessions make Satan cry.
His baptismal certificate reads “Dinkelaker is not adiaphora.”
He once punched a guppy for being a pietist.
The Old Adam sued him for entrapment.
He sued Original Sin for copyright infringement.
Twitter allows him 1517 characters.
Google Maps include where he stands.
He’s…the Most Lutheran Man in the World.
“Sin boldly, my friends…”

- See more at:

3,000-Year-Old Text Sheds Light on Biblical History

Archaeologist Eilat Mazar shows off her 3,000-year-old Biblical find. (Key to David's City/Youtube)

A few characters on the side of a 3,000-year-old earthenware jug dating back to the time of King David has stumped archaeologists until now -- and a fresh translation may have profound ramifications for our understanding of the Bible.

Experts had suspected the fragmentary inscription was written in the language of the Canaanites, a biblical people who lived in the present-day Israel. Not so, says one expert who claims to have cracked the code: The mysterious language is actually the oldest form of written Hebrew, placing the ancient Israelites in Jerusalem earlier than previously believed.

"Hebrew speakers were controlling Jerusalem in the 10th century, which biblical chronology points to as the time of David and Solomon," ancient Near Eastern history and biblical studies expert Douglas Petrovich told

"Whoever they were, they were writing in Hebrew like they owned the place," he said.

First discovered near the Temple Mount in Jerusalem last year, the 10th century B.C. fragment has been labeled the Ophel Inscription. It likely bears the name of the jug's owners and its contents.

If Petrovich's analysis proves true, it would be evidence of the accuracy of Old Testament tales. If Hebrew as a written language existed in the 10th century, as he says, the ancient Israelites were recording their history in real time as opposed to writing it down several hundred years later. That would make the Old Testament an historical account of real-life events.

According to Petrovich, archaeologists are unwilling to call it Hebrew to avoid conflict.

"It's just the climate among scholars that they want to attribute as little as possible to the ancient Israelites," he said.

Needless to say, his claims are stirring up controversy among those who do not like to mix the hard facts of archaeology -- dirt, stone and bone -- with stories from the Bible.

Tel Aviv University archaeologist Israel Finkelstein told that the Ophel Inscription is critical to the early history of Israel. But romantic notions of the Bible shouldn't cloud scientific methods -- a message he pushed in 2008 when a similar inscription was found at a site many now call one of King David's palaces.

At the time, he warned the Associated Press against the "revival in the belief that what's written in the Bible is accurate like a newspaper."

Today, he told that the Ophel Inscription speaks to "the expansion of Jerusalem from the Temple Mount, and shows us the growth of Jerusalem and the complexity of the city during that time." But the Bible? Maybe, maybe not.

RELATED: Ancient Roman Road Found in Israel

Professor Aren Maeir of Bar Ilan University agrees that some archaeologists are simply relying too heavily on the Bible itself as a source of evidence.

"[Can we] raise arguments about the kingdom of David and Solomon? That seems to me a grandiose upgrade," he told Haaretz recently.

In the past decade, there has been a renaissance in Israel of archaeologists looking for historical evidence of biblical stories. has reported on several excavations this year claiming to prove a variety of stories from the Bible.

Most recently, a team lead by archaeologist Yossi Garfinkel wrapped up a ten-year excavation of the possible palace of King David, overlooking the valley where the Hebrew king victoriously smote the giant Goliath.

Garfinkel has another explanation as to the meaning behind the Ophel Inscription.

"I think it's like a [cellphone] text," Garfinkel told "If someone takes a text from us 3,000 years from now, he will not be able to understand it."

The writing on the fragmented jug is a type of shorthand farmers of the 10th century used, in his opinion, and not an official way of communication that was passed on.

"What's more important is that there is a revolution in this type of inscription being found," Garfinkel told There have been several from the same time period found across Israel in the past five years.

"When we find more and more of these inscriptions, maybe not until the next generation, we may have a breakthrough," he said.

Monday, July 29, 2013

A Christian Tragedy in the Muslim World

Few people realize that we are today living through the largest persecution of Christians in history, worse even than the famous attacks under ancient Roman emperors like Diocletian and Nero. Estimates of the numbers of Christians under assault range from 100-200 million. According to one estimate, a Christian is martyred every five minutes. And most of this persecution is taking place at the hands of Muslims. Of the top fifty countries persecuting Christians, forty-two have either a Muslim majority or have sizeable Muslim populations.


The extent of this disaster, its origins, and the reasons why it has been met with a shrug by most of the Western media are the topics of Raymond Ibrahim’s Crucified Again. Ibrahim is a Shillman Fellow at the David Horowitz Freedom Center and an associate fellow of the Middle East Forum. Fluent in Arabic, he has been tracking what he calls “one of the most dramatic stories” of our time in the reports and witnesses that appear in Arabic newspapers, news shows, and websites, but that rarely get translated into English or picked up by the Western press. What he documents in this meticulously researched and clearly argued book is a human rights disaster of monumental proportions.

In Crucified Again, Ibrahim performs two invaluable functions for educating people about the new “Great Persecution,” to use the label of the Roman war against Christians. First, he documents hundreds of specific examples from across the Muslim world. By doing so, he shows the extent of the persecution, and forestalls any claims that it is a marginal problem. Additionally, Ibrahim commemorates the forgotten victims, refusing to allow their suffering to be lost because of the indifference or inattention of the media and government officials.

Second, he provides a cogent explanation for why these attacks are concentrated in Muslim nations. In doing so, he corrects the delusional wishful thinking and apologetic spin that mars much of the current discussion of Islamic-inspired violence.

Ibrahim’s copious reports of violence against Christians range across the whole Muslim world, including countries such as Indonesia, which is frequently characterized as “moderate” and “tolerant.” Such attacks are so frequent because they result not just from the jihadists that some Westerners dismiss as “extremists,” but from mobs of ordinary people, and from government policy and laws that discriminate against Christians. Rather than ad hoc reactions to local grievances, then, these attacks reveal a consistent ideology of hatred and contempt that transcends national, geographical, and ethnic differences.

In Afghanistan, for example, where American blood and treasure liberated Afghans from murderous fanatics, a court order in March 2010 led to the destruction of the last Christian church in that country. In Iraq, also free because of America’s sacrifice, half of the Christians have fled; in 2010, Our Lady of Salvation Church in Baghdad was bombed during mass, with fifty-eight killed and hundreds wounded.

In Kuwait, likewise, the beneficiary of American power, the Kuwait City Municipal Council rejected a permit for building a Greek Catholic church. A few years later, a member of parliament said he would submit a law to prohibit all church construction. A delegation of Kuwaitis was then sent to Saudi Arabia––which legally prohibits any Christian worship–– to consult with the Grand Mufti, the highest authority on Islamic law in the birthplace of Islam, the Arabian Peninsula.

The Mufti announced that it is “necessary to destroy all the churches of the region,” a statement ignored in the West until Ibrahim reported it. Imagine the media’s vehement outrage and condemnation if the Pope in Rome had called for the destruction of all the mosques in Italy. The absence of any Western condemnation or even reaction to the Mufti’s statement was stunning. Is there no limit to our tolerance of Islam?

Moreover, it is in Egypt––yet another beneficiary of American money and support–– that the harassment and murder of Christians are particularly intense. Partly this reflects the large number of Coptic Christians, the some sixteen million descendants of the Egyptian Christians who were conquered by Arab armies in 640 A.D. Since the fall of Mubarak, numerous Coptic churches have been attacked by Muslim mobs. Most significant is the destruction of St. George’s church in Edfu in September 2011. Illustrating the continuity of mob violence with government policy, the chief of Edfu’s intelligence unit was observed directing the mob that destroyed the church. The governor who originally approved the permit to renovate the building went on television to announce that the “Copts made a mistake” in seeking to repair the church, “and had to be punished, and Muslims did nothing but set things right.”

The destruction of St. George’s precipitated a Christian protest against government-sanctioned violence against Christians and their churches in the Cairo suburb of Maspero in October 2011. As Muslim mobs attacked the demonstrators to shouts of “Allahu Akbar” and “kill the infidels,” the soldiers sent to keep order helped the attackers. Snipers fired on demonstrators, and armored vehicles ran over several. Despite the gruesome photographs showing the crushed heads of Copts, the Egyptian military denied the charges, but then claimed that Copts had hijacked the vehicles and ran over their co-religionists.

False media reports of Copts murdering soldiers fed the violence. Twenty-eight Christians were killed and several hundred wounded. In the aftermath, thirty-four Copts were retained, including several who had not even been at the demonstration. Later, two Coptic priests had to stand trial. Meanwhile, despite an abundance of video evidence, the Minister of Justice closed an investigation because of a “lack of identification of the culprits.”

The scope of such persecution, the similarity of the attacks, and the attackers’ motives, despite national and ethnic differences, and the role of government officials in abetting them, all cry out for explanation. Ibrahim clearly lays out the historical and theological roots of Muslim intolerance in the book’s most important chapter, “Lost History.” Contrary to the apologists who attribute these attacks to poverty, political oppression, the legacy of colonialism, or the unresolved Israeli-Arab conflict, Ibrahim shows that intolerance of other religions and the use of violence against them reflects traditional Islamic theology and jurisprudence.

First Ibrahim corrects a misconception of history that has abetted this misunderstanding. During the European colonial presence in the Middle East, oppression of Christians and other religious minorities was proscribed. This was also the period in which many Muslims, recognizing how much more powerful the Europeans were than they, began to emulate the political and social mores and institutions of the colonial powers.

Thus they abolished the discriminatory sharia laws that set out how “dhimmis,” the Christians and Jews living under Muslim authority, were to be treated. In 1856, for example, the Ottomans under pressure from the European powers issued a decree that said non-Muslims should be treated equally and guaranteed freedom of worship. This roughly century-long period of relative tolerance Ibrahim calls the Christian “Golden Age” in the Middle East.

Unfortunately, as Ibrahim writes, the century-long flourishing of Middle Eastern Christians “has created chronological confusions and intellectual pitfalls for Westerners” who take the “hundred-year lull in persecution” as the norm. In fact, that century was an anomaly, and after World War I, traditional Islamic attitudes and doctrines began to reassert themselves, a movement that accelerated in the 1970s. The result is the disappearance of Christianity in the land of its birth. In 1900, twenty percent of the Middle East was Christian. Today, less than two percent is.

Having corrected our distorted historical perspective, Ibrahim then lays out the justifying doctrines of Islam that have made such persecution possible during the fourteen centuries of Muslim encounters with non-Muslims. The foundations can be found in the Koran, which Muslims take to be the words of God. There “infidels” are defined as “they who say Allah is one of three” or “Allah is the Christ, [Jesus] son of Mary”––that is, explicitly Christian. As such, according to the Koran, they must be eliminated or subjugated. The most significant verse that guides Muslim treatment of Christians and Jews commands Muslims to wage war against infidels until they are conquered, pay tribute, and acknowledge their humiliation and submission.

In the seventh century, the second Caliph, Omar bin al-Khattab, promulgated the “Conditions of Omar” that specified in more detail how Christians should be treated. These conditions proscribe building churches or repairing existing ones, performing religious processions in public, exhibiting crosses, praying near Muslims, proselytizing, and preventing conversion to Islam, in addition to rules governing how Christians dress, comport themselves, and treat Muslims.

“If they refuse this,” Omar said, “it is the sword without leniency.” These rules have consistently determined treatment of Christians for fourteen centuries, and Muslims regularly cite violations of these rules as the justifying motives for their attacks. As a Saudi Sheikh said recently in a mosque sermon, “If they [Christians] violate these conditions, they have no protection.” From Morocco to Indonesia, Christians are attacked and murdered because they allegedly have tried to renovate a church, proselytized among Muslims, or blasphemed against Mohammed––all reasons consistent with Koranic injunctions codified in laws and the curricula of school textbooks.

Both Islamic doctrine and history show the continuity of motive behind today’s persecution of Christians. As Ibrahim writes, “The same exact patterns of persecution are evident from one end of the Islamic world to the other––in lands that do not share the same language, race, or culture––that share only Islam.” But received wisdom in the West today denies this obvious truth. The reasons for this attitude of denial would fill another book. As Ibrahim points out, the corruption of history in the academy and in elementary school textbooks have replaced historical truth with various melodramas in which Western colonialists and imperialists have oppressed Muslims.

These and other prejudices have led American media outlets to ignore or distort Islamic-inspired violence, as can be seen in the coverage of the Nigerian jihadist movement Boko Haram. These jihadists have publicly announced their aim of cleansing Nigeria of Christians and establishing sharia law, yet Western media coverage consistently ignores this aim and casts the conflict as a “cycle of violence” in which both sides are equally guilty.

As Ibrahim concludes, even when Western media report on violence against Christians, “they employ an arsenal of semantic games, key phrases, convenient omissions, and moral relativism” to promote the anti-Western narrative that “Muslim violence and intolerance are products of anything and everything––poverty, political and historical grievances, or territorial disputes––except Islam.”

Within the global Muslim community, there is a civil war between those who want to adapt their faith to the modern world, and those who want to wage war in order to recreate a lost past of Muslim dominance. We do the former no favor by indulging Islam’s more unsavory aspects, since those aspects are exactly what need to be changed if Muslims want to enjoy the freedom and prosperity that come from political orders founded on human rights and inclusive tolerance. Raymond Ibrahim’s Crucified Again is an invaluable resource for telling the truth that could promote such change.

Bruce S. Thornton is a research fellow at the Hoover Institution. He received his BA in Latin in 1975 and his PhD in comparative literature–Greek, Latin, and English–in 1983, both from the University of California, Los Angeles. Thornton is currently a professor of classics and humanities at California State University in Fresno, California. He is the author of nine books and numerous essays and reviews on Greek culture and civilization and their influence on Western civilization. His latest book, published in March 2011, is titled The Wages of Appeasement: Ancient Athens, Munich, and Obama's America.

Sunday, July 28, 2013

How did America’s Police Become a Military Force on the Streets?

Editor's Note: In a remarkable speech at the National Defense University in May, President Barack Obama signaled an end to the war on terrorism; maybe not an end, it turns out, but a winding down of the costly deployments, the wholesale use of drone warfare, and even the very rhetoric of war. Click here to read the full editor’s note.
Are cops constitutional?
In a 2001 article for the Seton Hall Constitutional Law Journal, the legal scholar and civil liberties activist Roger Roots posed just that question. Roots, a fairly radical libertarian, believes that the U.S. Constitution doesn’t allow for police as they exist today. At the very least, he argues, police departments, powers and practices today violate the document’s spirit and intent. “Under the criminal justice model known to the framers, professional police officers were unknown,” Roots writes.
Civil liberties activists say our nation's police forces have become too militaristic—like this SWAT team participating in a drill in October–and are deployed even in nonviolent situations. Photo by AP/Elaine Thompson.
The founders and their contemporaries would probably have seen even the early-19th-century police forces as a standing army, and a particularly odious one at that. Just before the American Revolution, it wasn’t the stationing of British troops in the colonies that irked patriots in Boston and Virginia; it was England’s decision to use the troops for everyday law enforcement. This wariness of standing armies was born of experience and a study of history—early American statesmen like Madison, Washington and Adams were well-versed in the history of such armies in Europe, especially in ancient Rome.
If even the earliest attempts at centralized police forces would have alarmed the founders, today’s policing would have terrified them. Today in America SWAT teams violently smash into private homes more than 100 times per day. The vast majority of these raids are to enforce laws against consensual crimes. In many cities, police departments have given up the traditional blue uniforms for “battle dress uniforms” modeled after soldier attire.
Police departments across the country now sport armored personnel carriers designed for use on a battlefield. Some have helicopters, tanks and Humvees. They carry military-grade weapons. Most of this equipment comes from the military itself. Many SWAT teams today are trained by current and former personnel from special forces units like the Navy SEALs or Army Rangers. National Guard helicopters now routinely swoop through rural areas in search of pot plants and, when they find something, send gun-toting troops dressed for battle rappelling down to chop and confiscate the contraband. But it isn’t just drugs. Aggressive, SWAT-style tactics are now used to raid neighborhood poker games, doctors’ offices, bars and restaurants, and head shops—despite the fact that the targets of these raids pose little threat to anyone. This sort of force was once reserved as the last option to defuse a dangerous situation. It’s increasingly used as the first option to apprehend people who aren’t dangerous at all.


Ron Sachs/CNP—© Ron Sachs/CNP/Corbis/AP Images
The Third Amendment reads, in full: “No soldier shall, in time of peace, be quartered in any house without the consent of the owner, nor in time of war, but in a manner to be prescribed by law.”
You might call it the runt piglet of the Bill of Rights amendments—short, overlooked, sometimes the butt of jokes. The Supreme Court has yet to hear a case that turns on the Third Amendment, and only one such case has reached a federal appeals court. There have been a few periods in American history when the government probably violated the amendment [the War of 1812, the Civil War and on the Aleutian Islands during World War II], but those incursions into quartering didn’t produce any significant court challenges. Not surprisingly, then, Third Amendment scholarship is a thin field, comprising just a handful of law review articles, most of which either look at the amendment’s history or pontificate on its obsolescence.
Given the apparent irrelevance of the amendment today, we might ask why the framers found it so important in the first place. One answer [lies in] the “castle doctrine.” If you revere the principle that a man’s home is his castle, it hardly seems just to force him to share a portion of it with soldiers—particularly when the country isn’t even at war. But the historical context behind the Third Amendment shows that the framers were worried about something more profound than fat soldier hands stripping the country’s larders.
At the time the Third Amendment was ratified, the images and memories of British troops in Boston and other cities were still fresh, and the clashes with colonists that drew the country into war still evoked strong emotions. What we might call the “symbolic Third Amendment” wasn’t just a prohibition on peacetime quartering, but a more robust expression of the threat that standing armies pose to free societies. It represented a long-standing, deeply ingrained resistance to armies patrolling American streets and policing American communities.
And, in that sense, the spirit of the Third Amendment is anything but anachronistic.
As with the castle doctrine, colonial America inherited its aversion to quartering from England. And as with the castle doctrine, England wasn’t nearly as respectful of the principle in the colonies as it was at home. The first significant escalation of the issue came in the 1750s, when the British sent over thousands of troops to fight the Seven Years’ War (known in the United States as the French and Indian War). In the face of increasing complaints from the colonies about the soldiers stationed in their towns, Parliament responded with more provocation. The Quartering Act of 1765 required the colonists to house, feed and supply British soldiers (albeit in public facilities). Parliament also helpfully provided a funding mechanism with the hated Stamp Act.
Protests erupted throughout the colonies, [and] some spilled over into violence, most notably the Boston Massacre in 1770. England only further angered the colonists by responding with even more restrictions on trade and imports. Parliament then passed a second Quartering Act in 1774, this time specifically authorizing British generals to put soldiers in colonists’ homes. The law was aimed squarely at correcting the colonies’ insubordination. England then sent troops to emphasize the point.
Using general warrants, British soldiers were allowed to enter private homes, confiscate what they found, and often keep the bounty for themselves. The policy was reminiscent of today’s civil asset forfeiture laws, which allow police to seize and keep for their departments cash, cars, luxury goods and even homes, often under only the thinnest allegation of criminality.
AP Photo/Greg Gibson


After the American Revolution, the leaders of the new American republic had some difficult decisions to make. They debated whether the abuses that British soldiers had visited upon colonial America were attributable to quartering alone or to the general aura of militarism that came with maintaining standing armies in peacetime—and whether restricting, prohibiting or providing checks on either practice would prevent the abuses they feared.
Antifederalists like George Mason, Patrick Henry, Sam Adams and Elbridge Gerry opposed any sort of national army. They believed that voluntary, civilian militias should handle issues of national security. To a degree, the federalists were sympathetic to this idea. John Adams, Thomas Jefferson and James Madison had all written on the threat to liberty posed by a permanent army. But the federalists still believed that the federal government needed the power to raise an army.
In the end, the federalists won the argument. There would be a standing army. But protection from its potential threats would come in an amendment contained in the Bill of Rights that created an individual right against quartering in peacetime. Even during wartime, quartering would need to be approved by the legislature, the branch more answerable to the people than the executive.
Taken together, the Second, Third and Tenth amendments indicate the founders’ desire for the power to enforce laws and maintain order to be primarily left with the states. As a whole, the Constitution embodies the rough consensus at the time that there would be occasions when federal force might be necessary to carry out federal law and dispel violence or disorder that threatened the stability of the republic, but that such endeavors were to be undertaken cautiously, and only as a last resort.
More important, the often volatile debate between the federalists and the antifederalists shows that the Third Amendment itself represented much more than the sum of its words. The amendment was in some ways a compromise, but it reflects the broader sentiment—shared by both sides—about militarism in a free society. Ultimately, the founders decided that a standing army was a necessary evil, but that the role of soldiers would be only to dispel foreign threats, not to enforce laws against American citizens.


Before the Bill of Rights could even be ratified, however, a rebellion led by a bitter veteran tested those principles. Daniel Shays was part of the Massachusetts militia during the Revolutionary War. He was wounded in action and received a decorative sword from the French general the Marquis de Lafayette in recognition of his service.
After the war ended, Shays returned to his farm in Massachusetts. It wasn’t long before he began receiving court summonses to account for the debts he had accumulated while he was off fighting the British. Shays went broke. He even sold the sword from Lafayette to help pay his debts. Other veterans were going through the same thing.
The debt collectors weren’t exactly villains either. Businesses too had taken on debt to support the war. They set about collecting those debts to avoid going under. Shays and other veterans attempted to get relief from the state legislature in the form of debtor protection laws or the printing of more money, but the legislature balked.
In the fall of 1786, Shays assembled a group of 800 veterans and supporters to march on Boston. The movement subsequently succeeded in shutting down some courtrooms, and some began to fear that it threatened to erupt into a full-scale rebellion.
In January 1787, Massachusetts Gov. James Bowdoin asked the Continental Congress to raise troops to help put down the rebels, but under the Articles of Confederation the federal government didn’t have the power. So Bowdoin instead assembled a small army of mercenaries paid for by the same creditors who were hounding men like Shays. After a series of skirmishes, the rebellion had been broken by the following summer.
Shays’ Rebellion was never a serious threat to overthrow the Massachusetts government—much less that of the United States—and it was put down relatively quickly, without the use of federal troops and with little loss of life beyond the rebels themselves. But its success in temporarily shutting down courthouses in Boston convinced many political leaders in early America that a stronger federal government was needed. Inadvertently, Shays spurred momentum for what became the 1787 Constitutional Convention in Philadelphia.
The impact of Shays’ Rebellion didn’t end, however, at Philadelphia. Memories of the rebellion and fears that something like it could destabilize the new republic blunted memories of the abuses suffered at the hands of British troops and made many in the new government more comfortable with the use of federal force to put down domestic uprisings.
In 1792, just five years after the ratification of the Bill of Rights, Congress passed the Calling Forth Act. The new law gave the president the authority to unilaterally call up and command state militias to repel insurrections, fend off attacks from hostile American Indian tribes, and address other threats that presented themselves while Congress wasn’t in session. In addition to the concerns raised by Shays’ Rebellion, growing discontent over one of the country’s first federal taxes—a tax on whiskey—was also making the law’s supporters anxious. Two years later, in 1794, President George Washington used the act to call up a militia to put down the Whiskey Rebellion in western Pennsylvania.
So ideas about law and order were already evolving. The young republic had gone from a country of rebels lashing out at the British troops in their midst to a country with a government unafraid to use its troops to put down rebellions. But American presidents had still generally adhered to the symbolic Third Amendment. For the first 50 years or so after ratification of the Constitution, military troops were rarely, if ever, used for routine law enforcement. But, over time, that would change.
AP Photo/APTN, Pool


The Civil War and Reconstruction rekindled historic antipathy toward the use of military troops in the streets. And four major wars during the 20th century kept militarization in its intended context—protecting Americans by fighting overseas.
But as the Vietnam War abated, policymakers turned the war footing inward, transforming law enforcement against illegal drugs into a “war.” There was nothing secretive about this transformation. President Richard Nixon declared a “war on drugs” in June 1971. But as that war has unfolded over several decades, we seem not to have noticed its implications.
On Feb. 11, 2010, in Columbia, Mo., the police department’s SWAT team served a drug warrant at the home of Jonathan Whitworth, his wife and their 7-year-old son. Police claimed that eight days earlier they had received a tip from a confidential informant that Whitworth had a large supply of marijuana in his home. They then conducted a trash pull, which turned up marijuana “residue” in the family’s garbage. That was the basis for a violent, nighttime, forced-entry raid on the couple’s home. The cops stormed in screaming, swearing and firing their weapons; and within seconds of breaking down the door they intentionally shot and killed one of the family’s dogs, a pit bull. At least one bullet ricocheted and struck the family’s pet corgi. The wounded dogs whimpered in agony. Upon learning that the police had killed one of his pets, Whitworth burst into tears.
The Columbia Police Department SWAT team recorded many of its drug raids for training purposes, including this one. After battling with the police over its release, a local newspaper was finally able to get the video through state open records laws and posted it to the Internet. It quickly went viral, climbing to over 1 million YouTube views within a week. People were outraged.
The video also made national headlines. On Fox News, Bill O’Reilly discussed it with newspaper columnist and pundit Charles Krauthammer, who assured O’Reilly’s audience that botched raids like the one in the video were unusual; he warned viewers not to judge the war on drugs based on the images coming out of Columbia. Krauthammer was wrong. This was not a “botched” raid. In fact, the only thing unusual about the raid was that it was recorded. Everything else—from the relatively little evidence to the lack of a corroborating investigation, the killing of the dog, the fact that the raid was for nothing more than pot, the police misfiring and their unawareness that a child was in the home—was fairly standard. The police raided the house they intended to raid, and they even found some pot. The problem for them was that possession of small amounts of pot in Columbia had been decriminalized. They did charge Whitworth with possession of drug paraphernalia for the pipe they found near the marijuana—a $300 fine.
Most Americans still believe we live in a free society and revere its core values. These principles are pretty well-known: freedom of speech, religion and the press; the right to a fair trial; representative democracy; equality before the law; and so on. These aren’t principles we hold sacred because they’re enshrined in the Constitution, or because they were cherished by the founders. These principles were enshrined in the Constitution and cherished by the framers precisely because they’re indispensable to a free society. How did we get here? How did we evolve from a country whose founding statesmen were adamant about the dangers of armed, standing government forces—a country that enshrined the Fourth Amendment in the Bill of Rights and revered and protected the age-old notion that the home is a place of privacy and sanctuary—to a country where it has become acceptable for armed government agents dressed in battle garb to storm private homes in the middle of the night—not to apprehend violent fugitives or thwart terrorist attacks, but to enforce laws against nonviolent, consensual activities?
How did a country pushed into a revolution by protest and political speech become one where protests are met with flash grenades, pepper spray and platoons of riot teams dressed like RoboCops? How did we go from a system in which laws were enforced by the citizens—often with noncoercive methods—to one in which order is preserved by armed government agents too often conditioned to see streets and neighborhoods as battlefields and the citizens they serve as the enemy?
Although there are plenty of anecdotes about bad cops, there are plenty of good cops. The fact is that we need cops, and there are limited situations in which we need SWAT teams. If anything, bad cops are the product of bad policy. And policy is ultimately made by politicians. A bad system loaded with bad incentives will unfailingly produce bad cops. The good ones will never enter the field in the first place, or they will become frustrated and leave police work, or they’ll simply turn bad. At best, they’ll have unrewarding, unfulfilling jobs. There are consequences to having cops who are too angry and too eager to kick down doors, and who approach their jobs with entirely the wrong mindset. But we need to keep an eye toward identifying and changing the policies that allow such people to become cops in the first place—and that allow them to flourish in police work.


Betty Taylor still remembers the night it all hit her.
As a child, Taylor had always been taught that police officers were the good guys. She learned to respect law enforcement, as she puts it, “all the time, all the way.” She went on to become a cop because she wanted to help people, and that’s what cops did. She wanted to fight sexual assault, particularly predators who take advantage of children. To go into law enforcement—to become one of the good guys—seemed like the best way to accomplish that. By the late 1990s, she’d risen to the rank of detective in the sheriff’s department of Lincoln County, Mo.,—a sparsely populated farming community about an hour northwest of St. Louis. She eventually started a sex crimes unit within the department. But it was a small department with a tight budget. When she couldn’t get the money she needed, Taylor gave speeches and wrote her own proposals to keep her program operating.
What troubled her was that while the sex crimes unit had to find funding on its own, the SWAT team was always flush with cash. “The SWAT team, the drug guys, they always had money,” Taylor says. “There were always state and federal grants for drug raids. There was always funding through asset forfeiture.” Taylor never quite understood that disparity. “When you think about the collateral effects of a sex crime—of how it can affect an entire family, an entire community—it just didn’t make sense. The drug users weren’t really harming anyone but themselves. Even the dealers, I found much of the time they were just people with little money, just trying to get by.”
The SWAT team eventually co-opted her as a member. As the only woman in the department, she was asked to go along on drug raids in the event there were any children inside. “The perimeter team would go in first. They’d throw all of the adults on the floor until they had secured the building. Sometimes the kids too. Then they’d put the kids in a room by themselves and the search team would go in. They’d come to me, point to where the kids were and say, ‘You deal with them.’ ” Taylor would then stay with the children until family services arrived, at which point they’d be placed with a relative.
Taylor’s moment of clarity came during a raid on an autumn evening in November 2000. Narcotics investigators had made a controlled drug buy a few hours earlier and were laying plans to raid the suspect’s home. “The drug buy was in town, not at the home,” Taylor says. “But they’d always raid the house anyway. They could never just arrest the guy on the street. They always had to kick down doors.”
With just three hours between the drug buy and the raid, the police hadn’t done much surveillance at all. The SWAT team would often avoid raiding a house if they knew there were children inside, but Taylor was troubled by how little effort they put into seeking out that sort of information. “Three hours is nowhere near enough time to investigate your suspect, to find out who might be inside the house. It just isn’t enough time for you to know the range of things that could happen.”
That afternoon the police had bought drugs from the stepfather of two children, ages 8 and 6. Both were in the house at the time of the raid. The stepfather wasn’t.
“They did their thing,” Taylor says. “Everybody on the floor, guns and yelling. Then they put the two kids in the bedroom, did their search, then sent me in to take care of the kids.”
Taylor made her way inside to see them. When she opened the door, the 8-year-old girl assumed a defense posture, putting herself between Taylor and her little brother. She looked at Taylor and said, half fearful, half angry, “What are you going to do to us?”
Taylor was shattered. “Here I come in with all my SWAT gear on, dressed in armor from head to toe, and this little girl looks up at me, and her only thought is to defend her little brother. I thought, ‘How can we be the good guys when we come into the house looking like this, screaming and pointing guns at the people they love? How can we be the good guys when a little girl looks up at me and wants to fight me? And for what? What were we accomplishing with all of this? Absolutely nothing.’ ”
Taylor was later appointed police chief of the small town of Winfield, Mo. Winfield was too small for its own SWAT team, even in the 2000s, but Taylor says she’d have quit before she ever created one. “Good police work has nothing to do with dressing up in black and breaking into houses in the middle of the night. And the mentality changes when they get put on the SWAT team. I remember a guy I was good friends with; it just completely changed him. The us-versus-them mentality takes over. You see that mentality in regular patrol officers too. But it’s much, much worse on the SWAT team. They’re more concerned with the drugs than they are with innocent bystanders. Because when you get into that mentality, there are no innocent people. There’s us and there’s the enemy. Children and dogs are always the easiest casualties.”
Taylor recently ran into the little girl who changed the way she thought about policing. Now in her 20s, the girl told Taylor that she and her brother had nightmares for years after the raid. They slept in the same bed until the boy was 11. “That was a difficult day at work for me,” she says. “But for her, this was the most traumatic, defining moment of this girl’s life. Do you know what we found? We didn’t find any weapons. No big drug operation. We found three joints and a pipe.”


By the mid-1990s, the Byrne Formula Grant Program that Congress had started in 1988 had pushed police departments across the country to prioritize drug crimes over other investigations. When applying for grants, departments are rewarded with funding for statistics such as the number of overall arrests, the number of warrants served or the number of drug seizures. Those priorities, then, are passed down to police officers themselves and are reflected in how they’re evaluated, reviewed and promoted.
Perversely, actual success in reducing crime is generally not rewarded with federal money, on the presumption that the money ought to go where it’s most needed—high-crime areas. So the grants reward police departments for making lots of easy arrests (i.e., low-level drug offenders) and lots of seizures (regardless of size) and for serving lots of warrants. When it comes to tapping into federal funds, whether any of that actually reduces crime or makes the community safer is irrelevant—and in fact, successfully fighting crime could hurt a department’s ability to rake in federal money.
But the most harmful product of the Byrne grant program may be its creation of hundreds of regional and multijurisdictional narcotics task forces. That term—narcotics task force—pops up frequently in case studies and horror stories. There’s a reason for that. While the Reagan and [first] Bush administrations had set up a number of drug task forces in border zones, the Byrne grant program established similar task forces all across the country. They seemed particularly likely to pop up in rural areas that didn’t yet have a paramilitary police team (what few were left).
The task forces are staffed with local cops drawn from the police agencies in the jurisdictions where the task force operates. Some squads loosely report to a state law enforcement agency, but oversight tends to be minimal to nonexistent. Because their funding comes from the federal government—and whatever asset forfeiture proceeds they reap from their investigations—local officials can’t even control them by cutting their budget. This organizational structure makes some task forces virtually unaccountable, and certainly not accountable to any public official in the region they cover.
As a result, we have roving squads of drug cops loaded with SWAT gear who get more money if they conduct more raids, make more arrests and seize more property, and they are virtually immune to accountability if they get out of line. In 2009 the U.S. Department of Justice attempted a cost-benefit analysis of these task forces but couldn’t even get to the point of crunching the numbers. The task forces weren’t producing any numbers to crunch. “Not only were data insufficient to estimate what task forces accomplished,” the report read, “data were inadequate to even tell what the task forces did for routine work.”
Not surprisingly, the proliferation of heavily armed task forces that have little accountability and are rewarded for making lots of busts has resulted in some abuse.


The most notorious scandal involving these task forces came in the form of a massive drug sting in the town of Tulia, Texas. On July 23, 1999, the task force donned black ski-mask caps and full SWAT gear to conduct a series of coordinated predawn raids across Tulia. By 4 a.m., six white people and 40 blacks—10 percent of Tulia’s black population—were in handcuffs. The Tulia Sentinel declared: “We do not like these scumbags doing business in our town. [They are] a cancer in our community; it’s time to give them a major dose of chemotherapy behind bars.” The paper followed up with the headline “Tulia’s Streets Cleared of Garbage.”
The raids were based on the investigative work of Tom Coleman, a sort of freelance cop who, it would later be revealed, had simply invented drug transactions that had never occurred.
The first trials resulted in convictions—based entirely on the credibility of Coleman. The defendants received long sentences. For those who were arrested but still awaiting trial, plea bargains that let them avoid prison time began to look attractive, even if they were innocent. Coleman was even named Texas lawman of the year.
But there were some curious details about the raids. For such a large drug bust, the task force hadn’t recovered any actual drugs. Or any weapons, for that matter. And it wasn’t for a lack of looking: The task force cops had all but destroyed the interiors of the homes they raided. Then some cases started falling apart. One woman Coleman claimed sold him drugs could prove she was in Oklahoma City at the time. Coleman had described another woman as six months’ pregnant—she wasn’t. Another suspect could prove he was at work during the alleged drug sale. By 2004, nearly all of the 46 suspects were either cleared or pardoned by Texas Gov. Rick Perry. The jurisdictions the task force served eventually settled a lawsuit with the defendants for $6 million. In 2005 Coleman was convicted of perjury. He received 10 years’ probation and was fined $7,500.
In the following years, there were numerous other corruption scandals, botched raids, sloppy police work, and other allegations of misconduct against the federally funded task forces in Texas. Things got so bad that by the middle of the 2000s Perry began diverting state matching funds away from the task forces to other programs. The cut in funding forced many task forces to shut down. The stream of lawsuits shut down or limited the operations of others. In 2001 the state had 51 federally funded task forces. By the spring of 2006, it was down to 22.
Funding for the Byrne grant program had held steady at about $500 million through most of the Clinton administration. The Bush administration began to pare the program down—to about $170 million by 2008. This was more out of an interest in limiting federal influence on law enforcement than concern for police abuse or drug war excesses.
But the reaction from law enforcement was interesting. In March 2008, Byrne-funded task forces across the country staged a series of coordinated drug raids dubbed Operation Byrne Blitz. The intent was to make a series of large drug seizures to demonstrate how important the Byrne grants were to fighting the drug war. In Kentucky alone, for example, task forces uncovered 23 methamphetamine labs, seized more than 2,400 pounds of marijuana, and arrested 565 people for illegal drug use. Of course, if police in a single state could simply go out and find 23 meth labs and 2,400 pounds of marijuana in 24 hours just to make a political point about drug war funding, that was probably a good indication that 20 years of Byrne grants and four decades of drug warring hadn’t really accomplished much.
During the 2008 presidential campaign, Barack Obama criticized [George W.] Bush and the Republicans for cutting Byrne, a federal police program beloved by his running mate Joe Biden. Despite Tulia … and a growing pile of bodies from botched drug raids, and the objections of groups as diverse as the ACLU, the Heritage Foundation, La Raza and the Cato Institute, Obama promised to restore full funding to the program, which, he said, “has been critical to creating the anti-gang and anti-drug task forces our communities need.”
He kept his promise. The 2009 American Recovery and Reinvestment Act resuscitated the Byrne grants with a whopping $2 billion infusion, by far the largest budget in the program’s 20-year history.


Police militarization would accelerate in the 2000s. The first half of the decade brought a new and lucrative source of funding and equipment: homeland security. In response to the terrorist attacks of Sept. 11, 2001, on the World Trade Center in New York City and the Pentagon in Washington, the federal government opened a new spigot of funding in the name of fighting terrorism. Terrorism would also provide new excuses for police agencies across the country to build up their arsenals and for yet smaller towns to start up yet more SWAT teams.
The second half of the decade also saw more mission creep for SWAT teams and more pronounced militarization, even outside of drug policing. The 1990s trend of government officials using paramilitary tactics and heavy-handed force to make political statements or to make an example of certain classes of nonviolent offenders would continue, especially in response to political protests. The battle gear and aggressive policing would also start to move into more mundane crimes—SWAT teams have recently been used even for regulatory inspections.
But the last few years have also seen some trends that could spur some movement toward reform. Technological advances in personal electronic devices have armed a large percentage of the public with the power to hold police more accountable with video and audio recordings. The rise of social media has enabled citizens to get accounts of police abuses out and quickly disseminated. This has led to more widespread coverage of botched raids and spread awareness of how, how often and for what purpose this sort of force is being used.
Over just the last six years, media accounts of drug raids have become less deferential to police. Reporters have become more willing to ask questions about the appropriateness of police tactics and more likely to look at how a given raid fits into broader policing trends, both locally and nationally. Internet commenters on articles about incidents in which police may have used excessive force also seem to have grown more skeptical about police actions, particularly in botched drug raids.
It’s taken nearly a half-century to get from those Supreme Court decisions [upholding questionable searches and police tactics] in the mid-1960s to where we are today—police militarization has happened gradually, over decades. We tend not to take notice of such long-developing trends, even when they directly affect us. The first and perhaps largest barrier to halting police militarization has probably been awareness. And that at least seems to be changing.
Whether it leads to any substantive change may be the theme of the current decade.


Warring Against Crime
In a remarkable speech at the National Defense University in May, President Barack Obama signaled an end to the war on terrorism; maybe not an end, it turns out, but a winding down of the costly deployments, the wholesale use of drone warfare, and even the very rhetoric of war.
Prompted by the odious attacks on New York City and Washington, D.C., in 2001, he said, we took the battle, for better or worse, to Afghanistan and to Iraq and, surreptitiously, to Pakistan to punish those deemed responsible. We moved on the home front, as well; perhaps too quickly, some would argue—“hardening targets, tightening transportation security, giving law enforcement new tools to prevent terror,” as the president described the domestic defense agenda.
Some of this hardening and tightening was obvious. Surveillance cameras be-came as ubiquitous as concrete barriers. Office buildings tightened security. Passengers were screened for weapons before boarding planes. But in local law enforcement some of the “new tools” made available to even the smallest police departments helped accelerate changes in policing, changes that some say altered the way police departments behave.
Today, police departments—or some of their key enforcement operations—appear to be on a war footing. Many dress in commando black, instead of the traditional blue. They own military-grade weapons, armored personnel carriers, helicopters and Humvees. Their training is military. Their approach is military. They are in a war against crime and violence and terror that they argue never ends. Just ask those at the finish line of the Boston Marathon on April 15.
In his new book, Rise of the Warrior Cop, journalist Radley Balko points out that this militarization of police departments had taken hold several decades before 9/11. He argues, in the following excerpt, that a few appropriate applications of those tactics and weaponry have obscured their routine use each day, against U.S. citizens accused of ordinary crimes, in ways that would have been repugnant to the nation’s founders. “To say a military tactic is legal, or even effective, is not to say it is wise or moral in every instance,” the president noted in his recent speech. “For the same human progress that gives us the technology to strike half a world away also demands the discipline to constrain that power—or risk abusing it.”
Whether or not you agree with him, it is an issue that Balko has been chronicling for years at the local and national levels. And in this particular moment of national introspection about the efficacy of traditional warfare against the threat of determined terrorists, Balko poses the question about its efficacy against common crime.

—The Editors
Radley Balko is an investigative journalist who writes about civil liberties, police, prosecutors and the broader criminal justice system. He is a senior writer for the Huffington Post.

Mass is Interrupted by Abortion Activists

Abortion activists interrupted Mass at the Cathedral of the Chilean capital Santiago the evening of July 25, destroying confessionals and defaming several side altars with blasphemous graffiti.

“We were celebrating the feast of St. James the Apostle, with the mayor in attendance, and offering SideAltarInCathedralofSantiagoDesecratedthanks to so many Catholics who serve the public, in an atmosphere of peace and recollection when protestors suddenly came in,” said Bishop Pedro Ossandón Buljevic, an auxiliary bishop of the Santiago de Chile archdiocese.

“The truth is that we are always for dialogue, for civilized debate.  We believe in the God-given gift of reason.”

“Therefore we invite everyone to protest in whichever way they wish, but that they do so with respect for the law, for democracy, and the for the dignity of others.”

Archbishop Ricardo Ezzati Andrello was saying Mass on the eve of the feast of St. James, the city's patron and namesake, when the activists unexpectedly stormed the cathedral at the conclusion of a pro-abortion march.
The faithful present at the Mass, including Santiago’s mayor, Carolina Toha, prevented the activists from reaching the main altar.

With help from the faithful, police who in riot gear were able to remove the protestors from the cathedral, dispersing the crowd outside as well, and making several arrests. The protestors had barricaded themselves in with pews.

The partial translation of the graffiti in the picture reads "I sh*t on God", speaking volumes about the soullessness of the people involved.

And don't think for a minute that this sort of radicalism is confined to idiots in South America:
The Texas Department of Public Safety confiscated 18 jars of feces from pro-abortion protestors planning to disrupt state senate proceedings, along with a jar of urine, bricks, bottles of paint, tampons and glitter, The Texas Tribune reports.
Evil is on the prowl.

Let there be no mistake about it.

Tolerance of the Intolerent is now the New Definition of Tolerance

Saturday, July 27, 2013

Minnesota Churches Prepare for Gay Weddings

MINNEAPOLIS — Karl Starr and Christopher Haug pledged their love for each other on the altar of Central Lutheran Church — preparing for the day they will walk into Minnesota history.

The two men will be among the first wave of gay couples to wed legally in a church after Aug. 1, when same-sex couples can start getting married in Minnesota. On Wednesday, Starr and Haug rehearsed their Sept. 14 ceremony at the downtown Minneapolis church with the Rev. D. Foy Christopherson.

“We’ve both grown up Lutheran, and we’re not willing or ready to give that up … this is our community,” said Haug, 56. “A lot of our friends are people we know through church. So we want them to be able to rejoice with us as well.”

Houses of worship that recognize same-sex unions are adjusting to the redefinition of marriage in Minnesota. The new dynamics require new language — no more “I now pronounce you man and wife” — and ceremonial changes.

For Christopherson, using the term “marriage” will be one of the biggest changes. Central Lutheran pastors have held religious “blessing” ceremonies for same-sex couples since about 2000.

The right choice of words is demanding attention by other churches as well.

At Edina Morningside Community Church, a United Church of Christ congregation, Pastor Rosemary Rocha will be performing her first same-sex marriage at the church next month. When talking to the two men who want to be married by her, Rocha says she asked them, “Do I pronounce you husband and husband? … I think we’re looking at I’m to pronounce them ‘married.’?”

“I’ve been learning along the way,” Rocha added. “Because we don’t have a big population of LGBT people in our church, it’s important for me to educate and familiarize myself with some of the issues. You can’t just go assuming … there are some things that might be the same for same-gender and male-female weddings. But what does it mean for a gay couple … who have been in love, cared about people, and been denied this? ”

Not all churches agree

Religious ceremonies are typically not governed by state law, so Catholics and other churches that don’t recognize same-sex marriage are not required to marry gay couples — or any couples, for that matter. While gay couples can seek a civil marriage outside of religious venues, others want a legal marriage ceremony in a church.

Some of the more liberal-leaning mainstream Protestant churches recognize same-sex unions and have been performing religious “blessing” ceremonies for gay couples for years. But even within those denominations, there is not always consensus about whether to perform same-sex marriages.

In Minnesota, the Evangelical Lutheran Church of America (ELCA) is the state’s second-largest denomination with close to 800,000 followers. Bishops who lead the St. Paul and Minneapolis ELCA synods issued statements to congregations emphasizing that it’s up to congregations to decide whether they want to perform gay weddings. The bishop of the Episcopal Church in Minnesota issued a similar message.

Bishop Bruce Caldwell, who leads St. Mark’s Episcopal Cathedral in Minneapolis, said the church set up a booth in Loring Park during the Pride Festival in June advertising that it would perform gay weddings. So far, about eight same-sex weddings are scheduled at the church, he said.

“These folks really do want to get married and commit their lives to one another in a faithful union before God,” Caldwell said.

Churches adapting

John Green, a professor at the University of Akron in Ohio who has written extensively about politics and religion, said churches that recognize same-sex unions will adapt as time goes on.

“This often involves a process of education, since not all members of these congregations are equally comfortable with same-sex marriage,” he said.

“Other congregations are more subtle, allowing such ceremonies but without calling special attention to particular nuptials. Sometimes the clergy will perform the ceremony but not in a church building.

“There are lots of ways that churches who recognize these kinds of unions adjust to new laws.”

Doris Ferguson and Lorna Lapp have been together nearly 19 years and plan to wed in October at Gloria Dei Lutheran Church in St. Paul. Ferguson, 55, intends to wear a mint green dress and will be escorted down the aisle by her brother. Lapp, 63, wearing a black dress with a sash matching the color of Ferguson’s dress, will be escorted by the couple’s son.

As for the traditional words to present the married couple, “Obviously at the end of the ceremony, you’re not going to say, ‘This man and woman,’?” Lapp said. “We want him to say ‘partners in life.’?”

Ferguson said it was important she and Lapp be married at Gloria Dei because they “feel so much a part of that community.

“I suppose we could have done [a blessing ceremony] before the eyes of God years ago,” Ferguson said. “But I guess there was something about the legislation that we wanted. We both just felt it in our heart that was important.”

Making the Church Attractive...

In an exchange between Pope Benedict XVI and a journalist, the Pope was asked what Catholics could do “to make the Church more ‘attractive’ to the modern world.”

“The Holy Father responded: ‘I would say that a Church that seeks to be particularly attractive is already on the wrong path. Because the Church does not work for her own ends, she does not work to increase numbers and thus power,’”

The Pope spoke of the unchanging truth that is the Church's proclamation and the reason for her existence.  She is charged with speaking this unchanging truth in season and out of season, when popular and accepted and when unpopular and rejected, when fashionable and when out of fashion.  She is judged not by the transient standards of relevance but by unchanging standard of faithfulness.

As a parent I fully understand both the temptation and the answer given by B16.  As a parent I want my children to think like I do, to come to the same conclusions I have, to make the same decisions I make.  Every parent does.  But even more than this, as a parent I want to be loved and liked.  The great temptation of the parent is not that you want what is good and best and right for your children (or that you want the good, best, and right to be what you define it).  No, the great temptation is to sacrifice what is good, best, and right only to be loved, appreciated, and liked.  To put it bluntly, we are tempted not to behave as parents because we want to be our children's friend.  We believe that if we just explain ourselves enough, our kids will agree with us, do things our way, and we will not have to be "hard".  We forget that our kids resent the explanations about as much as they do our "no" to their "yes."

B16 is suggesting to the Church what every parent knows on some level.  It is more important to be the parent than to be friend to your kids.  It is more important to be respected by them than to be liked.  The Church's great temptation is to want to be liked.  We want people outside the Church to find us attractive, fun, and exciting.  We want the world around us to think of us as really great, fun loving people.  Our Achilles' heel is that desire for people's affection when our purpose, our identity, and our calling is to be faithful.  That faithfulness often, more often than we want, requires us to say "no" to the yes of the heart, to speak honestly when people would settle for a pleasant lie, and to address wrong when we all would prefer to live in the fairy world of "you're okay, I'm okay."

Let me say upfront something B16 did not say but could have.  There are people in the Church who are permanently discontent, who make everything an argument, and who frown before the world instead of showing the holy joy of our Lord.  No one is saying we need to be disagreeable just for the sake of being disagreeable.  What B16 and every wise Christian knows is that if given a choice between the unpleansantness of being faithful and rejected or being silent and loved, we must always choose being faithful.

Honestly, there are some in Lutheranism, some very smart people and Pastors, who are disagreeable, hard to get along with, and unrelenting about everything.  They make every hill a hill to die on and cannot accept or trust any answer but theirs.  Some people think I am like that.  Speaking just as honestly, there are Lutherans who treat their faith as a flavor of the day for those who happen to like it.  They never die because everything is always negotiable and they never say a discouraging word.  Neither of them due justice to the Great Reformers who risked all when being quiet might have been a whole lot easier.  Lutheranism is not disagreeable.  It is positive and positively joyful.  But it is also faithful and chooses faithfulness over silence, truth over lies, every time.

The funny thing is this.  If we were more faithful, we might also be more attractive to the world.  The world is not attracted by lies, even the ones we like.  The world respects conviction and needs (even if it does not always desire) truth.  The whole goal of Lutheran confession is to speak the truth in love, not self-righteously but as servants of the Servant Lord.  We have had a couple of generations of tilting more to the "I'd rather be liked" end of the spectrum and it has gotten us little.  We have bled off ministries and people and done less with less.  Maybe B16 is exactly right.  Maybe we need to try being faithful.

Behind all the missional, church growth stuff is the conclusion that the world does not like us as we are, does not think we are attractive, and does not find us "fun".  Therefore we need to change to be more what the folks outside the Church would class as winsome, attractive, and fun.  It is a methodology that proceeds from an inherent insecurity about who we are.  It is always doomed to failure.  Every success story comes from those who play from their strength.  Our strength is the Gospel -- full, true, and undiluted.  Maybe we ought to try leading with the full counsel of God's Word -- the sad truth and the joyful truth.  They are not two truths but the one saving truth that is Jesus Christ.  We get nowhere by skipping the Gospel and speaking the Law and we get nowhere by speaking the Gospel and skipping the Law.  The only winning solution for us is to be faithful, joyful but faithful.


A Loss of Cultural Identity and Self-Confidence

I've written about the deceptive and arrogant Islamic supremacist Reza Aslan's book elsewhere; this is not about him, but about the Jesuit priests who encouraged him to return to Islam. For Catholic priests, Christian clerics, to do such a thing bespeaks a loss of confidence in (and belief in) their own religion, tradition and heritage: one might have expected them to point Aslan in the direction of defenses of Christianity (of which there are plenty to respond to the puerile objections that Aslan advances), but instead, full of enlightened Leftism, relativism, and indifferentism, they encouraged him to return to a religion that has been the historic enemy of the Church.

Even if these Jesuits believed that Islam was no longer the enemy of the Church, or were ignorant enough to think that it never had been, or thought that the idea of having any enemies at all was an unenlightened relic of the past that had to be rejected, their action epitomized the West's present loss of self-confidence and suicidal eagerness to abase its own history and traditions. And these Jesuits are not singular in any way. If these priests told anyone in their circles what they had done, every one of their fellow Jesuits and Leftist clerics of all kinds are certain to have applauded their actions. In fact, they would probably have criticized them if they had tried to restore Aslan's flagging commitment to Christianity.

And they will keep on acting this way, right up to the moment when Aslan's more bloody-minded coreligionists knock on the door of the Jesuit house and tell them the game is up.

"Another Jesuit Success Story," by Rod Dreher in The American Conservative, July 15 (thanks to The Cardinal Newman Society):
During my years as a Catholic, more than a few times I would meet someone who had left the faith, and would credit their Jesuit education for having opened their eyes. Just now, I heard the Muslim scholar Reza Aslan on Fresh Air, talking about his new book. Terry Gross mentioned that he (Aslan) had been born into Islam, but his parents fled with him from Iranian Revolution. In America, his father became atheist, but Aslan became an Evangelical Christian. His mother followed him into Christianity. But then, studying at the Jesuit-run Santa Clara University, Aslan encountered Jesuit priests who encouraged him to go deeper into Islam, the religion of his forefathers.

Aslan did, and subsequently renounced Christianity to return to Islam.

Posted by Robert Spencer at

Thursday, July 25, 2013

Heritage Stockman by Ariat

The Heritage Stockman boots are similar to the classic roper style boot, but they include a saddle vamp over the arch of the foot. These boots are made from durable, full-grain leather that is easy to care for and will last through almost any job. They are also leather lined, which allows your feet to breath and the boots to mold to your feet as you wear them. They feature a round toe and low heel that is functional and comfortable. Ariat's exclusive rubber Duratread outsole is longer lasting than traditional rubber, provides maximum wear resistance, is flexible, and highly resistant to barnyard acids. The ATS sole technology offers optimal stability and all-day comfort for performance you can depend on.

Good looking, hard working, top performing cowboy boot. ATS™ technology holds up all day easing the load on the feet. Duratread™ outsole lasts. Traditional styling with a little flair, a narrower U toe, six-row stitch pattern, and classic colors are ready for some steppin' out.

Shaft Height : 11"
Technology : ATS
Toe Shape : U
Heel Type Height : 1.375"
Gender : M
Outsole Composition: Rubber
Style #: 10002247

Full-grain leather foot and shaft
Leather lined
Duratread rubber outsole
Single stitch welt around the sole
Ariat's round U-toe
Six-row stitch pattern on the shaft
ATS sole technology
11" Shaft height
Heel height 1 1/8"

Toe Type: Round
Outsole: Duratread
Lifestyle: Western
Heel Type: Riding
Lining: leather
Upper Material: Leather
Insole: ATS Technology
Shaft: 11"
Material: Leather
Heel Height: Medium
Closure: Pull-On

Sizing and Fit
Where can I find Size Charts?
Refer to our Ariat Sizing Chart for assistance.

How can I tell if my boots fit properly?
Pull out the footbed and place your foot from heel to toe on the footbed. If you place your thumb at the toe area and it fits comfortably, this should ensure a proper fit, allowing enough slippage in the heel.

How do I know if my Western Pull On fits correctly?
Refer to our size chartsfor assistance. When you slide your foot into a boot, use either boot pulls to pull it on while sitting, or stand up in to the boots. A well fitting boot should fit your heel snugly. Expect a small amount of movement (lift) in the heel as you walk, because a boot that fits properly should flex as you walk, and your heel should lift a little, but not too much. New boots will have a stiffer sole. As you break in the boot, the sole will flex to a comfortable level and most slippage should stop.

Boot Care
How do I care for my Ariat boots?
Proper care of your boots and shoes is one of the best things you can do to prolong the life of your favorite Ariat boots. We recommend the Cadillac and Kiwi brands. Please note, however, that some leather care products may alter the color of the leather and will smooth the nap of a nubuck or distressed leather. We recommend that you test a small portion of your boot, i.e. the inner ankle area, before spraying your entire boot.

Can I waterproof my boots?
You may, enhance the level of water resistance in your boots with waterproofing sprays offered by Kiwi or Scotchsguard brands. Please review the product instructions before application. We would recommend that you test the spray on a small portion of your boot, like the inner ankle area, before spraying your entire boot as some of these products may alter the color of the leather and will smooth the nap of a nubuck or distressed leather. 

The Failures of U.S. International Religious Freedom Policy

Dr. Thomas Farr of Georgetown’s Berkley Center is one of the true Good Guys on the Washington scene. His June 13 testimony before the National Security Subcommittee of the House Committee on Oversight and Government Reform was a thoughtful, sobering reflection on the failures of U.S. international religious freedom policy.

Farr speaks with authority, for he was the first director of the State Department’s Office of International Religious Freedom in 1999-2003 and has been deeply engaged in the battle for religious freedom around the world ever since. He has done so both from conviction—this is the right thing to do—and from prudential policy judgment: religious freedom advances the cause of peace, for countries that violate the first freedom internally are, in the main, countries “whose internal stability, economic policies, and foreign policies are of substantial concern to the United States.”

So if the promotion of religious freedom abroad (like its defense at home) is both the right play and the smart play, why does the United States do it so badly?

Why, to cite Farr’s testimony, is it “difficult to name a single country in the world over the past fifteen years where American religious freedom policy has helped to reduce religious persecution or to increase religious freedom in any substantial or sustained way?” The opposite is true: that “in most of the countries where the United States has in recent years poured blood, treasure, and diplomatic resources (such as Iraq, Afghanistan, Pakistan, Egypt, China, Saudi Arabia, and Russia), levels of religious freedom are declining and religious persecution is rising.”

Farr suggests several structural reasons why.

First, the U.S. approach to international religious freedom is largely rhetorical: annual reports are issued, speeches are made, lists of egregious persecutors are published. None of this, however, has much effect on the persecutors.

That, in turn, suggests another structural reason why the effort to promote religious freedom internationally, mandated by Congress, hasn’t worked: it hasn’t been thought through strategically. Or as Farr put it, forbearing to mince words, no president or secretary of state has made a concerted, sustained effort to “integrate the advancement of religious freedom into the foreign policy of the United States” since the International Religious Freedom Act was passed in 1998.

Because of that, most professionals in the U.S. Foreign Service don’t take religious freedom seriously as a foreign policy concern; indeed, Tom Farr testified, “our diplomats are not being trained to know what religious freedom is and why it is important, let alone how to advance it.” And if the superiors in the White House and at Foggy Bottom don’t insist that strategic policy planning include religious freedom issues, the “deep-seated skepticism in our foreign-policy establishment that religious freedom is in in fact important for individuals and societies” (a skepticism that reinforces the faux-realist view that religious freedom is not “real foreign policy”) will remain the default position in the Foreign Service.

Which leads us to another, related structural problem. The Office of International Religious Freedom was established in the State Department by congressional mandate; State’s permanent bureaucracy, like other permanent bureaucracies, is exceptionally skillful at hermetically sealing off anything it regards as an alien body from the serious policy-planning action. Thus the office and the U.S. special ambassador for international religious freedom (a post also mandated by Congress) have often been isolated within State, underfunded, and cut off from access to the Secretary of State and other officials with real policy-making authority.

As Dr. Farr concludes after surveying this dismal landscape, “It is hardly surprising that American diplomats and foreign governments do not see religious freedom as a priority for U.S. foreign policy. It is not surprising that religious freedom programs play little or no role in U.S. strategies to stabilize key struggling democracies such as Iraq or Pakistan, encourage economic growth on places like Egypt or Nigeria, or undermine the religion-related terrorism that is still being incubated in many nations of the broader Middle East.”

Religious freedom is right. Religious freedom works. But promoting it remains marginal to U.S. foreign policy. Not good; not smart, either.

George Weigel is Distinguished Senior Fellow of Washington’s Ethics and Public Policy Center. His previous “On the Square” articles can be found here. Photo credit: Pew.

By George Weigel, originally on the First Things website.

Wednesday, July 24, 2013

Intermittent Fasting: A Healthy Choice

The advice to have five or six small meals daily has become common in recent years. I am 69 years old and don't recall ever hearing this as a child and seldom as a young adult, but by the 1980s it seemed to be everywhere. Today, it is close to nutritional dogma. It is not surprising that an online search of the phrase "eat many small meals" returns over 275,000 results.
The usual justification for eating extra meals is that it keeps the metabolism "revved up" so that weight loss is easier. There is, however, very little hard evidence that supports this idea, and a fair amount that disputes it. For example, a research analysis published in the British Journal of Nutrition concluded that "any effects of meal pattern on the regulation of body weight" appear to be negligible, and what matters is total food intake.
Worse, the "eat many small meals" advice has two clearly negative effects:
  • In practice, those extra meals usually aren't vegetable-intensive, home-cooked ones. These days, they are likely to be "energy bars" (a euphemism for candy bars), snack mixes, and so on. In other words, high-glycemic-load processed snacks.
  • When people are told to "eat many small meals," what they may actually hear is "eat all the time," making them likely to respond with some degree of compulsive overeating. It's no coincidence, I think, that obesity rates began rising rapidly in the 1980s more or less in tandem with this widespread endorsement of more frequent meals. (The other major culprit was the government's scientifically shaky "low-fat" dietary recommendation that led to rampant overconsumption of carbohydrates.) In my travels around the world, I am often struck by how rarely I see people eating in their cars, or while strolling down the street, or otherwise outside the traditional time and space boundaries of a meal. In the U.S., these behaviors are ubiquitous.

So the time has come to explore the opposite idea: regularly allowing greater-than-normal amounts of time to pass between meals, a practice known as "intermittent fasting," or IF. Frankly, today in America, simply eating three meals with no snacks might be called a form of IF, if only by way of contrast. If we were to return to this once-common practice, I believe we would be healthier for it.
The basic premise of IF is to enjoy better health via repeatedly fasting for longer periods than is typical on a daily breakfast-lunch-dinner schedule. Variations are endless. Some proponents skip breakfast; others, dinner. Others fast all day every other day, every third day, once per week, or once per month. A friend I know who travels for work six to eight times annually always fasts on the first and last days of his trips, reasoning that airline food is awful anyway. (Fasting, it should be pointed out, means abstaining from solid food; all sensible IF plans allow hydration with water, tea or other no- or low-calorie beverages.)
An IF regime works, proponents say, because it aligns with our evolutionary history. Over the 250,000 years that Homo sapiens have been around, food supply has waxed and waned. We evolved to take advantage of this fact, building muscle and fatty tissue during times of abundance, then paring it back during lean ones. Fasting periods accelerate the clearing-out of waste left by dead and damaged cells, a process known as autophagy. A failure of autophagy to keep up with accumulated cellular debris is believed by many scientists to be one of the major causes of the chronic diseases associated with aging.
Occasional fasting also seems to boost activity and growth of certain types of cells, especially neurons. This may seem odd, but consider it from an evolutionary perspective -- when food is scarce, natural selection would favor those whose memories ("Where have we found food before?") and cognition ("How can we get it again?") became sharper.
Research indicates that the benefits of IF may be similar to those of caloric restriction (CR) in which there are regular meals, but portions are smaller than normal. The advantage of IF, proponents say, is that it's easier to feel sharp hunger occasionally rather than the mild hunger of CR virtually all the time.
The positive effects of IF have been chronicled in a variety of animal and human studies, starting with a seminal experiment in 1946, when University of Chicago researchers discovered that denying food every third day boosted rats' lifespans by 20 percent in males, 15 percent in females. A 2007 review by University of California, Berkeley, researchers concluded that alternate-day fasting may:
  • Decrease cardiovascular disease risk.
  • Decrease cancer risk.
  • Lower diabetes risk (at least in animals, data on humans were less clear, possibly because the trial periods in the studies were not long enough to show an effect).
  • Improve cognitive function.
  • Protect against some effects of Alzheimer's and Parkinson's diseases.
What should we make of this?
I don't recommend IF for everyone. Children under 18 should not fast, nor should diabetics, nor pregnant or lactating women. Some health conditions -- such as severe gastrointestinal reflux disease, or GERD -- are easier to manage when food intake is more regular.
But I do think the evidence for the health benefits of IF should make us rethink what seems to be a modern cultural imperative: to avoid hunger at all costs. To the contrary, getting hungry now and then is clearly a healthy thing to do as long as overall caloric intake stays high enough to maintain a healthy weight. (Fasting, like every other healthy activity, must be done sensibly and in moderation.) Many people who follow IF regimes report both physical and mental benefits, including improved energy and concentration, better sleep, and an overall feeling of well-being.
If you practice IF, please share your experiences in the comments below -- what's your eating-fasting pattern, and what health effects have you noticed?

by Andrew Weil, M.D., is the founder and director of the Arizona Center for Integrative Medicine and the editorial director of