Blog

08 Dec
0

In Giving We Receive

If you spend any time online, chances are you’ve heard of Giving Tuesday, a phenomenon that emerged out of a non-profit organization’s effort to counterbalance the consumer-oriented post-Thanksgiving splurges of “Black Friday” and “Cyber Monday” with a day devoted to generosity toward others. Giving Tuesday is but one example of what can be accomplished by eliciting the charitable impulse that is inherent in the human person.

The idea for dedicating the Tuesday after Thanksgiving to charitable giving was born just nine years ago, but it has already achieved massive recognition. As of 2015, a survey found that only 18 percent of Americans were aware of it, but that number is surely much higher now. The totals for 2020 have not yet been finalized, but more than $500 million was given in 2019 and predictions see 2020 exceeding that mark. Corporations have bolstered the effort by promoting it in various ways: most prominently, Facebook matches donations made through its platform.

Giving Tuesday is but one small piece of a worldwide philanthropic movement that is gigantic in both scale and effect. In the United States alone, donors contribute more than $400 billion per year to non-profit organizations. Removing corporations and foundations from the mix still leave more than $2 billion given by individuals. Skeptics may rightly point out that the US tax code incentivizes such giving, but the fact is that most taxpayers take standard deductions and thus are not rewarded for their donations. In addition, these charitable totals do not reflect in-kind gifts or volunteer hours that are not tracked as charitable giving. If these were added, the total (non-tax-related) contributions would doubtless balloon well into the billions.

In other words, it is common for people to help other people, and money is only one way it happens. This helpfulness is obvious whenever a high-profile natural disaster hits: volunteers and donations come pouring in from around the world. But it also happens constantly in ways that rarely receive publicity: helping a struggling student with school; caring for a child or an elderly person; giving someone a ride to the store or to church; assisting a neighbor with yard work; offering a meal to someone in need.

In these and countless other ways, we demonstrate solidarity, which recent popes have defined as “a sense of responsibility on the part of everyone with regard to everyone.” It is natural to feel and exhibit concern for those in our immediate sphere, our friends and family. And it is true that we have a primary obligation to care for those nearest to us. But the principle of solidarity urges us to look beyond those with whom we interact regularly and to consider the needs of a wider world. Without denying diversity, solidarity thus points to what is common in humankind. It stresses the unity of human persons as sons and daughters of one Father. When we give—monetarily or otherwise—to charitable causes within or beyond our local communities, we express and affirm this unity. We recognize the reciprocal obligations we have to each other as fellow creatures of a benevolent Creator.

The non-profit world is vast and various, and there are surely some causes and organizations with which we would disagree, perhaps strongly. But Giving Tuesday nonetheless represents a positive impulse, one that we do well to encourage. Going outside of ourselves is a healthy practice, for selflessness breeds self-content. This venerable spiritual paradox is conveyed in the words of the Prayer of St. Francis: “It is in giving that we receive.” As a devout Christian, St. Francis subscribed to an even more profound truth that gives further impetus to our generosity. “If anyone has material possessions and sees a brother or sister in need but has no pity on them, how can the love of God be in that person?” (1 John 3: 17). The things of this world are passing; let us use them, while we may, to build up the Kingdom.

 

 

Read More
03 Dec
0

Do We Still Value Liberty?

In a recent speech at a Federalist Society conference, Supreme Court Justice Samuel Alito asserted that “the pandemic has resulted in previously unimaginable restrictions on individual liberty.” He hedged this claim about with caveats, such as, “I am not diminishing the severity of the virus’s threat to public health”; and, “I’m not saying anything about the legality of COVID restrictions”; and, “Nor am I saying anything about whether any of these restrictions represent good public policy.” He was simply drawing attention to “disturbing trends that were already present before the virus struck,” including “the dominance of lawmaking by executive fiat rather than by legislation,” and “in certain quarters,” religious liberty’s “fast becoming a disfavored right.”

In other words, without concluding that any particular pandemic-related measure was or was not necessary or justified, Alito was sounding an alarm: We should be concerned about how easily government officials and agencies can suppress liberties that have long been considered fundamental to American life. Of particular concern is religious liberty, which had already been attenuated by developments such as the Court’s own Obergefell decision on same-sex marriage.

For this, Slate called him a “bitter partisan,” an NBC News headline announced that he had “openly joined the culture wars,” and CNN described him as “ireful” and “infuriated.” This “ireful” response from the organs of the American Left illustrates a trend that has been in evidence for some time. An ideological divide has emerged over terms that were once considered a core element of American unity: liberty and freedom.

liberty bellFrom the Liberty Bell to the slogan “Give me liberty or give me death,” the language and symbolism of liberty have permeated American political culture since its inception. Until recent decades, social movements strove to attach themselves to that tradition, because identification with freedom virtually ensured success. “Free at last! Free at last! Thank God almighty we are free at last!” were the unforgettable final words of Martin Luther King Jr.’s momentous “I Have a Dream Speech.” No one—left, right, or center—wanted to be associated with oppression or infringement of liberty.

This is not to say that there were no vigorous debates or conflicts concerning the meaning of liberty and whether or how it applied to various questions. Americans have always been fractured and fractious. But there was at least some semblance of a shared value that could be appealed to by people of every persuasion. Freedom, we all agreed, is a good thing.

That can no longer be assumed. The terminology of liberty can still be found on the left, especially among older institutions (e.g., the American Civil Liberties Union), but there’s little question that talk of freedom has become mostly a mark of the right. Few left-leaning organizations, websites, or publications formed in the twenty-first century would incorporate the words “freedom” or “liberty” in their titles. Instead, “progress,” “justice,” and “equal” are the preferred concepts. On the right, meanwhile, usage of “freedom” and “liberty” abounds. This causes a snowball effect: As the terms liberty and freedom become associated with conservative politics, they become increasingly suspect on the left.

The terminological divide is important because it’s a manifestation of an ideological divide. There is no identity between terminology and reality, of course, but the two aren’t unrelated either. As a vast swath of the American population becomes suspicious of the language of liberty, the tendency increases to become skeptical of the concept itself. It’s not clear, in fact, which direction the cause-and-effect relationship runs, but that there is some correlation seems beyond question. Evidence abounds. Among young people, for example, a group that is disproportionately liberal in politics, there is also lack of enthusiasm for freedom of speech. This is something new. The freedoms enshrined in the First Amendment—speech, press, assembly, religion—used to be sacrosanct to virtually all Americans of all political stripes. Again, that doesn’t mean there weren’t fights over their meaning, only that there was to some extent a shared set of values, reflected in common parlance.

This is dangerous territory. Heritage Foundation scholar Ryan Anderson called attention to the problem three years ago, when he wrote: “A presumption of liberty has been replaced with a presumption of regulation. Citizens used to think that liberty was primary and the government had to justify its coercive regulation. Now people assume that government regulations are the neutral starting point and citizens must justify their liberty.”

That was before COVID. The presumption of the government’s prerogative to override citizens’ freedom has gained strength during the pandemic. Justice Alito is calling attention to a matter that ought to concern all Americans of every political and religious variety. If the concern is not intense and widespread, then it raises a question that cuts to the heart of American society: Do we still value liberty?

Read More
23 Nov
0

Trade-offs are Unavoidable

“You can’t have it all,” is a cliché we’ve all heard, probably to our annoyance. Like all clichés, it could be true or not depending on the context and the specific meaning intended by the terms. In the sphere of public policy, though, it does convey an important truth, one that we should keep in mind whenever politicians promise bigger, better, and more.

The limits faced by policy decision-makers can be understood by reference to two, interrelated principles from economics: trade-offs and opportunity cost. The concept of trade-offs simply means that every decision involves a choice not only to do something but also not to do something else. The opportunity cost is what is sacrificed by not doing that something else.

Consider a city council that has a million dollars to spend. Options before them are building a new park or replacing a failing bridge. Each project costs a million dollars. They can’t do both. If they choose the park, some will complain that they are endangering residents by neglecting the bridge. If they choose the bridge, some will complain that they are undermining the welfare of children by not providing a safe place to play. The council members can study both projects carefully and fully appreciate the benefits and drawbacks of both, and yet in the end they must make a decision to go one way or the other. There is a trade-off

There is also an opportunity cost. If they choose the bridge, the property values that would have been enhanced by the park will not rise, and so additional property tax revenue will be lost. If they choose the park, the expense of replacing the bridge may rise as its condition continues to deteriorate. There will be other financial opportunity costs either way, and there will also be costs that are not financial.

Some will claim they have the solution to this dilemma: Raise taxes! The city can pay for both, and all this hard decision-making can be avoided.

Wrong. Raising more revenue does not eliminate trade-offs and opportunity costs, it merely shifts them elsewhere. Taking more money out of the pockets of city residents means that they have less money to spend on restaurants, shops, and other businesses in the city. The city has made a trade-off between a park and bridge on one hand, and a more vibrant local economy on the other. Opportunity costs might include lower income tax revenue as employees are let go or wages decrease in response to lowered demand or even the closure of businesses that cannot sustain the downturn.

Understanding the reality of trade-offs helps protect against the inflation of politics into a war on which cosmic justice depends. Yes, policy debates do sometimes involve life-or-death matters or questions of fundamental moral importance. But usually, they don’t, even though they are portrayed so in a rhetorical attempt by one side or the other to gain the advantage of moral high ground. Whether and how to provide a path to citizenship for illegal immigrants is an important matter that has moral dimensions, but it is not a battle between right versus wrong; it is a choice among many options that all entail pros and cons—that involve trade-offs. Similarly, whether an income tax regime should be progressive or flat is a policy question about which moral arguments invoking justice, fairness, and equality can be made, but it is not a Manichean choice between good and evil. There are advantages and disadvantages to either approach, and decent, reasonable people can disagree as to which choice is superior.

Trade-offs are a reflection of the fact that, in this world, knowledge, resources, time, and other goods are not infinite. In Christian eschatology, we hope for a “new heaven and a new earth” (Rev. 21:1), where the limitations we now experience are overwhelmed by perfect happiness. Until such time, we can’t have it all—even if politicians pretend otherwise.

Read More
13 Nov
0

A “Quiet and Useful Life”

In Sir Walter Scott’s swashbuckling historical novel, Rob Roy, the merchant Bailie Jarvie plays an important part by inducing the title character to intervene in the main plot. When his role there is finished, the author bids him adieu with the line, “I do not know that there was any other incident of his quiet and useful life worthy of being particularly recorded.” That is to say, Jarvie was interesting to Scott (and to us readers) as long as he was involved in political, martial, and romantic intrigue. When he returned to the mundane life of a business, there was nothing worth reporting.

So it is with real life. Media coverage focuses on threats, disasters, scandals, and controversies. Political and military events demand attention, as do the personal lives and opinions of celebrities in the fields of entertainment and sports. Even within the field of business, it’s the stars who get the attention. Books are written about big personalities at the top of big companies—the brash, the quirky, or the simply too important to ignore: Jack Welch, Lee Iacocca, Steve Jobs, Warren Buffett, Bill Gates. Or, to reach the very top of the media circus, the combination of flashy business and populist politics: Donald Trump.

Meanwhile, important activities and developments in the more quotidian domains of life go largely unnoticed. This problem is to some extent natural and unavoidable, a result of media consumer demand. Most people just aren’t that interested in an advance that makes a car engine more efficient, an invention that makes a lift truck safer to use, or a new technique that makes grain farming more productive.

We can see this phenomenon reflected even in the Gospels, which largely pass over the decades of Jesus’ home life—the daily work—to bring us the drama of his birth, passion, death, and resurrection. “Jesus worked with his hands in daily contact with the matter created by God, to which he gave form by his craftsmanship,” Pope Francis writes in Laudato Si’ (2015). “It is striking that most of his life was dedicated to this task in a simple life which awakened no admiration at all.… In this way, he sanctified human labor.” When Jesus did emerge into public life, people were skeptical and asked, “Is not this the carpenter?” What did they have to learn from this local boy who pounded wood and lacked wealth, status, and education? “And they took offense at him” (Mk. 6: 2–3).

We need to remind ourselves that it’s the people involved in day-to-day work, not politicians and entertainers, who really make the world go round. Sometimes, we’re forced to remember: for example, when COVID knocks out workers at a few meat processing plants and the price of beef skyrockets. The people putting food on our table, keeping our utilities working, building our homes, and feeding, clothing, and teaching the next generation are living “quiet and useful” lives. They are not the kinds of lives that attract book or movie deals, but they are indeed very useful.

In the midst of election season and related debates about the president, Congress, and the Supreme Court, it might be good to bear this truth in mind. To keep society running smoothly, what is really needed is millions of people doing their daily duty: putting in the work (paid or not) that supports themselves and their families and creates the goods and services to enable others to do the same. Political officials and the policies they make are important only indirectly because they can make that daily life more or less burdensome. The public drama is in politics, but the real action is in the office, the factory, and the home.

Read More
12 Nov
0

Smart Charity

Most people of goodwill agree that there is an obligation to assist those in need. For Christians, the responsibility is intrinsic to our faith. Commands to help the widow, the orphan, and the poor pervade Old and New Testaments. The writings of the Church Fathers are full of admonitions to give to the less fortunate. Basil of Caesarea (d. 370), who founded the world’s first hospital, put the obligation in stark but not uncharacteristic terms: “The bread in your cupboard belongs to the hungry man; the coat hanging in your closet belongs to the man who needs it; the shoes rotting in your closet belong to the man who has no shoes; the money which you put into the bank belongs to the poor. You do wrong to everyone you could help but fail to help.”

To know that we have an obligation to help the needy, however, is not to know how that obligation should be discharged. Many people assume that the move from principle to practice is a seamless one. Helping the poor is easy, and lots of opportunities present themselves. Pay taxes. Throw some change in the kettle. Put a dollar in the collection basket. Give the panhandler a few bucks. If we do these things, we’ve done what’s necessary.

Doing these things may (or may not) be actually helpful to those in need, but what is certain is that they are not adequate to meet our duty to others. If we are genuinely committed to acting in charity—if we truly desire the good of our neighbor—we must go far beyond what is simple and easy. Proper motivation—genuine love—is the primary element, but even that is not enough. We must also apply our intelligence.

The importance of reflection, consideration, and evidence in the practice of charity can be illustrated by a story that appeared in a local newspaper a few months ago. The report relates the development and eventual abandonment of a project to address homelessness in a small Midwestern city.

In 2015, in response to the demands of community leaders, the mayor formed a task force on homelessness. A local church volunteered to operate a warming shelter on winter’s coldest nights. Despite extensive publicity and widespread invitations, few people availed themselves of the shelter. Critics suggested that the location was wrong; the shelter needed to be downtown.

A downtown church agreed to host the shelter. Volunteers staffed the rooms and prepared meals. Hot showers were offered to induce visitors. The result: “Many nights, the volunteers ate the dinner they had prepared so that the food would not go to waste, and then went home when no one came to the shelter.” One potential client showed up but left when he discovered there was no television to watch.

Critics now proposed that the homeless stayed away because they weren’t comfortable at a church. The shelter was moved to a non-sectarian community center, a place already well known as a resource for those in need. The numbers did not increase.

The task force made one last try, this time enlisting the charitable expertise of the Salvation Army. During the winter of 2019-2020, the shelter was open seventeen nights, and not a single person spent a single night. The mayor concluded, “Perhaps it’s time to simply realize that we can spend our time in ways that produce better results.”

Just so. This was a project conceived by good, well-intentioned people to tackle what appeared to be a serious problem in the community. The effort was locally initiated and brought together churches, non-profits, and government in a collaborative endeavor. So far, so good.

But none of these positive qualities guaranteed its success. Perhaps the exact nature of the problem had been misunderstood. Perhaps the solution to the problem was more complicated than initially recognized. Whatever the case, it’s clear that the elements of good intentions, resources, and effort were not enough to ensure effective charity. Better analysis of how to address the problem was needed. Laudably, the task force in this case had the humility to admit that their experiment had failed. They can now learn from the experience and move on. This failure can be the catalyst for more effective charitable activity in the future.

The travesty is when a government or non-profit welfare programs gain a momentum of their own, regardless of the consequences. Big charities and state and federal programs are the most likely to fall into this problem. The more separation there is between donors, administrators, and beneficiaries, the more room there is for ignoring the effectiveness of aid efforts. When the metric of success is money spent rather than results achieved, the original goal of the program has been subsumed by other interests. That is not smart charity.

Read More
12 Nov
0

Fake Diversity

Of the many abused terms in our society, diversity is a prominent one. In the minds of many, diversity is similar to taking a stroll at the local zoo to observe various species of animals or sitting comfortably at home to enjoy your aquarium. This taxonomical understanding of diversity is impoverished and tenuous. It is leading our society to lose respect for the person as we drown in an expansive yet shallow sea of color.

I’m a black Puerto Rican and my wife is African American. In contemporary diversity terms, we are covered! Years ago, my oldest daughter graduated in the top five percent of the class at a rather rigorous Catholic high school. (Don’t hate me for bragging about her a little.) It was exciting for her to start applying for college and one of her top choices sent her an invitation to a “scholars’ night.” You can imagine her happiness and pride in being recognized as a great student.

My wife joined her for a weekend at the university. The initial event was an evening reception and as they entered a large room, they noticed dozens of students already there. As the evening proceeded more students joined, but my daughter noticed something curious: only black and brown students were there. She blurted to my wife, “There are no white scholars at this university!”

“Cultural features do not exist merely as badges of identity to which we have some emotional attachment. They exist to meet the necessities and forward the purposes of Human Life” ─Dr. Thomas Sowell

Well, soon they discovered that most students there had extremely low-grade point averages. The “scholars’ night” was merely a ploy by “diversology” bureaucrats to meet their quota of us negroes there. In other words, they did not see her. They did not see my daughter as a unique and unrepeatable person whose academic effort deserved a hearing. All the bureaucrats saw was the right skin color. Boxed into a taxonomy, her moral worth dissipated. I am proud of her and my wife: They packed and left immediately—right after expressing to organizers the sadness they experienced at the disrespect and soft bigotry of low expectations.[1]

crowds diversity gathering streetAlthough such efforts are often couched in the rhetoric of justice for a deprived people and rationalized as a desire to prop up those whose starting line was way behind, they only accomplish the severing of the connection between reward and accomplishment that can only lead to success through struggle in spite of obstacles. Offering brownie points to “endangered communities” is not much more than degrading and condescending paternalism, however well-intentioned.

The fact remains that we are not as different from each other as the diversity industry promulgates. There is no profound cultural divide in America that is not the product of the constant drumbeat that there is one. Yes, there are some differences in customs, accents, and ways of doing things but they are only balkanizing if we accept the progressive ideology behind the allegations of radical differences—an ideology that is not part of the culture of those deemed to be in cultural diaspora and must be imposed to convince them of the sad state of affairs.

The diversiphile worldview insists that we are not a melting pot because they want to conjure a salad bowl rooted in ideology.[2] As America was an experiment in ideas, they want to impose a new set of ideas to convince minorities that our land is far away, in a foreign Valhalla. They reject America as a common enterprise of common values. By instilling in our minds that there are fundamental differences at play, they can convince us that diversity necessitates a complete rearrangement of the body politic. The insistence on skin-deep diversity—that is, race—gives the diversity party a way to highlight some differences and ignore many commonalities—in the process filling the agnostic space they have cleared in our minds with gnostic knowledge of some imaginary “real” blackness or “Hispanicness. If cultural features exist merely as badges of honor to create barriers for unity, the creation of a diverse community is an impossibility. Culture becomes a weapon and diversity becomes an empty word.

What would be a clear and obvious diversity? Well, about 14% of Americans are of Hispanic descent. That is a fact. The fact shows a difference. But is that difference necessarily divisive? No. It becomes problematic when it is used as a tool, as a weapon to sever society into discrete groups with “proportional representation” everywhere. What is merely a beautiful expression of uniqueness within a society informed by inclusive monocultural features based on a common set of principles, is transformed into a radically divisive tool for power. Groups are competing for the same resources and one group is dominant over the others, making the struggle a dialectical one. As benefit derives from lumping individuals together, society is not unified but thinly pasted together with a substance that from the beginning was a lie to perpetuate the dominance of the majority group. This ghettoization of the entire body politic of necessity leads to separatism as the only solution.

Diversity also lumps people together when they often have very little in common. Take for example, the terms “Asian” and “Latino,” which cover such a variegated array of ethnicities, cultures, and religions. But the effort to divide us into warring camps now includes made-up words based on “gender” and those of us with Hispanic ancestry are suddenly informed that we are “Latinx.” Rearranging the whole society for the victims is offered to whites as their way of redemption for a past that is placed in front of them eternally. That past is a ticket to redemption for them and pyric benefit for us. Minorities become “endangered”, “different”, “special” because the status of victims confers benefit at the expense of being seen in the full scope of their personhood. The demands on them are daunting and the ones we place on ourselves, minimal. How else, for example, can we understand the attention given to cases of homicide by police in comparison to the virtual oblivion to the avalanche of internal community violence sweeping the inner city? If we can place internal community problems always in the context of the effects of systemic victimization, there is not much to do but to become political activists.

My friends, this is all an ideological ploy based on the dialectics of history that must find culprits and victims, oppressors and oppressed, expropriated and expropriators. In reality, diversity would be unsettling to the Marxists because when we, for example, study ancient cultures around the world they tend (with some exceptions) to be patriarchal and very traditional in mores and cultural and sexual ethics. This is one reason why liberation theologies developed in old Europe and are based on the thought of an elderly, anti-Semitic Prussian! The young seminarians at Tübingen and Rome concocted a theory that was in many ways at odds with the culture of the people they claimed to represent. With focused zeal, they descended into the barrios and favelas of Central and South America to snatch the culture and paint it with a veneer of ideology. After their indoctrination was all but complete, they suddenly claimed that the now-ideologized version of culture they imposed had to be respected by the colonial powers. We must reject this false diversity.

 

[1] The phrase was coined by President George W. Bush in a speech to the NAACP launching the No Child Left Behind Act. “Discrimination is still a reality, even when it takes different forms. Instead of Jim Crow, there’s racial redlining and profiling. Instead of separate but equal, there is separate and forgotten” (George W. Bush’s Speech to the NAACP, 2000).
[2] Wood, Peter, Diversity: The Invention of a Concept (San Francisco: Encounter Books, 2003) pp. 22-28.

Read More
11 Nov
0

Religion in Government

The nomination of a devoutly Catholic federal judge to fill the latest vacancy on the United States Supreme Court has brought back into the spotlight a question that has roiled American public life since the nation’s founding: What is the proper role of religion in our nation’s government?

Many people persist in thinking there’s an easy answer to that question. “None!” some say. Others insist that the US is a “Christian nation” and our government should reflect that fact.

In reality, the answer isn’t as simple as either of these answers implies.

The founders were well aware of the problematic nature of religion as it relates to the government. Europe’s post-Reformation history was fresh in their memories. In various contexts, Catholics, Lutherans, Calvinists, Anabaptists, and Jews had been persecuted and sometimes killed by government fiat. The framers tried hard to defuse the problem by enshrining in the Constitution (including its Bill of Rights) the principle of separation of church and state: no religious tests to hold public office; no establishment of a national church; and free exercise of religion for all.

By the standards of historical experience, it was an excellent effort. Any reasonable observer would agree that the United States has an exemplary record of protecting religious minorities, has been home to an extraordinary religious pluralism, and has avoided a widespread, ongoing, violent religious conflict of the kind that has plagued many other parts of the world.

But that doesn’t mean that the problem has been solved, once and for all. Our history has witnessed deadly clashes over religion (e.g., the Philadelphia Bible Riots of 1844) as well as government persecution of religious groups (e.g., Missouri’s 1838 Extermination Order against Mormons). Even if we optimistically believe that such religion-related violence is a thing of the past, softer forms of such conflict remain common. There have been numerous instances in recent years of vandalism directed against mosques, synagogues, and churches alike.

Determining exactly how the government ought to treat religion, moreover, remains a challenge. Complete neutrality is illusory. When government aid is available for social welfare programs and schools, and tax credits are offered for adoptions, educational expenses, or charitable donations, the line between “favoring religion” and “discriminating against religion” becomes hard to discern.

Sometimes there are difficult tradeoffs that must be made between religious freedom and public order. No one could reasonably claim that the late Justice Antonin Scalia was “anti-religious,” but he wrote the majority opinion against religious freedom claims in the 1990 Employment Division v. Smith case, which denied a religious exemption from the state’s controlled substances regulations.

So partisans on both sides of the religious divide need to be careful. It’s true that the “high wall” of separation introduced by the Supreme Court in the 1940s was rooted neither in the law or the historical experience of the American polity. Trying to eliminate religion from politics is a bad idea, in part because it’s impossible. At the same time, those who belong to the dominant religious group must guard against unwitting discrimination against minority views and practices. Finally, politicians and voters alike must avoid the temptation of using religion as a tool for political gain.

Following these guidelines will help, but, in the arena of church-state relations, dilemmas cannot be entirely avoided. In the 1870s, the Supreme Court effectively outlawed what had been a pillar of the Mormon faith, the practice of polygamy. It was deemed fundamentally incompatible with the values and institutions of American life. The Latter-Day Saints survived and thrived, but there’s no gainsaying the fact that this was a case of government intrusion into religion. The orthodox Christian who embraces both the sanctity of monogamous marriage and the inviolability of religious freedom will likely view the episode with some ambivalence.

In the 1830s, Alexis de Tocqueville found it remarkable how, in the American experience, the “spirit of religion” and the “spirit of freedom” were “intimately united.” Our continuing challenge is to continue to strike the balance that has characterized American political culture and preserved this unity. Our government should strive to create an environment that is favorable to religious belief and religious institutions, without favoring any particular religious view or body. The offices of government should be forbidden to neither agnostics nor believers, to neither secular Jews nor devout Catholics—to neither Ruth Bader Ginsburg nor Amy Coney Barrett.

Read More
15 Oct
0

What is Self-Reliance?

At the Freedom and Virtue Institute (FVI), one of our core values is self-reliance. The term is featured in the name of our flagship program: self-reliance clubs. Does this mean that we advocate “rugged individualism,” where everyone is on his own and only the fittest survive? Obviously not. If we did, then the idea of a “self-reliance club” would itself be an oxymoron.

What, then, do we mean by self-reliance? Consider the description that appears on our website. It may seem odd that the beginning of FVI’s statement on self-reliance is a claim about reliance: “Our most fundamental reliance is on God”. This claim is critical because it recognizes at the outset that absolute human self-reliance is impossible. “God shows the capacities of reason and choice to perfection. We rely on our capacities because we were made in his image”. Even the rugged individualist is compelled to use what Another has provided to create the illusion of “providing for himself.”

SRC-logo

Nor does self-reliance insinuate complete independence from other human persons. For one, we are called to employ the gifts that we’ve been given in a useful fashion. “We use the gifts [God] gave us to recreate our environment and produce positive outcomes for ourselves, our families, and our community”. This employment comes about through relationships—relationships that are familial, fraternal, social, economic, and political. Each of these spheres involves interpersonal relations that are in character somewhat different from those in the other spheres, but they are all important and contribute to forming the complex web of relations that comprise a healthy society. Thus FVI, far from advocating an isolating individualism, instead insists that “we rely on others and cooperate because others also are made in the image of God—we are called to communion!”

Freedom and voluntary association with each other are not only compatible; they are necessary companions. Alexis de Tocqueville, in his famous reflection on American democracy, wrote that “Sentiments and ideas renew themselves, the heart is enlarged, and the human mind is developed only by the reciprocal action of men upon one another.” Where people enjoyed freedom and independence, it would therefore be necessary to form associations that could accomplish this civilizing aim. Tocqueville concluded his treatment of associations this way: “In order that men remain civilized or become so, the art of associating must be developed and perfected among them in the same ratio as equality of conditions increases.”

It may seem that our view at this point is tilting away from self-reliance altogether. Here it helps to keep in mind what is being avoided. FVI, the website statement concludes, “encourages replacing the attitudes of entitlement and dependency by teaching individuals that they must become accountable for their own lives and well-being.” Although we are to some degree dependent on others, we should not enter into a relationship of dependency. That kind of relation is appropriate for a parent-infant relationship, but in able adults it is only found in the dysfunctional dialectics of master-slave or patron-client.

StudentsTogether

All the pieces are now in place for a balanced understanding of self-reliance. It is not the delusional belief that one can thrive without the assistance of others, nor the willingness to accept the sustaining support of others as a replacement for personal responsibility. It is not the refusal to relate to others in relations of mutual charity and benevolence, nor the exploitation of others as instruments of material or psychological comfort. It is expressed in neither independence nor dependence but instead in interdependence, which is the mutually beneficial relationship of equals. In this way, self-reliance is a form of charity. When we strive to be self-reliant, we seek to minimize the burden placed on others and maximize the burden placed on ourselves. We look for ways to contribute rather than to extract. We strive to build up the capacities of others rather than take advantage of their weaknesses. Where these attitudes are dominant, there is a flourishing social group—be that a family, a business enterprise, or a society. 

Read More
12 Oct
0

The Quintessential Americans

Creating a new nation by severing connections with the past was the founders’ generation’s remarkable effort. By bringing together from afar various peoples, they were attempting to interweave threads historically deemed incompatible with social cohesion. Nation-states were previously created by homogeneous populations through many generations of common existence. Ethnic identity was crucial for the building of national identities.

The Americans, however, endeavored a curious experiment whose end was creating a union that despite ancestral disparities was to weave around a set of universal principles. Distilling into a common vision, a variated multitude was to become a unified whole. E pluribus unum. Writing in his journal, the great essayist Ralph Waldo Emerson called that America was destined to become “asylum of all nations, the energy of Irish, Germans, Swedes, Poles, and Cossacks, and all the European tribes ─ of the Africans and of the Polynesians, will construct a new race… as vigorous as the new Europe which came out of the melting pot of the Dark Ages.”[2] George Washington spoke of a home “open…to the oppressed and persecuted of all Nations and religions.”[3] It is rather apparent that Washington was not referring only to white Christians. 

“We have it in our power to begin the world again” 
     ―Thomas Paine[1]

Similarly, John Quincy Adams invited the multitudes to come to our shores to build a futuristic polity—more than a nation, a new world. He once baffled Baron von Furstenwaerther when in a letter Adams added the caveat that to become an American a German had to cease to be one. “They must cast off the European skin, never to resume it. They must look forward to their posterity rather than backward to their ancestors.”[4]

A Frenchman among us saw the struggle of creating one people from a “promiscuous breed.”[5] Astonished and delighted by what he saw in America, his letter back to friends in France exclaimed, “Imagine, my dear friend, if you can, a society formed of all the nations of the world … people having different languages, beliefs, opinions: in a word, a society without roots, without memories, without prejudices, without routines, without common ideas, without character and yet a hundred times happier than our own.”[6]

In his effusive praise that may have reflected hope more than fact, Tocqueville described a quest never before attempted but seemingly moving forward. There was no originating purpose of racial supremacy in the effort to build our nation. All the founders had was a desire to start the world anew, no matter the daunting challenges, by leaving behind old prejudices coming from an old way of looking at reality and instead affirming natural rights lived out in new ways. They were creating a society that depends on intermediary institutions between the individual and the state and, at the same time, telling everyone to mind their own business. It was in the messy midst of such revolutionary ideas that prejudice and human fallibility put up a fight to derail the glorious experiment. Or are we to imagine that the timber of frail human character was not to resist? Why would we expect that the conflict between ancient prejudices and a vigorous leap forward was not also reflected within individuals’ own hearts?

It seems rather obvious to me that the arrival at our shores of vile slave ships in 1619 cannot in any fashion represent the founding of our nation because slavery belongs to the ancient and ghastly realities of an old realm that affirmed the types of differences the American experiment was rejecting, albeit through a struggle with a world that refused to fade. It is evident that Crevecoeur first, then Tocqueville, and later men like James Bryce and Gunnar Myrdal, did not have the African slave primarily in mind when reporting on the building of an American character that seemed to be, in the words of Bryce, “quickly dissolving and assimilating the foreign bodies that are poured into her mass.”[7] Yet, the solvent of a new world with new values and new hopes and possibilities was active even in the lives of those suffering the cruelest degradations.

Yes, slavery and racial prejudice toward a rather uniquely different people were unity’s greatest foes. But the quest to vanquish them gave us something amazing, forged in fire: the black American, the quintessential American. No one more than Africans had to give up any hope of rejoining ancestral kinfolk and ways of living; no one more than they had to endure hardships; no one more than they had to overcome such centrifugal forces: first, internal divisions among slaves—as African tribal rivalries and variations were real—and, then, to unite with the other Americans, pale and threatening. But they did! They ceased to be Africans and became Americans in spite of the brutal force of the enslaver’s blade. And today we are not, in any fashion, Africans in the diaspora. We are home.

Crushing coffee in Suriname, old illustration. Created by Worms after Bray, published on L'Illustration, Journal Universel, Paris, 185

I marvel at them, these forebears, with amazement. When despair made sense, they hoped. When pain abounded, they sang. When frustration and hard labor threatened their very lives, they moved forward with quiet pride, with resilient determination. No one deserves the title of American more. 

Tocqueville was incomplete in his description of how Americans became Americans, through civic participation and the exercise of political rights. There was a group of people who were deprived of those options and yet became Americans through sweat, blood, and tears—refining elements, martyred victory. 

That martyred victory is at risk today, not primarily from racist forces still at battle with the glorious experiment in creating a new world but from a radically new understanding of America that tries to sever African Americans from their true identity and bow down to foreign ideologies arising from a foreign vision of the human person. They insist that blacks are not quintessential Americans but eternal victims, who do not belong and will be rescued, of all people, by the thought of a bourgeois German who was born in 1818.


[1] Thomas Paine, Thomas, Common Sense, 23.

[1] Porte, Joel, Emerson in His  Journals (Cambridge, 1982) cited in Schlesinger, Arthur, The Disuniting of America (New York, W.W. Norton & Company, 1992) p. 24.

[2] J.C. Fitzpatrick, ed., Writings (Washington, 1938), xxvii, 252.

[3] In Werner Sollors, Beyond Ethnicity (New York, 1986), 4.

[4] The apt phrase comes from J. Hector St. John de Crevecoeur’s Letters from an American Farmer, first published in 1782, a work in some ways considered a precursor to Tocqueville’s Democracy in America.

[5] Alexis de Tocqueville, Democracy in America, vol. 1 (1836) chap. xiv.

[6] James Bryce, The American Commonwealth, vol. 2 (London, 1888), 709, 328. Cited in Arthur Schlesinger, The Disuniting of America (New York, W.W. Norton & Company, 1992), 26.

Read More
08 Oct
0

The Luring Tide of Victimization

The narrative inundating our public life and academic work is one of black oppression. We are told that black powerlessness is the direct result of the oppressive activity of the dominant white group. White America is guilty of “racism, classism, ableism, heterosexism, regionalism, sexism, ethnocentrism and ageism.”[1] In this view, if we are to understand the reality of oppression we must understand the nature of American society as prejudiced.[2] This idea is not a novelty: it has been alive among radicals for generations, and in recent decades has seeped into social work education and sociology. Now, however, it is mainstream. 

William Ryan coined a phrase that has become both a cudgel and an alibi: “blaming the victim.” In the classic version of the ideology of victimhood, it is customary to reject the proposition that the personal behavior of the presumed victims has much to do with their condition. Ryan called the assignment of personal responsibility “a brilliant ideology for justifying a perverse form of social action designed to change, not society, as one might expect, but rather society’s victim.”[3]

The field of social work especially has become a hub for creating activists with the “blaming the victim” mindset. Herbert and Irene Rubin, for example, tell us that “blaming the victim is a form of social control that disempowers by denying people a legitimate focus for complaint.” They deplore the fact that “people often blame themselves for the bad things that happen to them.”[4] Instead, what scholar William G. Brueggemann calls “institutional deviance” is at play. Society is at fault and social activism is the cure. The question is not why a person engages in deviant behavior but why society brings him to that place.

For these ideologues, the answer is found in Hegel’s idea of alienation, which comes to them via Karl Marx’s materialistic interpretation of Hegel’s dialectic. The oppressed class creates impersonal institutions whose purpose is to perpetuate the power relationship of the oppressor-oppressed. These institutions and ideas constitute a superstructure. Among these find elements, we find the nuclear family, the churches and religion, ethics, morality, cultural trends, banks, laws, and also social expectations or virtues such as personal responsibility. All are established not because they have intrinsic value or are conducive to human flourishing, but instead because they are useful to maintain the balance of power. 

According to Brueggemann, victim blamers shift responsibility from the capitalist system to the individual, or they may blame the anointed reformers—“socialists, pacifists, union organizers, social activists, community organizers, and civil rights activists”—instead of blaming the capitalist oppressors. They may also target a minority group—gays, the poor, blacks—as blameworthy for a given social ill.[5]

The solution these people propose begins with Marx again: the development of insight into the inexorable historical forces at play and the role of the victim in acquiring “class consciousness.” The victim must be empowered, and enlightenment “takes place as people recognize that they are victims of problems that are shared by many others.”[6] Through “consciousness-raising sessions” people come to realize that “their problems are caused by a broader social structure and occur because they redound to the advantage of others.” This gnostic “Victims-R-Us” system of thought provides a powerful reason to disengage from activity and to rebel in activism. From scenery in the drama of the oppressor, we are called to become scenery in the drama of forces outside our control—forces whispering softly into our ears that our debased condition, after all, is “their” fault.

Some scholars who blame society for individual misfortune point to low self-esteem as a by-product of victimization. “Persons who experience blame, shame, and stigma often assimilate this negativity into their self-image.… In general, feelings of powerlessness increase, often resulting in low self-esteem, alienation, and despair.”[7] But this theory does not fit the evidence. As Orlando Patterson demonstrates, studies of self-esteem amply show that, while up to the 1960s African-Americans exhibited low levels of ethnic and individual self-esteem, that trend has been reversed.

The black underclass exhibits higher “self-regard” but lower “feelings of personal efficacy.” Afro-Americans experience a lower sense of internal control but high self-esteem.[8] It seems as if the identity of victim has been internalized, with individuals accepting personal failure as a result of external forces. This pattern is corrosive to the fabric of a people. Individuals tend to devalue areas where they personally fail as such failure can be easily attributed to these forces. If my educational attainment is low, then it is because of white oppression, and I need not pay much attention to my education—after all, I cannot be expected to excel until “whitey” fixes the problem.

As Patterson shows, the problem is no longer that individuals feel bad about themselves but that they exhibit a “sense of positive regard … from their commitment to blaming the system.… Lower-class Afro-Americans, with the full support of their leaders and professional psychologists, have come to respect themselves because they have no autonomy.” When individuals see important areas of self-development as unimportant and abandon a commitment to improve them, the results are devastating.[9]

We are truly hurting our generation with these senseless and destructive theories. Abandoning the appetite for the alibi of victimization is our priority as a people. If we fail, a dreadful and painful journey lies ahead.


[1] See Karla Krogsrud Miley, Michael O’Melia, and Brenda DuBois, Generalist Social Work Practice: An Empowering Approach (Boston: Allyn & Bacon, 2004), 89.

[2] See Philip R. Popple and Leslie Leighninger, The Policy-Based Profession: An Introduction to Social Welfare Policy for Social Workers (Boston: Allyn & Bacon, 1998), 98. There we read: “In fact, many policies, such as affirmative action and minority scholarships, are often proposed specifically for this purpose [to ameliorate the effects of societal racism, sexism, etc.]. On the other hand, individuals and groups often oppose social welfare policies and, although they generally don’t admit this, the reason for the opposition is often directly a result of racism and sexism.”

[3] William Ryan, Blaming the Victim (New York: Vintage Books, 1976), 78.

[4] Herbert J. Rubin and Irene S. Rubin, Community Organizing and Development, 3rd ed. (Boston: Allyn & Bacon, 2001), 81.

[5] William G. Brueggemann, The Practice of Macro Social Work (Belmont, CA: Brooks/Cole, 2002), 41–43.

[6] Rubin and Rubin, Community Organizing and Development, 89.

[7] Miley, O’Melia, and DuBois, Generalist Social Work Practice, 90.

[8] Orlando Patterson, The Ordeal of Integration: Progress & Resentment in America’s “Racial” Crisis (New York: Basic Civitas, 1997), 88.

[9] Patterson, Ordeal of Integration, 90–91.

Read More
23469
Social Media Auto Publish Powered By : XYZScripts.com