Reasonable Sacrifice
Thankfulness ought not to be merely in an inwardly felt affection; but rather it is to be manifested in the actions of obedient sacrifice. Preparing a feast, raising children, supporting a ministry financially, caring for aging parents, protecting your nation from invasion, and feeding the impoverished all require your material expense and physical exertion.
Sacrifice is reasonable (Rom. 12:1). But the mindset of most people is that self-preservation is more reasonable. We think that sparing ourselves difficulty & discomfort is sensible. We’ve built a framework that incentivizes selfishness. From the smorgasbord of the entertainment industry, to the twisting of the medical field to drug and carve and indulge the patient’s imagined vision for themselves, we are a culture consumed with self. But this is unreasonable; like trying to grow a crop of corn by planting popcorn.
Both Moses’ Law and throughout the Psalms we see that thanksgiving is expressed through sacrifice. The sacrificial system was the way in which Israelites demonstrated their gratitude for God’s covenant mercies. The Psalms further revealed the ethical reality that thankfulness is demonstrated by sacrifice (Ps. 116:17).
If we put this together with Paul’s instruction to offer our bodies as a living sacrifice we can see the necessity of the material discomfort of obedience.
Related Posts:
You Might also like
-
The Counterfeit, Anti-Biblical Epistemologies of Postmodernism and Critical Theory
Though the devotees of postmodernism and critical theory love to sport epistemological terminology, they betray its essence at every turn. Truth is a fiction. Justification is a waste of time. Belief is purely optional. Their epistemology is counterfeit.
Among the world currencies, some are strong, others weak. And yes, there are the parasitic counterfeits. Unfortunately, these pretenders can do a lot of damage, trading on another’s good name. Albert Talton is a case in point: Using only a standard inkjet printer in the early 2000’s, he managed to produce seven million dollars’ worth of phony one-hundred-dollar bills, circulating many of them before going to jail in 2009. Unfortunately—even tragically—postmodernism and critical theory have generated epistemological counterfeits that have beguiled and bankrupted much of our culture.
Knowledge
Treasury agents are trained to spot counterfeits by first scrutinizing the real thing, and so we shall begin with the classic definition of “knowledge”—what it is, how you get it, and how you can be confident you have it—the subject of epistemology. The formula traces back to Plato, who, in the Theaetetus, has Socrates identifying it as “correct belief” together with “an account” of why the judgment is made.[1] Socrates hesitated to endorse it, since, as worded, it was circular, including knowledge of supporting evidence in the definition of “knowledge.” But the core notion endured, thanks in large measure to the identification of the need for and availability of foundational, epistemic premises, whether empirical or rationalistic. So, we press on with the ancient characterization, today expressed as “justified true belief.”
Of course, all sorts of philosophical analysis have challenged and refined the definition. For instance, we contrast “knowledge that” (propositional) with “knowledge of” (e.g., how to ride a bike), and a fellow named Edmund Gettier came up with an ingenious counter-argument in the 1960’s, where all three elements were present, but still no knowledge—prompting philosophers to rise in defense of the received concept.[2] But there is a strange new assault on it, mounted by purveyors of postmodernism and critical theory.
Just as Christian Science is neither Christian nor scientific, critical theory is hostile to critical thinking, and it commends a posture, not a theory. A genuine theory, such as plate tectonics, generates testable/falsifiable hypotheses, in this instance seabed fissures oozing magma and continual earthquakes along the “Ring of Fire.” But the “theory” in critical theory is a snide conceit, immune—yea hostile—to rational pushback. It’s the very antithesis of judicious inquiry, the practice that has prospered the Judeo-Christian West. Indeed, it attempts to lay the ax at the roots of the best in our civilization, nullifying the truths of the created order laid out in the opening chapters of Genesis.
So, back to the definition, as it relates to a given proposition:
If it’s true and warranted, but I don’t believe it, then I don’t know it. (Think of an atheist actor mouthing the lines of a faithfully-biblical sermon.)
If it’s true and I believe it, but I lack good reasons for my belief, then I don’t know it. (A hypochondrial hysteric can get things right now and then, even when his self-diagnosis is based on the flimsiest of evidence.)
If my belief is warranted, but it turns out to be false, then you don’t say I had knowledge of it. (Such is the case when I’m deceived by a typically reliable, but currently addled, source.)
So, again: Justified. True. Belief. Sad to say, these three are cast aside today by cultural patricians and plebeians alike under the postmodernist spell.
Anti-Knowledge
So what is casting these spells?
Postmodernism
As Gene Veith demonstrated in his 1994 book, Postmodern Times,[3] postmodernism boils down to relativism and pluralism, which have replaced modernism, whose god was the latest deliverances of scientific materialism. The chaos has now been nurtured by new technologies, a topic Veith takes up in Post Christian: “Individuals can latch onto the ‘truths’ (often put into quotation marks today) that they want to believe in or that accords with their will to power (the will taking the place of the intellect; power taking the place of reason).”[4]
Postmodern Times discussed the sexual revolution in terms of extramarital sex; now the issues are homosexuality, pornography, and sex robots. In the 1990s we were deconstructing literature; in the twenty-first century we are deconstructing marriage. In the 1990s we were constructing ideas; in the twenty-first century we are constructing the human body. In the 1990s we had feminism; in the twenty-first century we have transgenderism. In the 1990s we were urged to embrace multiculturalism; in the twenty-first century we are warned about committing cultural appropriation. Pluralism has given way to identity politics. Relativism has given way to speech codes. Humanism has given way to transhumanism, the union of human beings and machines.[5]
In the confusion, social commentators are scrambling to coin new terms to catch up with developments, e.g., “post-postmodernsm,” “metamodernisim,” “transpostmoderism,” “altermodernism,” and “performatism,” but all are fruit of relativism.[6]
Venturing outside the evangelical camp, we find substantial testimony to complement Veith’s portrayal. British professor Zygmunt Bauman (a Polish, Jewish expatriate) construed postmodernism in these terms:
The mistrust of human spontaneity, of drives, impulses and inclinations resistant to prediction and rational justification, has been all but replaced by the mistrust of unemotional, calculating reason. Dignity has been returned to emotions; legitimacy to the “inexplicable,” nay irrational, sympathies and loyalties which cannot “explain themselves” in terms of their usefulness and purpose… [In the postmodern world] things may happen that have no cause which made them necessary; and people do things which would hardly pass the test of accountable, let alone “reasonable,” purpose… We learn again to respect ambiguity, to feel regard for human emotions, to appreciate actions without purpose and calculable rewards. We accept that not all actions, and particularly not all among the most important of actions, need to justify and explain themselves to be worthy of our esteem.[7]
Of course, there is a place of honor in Christianity for emotions, spontaneity, and mystery, but when these are the ruling criteria, contemptuous of reasonableness, then we gut the faith “once for all delivered to the saints” as well as “the whole counsel of God.”
Unfortunately, postmodern relativism produces thuggery rather than a joyous festival down at Vanity Fair. Ohio State professor Brian McHale plays off Jean François Lyotard’s characterization of postmodernism as “incredulity toward the master narratives of Western culture” as he presents Thomas Pynchon’s novel, Gravity’s Rainbow, as “a test case of postmodern incredulity, relentlessly questioning, opposing, and undermining cultural narratives about scientific knowledge and technological progress, about the nation and the people, about liberalism and democracy.” Its “[c]haracters’ epistemological quests succumb to ontological uncertainty in a world—a plurality of worlds—where nothing is stable or reliably knowable.” Rather, he says we need to put our faith in “little narratives” which support “small-scale separatist cultural enclaves.”[8] And so, armed with postmodern tools, academic departments, media empires, and even the military are bullied into honoring heretofore-considered-degenerate “cultural enclaves,” as wonderful giftings and exemplars of treasured diversity, protected under pain of penalty.
Earlier, I mentioned Socrates’ reservation over the definition, “justified, true, belief.” The problem was that you had to assume to know certain things (items you raise in justification, e.g., “I’m sure the accused was in the mall that afternoon. I saw him there.”) in order to demonstrate that you knew other things, and so looms the threat of circularity. Well, indeed, there needs to be external grounding for our claims, items philosopher Alvin Plantinga has called “properly basic.” If we can’t agree on those matters, then we reach an impasse, and this destroys perhaps the main tool of analytical reasoning, the reductio ad absurdum (“reduction to absurdity”). On this model, a thinker will advance a fact-claim or alleged principle, and then his interlocutors will jump in to trace the implications. If these prove to be laughable or grotesque, then the assertion must be retooled or discarded for another try. The problem comes when the parties involved are unable to agree on what is laughable or grotesque. Take for instance the rejoinder to the claim that people can self-identify with a gender at odds with the chromosomal facts. When you show that this could mean that a young man might compete in womens’ events at the Olympics, sane people would agree that you’ve blown up the transgender conceit. But there are those who would ask, “What’s your point? I don’t see a problem there.” And that is where we are today. A rare madness has fallen upon our nation, whereby unmasked fools are standing their ground and making public policy.
Critical Theory
American English professor Lois Tyson provides a crisp and enthusiastic account of critical theory’s realm and ethos:
Simply speaking, when we interpret a literary text, we are doing literary criticism; when we examine the criteria upon which our interpretation rests, we are doing critical theory… Of course, when we apply critical theories that involve a desire to change the world for the better—such as feminism, Marxism, African American criticism, lesbian/gay/queer criticism, and postcolonial criticism—we will sometimes find a literary work flawed in terms of its deliberate or inadvertent promotion of, for example, sexist, classist, racist, heterosexist, or colonialist values. But even in these cases, the flawed work has value because we can use it to understand how these repressive ideologies operate.[9]
She continues by working from the thought of Jacques Derrida, the French postmodernist who dismissed “structuralists,” those who saw universal commonalities in the way we grasp and construe the world (the sort of thing that could reflect and point to a created order). Rather, he magnified the variations, licensing human language (rather than the logos of John 1:1) to make a mockery of overarching accounts of reality.
[A]ll systems of Western philosophy derive from and are organized around one ground principle from which we believe we can figure out the meaning of existence… While these ground concepts produce our understanding of the dynamic evolving world around us—and of our dynamic, evolving selves as well—the concepts themselves remain stable. Unlike everything they explain, they are not dynamic and evolving… They are “out of play,” as Derrida would put it. This type of philosophy—in short, all Western philosophy—Derrida calls logocentric because it places at the center (centric) of this understanding of the world a concept (logos) that organizes and explains the world for us while remaining outside of the world it organizes and explains. But for Derrida, this is Western philosophy’s greatest illusion. Given that each grounding concept —Plato’s Forms, Descartes’ cogito, structuralism’s innate structures of human consciousness, and so on—is itself a human concept and therefore a product of human language, how can it be outside the ambiguities of language? That is, how can any concept be outside the dynamic, evolving, ideologically saturated operations of the language that produced it?
For Derrida, the answer is that no concept is beyond the dynamic instability of language, which disseminates (as a flower scatters its seed on the wind) an infinite number of possible meanings with each written or spoken utterance. For deconstruction, then, language is the ground of being, but that ground is not out of play; it is itself as dynamic, evolving, problematical, and ideologically saturated as the worldviews it produces. For this reason, there is no center to our understanding of existence there are, instead, an infinite number of vantage points from which to view it, and each of these vantage points has a language of its own, which deconstruction calls its discourse. For example, there is the discourse of modern physics, the discourse of Christian fundamentalism, the discourse of liberal arts education in the 1990s, the discourse of nineteenth-century American medicine, and so on… For deconstruction, if language is the ground of being, then the world is infinite text, that is, an infinite chain of signifiers always in play.[10]
Again, relativism, albeit a tendentious and aggressive relativism.
Truth
With this in mind, let’s return to the three-part definition of knowledge, taking a closer look at how these elements have been undermined and dismissed in our culture. For starters, the traditional standard of truth is correspondence with reality, and it’s propositional: “The cat is on the mat” is true if the cat is on the mat.
So what’s the problem? Well, as Cambridge-educated, Kenyan-Christian-school-administrator Philip Dow explains, postmodernism makes the pursuit of knowledge pointless:
Relativistic openness…undermines progress for the simple reason that progress assumes a goal. We only know we are making progress when we are getting closer to that goal. Take away the goal of truth and any talk of advancing becomes meaningless. All our attempts at moral scientific or spiritual improvement simply become nonsense unless we believe that there are targets we are shooting for.[11]
Furthermore, it makes us prey to the notions of “my truth” and “your truth,” casting aside the sensible concept of the truth. Nevertheless, Middlebury professor Heidi Grasswick is all in on jettisoning objective knowledge, in effect dismissing Kepler’s notion that, in our studies, we should be concerned with “thinking God’s thoughts after Him”:
Analysis of testimony has formed one of the largest and most active areas of discussion in contemporary social epistemology. Feminists’ attention to the role of social power relations in the economics of credibility has provided a distinct angle from which to develop insightful descriptive and normative assessments of testimony across differently situated agents…The basic idea of socially situated knowing amounts to a denial of the traditional framing of the epistemic point of view as a “view from nowhere,” embracing instead the idea that knowing is inherently perspectival, with perspectives being tied to our materially and socially grounded position in the world.”[12]
Biblical Regard for Truth
It’s obvious to any student of the Bible that truth is a non-negotiable feature of Christianity, from its grounding in Old Testament prophecy (where Amos pictures God holding a plumb line accusingly beside Israel’s morally crooked wall) on through the Gospels (where, in the Sermon on the Mount, Jesus repeatedly uses “truly” and “you have heard it said, but I say . . .” to set the record straight), the epistles (where, in 2 Timothy 3, Paul compares current enemies of the gospel to the truth-opposing Jannes and Jambres of Moses’s day), and Revelation 21, where liars are consigned to “the lake that burns with fire and sulphur). And, of course, we have Jesus’ explanation in John 8, that the devil is “the father of lies,” his declaration in John 14, “I am the way, and the truth, and the life,” and Paul’s teaching in 1 Corinthians 13, “Love…rejoices with the truth.” Scriptural testimony to the reality and value of truth is manifold.
Meaning
Of course, the possibility of a proposition’s being true depends upon the meaning of the words. When you say that the whale is a mammal, you need to have a reliable, exacting definition of “mammal.” And fastidiousness must extend beyond the glossary to punctuation, as underscored in the book title, Eats, Shoots, and Leaves.[13] (As it stands, you have a gunfighter extracting himself from a hostile saloon. Drop the commas, and you’re talking about a panda.)
Knowing that pesky matters of truth and falsity can wreck their enterprise, postmodernists and critical theorists can simply queer (in both senses) the issue upstream. Simply commandeer the language, and you avoid accountability. Consider the expression, “begs the question.” It’s typically cast as “raises the question,” as in “The advance of the polar ice sheet this year begs the question, ‘Is anthropogenic global warming a reality?’” However, the concept refers classically to unfairly front-end-loading the conclusion, often in the form of a “question-begging epithet”—a slur that rigs the conversation. Imagine, for instance, a survey that asks, “Do you oppose the tyrannical Texas law, robbing women of their right to choose their own path to reproductive health?” It seems as though the right answer would be Yes. But more dispassionate wording might shift the results. If you spoke more clinically about a fetal-heartbeat red line, you’d see more No’s.
Notice that both nouns (“health”) and adjectives (“tyrannical”) do heavy lifting in the original question. No, there’s nothing wrong per se in the use of highly charged words. No one should object to the sentence, “In territories under his control, the despotic Adolph Hitler implemented a policy of genocide against the Jews.” The problem comes when you assume the very thing you’re trying to demonstrate, either through specious definitions or super-charged modifiers. And both are stock-in-trade for critical theory.
A favorite suffix, serving both nouns and adjectives, derives from the Greek word for fear, phobos. It shows up in “homophobia” and “homophobic” and signals a malady. Consider the poor fellow who stays cooped up in his home, terrified of normal contact with folks at the mall (“agoraphobia”); who insists upon the statistically more dangerous highway for long trips, refusing to fly (“aerophobia”); or who clicks past Channel 13, feeling much safer watching Channel 14 (“triskaidekaphobia”). Even when the danger may be real in certain circumstances, e.g., for the “germaphobe,” the subject’s fear is judged irrational, ideally addressed by therapy. But when you label as a “phobia” a phenomenon warranting concern, revulsion, or indignation, you speak viciously, not judiciously. If, for instance, you raise the alarm over the erasure of gender identity and the abominable public policy implications that follow from it (e.g., with boys self-identifying as girls in the girls’ locker room), you’re dismissed as a “phobe” rather than a “guide,” a distinction whose soundness should be in play, not something to be bulldozed by raw stipulation.
One of the most breathtaking examples of linguistic bulldozing involves the construal of “racism” as beyond the capability of disadvantaged people. The traditional and plausible understanding of the term disparages those who refuse to “judge people by the color of their skin rather than the content of their character” (cf. Martin Luther King’s “I Have a Dream” speech). But what if the prejudice flows upward rather than downward, it’s excused—whether from a financially struggling Malay toward the prosperous Chinese immigrant with a shop in the atrium; from a black custodian living on Chicago’s Near West Side toward the white building manager who enjoys better lodging on the city’s North Shore; from Filipino contract workers serving as housekeepers in shimmering, high-rise condos in Dubai. This curious definition gives “underdogs” a blank check to despise, indiscriminately, Chinese, Anglos, and Arabs for being Chinese, Anglo, and Arab. Guilt-free racism, utterly un-Christian, yet touted even by some who call themselves Christian.
The list goes on and on: disagreement-discourse called “hate speech;” dispute-free zones called “safe-spaces;” straightforward speech labeled a “dog whistle,” implying subterfuge; “We need to have a conversation,” meaning “You need to meekly receive my authoritative lecture;” and “Just listen,” implying, “Just alter your behavior to accommodate my feelings and convictions,” as in “They doesn’t listen to me.” Of course, on many of these matters, we’ve been listening for centuries, even millennia, and those suggesting that we’ve not done our civilizational homework or are suffering from ethical and logical malformation are likely trading in insult and specious implication.
As the account goes, if you don’t “just listen,” you’re guilty of “testimonial injustice.” This “occurs when prejudice on the part of the hearer leads to the speaker receiving less credibility than he or she deserves.” And some would cast this offense as a failure of distributive justice: “If we think of credibility as a good (like wealth, healthcare, education or information), then it is natural to think that testimonial injustice consists in an unjust (or unfair) distribution of this good…”[14] Of course, that kicks the can down the road. You still have to determine whether the speaker is sagacious, befuddled, or mendacious. But the postmodernists have an answer: If and only if he’s marginalized, his account is important, and to ignore it is evil. For them, it’s obvious that you must grant some sort of epistemological equity to all voices so that no one is denied a seat of honor at the roundtable of adepts.
On the contrary, it’s reasonable to think that much marginalization is due to the bad epistemological choices the marginalized have made. That sounds harsh, but everyone—postmodernists included—must make such value choices. Consider the counsel of Tasmanian philosopher David Coady. He begins with a veneer of dispassionate wisdom, but then shows his esteem for the deliverances of wanton sexual passion:
Read More -
What is Orthodox Protestantism? A Brief Response to Rod Dreher
Written by Carl R. Trueman |
Monday, January 9, 2023
Institutional unity is important as a witness to the truth. I for one do think it ridiculous that in the USA alone there are numerous presbyterian denominations who hold substantially the same doctrinal position but exist as separate institutional bodies. Yet even so, the problem of defining Protestant orthodoxy is not simply a Protestant problem.Taking his cue from my recent article at First Things, Rod Dreher asks a most reasonable question: what is orthodox Protestantism?
The problem with defining the term is that orthodox Protestantism is, in one sense, an abstraction. It correlates with no single institution. Thus, the Roman Catholic is here at an advantage, at least in theory: orthodox Catholicism is what the Roman Catholic Church upholds as true and practices in her worship. The unity of the institution makes the question straightforward. As there is no single orthodox Protestant church, the question is inevitably more challenging.
The way I was using the term in the article was with reference to the points of consensus of the Protestant confessions of the sixteenth and seventeenth centuries. Thus, when one compares, say, the Lutheran Book of Concord with the various Reformed confessions, significant points of agreement emerge: on the Trinity, on the Incarnation, on the uniqueness and sufficiency of Christ for salvation. We might summarize this as agreement upon the creedal faith of the early church, refracted through the debates over sacraments, salvation, and ecclesiology in the Reformation. Significant points of antithesis do exist within Reformation Protestantism, particularly on the Lord’s Supper as a point of division between Lutherans and Reformed, but aside from this significant issue, there is a high degree of fundamental commonality.
When one looks specifically at the Reformed confessions, the consensus is even stronger. E.F.K. Müller’s collection of Reformed confessional documents, Die Bekenntnisschriften der reformierte Kirche, is fascinating in this regard: the documents are drawn from across Europe and represent the productions of churches in a wide variety of linguistic, political, and cultural contexts. Yet there is substantial unity on all major topics. From the doctrine of God through the Incarnation to grace, justification, the word of God, the church, sacraments, and the afterlife, a clear core of orthodox Protestant teaching is there, despite the diversity of contexts–a diversity arguably much greater than that represented by the bespoke diversities of today, given the lack of information technology, easy and efficient transportation, and pop cultural unity in the sixteenth century (no international Manchester United Supporters’ Club in Luther’s day), things that are now a commonplace in our globalized world.
Catholics will no doubt respond that I am offering a false unity here. I have chosen those texts that reflect the core of Christian belief I myself prefer and, by privileging them as normative, have granted Protestant orthodoxy a coherence that it did not possess then and does not possess now.
Read More
Related Posts: -
Praying When I Don’t Feel like It
Is it so mysterious that I am not growing? Couldn’t the main reason be my prayerlessness? We need to become focused on the encouragement of warm, loving times with our patient Savior—the One who does not cease whispering our names into the ears of His Father. We must deal with this besetting sin. We are not going to grow in Christlikeness until we meet regularly in the secret place with Jesus.
All Christians are being tested in their responsiveness and obedience to their own consciences. God has provided within them that great monitor of their conduct and behavior. So how is it between us and our consciences? Are we infinitely particular about paying attention to what our consciences say? Are we careful to educate those consciences because any conscience can be in darkness?
There are many Christian consciences that, in the words of Thomas Boston, are “too pernickety.” They condemn what God’s Word does not condemn, and Christians must educate their consciences. There are other consciences that can let everything pass, even those attitudes that God’s Word condemns. They are not as sensitive as they should be; they are too broad and too open. Our consciences must constantly stand under the scrutiny of God’s Word. The conscience must always be open to commending what God’s Word commends and condemning what God’s Word condemns. The Puritan Thomas Manton reminds us:
Conscience must be satisfied with something. So professing Christians can please themselves with giving to God as much obedience as is least contrary to their feelings and inclinations. Like a servant who obeys his master when he sends him on a mission to a fair or a feast, but deceives in errands that are more demanding. This man is satisfying self, his own inclinations. Such men are not so much serving God as their own interests.
Let us suppose that every reader of Tabletalk has an enlightened conscience. What allegiance are we showing to it? Are we careful to obey it when we are emotionally disinclined, when we lie in the depth of depression, or when we’re wallowing in self-pity? We know that there is a duty to attend to, but it is very unpleasant and unattractive, an unwelcome responsibility. Do we have the maturity to stand over our emotions and in the face of our feelings determine to attend to what God commands us to do, even though we are emotionally disinclined to do so?
There is no greater peril in the Christian life than to make our emotions the touchstone of our duties—in other words, to wait for the moment of inspiration before we obey our Lord, who has told us always to pray and not to faint.Read More
Related Posts: