What Does God Listen For?
Out of all the innumerable sounds in heaven and earth, God pays special attention to the voice of his people. Psalm 34 is not simply a theological statement of this fact—it is the personal testimony of David, when he was a fugitive running for his life. He celebrates his own experience of God hearing his cry for help.
Have you ever considered all the things you hear in the course of one day? This morning, I heard birds singing outside, and the voices of my family. I heard the coffee machine and the clink of plates and cutlery at breakfast. Right now, I’m hearing the noise of construction above the ever-present sounds of traffic and the occasional gust of wind. I haven’t even had lunch yet. There will be plenty more to fill my ears before this day is finished.
Have you ever considered all the things that God hears? The creator of sound waves hears the unceasing worship of angels before his throne. He hears the swirling wind of Jupiter and the ice that melts on Mars. He hears beyond what is audible to us—the ultrasonic songs of katydids and the footsteps of aphids. He hears beyond the limits of location—
Related Posts:
Subscribe to Free “Top 10 Stories” Email
Get the top 10 stories from The Aquila Report in your inbox every Tuesday morning.
You Might also like
-
Innovation Exists by God’s Design
He may use you to discover the genetic cure for cancer, or he may use you to weaponize a superravager, but he disposes of every innovator as he pleases in his wisdom. We are each accountable for our volitional decisions and our sins. But make no mistake—each and every one of us finally fulfills the Creator’s purpose for our lives.
Innovators Create by Divine Appointment
Many of the sharpest Christians, who rightly celebrate God’s providential governance over all things, tend to wrongly assume (in practice) that his reign ends somewhere around the boundary lines of Silicon Valley. In reality, innovators—both virtuous ones and nefarious ones—are created by God. Scripture protects us from the myth that God is trying his best to stifle and subdue the unwieldiness of human technology. No, for his own purposes God creates blacksmiths and warriors, both welders and wielders of new tools. Our most powerful innovators exist by divine appointment.
More troubling, many of the world’s most powerful technologists imagine that they have transcended their need for God. And it is their common agnosticism or atheism that explains why Christians today often adopt a negative view of technology. The godlessness of Elon Musk reminds us that the closer you approach Silicon Valley, the fewer Christians you’ll find. The percentage of professing evangelical adults in the US (25.4 percent) drops in California (20 percent) and plummets in San Francisco (10 percent). And the percentage of adults who read Scripture at least once a week in the US (35 percent) sinks in California (30 percent) and plunges in San Francisco (18 percent).1 We assume that God must be withdrawn from such a place. But Isaiah corrects this assumption. The pagan societies where the ancient smith and ravager operated make San Francisco look like it’s part of the Bible Belt.What does God think of human technology? Tony Reinke explores how the Bible unseats 12 common myths Christians hold about life in this age of innovation.
The rejection of God and the accumulation of innovative brilliance doesn’t give you the power to operate apart from God, like a queen on a chess board who thinks she can move anywhere she wants, impervious to the Master’s ultimate plan. Your innovative brilliance is how God is choosing to wield you in the world.
Read More -
How Hitler and a Boring Sermon Awakened C.S. Lewis’s Demons
If you’ve never read the Screwtape Letters before, I’d encourage you to grab a copy. The book consists of 31 letters from a senior demon Screwtape to his nephew, a junior demon, named Wormwood. This is arguably Lewis’s most influential work. You can read it in a month just doing one letter a day. They give powerful insights into what it feels like to be tempted in a fallen world and the glory that awaits believers on the other side. I’m staring a new podcast this month called “Mere Caffeination.”
It was a hot dry summer in 1940 in Oxford, England. That ended in July when the heavens opened up with deluge rainfalls. It must have been a wet Saturday evening when C.S. Lewis, the man who would take to speaking over the radio in the very near future, turned on his own radio on and tuned in to listen to an influential political speech. History was being made in more ways than one
“In looking back upon the last ten months we are all struck by the grace of Providence that has allowed us to succeed in our great work,” the speaker’s voice proclaimed through the crackly speakers. “Providence has blessed our great resolves and guided us in our difficult matters. As for myself, I am deeply moved, realizing that Providence has called on me to restore to my people their freedom and honor.”
Lewis admitted to being affected by the rhetoric. “I don’t know if I’m weaker than other people, Lewis said, “but it is a positive revelation to me how while the speech lasts it is impossible not to waver just a little.” Lewis wrote these words describing how it felt to hear what is described as Hitler’s last appeal to Britain to remove themselves from the war, before he promised to unleash Hell. Within a couple months it would be far more than rain falling from the English sky.
In his speech, Hitler claimed to be the voice of reason pleading for common sense. It was Churchill who was evil and illogical, Hitler claimed, referencing the United Kingdom’s Prime Minister no less than fourteen times in the address.
Read More
Related Posts: -
The Counterfeit, Anti-Biblical Epistemologies of Postmodernism and Critical Theory
Though the devotees of postmodernism and critical theory love to sport epistemological terminology, they betray its essence at every turn. Truth is a fiction. Justification is a waste of time. Belief is purely optional. Their epistemology is counterfeit.
Among the world currencies, some are strong, others weak. And yes, there are the parasitic counterfeits. Unfortunately, these pretenders can do a lot of damage, trading on another’s good name. Albert Talton is a case in point: Using only a standard inkjet printer in the early 2000’s, he managed to produce seven million dollars’ worth of phony one-hundred-dollar bills, circulating many of them before going to jail in 2009. Unfortunately—even tragically—postmodernism and critical theory have generated epistemological counterfeits that have beguiled and bankrupted much of our culture.
Knowledge
Treasury agents are trained to spot counterfeits by first scrutinizing the real thing, and so we shall begin with the classic definition of “knowledge”—what it is, how you get it, and how you can be confident you have it—the subject of epistemology. The formula traces back to Plato, who, in the Theaetetus, has Socrates identifying it as “correct belief” together with “an account” of why the judgment is made.[1] Socrates hesitated to endorse it, since, as worded, it was circular, including knowledge of supporting evidence in the definition of “knowledge.” But the core notion endured, thanks in large measure to the identification of the need for and availability of foundational, epistemic premises, whether empirical or rationalistic. So, we press on with the ancient characterization, today expressed as “justified true belief.”
Of course, all sorts of philosophical analysis have challenged and refined the definition. For instance, we contrast “knowledge that” (propositional) with “knowledge of” (e.g., how to ride a bike), and a fellow named Edmund Gettier came up with an ingenious counter-argument in the 1960’s, where all three elements were present, but still no knowledge—prompting philosophers to rise in defense of the received concept.[2] But there is a strange new assault on it, mounted by purveyors of postmodernism and critical theory.
Just as Christian Science is neither Christian nor scientific, critical theory is hostile to critical thinking, and it commends a posture, not a theory. A genuine theory, such as plate tectonics, generates testable/falsifiable hypotheses, in this instance seabed fissures oozing magma and continual earthquakes along the “Ring of Fire.” But the “theory” in critical theory is a snide conceit, immune—yea hostile—to rational pushback. It’s the very antithesis of judicious inquiry, the practice that has prospered the Judeo-Christian West. Indeed, it attempts to lay the ax at the roots of the best in our civilization, nullifying the truths of the created order laid out in the opening chapters of Genesis.
So, back to the definition, as it relates to a given proposition:
If it’s true and warranted, but I don’t believe it, then I don’t know it. (Think of an atheist actor mouthing the lines of a faithfully-biblical sermon.)
If it’s true and I believe it, but I lack good reasons for my belief, then I don’t know it. (A hypochondrial hysteric can get things right now and then, even when his self-diagnosis is based on the flimsiest of evidence.)
If my belief is warranted, but it turns out to be false, then you don’t say I had knowledge of it. (Such is the case when I’m deceived by a typically reliable, but currently addled, source.)
So, again: Justified. True. Belief. Sad to say, these three are cast aside today by cultural patricians and plebeians alike under the postmodernist spell.
Anti-Knowledge
So what is casting these spells?
Postmodernism
As Gene Veith demonstrated in his 1994 book, Postmodern Times,[3] postmodernism boils down to relativism and pluralism, which have replaced modernism, whose god was the latest deliverances of scientific materialism. The chaos has now been nurtured by new technologies, a topic Veith takes up in Post Christian: “Individuals can latch onto the ‘truths’ (often put into quotation marks today) that they want to believe in or that accords with their will to power (the will taking the place of the intellect; power taking the place of reason).”[4]
Postmodern Times discussed the sexual revolution in terms of extramarital sex; now the issues are homosexuality, pornography, and sex robots. In the 1990s we were deconstructing literature; in the twenty-first century we are deconstructing marriage. In the 1990s we were constructing ideas; in the twenty-first century we are constructing the human body. In the 1990s we had feminism; in the twenty-first century we have transgenderism. In the 1990s we were urged to embrace multiculturalism; in the twenty-first century we are warned about committing cultural appropriation. Pluralism has given way to identity politics. Relativism has given way to speech codes. Humanism has given way to transhumanism, the union of human beings and machines.[5]
In the confusion, social commentators are scrambling to coin new terms to catch up with developments, e.g., “post-postmodernsm,” “metamodernisim,” “transpostmoderism,” “altermodernism,” and “performatism,” but all are fruit of relativism.[6]
Venturing outside the evangelical camp, we find substantial testimony to complement Veith’s portrayal. British professor Zygmunt Bauman (a Polish, Jewish expatriate) construed postmodernism in these terms:
The mistrust of human spontaneity, of drives, impulses and inclinations resistant to prediction and rational justification, has been all but replaced by the mistrust of unemotional, calculating reason. Dignity has been returned to emotions; legitimacy to the “inexplicable,” nay irrational, sympathies and loyalties which cannot “explain themselves” in terms of their usefulness and purpose… [In the postmodern world] things may happen that have no cause which made them necessary; and people do things which would hardly pass the test of accountable, let alone “reasonable,” purpose… We learn again to respect ambiguity, to feel regard for human emotions, to appreciate actions without purpose and calculable rewards. We accept that not all actions, and particularly not all among the most important of actions, need to justify and explain themselves to be worthy of our esteem.[7]
Of course, there is a place of honor in Christianity for emotions, spontaneity, and mystery, but when these are the ruling criteria, contemptuous of reasonableness, then we gut the faith “once for all delivered to the saints” as well as “the whole counsel of God.”
Unfortunately, postmodern relativism produces thuggery rather than a joyous festival down at Vanity Fair. Ohio State professor Brian McHale plays off Jean François Lyotard’s characterization of postmodernism as “incredulity toward the master narratives of Western culture” as he presents Thomas Pynchon’s novel, Gravity’s Rainbow, as “a test case of postmodern incredulity, relentlessly questioning, opposing, and undermining cultural narratives about scientific knowledge and technological progress, about the nation and the people, about liberalism and democracy.” Its “[c]haracters’ epistemological quests succumb to ontological uncertainty in a world—a plurality of worlds—where nothing is stable or reliably knowable.” Rather, he says we need to put our faith in “little narratives” which support “small-scale separatist cultural enclaves.”[8] And so, armed with postmodern tools, academic departments, media empires, and even the military are bullied into honoring heretofore-considered-degenerate “cultural enclaves,” as wonderful giftings and exemplars of treasured diversity, protected under pain of penalty.
Earlier, I mentioned Socrates’ reservation over the definition, “justified, true, belief.” The problem was that you had to assume to know certain things (items you raise in justification, e.g., “I’m sure the accused was in the mall that afternoon. I saw him there.”) in order to demonstrate that you knew other things, and so looms the threat of circularity. Well, indeed, there needs to be external grounding for our claims, items philosopher Alvin Plantinga has called “properly basic.” If we can’t agree on those matters, then we reach an impasse, and this destroys perhaps the main tool of analytical reasoning, the reductio ad absurdum (“reduction to absurdity”). On this model, a thinker will advance a fact-claim or alleged principle, and then his interlocutors will jump in to trace the implications. If these prove to be laughable or grotesque, then the assertion must be retooled or discarded for another try. The problem comes when the parties involved are unable to agree on what is laughable or grotesque. Take for instance the rejoinder to the claim that people can self-identify with a gender at odds with the chromosomal facts. When you show that this could mean that a young man might compete in womens’ events at the Olympics, sane people would agree that you’ve blown up the transgender conceit. But there are those who would ask, “What’s your point? I don’t see a problem there.” And that is where we are today. A rare madness has fallen upon our nation, whereby unmasked fools are standing their ground and making public policy.
Critical Theory
American English professor Lois Tyson provides a crisp and enthusiastic account of critical theory’s realm and ethos:
Simply speaking, when we interpret a literary text, we are doing literary criticism; when we examine the criteria upon which our interpretation rests, we are doing critical theory… Of course, when we apply critical theories that involve a desire to change the world for the better—such as feminism, Marxism, African American criticism, lesbian/gay/queer criticism, and postcolonial criticism—we will sometimes find a literary work flawed in terms of its deliberate or inadvertent promotion of, for example, sexist, classist, racist, heterosexist, or colonialist values. But even in these cases, the flawed work has value because we can use it to understand how these repressive ideologies operate.[9]
She continues by working from the thought of Jacques Derrida, the French postmodernist who dismissed “structuralists,” those who saw universal commonalities in the way we grasp and construe the world (the sort of thing that could reflect and point to a created order). Rather, he magnified the variations, licensing human language (rather than the logos of John 1:1) to make a mockery of overarching accounts of reality.
[A]ll systems of Western philosophy derive from and are organized around one ground principle from which we believe we can figure out the meaning of existence… While these ground concepts produce our understanding of the dynamic evolving world around us—and of our dynamic, evolving selves as well—the concepts themselves remain stable. Unlike everything they explain, they are not dynamic and evolving… They are “out of play,” as Derrida would put it. This type of philosophy—in short, all Western philosophy—Derrida calls logocentric because it places at the center (centric) of this understanding of the world a concept (logos) that organizes and explains the world for us while remaining outside of the world it organizes and explains. But for Derrida, this is Western philosophy’s greatest illusion. Given that each grounding concept —Plato’s Forms, Descartes’ cogito, structuralism’s innate structures of human consciousness, and so on—is itself a human concept and therefore a product of human language, how can it be outside the ambiguities of language? That is, how can any concept be outside the dynamic, evolving, ideologically saturated operations of the language that produced it?
For Derrida, the answer is that no concept is beyond the dynamic instability of language, which disseminates (as a flower scatters its seed on the wind) an infinite number of possible meanings with each written or spoken utterance. For deconstruction, then, language is the ground of being, but that ground is not out of play; it is itself as dynamic, evolving, problematical, and ideologically saturated as the worldviews it produces. For this reason, there is no center to our understanding of existence there are, instead, an infinite number of vantage points from which to view it, and each of these vantage points has a language of its own, which deconstruction calls its discourse. For example, there is the discourse of modern physics, the discourse of Christian fundamentalism, the discourse of liberal arts education in the 1990s, the discourse of nineteenth-century American medicine, and so on… For deconstruction, if language is the ground of being, then the world is infinite text, that is, an infinite chain of signifiers always in play.[10]
Again, relativism, albeit a tendentious and aggressive relativism.
Truth
With this in mind, let’s return to the three-part definition of knowledge, taking a closer look at how these elements have been undermined and dismissed in our culture. For starters, the traditional standard of truth is correspondence with reality, and it’s propositional: “The cat is on the mat” is true if the cat is on the mat.
So what’s the problem? Well, as Cambridge-educated, Kenyan-Christian-school-administrator Philip Dow explains, postmodernism makes the pursuit of knowledge pointless:
Relativistic openness…undermines progress for the simple reason that progress assumes a goal. We only know we are making progress when we are getting closer to that goal. Take away the goal of truth and any talk of advancing becomes meaningless. All our attempts at moral scientific or spiritual improvement simply become nonsense unless we believe that there are targets we are shooting for.[11]
Furthermore, it makes us prey to the notions of “my truth” and “your truth,” casting aside the sensible concept of the truth. Nevertheless, Middlebury professor Heidi Grasswick is all in on jettisoning objective knowledge, in effect dismissing Kepler’s notion that, in our studies, we should be concerned with “thinking God’s thoughts after Him”:
Analysis of testimony has formed one of the largest and most active areas of discussion in contemporary social epistemology. Feminists’ attention to the role of social power relations in the economics of credibility has provided a distinct angle from which to develop insightful descriptive and normative assessments of testimony across differently situated agents…The basic idea of socially situated knowing amounts to a denial of the traditional framing of the epistemic point of view as a “view from nowhere,” embracing instead the idea that knowing is inherently perspectival, with perspectives being tied to our materially and socially grounded position in the world.”[12]
Biblical Regard for Truth
It’s obvious to any student of the Bible that truth is a non-negotiable feature of Christianity, from its grounding in Old Testament prophecy (where Amos pictures God holding a plumb line accusingly beside Israel’s morally crooked wall) on through the Gospels (where, in the Sermon on the Mount, Jesus repeatedly uses “truly” and “you have heard it said, but I say . . .” to set the record straight), the epistles (where, in 2 Timothy 3, Paul compares current enemies of the gospel to the truth-opposing Jannes and Jambres of Moses’s day), and Revelation 21, where liars are consigned to “the lake that burns with fire and sulphur). And, of course, we have Jesus’ explanation in John 8, that the devil is “the father of lies,” his declaration in John 14, “I am the way, and the truth, and the life,” and Paul’s teaching in 1 Corinthians 13, “Love…rejoices with the truth.” Scriptural testimony to the reality and value of truth is manifold.
Meaning
Of course, the possibility of a proposition’s being true depends upon the meaning of the words. When you say that the whale is a mammal, you need to have a reliable, exacting definition of “mammal.” And fastidiousness must extend beyond the glossary to punctuation, as underscored in the book title, Eats, Shoots, and Leaves.[13] (As it stands, you have a gunfighter extracting himself from a hostile saloon. Drop the commas, and you’re talking about a panda.)
Knowing that pesky matters of truth and falsity can wreck their enterprise, postmodernists and critical theorists can simply queer (in both senses) the issue upstream. Simply commandeer the language, and you avoid accountability. Consider the expression, “begs the question.” It’s typically cast as “raises the question,” as in “The advance of the polar ice sheet this year begs the question, ‘Is anthropogenic global warming a reality?’” However, the concept refers classically to unfairly front-end-loading the conclusion, often in the form of a “question-begging epithet”—a slur that rigs the conversation. Imagine, for instance, a survey that asks, “Do you oppose the tyrannical Texas law, robbing women of their right to choose their own path to reproductive health?” It seems as though the right answer would be Yes. But more dispassionate wording might shift the results. If you spoke more clinically about a fetal-heartbeat red line, you’d see more No’s.
Notice that both nouns (“health”) and adjectives (“tyrannical”) do heavy lifting in the original question. No, there’s nothing wrong per se in the use of highly charged words. No one should object to the sentence, “In territories under his control, the despotic Adolph Hitler implemented a policy of genocide against the Jews.” The problem comes when you assume the very thing you’re trying to demonstrate, either through specious definitions or super-charged modifiers. And both are stock-in-trade for critical theory.
A favorite suffix, serving both nouns and adjectives, derives from the Greek word for fear, phobos. It shows up in “homophobia” and “homophobic” and signals a malady. Consider the poor fellow who stays cooped up in his home, terrified of normal contact with folks at the mall (“agoraphobia”); who insists upon the statistically more dangerous highway for long trips, refusing to fly (“aerophobia”); or who clicks past Channel 13, feeling much safer watching Channel 14 (“triskaidekaphobia”). Even when the danger may be real in certain circumstances, e.g., for the “germaphobe,” the subject’s fear is judged irrational, ideally addressed by therapy. But when you label as a “phobia” a phenomenon warranting concern, revulsion, or indignation, you speak viciously, not judiciously. If, for instance, you raise the alarm over the erasure of gender identity and the abominable public policy implications that follow from it (e.g., with boys self-identifying as girls in the girls’ locker room), you’re dismissed as a “phobe” rather than a “guide,” a distinction whose soundness should be in play, not something to be bulldozed by raw stipulation.
One of the most breathtaking examples of linguistic bulldozing involves the construal of “racism” as beyond the capability of disadvantaged people. The traditional and plausible understanding of the term disparages those who refuse to “judge people by the color of their skin rather than the content of their character” (cf. Martin Luther King’s “I Have a Dream” speech). But what if the prejudice flows upward rather than downward, it’s excused—whether from a financially struggling Malay toward the prosperous Chinese immigrant with a shop in the atrium; from a black custodian living on Chicago’s Near West Side toward the white building manager who enjoys better lodging on the city’s North Shore; from Filipino contract workers serving as housekeepers in shimmering, high-rise condos in Dubai. This curious definition gives “underdogs” a blank check to despise, indiscriminately, Chinese, Anglos, and Arabs for being Chinese, Anglo, and Arab. Guilt-free racism, utterly un-Christian, yet touted even by some who call themselves Christian.
The list goes on and on: disagreement-discourse called “hate speech;” dispute-free zones called “safe-spaces;” straightforward speech labeled a “dog whistle,” implying subterfuge; “We need to have a conversation,” meaning “You need to meekly receive my authoritative lecture;” and “Just listen,” implying, “Just alter your behavior to accommodate my feelings and convictions,” as in “They doesn’t listen to me.” Of course, on many of these matters, we’ve been listening for centuries, even millennia, and those suggesting that we’ve not done our civilizational homework or are suffering from ethical and logical malformation are likely trading in insult and specious implication.
As the account goes, if you don’t “just listen,” you’re guilty of “testimonial injustice.” This “occurs when prejudice on the part of the hearer leads to the speaker receiving less credibility than he or she deserves.” And some would cast this offense as a failure of distributive justice: “If we think of credibility as a good (like wealth, healthcare, education or information), then it is natural to think that testimonial injustice consists in an unjust (or unfair) distribution of this good…”[14] Of course, that kicks the can down the road. You still have to determine whether the speaker is sagacious, befuddled, or mendacious. But the postmodernists have an answer: If and only if he’s marginalized, his account is important, and to ignore it is evil. For them, it’s obvious that you must grant some sort of epistemological equity to all voices so that no one is denied a seat of honor at the roundtable of adepts.
On the contrary, it’s reasonable to think that much marginalization is due to the bad epistemological choices the marginalized have made. That sounds harsh, but everyone—postmodernists included—must make such value choices. Consider the counsel of Tasmanian philosopher David Coady. He begins with a veneer of dispassionate wisdom, but then shows his esteem for the deliverances of wanton sexual passion:
Read More