The dirty (big) secret of capital

1. In the Confessions, Rousseau famously describes his secret desire as a child of eight for the punishment given to him by a nursemaid, whose hand “determined my tastes, my desires, my passions, myself for the rest of my life” and that when he entered puberty, “tormented for a long time without knowing by what, I devoured beautiful women with an ardent eye; solely to make use of them in my fashion, and to make so many Mlle Lamberciers out of them”. After the first instance, Rousseau “required all the truth of that affection [for Mme. Lambercier] and all my natural goodness to keep me from seeking the repetition of the same treatment by deserving it: for I had found in the suffering, even in the shame, an admixture of sensuality which had left me with more desire than fear to experience it a second time from the same hand”. The spanking would only occur one other time, after which Rousseau and his brother, who had previously slept in her room, were sent to sleep in a separate room, the honor of which he “could very well have dispensed” but, nevertheless, was regretfully that of “being treated by her as a big boy”.

Rousseau’s infatuation with older women would continue into his teenage years when, at about the age of sixteen or seventeen, inflamed by desire and fantasies of women, and yet unwilling to act, he would instead skulk in “dark alleys [and] hidden nooks where I could expose myself from afar to persons of the opposite sex”. However, Rousseau immediately notes that he “would not dream” of flashing them the “obscene object”; rather, they saw “the ridiculous object”, which had been spanked as a child, and “the foolish pleasure I had in displaying it to their eyes cannot be described. There was only one step to take from that to feeling the desired treatment, and I do not doubt that some bold one would have given me this amusement while passing by, if I had had the audacity to wait” (one can only imagine Rousseau giggling and scurrying away).

Rousseau wants for no audacity in these confessions, admitting that the memory of pissing into the cooking pot of a neighbor while she was at church as a child “still makes me laugh”. Rousseau understands that, as Foucault argues, those who enjoin us to confess “what one is and what one does … what one is thinking and what one thinks he is not thinking—are [not] speaking to us of freedom”. Unlike the priestly confession, however, Rousseau’s confessions lack the sacramental seal of shame and humility and, thus, the “shimmering mirage” (Foucault) of the truth between the confessor’s words. There are only the words and a defiant smirk; Rousseau never becomes “the subject of the statement” to one who prescribes the ritual of confession and who is thus liberated by it (compare, for example, the objections to the misunderstandings of his work in the Reveries and Dialogues). Rousseau, of course, was fully aware of the dialectic of liberation and subjection (e.g., in the famous statement of bondage in The Social Contract) and affirms their identity-in-difference by his insistence that the truth of his confessions lies not in what is meant by his words but simply in what is said (“I have nothing to hide”).

2. In an essay made famous by Auerbach, Montaigne admits that “I very rarely repent, and that my conscience is satisfied with itself, not as the conscience of an angel, or that of a horse, but as the conscience of a man”. The angel’s will is immovable, Aquinas says, and so the virtues that satisfy us would be of disinterest to a higher nature. Sin, “which is lodged in us as in its own proper habitation” thus admits of no true repentance: “one may disown and retract the vices that surprise us, and to which we are hurried by passions; but those which be a long habit are rooted in a strong and vigorous will are not subject to contradiction [and thus no repentance]. Repentance is no other but a recanting of the will and an opposition to our fancies, which lead us which way they please” (emphasis added). Thus the true moral dictate is not that of repentance but sincerity, particularly in the face of the contingencies of our nature and our fate. We cannot reveal ourselves in our essential truth:

I cannot fix my object; ‘tis always tottering and reeling by a natural giddiness … I do not paint its being, I paint its passage … I must accommodate my history to the hour: I may presently change, not only by fortune but also by intention. ‘Tis a counterpart of various and changeable accidents, and of irresolute imaginations, and, as it falls out, sometimes contrary: whether it be that I am then another self, or that I take subjects by other circumstances and considerations: so it is that I may peradventure contradict myself, but, as Demades said, I never contradict the truth. Could my soul once take footing, I would not essay but resolve: but it is always learning and making trial [emphases added].

The self-representation that Montaigne offers – as a representation of the human condition or “my universal being” – therefore admits of no “inner” truth whose general form is inaccessible to others. That which is admired or reviled of our public semblance is of less consequence than the mundane habits of our private life (no one is a hero to the chambermaid, Montaigne observes). The truth of a life lies not in its honors, deeds, or ideals – and much less in its approbations and validations – in short, not in its truth but in its inanity. The most for which one can hope is not rightness nor redemption but the sincerity of speaking of one’s “ill-fashioned” nature that, “if I had to model him anew, I should certainly make something else than what he is but that’s past recalling”, i.e., not from the regret of what might have been but the tranquility of an ordinary life.

3. What Montaigne never saw, however, are the conditions of modern life that not only generate the compulsory demands of truth but the structures that render the most ordinary truths about ourselves unspeakable and simultaneously alienating while expressing perhaps the fundamental truth of capital.

Lazzarato has described the asignifying semiotics of the economy that “act on things. They connect an organ, a system of perception, an intellectual activity, and so on, directly to a machine, procedures, and signs, bypassing the representations of a subject … Stock market indicies, unemployment statistics, scientific diagrams and functions, and computer languages produce neither discourses nor narratives” and act directly on the material flows that comprise the fundamental ontology of capital, bypassing the classical subjects of knowledge or labor. Lazzarato’s analysis thus indicates that to grasp the truth of capital we must look neither to its meaning or its content (e.g., in alienation) but to its form:

what matters to capitalism is controlling the asignifying semiotic apparatuses (economic, scientific, technical, stock-market, etc.) through which it aims to depoliticize and depersonalize power relations. The strength of asignifying semiotics lies in the fact that, on the one hand, they are forces of ‘automatic’ evaluation and measurement and, on the other hand, they unite and make ‘formally’ equivalent heterogeneous spheres of asymmetrical force and power by integrating them into and rationalizing them for economic accumulation.

Individuals are thus de-subjectivized and dissolved by these apparatuses; “if our societies are no longer based on individuals, they are not based on language either” (as Nietzsche observed, we have rid ourselves of neither God nor our selves because we still believe in grammar).

Lazzarato’s insight can be generalized: the autonomy of capital from the individual is at once ontological, semiotic, and logical. This truth of capital is one that can be neither represented nor spoken in the language of capitalism except through the cultural (hence “unofficial”) prohibitions on revealing the most ordinary and ubiquitous facts about ourselves. We are enjoined, for example, never to ask what someone else makes nor to volunteer that information; we are compelled to hide the truth. Of course, this practice serves familiar bourgeois interests of management and preserves the importance of pecuniary conspicuousness described by Veblen. But, more than that, this fact about ourselves can only be expressed as both a confession but also as a penitence, given that no matter what our answer, we must face the shame that it is insufficient or the guilt that it is too much. We can never give a right answer since, of course, the truth that we are obliged to reveal is not a truth about us at all; it is a truth about the indifference of capital to the value of a human life, which cannot be expressed by capitalism and yet that must be constructed as the only truth about the individual that matters (“what do you do?”), since it is the only truth that can be encoded into the signifying apparatuses of its machines. As Foucault observed, rather than being a rebellion against the repressive demand to stay silent, our confession produces the structures of power that render the truth unspeakable in the first place. The intolerable presumption of capital is that it foists its secret upon us while demanding at every turn that we wear it on our sleeves; unlike Rousseau, however, we do not have the luxury of insolence.

From farce to tragedy

1a. In a remarkable letter written five years after his presidency, Madison praises the state of Kentucky for its commitment to the provision of public education, observing—in language that precedes Marx’s (and Engels’ independent text) more famous phrase by thirty years—that “a popular government, without popular information, or the means of acquiring it, is but a prologue to a farce or a tragedy; or, perhaps both”. Madison’s enthusiasm is directed specifically at the fact that the state is constructing a plan for education

“embracing every class of citizens, and every grade and department of knowledge. No error is more certain than the one proceeding from a hasty and superficial view of the subject: [i.e.,] that the people at large* have no interest in the establishment of academies, colleges, and universities, where a few only, and those not of the poorer classes can obtain for their sons the advantages of superior education. It is thought to be unjust that all should be taxed for the benefit of a part, and that too the part least needing it.”

Here Madison has his vision fixed a century into the future since, prior to the beginning of the twentieth century, less than two percent of the population received schooling beyond high school (and these naturally being the sons of wealthy landowners). He continues:

“If provision were not made at the same time for every part [of society], the objection would be a natural one. … It is better for the poorer classes to have the aid of the richer by a general tax on property, than that every parent should provide at his own expense for the education of his children, it is certain that every class is interested in establishments which give to the human mind its highest improvements, and to every country its trust and most durable celebrity. Learned institutions ought to be favorite objects with every free people. They throw that light over the public mind which is the best security against crafty and dangerous encroachments on the public liberty.”

It should come as no surprise, then, that the most vociferous opponents of higher education today are also those in the process of retracting the promises of civil liberties for which our predecessors suffered through their very lives and bodies, whether through proposing the largest cuts to state funding for education in the history of this country or through explicit denunciations of “the academic left” (that follow an easily identifiable historical genealogy from the infamous McCarthy trials).

But in addition to the arguments advanced a century later by Dewey to the effect that the possibility of democracy is predicated on an educated citizenry, Madison also observes that such governments require not mere politicians but statesmen:

“[Schools] multiply the educated individuals from among whom the people may elect a due portion of their public agents of every description; more especially of those who are to frame the laws; by the perspicuity, the consistency, and the stability, as well as by the just and equal spirit of which the great social purposes are to be answered.”

The democratic provision of the public good, however, requires not only the presently favorable desires of majority opinion. Representation is not of majority opinion; rather, majority opinions ends at representation and the task of the statesman is to deliberate about the possibilities of justice in the face of present needs. Here Madison agrees with Plato: the statesman requires a specific form—and not a specific content—of knowledge, which had broadly speaking been the task entrusted to liberal education not as the reception of information but the capacity to ask, frame, and understand important (viz., ethical and political) questions. (One of the primary complaints of contemporary educators is the inability of students to “think critically”, i.e., to frame appropriate questions, identify their stakes, and establish criteria for their resolution.) Instead of the “right to have an opinion”, education demands that the right be earned by the capacity to know how to ask the right questions.**

**This too was Dewey’s point in an address to a conference of scientists: “the trouble with much of what is called popularization of knowledge is that it is content with diffusion of information, in diluted form, merely as information [think the “intelligence” required to participate in Jeopardy!]. It needs to be organized and presented in its bearing upon action” (i.e., as system). That, Dewey insisted throughout the end of his career, is the “supreme intellectual obligation”: to mobilize knowledge as knowledge and not mere information for moral and social improvement. If there is anything pragmatism understood correctly—and what its critics have misunderstood—it is that knowledge is useful when it is true (it is not, as the more decadent pragmatists would say, true because it is useful).

This critical capacity, Madison argues, must be acquired broadly under pain of plutocracy:

“Without such institutions, the more costly of which can scarcely be provided by individual means, none but the few whose wealth enables them to support their sons abroad can give them the fullest education; and in proportion as this is done, the influence is monopolized which superior information everywhere possesses. … Whilst those who are without property, or with but little, must be peculiarly interested in a system which unites with the more learned institutions, a provision for diffusing through the entire society the education needed for the common purposes of life.”

Madison proceeds, again, to address a future he could not have foreseen, viz., one in which the US lags far behind several western European countries in terms of economic mobility with the one decisive factor being education (45% of people in the bottom 1/5 of the economy who do not graduate college remain in their present economic location whereas only 16% of those who graduate remain):

“Why should it be necessary in this case [of the provision of education] to distinguish the society into classes according to their property? When it is considered that the establishment and endowment of academies, colleges, and universities are a provision, not merely for the existing generation, but for succeeding ones also; that in governments like ours a constant rotation of property results from the free scope to industry [an observation unfortunately disqualified by the succeeding history of the republic] … and when it is considered moreover, how much of the exertions and privations of all are meant not for themselves, but for their posterity, there can be little ground for objections from any class, to plans of which every class must have its turn of benefits. The rich man, when contributing to a permanent plan for the education of the poor, ought to reflect that he is providing for that of his own descendents; and the poor man who concurs in a provision for those who are not poor that at no distant day it may be enjoyed by descendants from himself. It does not require a long life to witness these vicissitudes of fortune.”

Yet at no point does Madison aver to the propensity of education to improve the material lot of oneself or one’s family. At best, as Adler cogently argued, the material benefits of education are corollary or subsidiary: they are not its primary function. Madison again:

“Throughout the civilized world, nations are courting the praise of fostering science and the useful arts, and are opening their eyes to the principles and blessings of representative government. The American people owe it to themselves, and to the cause of free government [emphasis added, to prove by their establishments for the advancement and diffusion of knowledge, that their political institutions, which are attracting observation from every quarter … are as favorable to the intellectual and moral improvement of man as they are conformable to his individual and social rights. What spectacle can be more edifying or more seasonable, than that of liberty and learning, each leaning on the other for their mutual and surest support?”

If Madison is right about the mutual constitution of liberty and education, then the continuing and persistent degradation of liberty (ironically in the name of liberty itself, recognizable as such only to those who can no longer distinguish between reality and illusion) should come as no surprise. In an analysis of transcripts from presidential debates, where the 1858 debates between Lincoln and Douglas occurred at an eleventh to twelfth grade literacy level, the Gore-Bush debate of 2000 occurred at a sixth (Bush) to seventh (Gore) grade level. Political speech, in other words, is addressed to adults with the literate capacity of children.

1b. Madison ends his letter with the remark that, in addition to reading, writing, and arithmetic, that provision should be made for the study of geometry and astronomy since “no information seems better calculated to expand the mind and gratify curiosity than what would thus be imparted. This is especially the case, with what relates to the globe we inhabit, the nations among which it is divided, and the characters and customs which distinguish them. An acquaintance with foreign countries in this mode, has a kindred effect with that of seeing them as travelers, which never fails, in uncorrupted minds, to weaken local prejudices, and enlarge the sphere of benevolent feelings”. Against the clichés of humanistic education that claim to provide insight into “discovering oneself”, Madison’s point here is that we must always understand ourselves as situated in the world and that ours is one among many ways of seeing, doing, acting, and living. Absent cognizance of the world and its other inhabitants, we are easily tempted by the narcissism of enjoyment. “Any reading not of a vicious species,” Madison concludes, “must be a good substitute for the amusements too apt to fill up the leisure of the laboring classes”. The vulgarity of such amusements (in large part what contemporary theory calls “spectacle”) is not intrinsic to any particular content but to their familiar effects: e.g., the silencing of discourse, the banalization of injustice, and the sublimation (in the chemical sense) of ethics into enjoyment (i.e., the reverse of Freudian sublimation).

The two activities of leisure in both ancient Greek and western bourgeois society were none other than politics and education. Both required a certain kind of autonomy from economic and material necessity. But instead of the reward of such freedom and the ability to “do nothing”, leisure imposed a grave duty, against which the ideology of “free time” has given seemingly inescapable means and opportunities of squandering for the sake of enjoyment.

2a. In the Critique of Everyday Life, Lefebvre analyzed the ways in which the everyday as the structural condition for life is at the same time the principal way in which the modern individual is alienated from her life. While Lefebvre was encumbered by the simultaneous mobilization of the everyday as both an ontological and sociological category, the Critique remains the standard account for the simultaneous collapse of leisure into the temporal repetitions of the everyday and the idealization of leisure as an escape from the everyday.

On the one hand, Lefebvre shows that the everyday is never simply given but constituted through the accretion of social and cultural signification.*** But he also shows (as Adorno and Horkheimer had also pointed out) that “the town and the factory complement one another by both conforming to the technical object [which Lefebvre in the middle of the twentieth century had already observed simply defined the everyday mode of existence]. An identical process makes work easy and passive, and life outside work fairly comfortable and boring. Thus everyday life at work and outside work become indistinguishable, governed as they are by systems of signals”. The word “signal” here is deliberate and appropriate: a signal, unlike a sign proper, has a meaning incapable of higher-order signification and functions structurally equivalently to Pavlovian response.

***I disagree with one of my own teacher’s remarks, however, that given this aspect of the everyday, as that which organizes experience and the world through certain spatio-temporal forms, it “becomes harder to endow it with an intrinsic political content. The everyday is robbed of much of its portentous symbolic meaning” (Felski). While on the one hand I accept her general corrective to the “hermeneutics of suspicion” endemic to cultural and critical theory, intrinsic to critical philosophy since Kant is conviction that the primary (and perhaps only) task of thought is not to take its conditions as necessary or as (enabling) limits.

Lefebvre finds examples of such a network of signals and conditioned responses in mass media (again, remembering that he is writing these particular words in the late 1950s):

“Day in and day out, news, signs and significations roll over [the individual] like a succession of waves, churned out and repeated and already indistinguishable by the simple fact that they are pure spectacle: they are overpowering, they are hypnotic. The ‘news’ submerges viewers in a monotonous sea of newness and topicality which blunts sensitivity and wears down the desire to know. Certainly, people are becoming more cultivated. Vulgar encyclopedism is all the rage. The [sociological] observer may well suspect that when communication becomes incorporated in private life to this degree it becomes non-communication.”

Aside from current concerns about “attention saturation” from cognitive psychology, Lefebvre continues to describe the mechanisms of the alienation that results from the uncoupling of signification from significance:

“Radio and television do not penetrate the everyday solely in terms of the viewer. They go looking for it at its source: personalized (but superficial) anecdotes, trivial incidents, familiar little family events. They set out from an implicit principle: ‘Everything, in other words, anything at all, can become interesting and even enthralling, provided that it is presented …’ The art of presenting the everyday by taking it from its context, emphasizing it, making it appear unusual or picturesque and overloading it with meaning, has become highly skillful [Lefebvre has, in fact, described reality TV forty years before it existed]. … At the extreme looms the shadow of what we will call ‘the great pleonasm’: the unmediated passing immediately into the unmediated and the everyday recorded just as it is in the everyday—the event grasped, pulverized and transmitted as rapidly as light and consciousness—the repetition of the identical in a wild whirling dance devoid of Dionysian rapture, since the ‘news’ never contains anything really new.”

Lefebvre thought that this “extreme point” of closure between communication and information was “still a long way away”. It turns out, however, that thirty or forty years is not so long. “At one and the same time the mass media have unified and broadcast the everyday; they have disintegrated it by integrating it with ‘world’ current events in a way which is both too real and utterly superficial. What is more or less certain is that they are dissociating an acquired, traditional culture, the culture of books, from written discourse and Logos. We cannot say what the outcome of this destructuring process will be.” But it seems that we can: the impossibility of philistinism because of the total absence of a culture about which to be literate (a parody is no longer a parody when it cannot be understood as such).

The obsession with difference after May ’68 in French thought can be interpreted as a refusal of this eternal repetition of the same on which mass culture insists as both the cause and the cure for existential boredom. It is for this reason that Lefebvre calls for the critique of the everyday because “to know the everyday is to want to transform it. Thought can only grasp it and define it by applying itself to a project or radical programme of radical transformation. To study everyday life and to use that study as the guideline for gaining knowledge of modernity is to search for whatever has the potential to be metamorphosed … it is to understand the real by seeing it in terms of what is possible, as an implication of what is possible”. Despite the disagreements between Lefebvre and Goldmann, so too the latter would insist that “the possible is the fundamental category for comprehending human history. The great difference between positivist and dialectical sociology consists precisely in the fact that whereas the former is content to develop the most exact and meticulous possible photography of the existing society, the latter tries to isolate the potential consciousness in the society is studies: the potential [virtuelles], developing tendencies oriented toward overcoming that society. In short, the first tries to give an account of the functioning of existing structuration, and the second centers on the possibilities of varying and transforming social consciousness and reality”. Of course, these two enterprises are not opposed; the second is the consequence of the first, which shows us the necessity of such an overcoming. As Foucault would say—a point that critics of postmodernism such as Furedi have never grasped—the moment power/knowledge is grasped as historically constituted it is recognized in its contingency and the possibility of political action and change (Foucault’s word is “destruction”) is realized.

2b. Kant contra Hegel (and Nietzsche). In a series of what are generally regarded as minor texts, Kant anticipates the stark differences that would separate him from the idealism he resisted in Fichte and what would become the absolutism of Hegel on the notion of history. Kant insists that history is not the continuous improvement of humanity or, in short, that we cannot say in fact that humanity is always improving. Rather, the perfectability of humanity is a sort of regulative ideal of practical action: that we must assume that the improvement of humanity is possible or else, if we were to believe that every triumph of virtue is simply negated by a corresponding tragedy, “it may perhaps be moving and instructive to watch such a drama for a while; but the curtain must eventually descend. For in the long run, it becomes a farce [emphasis added]. And even if the actors do not tire of it—for they are fools—the spectator does, for any single act will be enough for him if he can reasonably conclude from it that the never-ending play will go on in the same way for ever” (Kant rejects, in short, the doctrine of amor fati).

What Kant (nor Nietzsche for that matter) did not anticipate was the ways in which nihilism would be made not only tolerable but the primary object of desire for civilizations in which no other alternatives are presented as either possible or necessary. Against the popular maxim there are actually three inevitabilities: death, taxes, and inevitability itself parceled in distraction and enjoyment.

2c. In Kierkegaardian terms, Kant tries to establish within the structure of practical reason itself the priority of the ethical over the aesthetic. There is no existential decision to be made for Kant because the moral law is simply a fact of reason. On the one hand, Kierkegaard accepts Kant’s rejection of heteronomy: “the person who says that he wants to enjoy life always posits a condition that either lies outside the individual or is within the individual in such a way that it is not there by virtue of the individual himself”. But Kierkegaardian authenticity has nothing of the character of Kantian autonomy if for no other reason than for the singularity of the “infinitely concrete” self that does not exist prior to the absolute choice to be who one is. What leftist critics of Kierkegaard (and existentialism generally) resisted was the propensity for the certitude of authenticity to remain inner in the complicity of the ethical self for an aestheticized existence, even if such aestheticism is transformed into the spiritual immolation of guilt.

Ethical guilt leads in the opposite direction of political action, which is predicated not on the identity of the subject but, rather, in the dereliction of subjective pride in the suffering of others (even if one is oneself the subject of oppression) in what Lévinas and Derrida have nominated as “responsibility”. The standard political distinction between responsibility and obligation consists simply in the fact that responsibility is not chosen and that my responsibility extends beyond my power of knowledge or even of satisfaction, e.g., in the fact that I can be responsible for injustices I never intended to commit. In a certain sense, then, the autonomy of my ethical responsibility is conditioned by the absolute heteronomy of my identity as one implicated prior to my decisions since those decisions must be made within a situation I have inherited.

3. Just as we have inherited the world of our predecessors, the critical political task is to be conscious of the futures we both prohibit and create. In this light, the fundamental imperative of education, Adorno said, is that Auschwitz should never happen again. What he meant, of course, is that education must form minds that are not pliable to the forces that lead us to fascism. What his hyperbolic statement has unfortunately made possible, however, is complacency with any injustice not commensurate with the most radical evil in recorded history (Abu Ghraib, for example, just “wasn’t as bad”). In a sense, politics always happens too late and the mistake of utopianism is to posit the possibility of redemption as the end of political action.

What criticism must resist at all personal and material costs is the reduction of politics into farce and the tragedy of recognizing that the necessity of criticism comes too late, i.e., when “the unthinkable” remains unthinkable because it has already become our modus operandi and when injustice can be recognized only the in the cold****, ironic laughter of those who can be persuaded that “it’s all good”. The real meaning of freedom (or Kant’s “autonomy”) is nothing other than a separation from reality and the given: “truth has no place other than the will to resist the lie of opinion. Thought … proves itself in the liquidation of opinion: literally the dominant opinion. This opinion is not due simply to people’s inadequate knowledge but rather is imposed upon them by the overall structure of society and hence by relations of domination. How widespread these relations are provides an initial index of falsity: it shows how far the control of thought through domination extends. Its signature is banality. … The banal cannot be true” (Adorno).

****We should not forget that Adorno explicitly claimed that Auschwitz was made possible by those without the capacity for love.

À la Lefebvre, though, it is not simply the content of opinion that is false but the very structure of opinion that criticism must interrogate. The fundamental insight of critical philosophy is that the given (the everyday) is never merely given but always (socially) mediated (this was, incidentally, Fichte’s insight into the possibility of ethics, which preceded Hegel’s formulation of the state as the “ethical substance” of the subject): the habits and routines of everyday life are both sedimentations of cultural meanings but also, ipso facto, a necessary condition for (self-)identity. The relation between the everyday and the extraordinary, as Felski argues, cannot be reduced to the opposition between the material and the ideal if only because the everyday is the materialization of the ideal. There is, therefore, no single “everyday” experience apart from specific histories, which constitute such experiences as gendered, economic, etc. The everyday, consequently, cannot serve as the final court of appeal against the demands of the extraordinary but, like the state, precisely because it is a condition of life must also be subjected to unrelenting critique. As Felski points out, the everyday is necessarily caught in a fundamental ambivalence: disdained and even mistrusted for the ways in which the political, economic, and biopolitical forms of power have normalized the inequalities of reality while at the same time our subjection is also that which creates our possibilities as subjects.

The everyday thus presents us with the perennial choice between immanence and transcendence: Foucault and Deleuze represent the most radical attempts at an immanent critique of the given. Contemporary criticism, however, has learned that, properly speaking, our choice is not “between” immanence and transcendence since, as both Derrida and Badiou have shown, despite being otherwise irreconcilable, immanence only manifests through a presentation of the transcendental. The chiasm from the immanent to the transcendent passes through the unpresentable singularity of that which, from the side of the immanent, can never be given “all at once” and, from the side of the transcendental, exceeds the circulation of discourse (e.g., Derrida’s transcendental signifier or, equivalently, his notion of justice as the undeconstructible condition of deconstruction). The sympathy of criticism, politics, education, and art consists in the insufficiency and contingency of the present and what is presented as affirmative in character.

The book-fetish

The reactions to Encyclopedia Britannica’s cessation of a print edition has oscillated between nostalgia and a reverent fear for the future of literacy. “Won’t our children know what a book is anymore?” Behind that question, however, is not merely a luddite rejection of technology but a fetishism of the book that undermines the very concerns these sentiments express. It is appropriate, moreover, that they should re-appear on the occasion at the loss of this particular sort of book for, if the opponents of e-readers have forgotten that the book itself is a product of technology, we have also forgotten the origins of the very notion of an encyclopedia and, if we were to recall the ends of such a project, may not be so reluctant to mourn its transformation.

The idea of an encyclopedia is a stereotypically nineteenth-century invention that entailed not only a compilation of all the “available facts of human history, collected over the widest areas” (preface to the ninth edition) but also that they are “carefully coordinated and grouped together, in the hope of ultimately evolving the laws of progress, moral and material, which underlie them, and which will help to connect and interpret the whole movement of the race” (emphasis added). Whether in its British or German versions, intrinsic to the encyclopedic project is the possibility of unifying the results of human inquiry under a systematic orientation toward the idea of a final synthesis of knowledge that requires, among other things, at least the one thing that lessons of twentieth century colonialism has taught us at least that we ought to be suspicious if not simply reject: i.e., a single, substantive notion of “human rationality” across all history and cultures. The encyclopedic project is caught on the horns of an impossible dilemma: either the facts thus collected can be unified into a systematic whole—which presupposes precisely the conception of universal historical progress and singular rationality that has resulted in the subjugation of both women and “savages”—or we renounce the possibility of a systematic unity of the information gathered, in which case what we have is both unending and meaningless since it is not thereby knowledge (if it were, then the IBM computer that was able to defeat Jeopardy! contestants by having a lot of facts at its disposal would be the smartest thing in the world). An “encyclopedia” of facts without a principle of order and selection is trivial but any such principle renders certain things visible at the expense of others.

While the current dismay at the loss of the printed edition of the Encyclopedia Britannica seems directed not at the loss of the encyclopedia itself—which, despite everything, continues to be an unfortunately inexorable presence in the pursuit of knowledge—but at the printed version of it, the objection rests on the same confusion of the paths to knowledge. If, at minimum, the encyclopedic impulse results from the fetishism of facts, which leads us to mistake the collection of facts as knowledge, the fetishism of the book leads us to reduce the pursuit of knowledge to particular objects. While knowledge is inseparable from material conditions, it is not beholden to any particular material conditions and, for that matter, it is strictly speaking not “beholden” to material conditions because knowledge and its material conditions are co-constitutive.

No one would have taken seriously a similar outcry over the loss of the scroll on the advent of the book. By now the intellectual advances made possible by that particular technological transformation are banal. What should strike us as strange is the failure to recognize a similar transformation here. Marx once remarked that humanity only poses such problems to itself that it is capable of solving, because something isn’t even recognized as a problem to be solved unless somehow we are capable of the solution. In this instance, however, our technology has lagged far behind our needs. The sheer amount of information and data produced can no longer be recorded on paper because there simply isn’t enough paper to do so; we’re all aware of the temporal limitations of the publishing industry in cases like textbooks that, because they are published so long after they are actually written, are obsolete the moment they are purchased (which contributes to the exorbitant price of textbooks). The curious phenomenon here is that the resistance to the digital transformation of the book is a resistance to the fulfillment of the needs we have created and a regressive tendency toward those forms of technology that are no longer capable of meeting those needs.

But rejecting nostalgia for the book need not consist merely in an automatic allegiance to the development of technology (since we might still remain agnostic on whether or not the needs to which such technology is responding are actually beneficial). The fetishism of the book confuses the end of knowledge with the means (i.e., technology): that knowledge is (“in”) the book* (and, of course, it is but a short step toward thinking that, therefore, an encyclopedia can “collect” knowledge).

*This is the same confusion of thinking that the piece of music is “in” the score.

If there is anything that digital technology has shown, it is that knowledge is not “in” anything and book fetishism is perhaps the most exemplary form of fetishism (attributing human powers to objects).** Wikipedia has shown, for example, how the transmission of knowledge is not only collective but communicative such that the collective effort at writing and editing a Wikipedia entry allow for a sort of interaction with information that was not possible with a printed encyclopedia.

**This is the same error behind the cliché “at least people are reading a book instead of watching TV”, as if reading were intrinsically better than watching TV. One finds it difficult to see how reading trash like a romance novel is intellectually “better” than watching Mythbusters, for example. If the charge goes “TV rots your brain”, so does Twilight.

While digital technology has not yet given us the means to replace the book completely—for serious academic purposes, for example, e-readers still fall consistently short—there is no indication that eventually it will not do so nor that it is undesirable for it to do so.

The smiling Doppelgänger; or: the fall of the fourth estate

1. As a result of the debate between Lippmann and Dewey, what we know as journalism has been instrumental in the simultaneous marriage and autonomy of economics and politics. For what began as a means for communicating information that may affect prices and trade—which quickly turned out to be anything and everything—their debate was predicated on the idea that the very possibility of a people to govern themselves required a robust and rigorous media, free from distortion and misinformation (this was Lippmann’s point in Liberty and the News after WWI). In his philosophy of education, Dewey showed that we must be able to think well; Lippmann insisted that we must be able not only to think but that we must always think about something and that our capacity to think is limited not only by subjective but also objective possibilities.

Lippmann knew, of course, that journalistic practice does not consist of getting “just the facts”. The essential problem for both journalists and audiences, he said, is that between the individual and her environment is what he called a “pseudo-environment”, which consists of the habits and fictions that orient our behavior—put simply, our beliefs about the facts (or, as Nietzsche had said, there are never “mere facts” but facts interpreted as facts).

In 1919, the Washington Post ran a story that in the Adriatic a US Rear Admiral had apparently received orders from the British via the War Council. It seemed, the article concluded, that American naval forces could be commanded by foreign powers under the new League of Nations without the knowledge or consent of American commanders. Republican senators immediately expressed indignation at the possibility of American military operations being conducted without the consent of Congress, adding this news to support their opposition to the League of Nations. It turned out that no orders had come from the British and that the American forces had landed in Italy at the request of the Italians for protection, acting under established international practice that had nothing to do with the League of Nations.

Lippmann’s point in this example is not only that the Washington Post “got the facts wrong”, which is indeed true, but also that it illustrates the way in which we act, form opinions and convictions, and consequently act from information gathered not only by our environment but by our pseudo-environment. How is it possible for two people—or even two nations—to enter a conflict, both fully convinced that they are acting in self-defense, for example?

2. Without having to make any decisions on what constitutes the “facts” of an event, what distinguished journalism from other forms of popular media for Lippmann was not simply a dedication to the facts but, rather, its civic duty. His recommendations of now standard editorial practices and “journalistic ethics” were predicated on the principle that the journalist’s responsibility was quite literally to be the medium from citizen to world.

It did not take long, however, for the culture industry to corrode this sense of duty. On the left, for example, The Daily Show has explicitly erased the distinction between journalism and entertainment (with all the ironic consequences that have followed in its critical impotence); on the right, Fox News asks viewers to vote for which story they would like to see just as American Idol asks audiences to vote for which singer they would like to hear.

A Yahoo! News story reported that a recent story about a $1.33 tip from a banker on a $133 restaurant bill (with the sentence “get a real job” written by the word “tip”), which provoked outrage across the Internet, may have been digitally altered. The restaurant claimed to have found the merchant’s copy of the receipt, which shows a standard tip ($7) for a smaller bill ($33) without the accompanying insult.

Quite apart from the question of whether this event counts as significant news, the Yahoo! story ends with the reporter asking what has become an obligatory query addressed to the audience: “what do you think? Who’s telling the truth?” What is at stake in this story is the fact that the outrage over the original story is (likely) directed at a false source. If there is a story here, it is that our outrage over callous wealthy privilege is misfounded (at least in this case) and that the facts of the matter do not justify such a response. But the reporter’s final question makes the truth irrelevant: the truth of the matter seems not to be the point—the work of establishing it has not been done by the reporter—but only what I think about it.

What are the possible responses to that question? 1) “I think the original story is true and the restaurant is lying about the original receipt.” – Then the facts don’t matter. 2) “I think the receipt is a hoax.” – Then the story has not gone far enough in collecting the relevant evidence to allow us to come to a reasonable conclusion. 3) “I suspend judgment.” – Then what I think is irrelevant since I should precisely think nothing (notice that this is the only reasonable response to give).

Among the objective failures of journalism, this question “What do you think?” and the compulsion to “register” (to whom?) an opinion on anything and everything signifies the decadent subjective failure of civic duty. If Lippmann had entrusted to journalists the responsibility to the truth, Dewey had asked us to remember that democratic politics demands not opinion but thought, i.e., not only simply to insist on our “right to have an opinion” but that we have a responsibility to think about them.*

*Incidentally, recently I claimed that the intellectual dereliction of the left was one of the only two things about which Rand was right. This is the second: that the appropriate converse of a closed mind is not an open but an active mind.

On perjury and consequences

1a. “Our perspective of life has passed into an ideology which conceals the fact that there is life no longer,” Adorno wrote at the start of one of the most remarkable texts of early critical theory. How is it possible, he asks, from* the false world of a “damaged life”, to speak truth? Similarly, Aristotle had asked a similar question with a similar answer: is it possible to be virtuous in a wicked society when the moral habits require both subjective and objective conditions of possibility.

*The English translation of the title is extremely infelicitous here. The reflections are, yes, on damaged life but they are at the same time from or out of it [aus dem beschädigten Leben].

But perhaps the most remarkable trope of our present state is the Christian notion of original sin. The interesting aspect of original sin is not its hereditary nature. As Calvin points out in the Institutes, for example, “… Augustine, though he frequently calls it the sin of another, the more clearly to indicate its transmission to us by propagation [against the Pelagians], yet, at the same time, also asserts it properly belongs to each individual” (emphasis added); not only, moreover, to each person but to every creature, groaning under the weight of a burden it neither chose nor incurred (Rom 8:20,4). The unchosen responsibility for a guilt that defines our very mode of existence—and our fate—is the task that we can no longer ignore under the auspices of Enlightenment naivety.

1b. What the Enlightenment finds so unpalatable about original sin is its apparent fatalism. Similarly, Adorno and Weber are often dismissed for their unremitting pessimism: is there not good in this world, after all? Should we not affirm, as a certain bumper sticker proclaims, “life is good” or that we should “look on the bright side”?**

**I was once asked by a student why critical theorists and modern (avant-garde) artists were so “depressing” and why they couldn’t just take a moment to see the beauty in the world.

The scandal of the modern world is that what appears as good necessarily makes the suffering at its root invisible. Benjamin had famously remarked that every document of civilization is simultaneously a document of barbarism and, as common wisdom goes, that history is written by the victors. The present situation is worse, however, than even he had imagined: it is reality itself that is created by those with the power to do so. Should we not celebrate the fact that we now have access to exotic grains from around the world at Whole Foods when the very fact that we are importing quinoa from Bolivia is raising prices so natives who depend on the crop for food can themselves no longer afford it and are being driven into malnutrition while obesity continues to rise in America? How many factory workers have to die or be poisoned, underage teenagers exploited, or rare minerals mined in war-torn countries to produce our “unlimited” iPads and e-readers? By how much do we mortgage future generations so we can drive on average thirty miles a day? Or while everyone was worrying about emissions and thought they were being green by buying nice electric cars, no one noticed that the environmental damage in the production of those cars is (or has been) more harmful than that of conventional cars (or that the original electric car batteries were more toxic to dispose of than nuclear waste).

Benjamin’s concern was that the conditions for the existence of evil would be forgotten and that the critic’s task was to rescue the missed and forgotten possibilities in the laughter of those who were now dead at the hands of a history that must march forward. As Arendt has shown, however, we are already too late: evil is now banal. Banality is the brother of irony: what the ironist accepts as unavoidable the other simply doesn’t notice because it is taken for granted: a radio announcer can just assume that women want to lose weight, for example, and proceed to offer special deals “for the ladies” or the culture industry can continue to feed off audiences’ demand for the ornaments of affirmative culture while works like the Thälmann Variations—written to offer hope for the future of the people—remain unpublished and unavailable.

The optimism of the 90s when this ideology of “the good life” found its final expression is no longer tenable. Neoliberals and conservatives alike continue to promise that the very conditions that not only caused the financial collapse and its continuing global repercussions remain the status quo but also that they continue to blind us to the lie behind the notion that “life is good”.

2a. Justice demands not only action but the tenacity to refuse the ideology of hope: that what was once an honest attempt has proven itself to be among the most catastrophic failures of recorded history. In one of the most reasonable things Zizek has said in recent years, “perhaps it is time to step back, think and say the right thing”; to do so, however, we must first render visible what the ideology of “the good life” denies existence. To borrow a Heideggerean sentence: what most calls for thinking is the fact that, despite everything, we are (still) not thinking. Justice must wield not only the sword but also the scales.

2b. And this is the present task of thought, which is imposed not only from the objective conditions of existence but from within thought itself. In short, this is the Kantian point of no return: there is no metaphilosophy. The material and social conditions for thought are either subject to philosophical method (which concern the possibility for thought as such) or there is something transcendent to philosophy. To put it perversely, il n’y a pas hors de l’histoire.

Appeals and incriminations

1. The primrose path. The split between philosophy and science has rendered philosophy vulnerable to two equivalent and damning accusations disguised as genuine questions: “what are the facts of the matter?” or “what is your ontology?” When, for example, cognitive and neuropsychology are busy re-creating the Kantian picture of cognition (including the opacity of the transcendental ego) or when sociology agrees with Aristotle’s insight into what we now call “crowdsourcing”, it seems that science has given philosophy empirical verification. Against the consequent threat of redundancy, philosophy (particularly in its idealist and crypto-idealist varieties) has generally responded with some doctrine of method: “philosophy provides an account of what a fact is in the first place”. Of course, we should be wary of any such tendency toward absolute idealism ever since witnessing the misfortunes of a system that attempts to deduce being from the idea. But an ethical idealism is equally problematic that insists on the role of philosophy in arbitrating between facts and values (which are, by definition, outside the domain of ontology): such a solution simply reduces philosophy to literature and makes it possible to speak of “my” and “your” philosophy since, after all, if values are not facts there is no other court of appeal than my “yes”.

1a. The discourse bubble. Values, of course, are discursive (as Nietzsche insisted against the metaphysicians). “We must reflect and discuss our values.” But to whom do we speak? Confronted with the towering black obelisk of technology, for example, philosophy quarantines itself in a mode of discourse that appeals to Aristotle and Heidegger instead of Lanier. The objection to such discursive naïveté (at best and bad faith at worst) is not that of simply lacking reference to a “real” world outside discourse but, rather, that a discourse that intends only itself is self-defeating.

2. Whither the moral world? Is it possible to be moral in an immoral world? We face here an inverted image of the doctrine of original sin. Bourgeois ideology refuses, for example, to decide between the “right” of a chemist to create a better non-smearing lipstick and the creation of HIV medication. The democratic paradox is that we must at once affirm the separation of ethical injunctions from political right while at the same time recognizing that it is this very distinction that creates the very immoral world from which we must impose on ourselves the choice to be moral.

2a. Discourse and praxis. Philosophy faces a similar paradox. Faced with the separation of philosophy and politics (which Marx famously wanted to overcome), philosophy both recognizes and refuses its task in the face of injustice. Philosophy has its responsibility and capacity to incite us to the recognition of injustice—including the fact that its current existence in academic institutions is predicated on unjust socioeconomic practices—but it will not be by researching what passages of Hobbes Leibniz was reading in what years (although, in fairness, such research is arguably not philosophy at all but its decadent imposter).

The paralysis of discourse

1. Bergson identifies laughter as the repetition of the past, i.e., as an interruption in the novelty of life. Moreover, as a social institution, comedy “lies midway between art and life. … By organizing laughter, comedy accepts social life as a natural environment … And in this respect it turns its back upon art, which is a breaking away from society and a return to pure nature”. On the one hand, comic laughter inhibits the movement of vital forces by the sublimation of desire into the affirmation of the present as the presence of what is missing. Life itself, as pure difference (that which differs from itself), never appears. But, on the other hand, laughter condenses into a single, unstable moment two tendencies, which are by nature opposed—(simple) negation and the reflexivity of a subject present-to-itself—resulting in the confusion of life and enjoyment.

Yet, as a relaxation or pause in the impetus of life, laughter finds itself neither on the side of language nor action. There can be, of course, no real hiatus in life, yet this illusion of laughter, Bergson says, is akin to the illusion of dreams: “the behavior of the intellect in a dream [is this:] … the mind, enamored of itself, now seeks in the outer world nothing more than a pretext for realizing its imaginations”. It is for this reason that laughter is the expression of irony par excellence (see “Irony and Criticism”) and, further, why laughter can serve no critical function. Because laughter is neither language—we can laugh at false reasoning or bad logic, which serves as the staples of comedy—nor action, laughter is simply a refusal of criticism.

Comedy, therefore, like camp, is not only incapable of criticism but actively serves to neutralize criticism. If, as Ross claims, camp consists in the recovery of cultural productions whose sense is no longer dominated by the demands of capital, camp threatens quickly to collapse into parody or imitation and thereby acquires a sort of “zombie life”. For both camp and irony, the price paid for enjoyment is simply the loss of the objective world: anything can be enjoyed by the perfect solipsist for whom there is no ethical demand to recognize anything as genuinely demeaning, offensive, violent, or banal. There is only the subject-for-itself, baptized in enjoyment.

We see the same phenomenon in the parody of children’s play. The child who mimics adult telephone conversations engages in precisely the same parodic act as the laughter of those uninitiated into various forms of discourse (for example, mocking a foreign language or the derision of jargon) or in caricature (for example, the “seventh meditation”), both of which mark the death of criticism.

2. On the other hand, the failure of criticism has been the assumption that the mode appropriate to it is that of discourse or, alternatively, that the choice facing politics is that between theory and action. Those impatient for action who want to “cut through the bullshit” of theory refuse the entreaties of discourse to see the intolerance in tolerance or the reactionary in the revolutionary. The call for theory is therefore not simply to remind us of our history but, as Zizek has called it, a search for “lost causes” as neither a mode of historical inquiry nor one of hermeneutics (Ricoeur, for example, uses the text as a model for action whereas we might say Zizek proclaims the inverse). Ricoeur’s “critical hermeneutics” requires a dialectic between inclusion and distantiation, which brings into discourse what is initially simply given as structure. But Ricoeur never escapes the vicious circle of subject and world: if we are to know the world to which a text refers, we must rely upon “imaginative variations” of the subject that only occur in a world constituted by discourse.

We are left, however, in a precarious position. The search for “lost causes” threatens not to dissolve the sense of discourse (as, for example, in parody) but to substitute meaning for intention: it is sufficient for discourse to appear as such in its illocutionary force (as a “call to action”, for example). The intention of discourse, it turns out, is irrelevant: as long as discourse retains consistency—even the consistency that obtains across parody as a derivative sense—it remains meaningful. At this zero-point, discourse is both sufficient and unnecessary: as Sartre said, intentions vanish and it no longer matters that we all agree on why we are storming the Bastille just as long as we’re doing it. Zizek tarries at this point where the pleasure of discourse is seduced on the one hand by the laughter of enjoyment and by the force of sovereignty on the other.