recentpopularlog in

robertogreco : nietzsche   25

Duke University Press - Baroque New Worlds
"Baroque New Worlds traces the changing nature of Baroque representation in Europe and the Americas across four centuries, from its seventeenth-century origins as a Catholic and monarchical aesthetic and ideology to its contemporary function as a postcolonial ideology aimed at disrupting entrenched power structures and perceptual categories. Baroque forms are exuberant, ample, dynamic, and porous, and in the regions colonized by Catholic Europe, the Baroque was itself eventually colonized. In the New World, its transplants immediately began to reflect the cultural perspectives and iconographies of the indigenous and African artisans who built and decorated Catholic structures, and Europe’s own cultural products were radically altered in turn. Today, under the rubric of the Neobaroque, this transculturated Baroque continues to impel artistic expression in literature, the visual arts, architecture, and popular entertainment worldwide.

Since Neobaroque reconstitutions necessarily reference the European Baroque, this volume begins with the reevaluation of the Baroque that evolved in Europe during the late nineteenth century and the early twentieth. Foundational essays by Friedrich Nietzsche, Heinrich Wölfflin, Walter Benjamin, Eugenio d’Ors, René Wellek, and Mario Praz recuperate and redefine the historical Baroque. Their essays lay the groundwork for the revisionist Latin American essays, many of which have not been translated into English until now. Authors including Alejo Carpentier, José Lezama Lima, Severo Sarduy, Édouard Glissant, Haroldo de Campos, and Carlos Fuentes understand the New World Baroque and Neobaroque as decolonizing strategies in Latin America and other postcolonial contexts. This collection moves between art history and literary criticism to provide a rich interdisciplinary discussion of the transcultural forms and functions of the Baroque.

Contributors. Dorothy Z. Baker, Walter Benjamin, Christine Buci-Glucksmann, José Pascual Buxó, Leo Cabranes-Grant, Haroldo de Campos, Alejo Carpentier, Irlemar Chiampi, William Childers, Gonzalo Celorio, Eugenio d’Ors, Jorge Ruedas de la Serna, Carlos Fuentes, Édouard Glissant, Roberto González Echevarría, Ángel Guido, Monika Kaup, José Lezama Lima, Friedrich Nietzsche, Mario Praz, Timothy J. Reiss, Alfonso Reyes, Severo Sarduy, Pedro Henríquez Ureña, Maarten van Delden, René Wellek, Christopher Winks, Heinrich Wölfflin, Lois Parkinson Zamora"
baroque  latinamerica  literature  counterconquest  europe  postcolonialism  transcultural  neobaroque  nietzsche  heinrichwölfflin  walterbenjamin  eugeniod'ors  renéwellek  mariopraz  alejocarpentier  josélezamalima  severosarduy  édouardglissant  haroldodecampos  carlosfuentes  dorothybaker  christinebuci-glucksmann  josépascualbuxó  leocabranes-grant  irlemarchiampi  williamchilders  gonzalocelorio  jorgeruedasdelaserna  robertogonzálezechevarría  ángelguido  monikakaup  timothyreiss  alfonsoreyes  pedrohenríquezureña  maartenvandelden  christopherwinks  loisparkinsonzamora 
7 weeks ago by robertogreco
School is Literally a Hellhole – Medium
"By continually privileging and training our eyes on a horizon “beyond the walls of the school” — whether that be achievement, authentic audiences, the real world, the future, even buzz or fame — have we inadvertently impoverished school of its value and meaning, turning it into a wind-swept platform where we do nothing but gaze into another world or brace ourselves for the inevitable? Here we have less and less patience for the platform itself, for learning to live with others who will be nothing more than competitors in that future marketplace."



"What would be possible if we instead were to wall ourselves up with one another, fostering community and care among this unlikely confluence of souls? Does privileging the proximate, present world render any critique of or contribution to the larger world impossible?

I don’t think so. Learning to protect, foster, and value the humans in our care will often automatically put us in direct conflict with the many forces that disrupt or diminish those values. More than reflecting the real world or the future or some outside standard or imperative, kids need to see themselves reflected and recognized in these rooms. This is true even in the most privileged of environments. Providing recognition means valuing students' perspectives and experiences, but also helping them gain critical consciousness of themselves and their world, which they often intuit.

These tasks aren’t disconnected from the outside world, but often need a smaller, more human-sized community in which to flourish. The impulse to test and measure continually intrudes upon this process. But so do other prying eyes, ones that cast our students as entrepreneurial, capitalistic, future-ready, self-motivated, passionate individuals — and that often shame those who can’t or won’t conform to this ideal.

We should ask ourselves to what extent those outside standards and ideals are antithetical to the values of education — civic discourse, collectivity, cooperation, care. I realize this post is short on specifics, but let’s be more cautious about always forcing one another out into unforgiving gaze of others, commending the merits of a world beyond this one."
arthurchiaravalli  schools  schooling  schooliness  presence  unschooling  deschooling  education  learning  highschool  competition  coexistence  community  benjamindoxtdator  engagement  blogging  teaching  howweteach  howwelearn  personalbranding  innovation  johndewey  work  labor  nietzsche  collectivism  collectivity  cooperation  care  caring  merit  entrepreneurship  passion  2018  foucault  michelfoucault 
june 2018 by robertogreco
Take your time: the seven pillars of a Slow Thought manifesto | Aeon Essays
"In championing ‘slowness in human relations’, the Slow Movement appears conservative, while constructively calling for valuing local cultures, whether in food and agriculture, or in preserving slower, more biological rhythms against the ever-faster, digital and mechanically measured pace of the technocratic society that Neil Postman in 1992 called technopoly, where ‘the rate of change increases’ and technology reigns. Yet, it is preservative rather than conservative, acting as a foil against predatory multinationals in the food industry that undermine local artisans of culture, from agriculture to architecture. In its fidelity to our basic needs, above all ‘the need to belong’ locally, the Slow Movement founds a kind of contemporary commune in each locale – a convivium – responding to its time and place, while spreading organically as communities assert their particular needs for belonging and continuity against the onslaught of faceless government bureaucracy and multinational interests.

In the tradition of the Slow Movement, I hereby declare my manifesto for ‘Slow Thought’. This is the first step toward a psychiatry of the event, based on the French philosopher Alain Badiou’s central notion of the event, a new foundation for ontology – how we think of being or existence. An event is an unpredictable break in our everyday worlds that opens new possibilities. The three conditions for an event are: that something happens to us (by pure accident, no destiny, no determinism), that we name what happens, and that we remain faithful to it. In Badiou’s philosophy, we become subjects through the event. By naming it and maintaining fidelity to the event, the subject emerges as a subject to its truth. ‘Being there,’ as traditional phenomenology would have it, is not enough. My proposal for ‘evental psychiatry’ will describe both how we get stuck in our everyday worlds, and what makes change and new things possible for us."

"1. Slow Thought is marked by peripatetic Socratic walks, the face-to-face encounter of Levinas, and Bakhtin’s dialogic conversations"

"2. Slow Thought creates its own time and place"

"3. Slow Thought has no other object than itself"

"4. Slow Thought is porous"

"5. Slow Thought is playful"

"6. Slow Thought is a counter-method, rather than a method, for thinking as it relaxes, releases and liberates thought from its constraints and the trauma of tradition"

"7. Slow Thought is deliberate"
slow  slowthought  2018  life  philosophy  alainbadiou  neilpostman  time  place  conservation  preservation  guttormfløistad  cittaslow  carlopetrini  cities  food  history  urban  urbanism  mikhailbakhti  walking  emmanuellevinas  solviturambulando  walterbenjamin  play  playfulness  homoludens  johanhuizinga  milankundera  resistance  counterculture  culture  society  relaxation  leisure  artleisure  leisurearts  psychology  eichardrorty  wittgenstein  socrates  nietzsche  jacquesderrida  vincenzodinicola  joelelkes  giorgioagamben  garcíamárquez  michelfoucault  foucault  asjalacis  porosity  reflection  conviction  laurencesterne  johnmilton  edmundhusserl  jacqueslacan  dispacement  deferral  delay  possibility  anti-philosophy 
march 2018 by robertogreco
Will Self: Are humans evolving beyond the need to tell stories? | Books | The Guardian
"Neuroscientists who insist technology is changing our brains may have it wrong. What if we are switching from books to digital entertainment because of a change in our need to communicate?"



"A few years ago I gave a lecture in Oxford that was reprinted in the Guardian under the heading: “The novel is dead (this time it’s for real)”. In it I argued that the novel was losing its cultural centrality due to the digitisation of print: we are entering a new era, one with a radically different form of knowledge technology, and while those of us who have what Marshal McLuhan termed “Gutenberg minds” may find it hard to comprehend – such was our sense of the solidity of the literary world – without the necessity for the physical book itself, there’s no clear requirement for the art forms it gave rise to. I never actually argued that the novel was dead, nor that narrative itself was imperilled, yet whenever I discuss these matters with bookish folk they all exclaim: “But we need stories – people will always need stories.” As if that were an end to the matter.

Non-coincidentally, in line with this shift from print to digital there’s been an increase in the number of scientific studies of narrative forms and our cognitive responses to them. There’s a nice symmetry here: just as the technology arrives to convert the actual into the virtual, so other technologies arise, making it possible for us to look inside the brain and see its actual response to the virtual worlds we fabulate and confabulate. In truth, I find much of this research – which marries arty anxiety with techno-assuredness – to be self-serving, reflecting an ability to win the grants available for modish interdisciplinary studies, rather than some new physical paradigm with which to explain highly complex mental phenomena. Really, neuroscience has taken on the sexy mantle once draped round the shoulders of genetics. A few years ago, each day seemed to bring forth a new gene for this or that. Such “discoveries” rested on a very simplistic view of how the DNA of the human genotype is expressed in us poor, individual phenotypes – and I suspect many of the current discoveries, which link alterations in our highly plastic brains to cognitive functions we can observe using sophisticated equipment, will prove to be equally ill-founded.

The neuroscientist Susan Greenfield has been prominent in arguing that our new digital lives are profoundly altering the structure of our brains. This is undoubtedly the case – but then all human activities impact upon the individual brain as they’re happening; this by no means implies a permanent alteration, let alone a heritable one. After all, so far as we can tell the gross neural anatomy of the human has remained unchanged for hundreds of millennia, while the age of bi-directional digital media only properly dates – in my view – from the inception of wireless broadband in the early 2000s, hardly enough time for natural selection to get to work on the adaptive advantages of … tweeting. Nevertheless, pioneering studies have long since shown that licensed London cab drivers, who’ve completed the exhaustive “Knowledge” (which consists of memorising every street and notable building within a six mile radius of Charing Cross), have considerably enlarged posterior hippocampi.

This is the part of brain concerned with way-finding, but it’s also strongly implicated in memory formation; neuroscientists are now discovering that at the cognitive level all three abilities – memory, location, and narration – are intimately bound up. This, too, is hardly surprising: key for humans, throughout their long pre-history as hunter-gatherers, has been the ability to find food, remember where food is and tell the others about it. It’s strange, of course, to think of Pride and Prejudice or Ulysses as simply elaborations upon our biologically determined inclination to give people directions – but then it’s perhaps stranger still to realise that sustained use of satellite navigation, combined with absorbing all our narrative requirements in pictorial rather written form, may transform us into miserable and disoriented amnesiacs.

When he lectured on literature in the 1950s, Vladimir Nabokov would draw a map on the blackboard at the beginning of each session, depicting, for example, the floor plan of Austen’s Mansfield Park, or the “two ways” of Proust’s Combray. What Nabokov seems to have understood intuitively is what neuroscience is now proving: reading fiction enables a deeply memorable engagement with our sense of space and place. What the master was perhaps less aware of – because, as yet, this phenomenon was inchoate – was that throughout the 20th century the editing techniques employed in Hollywood films were being increasingly refined. This is the so-called “tyranny of film”: editing methods that compel our attention, rather than leaving us free to absorb the narrative in our own way. Anyone now in middle age will have an intuitive understanding of this: shots are shorter nowadays, and almost all transitions are effected by crosscutting, whereby two ongoing scenes are intercut in order to force upon the viewer the idea of their synchrony. It’s in large part this tyranny that makes contemporary films something of a headache for older viewers, to whom they can seem like a hypnotic swirl of action.

It will come as no surprise to Gutenberg minds to learn that reading is a better means of forming memory than watching films, as is listening to afternoon drama on Radio 4. This is the so-called “visualisation hypothesis” that proposes that people – and children in particular – find it harder not only to remember film as against spoken or written narratives, but also to come up with novel responses to them, because the amount of information they’re given, together with its determinate nature, forecloses imaginative response.

Almost all contemporary parents – and especially those of us who class themselves as “readers” – have engaged in the Great Battle of Screen: attempting to limit our children’s consumption of films, videos, computer games and phone-based social media. We feel intuitively that it can’t be doing our kids any good – they seem mentally distracted as well as physically fidgety: unable to concentrate as they often look from one handheld screen to a second freestanding one, alternating between tweezering some images on a touchscreen and manipulating others using a remote control. Far from admonishing my younger children to “read the classics” – an utterly forlorn hope – I often find myself simply wishing they’d put their phones down long enough to have their attention compelled by the film we’re watching.

If we take seriously the conclusions of these recent neuroscientific studies, one fact is indisputable: whatever the figures for books sales (either in print or digital form), reading for pleasure has been in serious decline for over a decade. That this form of narrative absorption (if you’ll forgive the coinage) is closely correlated with high attainment and wellbeing may tell us nothing about the underlying causation, but the studies do demonstrate that the suite of cognitive aptitudes needed to decipher text and turn it into living, breathing, visible and tangible worlds seem to wither away once we stop turning the pages and start goggling at virtual tales.

Of course, the sidelining of reading narrative (and along with it the semi-retirement of all those narrative forms we love) is small potatoes compared with the loss of our capacity for episodic memory: would we be quite so quick to post those fantastic holiday photographs on Facebook if we knew that in so doing we’d imperil our ability to recall unaided our walk along the perfect crescent of sand, and our first ecstatic kiss? You might’ve thought that as a novelist who depends on fully attuned Gutenberg minds to read his increasingly complex and confusing texts I’d be dismayed by this craven new couch-based world; and, as a novelist, I am.

I began writing my books on a manual typewriter at around the same time wireless broadband became ubiquitous, sensing it was inimical not only to the act of writing, but that of reading as well: a novel should be a self-contained and self-explanatory world (at least, that’s how the form has evolved), and it needs to be created in the same cognitive mode as it’s consumed: the writer hunkering down into his own episodic memories, and using his own canonical knowledge, while imagining all the things he’s describing, rather than Googling them to see what someone else thinks they look like. I also sense the decline in committed reading among the young that these studies claim: true, the number of those who’ve ever been inclined “to get up in the morning in the fullness of youth”, as Nietzsche so eloquently put it, “and open a book” has always been small; but then it’s worth recalling the sting in the tail of his remark: “now that’s what I call vicious”.

And there is something vicious about all that book learning, especially when it had to be done by rote. There’s something vicious as well about the baby boomer generation, which, not content to dominate the cultural landscape, also demands that everyone younger than us survey it in the same way. For the past five years I’ve been working on a trilogy of novels that aim to map the connections between technological change, warfare and human psychopathology, so obviously I’m attempting to respond to the zeitgeist using this increasingly obsolete art form. My view is that we’re deluded if we think new technologies come into existence because of clearly defined human objectives – let alone benevolent ones – and it’s this that should shape our response to them. No, the history of the 20th century – and now the 21st – is replete with examples of technologies that were developed purely in order to facilitate the killing of people at … [more]
willself  communication  digital  writing  howwewrite  entertainment  books  socialmedia  neuroscience  2016  marshallmcluhan  gutenbergminds  print  change  singularity  videogames  gaming  games  poetry  novels  susangreenfield  rote  rotelearning  twitter  knowledge  education  brain  wayfinding  memory  location  narration  navigation  vladimirnabokov  proust  janeausten  film  video  attention  editing  reading  howweread  visualizationhypothesis  visualization  text  imagery  images  cognition  literacy  multiliteracies  memories  nietzsche  booklearning  technology  mobile  phones  mentalillness  ptsd  humans  humanity  digitalmedia  richardbrautigan  narrative  storytelling 
november 2016 by robertogreco
Against Interpretation
[before quoting the entirety, quoting one line:

"What is important now is to recover our senses. We must learn to see more, to hear more, to feel more."]

"“Content is a glimpse of something, an encounter like a flash. It’s very tiny - very tiny, content.”
- Willem De Kooning, in an interview

“It is only shallow people who do not judge by appearances. The mystery of the world is the visible, not the invisible.”
- Oscar Wilde, in a letter

1

The earliest experience of art must have been that it was incantatory, magical; art was an instrument of ritual. (Cf. the paintings in the caves at Lascaux, Altamira, Niaux, La Pasiega, etc.) The earliest theory of art, that of the Greek philosophers, proposed that art was mimesis, imitation of reality.

It is at this point that the peculiar question of the value of art arose. For the mimetic theory, by its very terms, challenges art to justify itself.

Plato, who proposed the theory, seems to have done so in order to rule that the value of art is dubious. Since he considered ordinary material things as themselves mimetic objects, imitations of transcendent forms or structures, even the best painting of a bed would be only an “imitation of an imitation.” For Plato, art is neither particularly useful (the painting of a bed is no good to sleep on), nor, in the strict sense, true. And Aristotle’s arguments in defense of art do not really challenge Plato’s view that all art is an elaborate trompe l’oeil, and therefore a lie. But he does dispute Plato’s idea that art is useless. Lie or no, art has a certain value according to Aristotle because it is a form of therapy. Art is useful, after all, Aristotle counters, medicinally useful in that it arouses and purges dangerous emotions.

In Plato and Aristotle, the mimetic theory of art goes hand in hand with the assumption that art is always figurative. But advocates of the mimetic theory need not close their eyes to decorative and abstract art. The fallacy that art is necessarily a “realism” can be modified or scrapped without ever moving outside the problems delimited by the mimetic theory.

The fact is, all Western consciousness of and reflection upon art have remained within the confines staked out by the Greek theory of art as mimesis or representation. It is through this theory that art as such - above and beyond given works of art - becomes problematic, in need of defense. And it is the defense of art which gives birth to the odd vision by which something we have learned to call “form” is separated off from something we have learned to call “content,” and to the well-intentioned move which makes content essential and form accessory.

Even in modern times, when most artists and critics have discarded the theory of art as representation of an outer reality in favor of the theory of art as subjective expression, the main feature of the mimetic theory persists. Whether we conceive of the work of art on the model of a picture (art as a picture of reality) or on the model of a statement (art as the statement of the artist), content still comes first. The content may have changed. It may now be less figurative, less lucidly realistic. But it is still assumed that a work of art is its content. Or, as it’s usually put today, that a work of art by definition says something. (“What X is saying is . . . ,” “What X is trying to say is . . .,” “What X said is . . .” etc., etc.)

2

None of us can ever retrieve that innocence before all theory when art knew no need to justify itself, when one did not ask of a work of art what it said because one knew (or thought one knew) what it did. From now to the end of consciousness, we are stuck with the task of defending art. We can only quarrel with one or another means of defense. Indeed, we have an obligation to overthrow any means of defending and justifying art which becomes particularly obtuse or onerous or insensitive to contemporary needs and practice.

This is the case, today, with the very idea of content itself. Whatever it may have been in the past, the idea of content is today mainly a hindrance, a nuisance, a subtle or not so subtle philistinism.

Though the actual developments in many arts may seem to be leading us away from the idea that a work of art is primarily its content, the idea still exerts an extraordinary hegemony. I want to suggest that this is because the idea is now perpetuated in the guise of a certain way of encountering works of art thoroughly ingrained among most people who take any of the arts seriously. What the overemphasis on the idea of content entails is the perennial, never consummated project of interpretation. And, conversely, it is the habit of approaching works of art in order to interpret them that sustains the fancy that there really is such a thing as the content of a work of art.

3

Of course, I don’t mean interpretation in the broadest sense, the sense in which Nietzsche (rightly) says, “There are no facts, only interpretations.” By interpretation, I mean here a conscious act of the mind which illustrates a certain code, certain “rules” of interpretation.

Directed to art, interpretation means plucking a set of elements (the X, the Y, the Z, and so forth) from the whole work. The task of interpretation is virtually one of translation. The interpreter says, Look, don’t you see that X is really - or, really means - A? That Y is really B? That Z is really C?

What situation could prompt this curious project for transforming a text? History gives us the materials for an answer. Interpretation first appears in the culture of late classical antiquity, when the power and credibility of myth had been broken by the “realistic” view of the world introduced by scientific enlightenment. Once the question that haunts post-mythic consciousness - that of the seemliness of religious symbols - had been asked, the ancient texts were, in their pristine form, no longer acceptable. Then interpretation was summoned, to reconcile the ancient texts to “modern” demands. Thus, the Stoics, to accord with their view that the gods had to be moral, allegorized away the rude features of Zeus and his boisterous clan in Homer’s epics. What Homer really designated by the adultery of Zeus with Leto, they explained, was the union between power and wisdom. In the same vein, Philo of Alexandria interpreted the literal historical narratives of the Hebrew Bible as spiritual paradigms. The story of the exodus from Egypt, the wandering in the desert for forty years, and the entry into the promised land, said Philo, was really an allegory of the individual soul’s emancipation, tribulations, and final deliverance. Interpretation thus presupposes a discrepancy between the clear meaning of the text and the demands of (later) readers. It seeks to resolve that discrepancy. The situation is that for some reason a text has become unacceptable; yet it cannot be discarded. Interpretation is a radical strategy for conserving an old text, which is thought too precious to repudiate, by revamping it. The interpreter, without actually erasing or rewriting the text, is altering it. But he can’t admit to doing this. He claims to be only making it intelligible, by disclosing its true meaning. However far the interpreters alter the text (another notorious example is the Rabbinic and Christian “spiritual” interpretations of the clearly erotic Song of Songs), they must claim to be reading off a sense that is already there.

Interpretation in our own time, however, is even more complex. For the contemporary zeal for the project of interpretation is often prompted not by piety toward the troublesome text (which may conceal an aggression), but by an open aggressiveness, an overt contempt for appearances. The old style of interpretation was insistent, but respectful; it erected another meaning on top of the literal one. The modern style of interpretation excavates, and as it excavates, destroys; it digs “behind” the text, to find a sub-text which is the true one. The most celebrated and influential modern doctrines, those of Marx and Freud, actually amount to elaborate systems of hermeneutics, aggressive and impious theories of interpretation. All observable phenomena are bracketed, in Freud’s phrase, as manifest content. This manifest content must be probed and pushed aside to find the true meaning - the latent content - beneath. For Marx, social events like revolutions and wars; for Freud, the events of individual lives (like neurotic symptoms and slips of the tongue) as well as texts (like a dream or a work of art) - all are treated as occasions for interpretation. According to Marx and Freud, these events only seem to be intelligible. Actually, they have no meaning without interpretation. To understand is to interpret. And to interpret is to restate the phenomenon, in effect to find an equivalent for it.

Thus, interpretation is not (as most people assume) an absolute value, a gesture of mind situated in some timeless realm of capabilities. Interpretation must itself be evaluated, within a historical view of human consciousness. In some cultural contexts, interpretation is a liberating act. It is a means of revising, of transvaluing, of escaping the dead past. In other cultural contexts, it is reactionary, impertinent, cowardly, stifling.

4

Today is such a time, when the project of interpretation is largely reactionary, stifling. Like the fumes of the automobile and of heavy industry which befoul the urban atmosphere, the effusion of interpretations of art today poisons our sensibilities. In a culture whose already classical dilemma is the hypertrophy of the intellect at the expense of energy and sensual capability, interpretation is the revenge of the intellect upon art.

Even more. It is the revenge of the intellect upon the world. To interpret is to impoverish, to deplete the world - in order to set up a shadow world of “meanings.” It is to turn … [more]
art  interpretation  philosophy  theory  essays  susansontag  plato  artistotle  film  representation  innocence  nietzsche  proust  kafka  tennesseewilliams  jean-lucgodard  rolandbarthes  erwinpanofsky  northropfrye  walterbenjamin  yasujirōozu  robertbresson  culture  thought  senses  oscarwilde  willemdekooning  content  appearances  aesthetics  invisibile  myth  antiquity  karlmarx  freud  jamesjoyce  rainermariarilke  andrégide  dhlawrence  jeancocteau  alainresnais  alainrobbe-grillet  ingmarbergman  ezrapund  tseliot  dgriffith  françoistruffaut  michelangeloantonioni  ermannoolmi  criticism  pierrefrancastel  mannyfarber  dorothyvanghent  rndalljarrell  waltwhitman  williamfaulkner 
july 2016 by robertogreco
What Is an "Existential Crisis”?: An Animated Video Explains What the Expression Really Means | Open Culture
[video: https://www.youtube.com/watch?v=aEzMwNBjkAU ]

"“Who am I?” many of us have wondered at some point in our lives, “What am I? Where am I?”… maybe even—while gazing in bewilderment at the pale blue dot and listening to the Talking Heads—“How did I get here?”

That feeling of unsettling and profound confusion, when it seems like the hard floor of certainty has turned into a black abyss of endless oblivion…. Thanks to modern philosophy, it has a handy name: an existential crisis. It’s a name, says Alain de Botton in his School of Life video above, that “touches on one of the major traditions of European philosophy,” a tradition “associated with ideas of five philosophers in particular: Kierkegaard, Camus, Nietzsche, Heidegger, and Sartre.”

What do these five have in common? The question is complicated, and we can’t really point to a “tradition.” As the Internet Encyclopedia of Philosophy notes, Existentialism is a “catch-all term” for a few continental philosophers from the 19th and 20th centuries, some of whom had little or no association with each other. Also, “most of the philosophers conventionally grouped under this heading either never used, or actively disavowed the term ‘existentialist.’” Camus, according to Richard Raskin, thought of Existentialism as a “form of philosophical suicide” and a “destructive mode of thought.” Even Sartre, who can be most closely identified with it, once said “Existentialism? I don’t know what it is.”

But labels aside, we can identify many common characteristics of the five thinkers de Botton names that apply to our paralyzing experiences of supreme doubt. The video identifies five such broad commonalities of the “existential crisis”:

1. “It’s a period when a lot that had previously seemed like common sense or normal reveals its contingent, chance, uncanny, and relative nature…. We are freer than we thought.”

2. We recognize we’d been deluding ourselves about what had to be…. We come to a disturbing awareness that our ultimate responsibility is to ourselves, not the social world.”

3. “We develop a heightened awareness of death. Time is short and running out. We need to re-examine our lives, but the clock is ticking.”

4. “We have many choices, but are, by the nature of the human condition, denied the information we would need to choose with ultimate wisdom or certainty. We are forced to decide, but can never be assured that we’ve done so adequately. We are steering blind.”

5. This means that anxiety is a “basic feature” of all human existence.

All of this, de Botton admits, can “seem perilous and dispiriting,” and yet can also ennoble us when we consider that the private agonies we think belong to us alone are “fundamental features of the human condition.” We can dispense with the trivializing idea, propagated by advertisers and self-help gurus, that “intelligent choice might be possible and untragic… that perfection is within reach.” Yet de Botton himself presents Existentialist thought as a kind of self-help program, one that helps us with regret, since we realize that everyone bears the burdens of choice, mortality, and contingency, not just us.

However, in most so-called Existentialist philosophers, we also discover another pressing problem. Once we become untethered from pleasing fictions of pre-existing realities, “worlds-behind-the-scene,” as Nietzsche put it, or “being-behind-the-appearance,” in Sartre’s words, we no longer see a benevolent hand arranging things neatly, nor have absolute order, meaning, or purpose to appeal to.

We must confront that fact that we, and no one else, bear responsibility for our choices, even though we make them blindly. It’s not a comforting thought, hence the “crisis.” But many of us resolve these moments of shock with varying degrees of wisdom and experience. As we know from another great thinker, Eleanor Roosevelt, who was not an Existentialist philosopher, “Freedom makes a huge requirement of every human being…. For the person who is unwilling to grow up… this is a frightening prospect.”"
existentialcrises  existentialism  philosophy  video  2016  kierkegaard  camus  nietzsche  heidegger  sartre  jean-paulsartre  albertcamus  humancondition  alaindebotton 
july 2016 by robertogreco
Speed Kills: Fast is never fast enough - The Chronicle of Higher Education
"In the past 50 years, two economies that operate at two different speeds have emerged. In one, wealth is created by selling labor or stuff; in the other, by trading signs that are signs of other signs. The virtual assets scale at a speed much greater than the real assets. A worker can produce only so many motorcycles, a teacher can teach only so many students, and a doctor can see only so many patients a day. In high-speed markets, by contrast, billions of dollars are won or lost in billionths of a second. In this new world, wealth begets wealth at an unprecedented rate. No matter how many new jobs are created in the real economy, the wealth gap created by the speed gap will never be closed. It will continue to widen at an ever-faster rate until there is a fundamental change in values.

One of the most basic values that must be rethought is growth, which has not always been the standard by which economic success is measured. The use of the gross national product and gross domestic product to evaluate relative economic performance is largely the product of the Cold War. As the battleground between the United States and the Soviet Union expanded to include the economy, the question became whether capitalism or communism could deliver more goods faster."



"The problem is not only, as Michael Lewis argues in Flash Boys, finding a technological fix for markets that are rigged; the problem is that the entire system rests on values that have become distorted: individualism, utility, efficiency, productivity, competition, consumption, and speed. Furthermore, this regime has repressed values that now need to be cultivated: sustainability, community, cooperation, generosity, patience, subtlety, deliberation, reflection, and slowness. If psychological, social, economic, and ecological meltdowns are to be avoided, we need what Nietzsche aptly labeled a "transvaluation of values."



"The growing concern about the effectiveness of primary, secondary, and postsecondary education has led to a preoccupation with the evaluation of students and teachers. For harried administrators, the fastest and most efficient way to make these assessments is to adopt quantitative methods that have proved most effective in the business world. Measuring inputs, outputs, and throughputs has become the accepted way to calculate educational costs and benefits. While quantitative assessment is effective for some activities and subjects, many of the most important aspects of education cannot be quantified. When people believe that what cannot be measured is not real, education and, by extension society, loses its soul.

Today’s young people are not merely distracted—the Internet and video games are actually rewiring their brains. Neuroscientists have found significant differences in the brains of "addicted" adolescents and "healthy" users. The next edition of the standard Diagnostic and Statistical Manual of Mental Disorders will very likely specify Internet addiction as an area for further research. The epidemic of ADHD provides additional evidence of the deleterious effects of the excessive use of digital media. Physicians concerned about the inability of their patients to concentrate freely prescribe Ritalin, which is speed, while students staying up all night to study take Ritalin to give them a competitive advantage.

Rather than resisting these pressures, anxious parents exacerbate them by programming their kids for what they believe will be success from the time they are in prekindergarten. But the knowledge that matters cannot be programmed, and creativity cannot be rushed but must be cultivated slowly and patiently. As leading scientists, writers, and artists have long insisted, the most imaginative ideas often emerge in moments of idleness.

Many people lament the fact that young people do not read or write as much as they once did. But that is wrong—the issue is not how much they are reading and writing; indeed they are, arguably, reading and writing more than ever before. The problem is how they are reading and what they are writing. There is a growing body of evidence that people read and write differently online. Once again the crucial variable is speed. The claim that faster is always better is nowhere more questionable than when reading, writing, and thinking.

All too often, online reading resembles rapid information processing rather than slow, careful, deliberate reflection. Researchers have discovered what they describe as an "F-shaped pattern" for reading web content, in which as people read down a page, they scan fewer and fewer words in a line. When speed is essential, the shorter, the better; complexity gives way to simplicity, and depth of meaning is dissipated in surfaces over which fickle eyes surf. Fragmentary emails, flashy websites, tweets in 140 characters or less, unedited blogs filled with mistakes. Obscurity, ambiguity, and uncertainty, which are the lifeblood of art, literature, and philosophy, become decoding problems to be resolved by the reductive either/or of digital logic.

Finally, vocationalization. With the skyrocketing cost of college, parents, students, and politicians have become understandably concerned about the utility of higher education. Will college prepare students for tomorrow’s workplace? Which major will help get a job? Administrators and admission officers defend the value of higher education in economic terms by citing the increased lifetime earning potential for college graduates. While financial matters are not unimportant, value cannot be measured in economic terms alone. The preoccupation with what seems to be practical and useful in the marketplace has led to a decline in the perceived value of the arts and humanities, which many people now regard as impractical luxuries.

That development reflects a serious misunderstanding of what is practical and impractical, as well as the confusion between the practical and the vocational. As the American Academy of Arts and Sciences report on the humanities and social sciences, "The Heart of the Matter," insists, the humanities and liberal arts have never been more important than in today’s globalized world. Education focused on STEM disciplines is not enough—to survive and perhaps even thrive in the 21st century, students need to study religion, philosophy, art, languages, literature, and history. Young people must learn that memory cannot be outsourced to machines, and short-term solutions to long-term problems are never enough. Above all, educators are responsible for teaching students how to think critically and creatively about the values that guide their lives and inform society as a whole.

That cannot be done quickly—it will take the time that too many people think they do not have.

Acceleration is unsustainable. Eventually, speed kills. The slowing down required to delay or even avoid the implosion of interrelated systems that sustain our lives does not merely involve pausing to smell the roses or taking more time with one’s family, though those are important.

Within the long arc of history, it becomes clear that the obsession with speed is a recent development that reflects values that have become destructive. Not all reality is virtual, and the quick might not inherit the earth. Complex systems are not infinitely adaptive, and when they collapse, it happens suddenly and usually unexpectedly. Time is quickly running out."
speed  health  life  trends  2014  via:anne  marktaylor  filippomarinetti  futurists  futuristmanifesto  modernism  modernity  charliechaplin  efficiency  living  slow  thorsteinveblen  wealth  inequality  values  us  growth  economics  writing  finance  education  highered  highereducation  communication  internet  web  online  complexity  systemsthinking  systems  humanities  liberalarts  stem  criticalthinking  creativity  reflection  productivity  reading  howweread  howwewrite  thinking  schools  schooling  evaluation  assessment  quantification  standardization  standardizedtesting  society  interdisciplinary  professionalization  specialization  transdisciplinary  multidisciplinary  learning  howwelearn  howwethink  neuroscience  slowness  deliberation  patience  generosity  consumption  competition  competitiveness  subtlety  sustainability  community  cooperation  nietzsche  capitalism  latecapitalism 
october 2014 by robertogreco
Empires Revolution of the Present - marclafia
"The film and online project brings together international philosophers, scientists and artists to give description and analysis to the contemporary moment as defined by computational tools and networks.

It states that networks are not new and have been forever with us in the evolution of our cities, trade, communications and sciences, in our relations as businesses and nation states, in the circulation of money, food, arms and our shared ecology.

Yet something has deeply changed in our experience of time, work, community, the global. Empires looks deeply to unravel how we speak to the realities of the individual and the notion of the public and public 'good' in this new world at the confluence of money, cities, computation, politics and science."

[Film website: http://www.revolutionofthepresent.org/ ]

[Trailer: https://vimeo.com/34852940 ]
[First cut (2:45:05): https://vimeo.com/32734201 ]

[YouTube (1:21:47): https://www.youtube.com/watch?v=HaTw5epW_QI ]

"Join the conversation at http://www.revolutionofthepresent.org

Summary: The hope was that network technology would bring us together, create a "global village," make our political desires more coherent. But what's happened is that our desires have become distributed, exploded into images and over screens our eyes relentlessly drop to view.

REVOLUTION OF THE PRESENT examines the strange effects — on cities, economies, people — of what we might call accelerated capitalism. Set against a visually striking array of sounds and images, 15 international thinkers speak to the complexity and oddity of this contemporary moment as they discuss what is and what can be.

Documentary Synopsis:
Humanity seems to be stuck in the perpetual now that is our networked world. More countries are witnessing people taking to the streets in search of answers. Revolution of the Present, the film, features interviews with thought leaders designed to give meaning to our present and precarious condition. This historic journey allows us to us re-think our presumptions and narratives about the individual and society, the local and global, our politics and technology. This documentary analyzes why the opportunity to augment the scope of human action has become so atomized and diminished. Revolution of the Present is an invitation to join the conversation and help contribute to our collective understanding.

As Saskia Sassen, the renowned sociologist, states at the outset of the film, 'we live in a time of unsettlement, so much so that we are even questioning the notion of the global, which is healthy.' One could say that our film raises more questions than it answers, but this is our goal. Asking the right questions and going back to beginnings may be the very thing we need to do to understand the present, and to move forward from it with a healthy skepticism.

Revolution of the Present is structured as an engaging dinner conversation, there is no narrator telling you what to think, it is not a film of fear of the end time or accusation, it is an invitation to sit at the table and join an in depth conversation about our diverse and plural world."

[See also: http://hilariousbookbinder.blogspot.com/2014/09/rethinking-internet-networks-capitalism.html ]

[Previously:
https://pinboard.in/u:robertogreco/b:ec1d3463d74b
https://pinboard.in/u:robertogreco/b:9f60604ec3b3 ]
marclafia  networks  philosophy  politics  science  money  cities  scale  economics  capitalism  2014  kazysvarnelis  communication  communications  business  work  labor  psychology  greglindsay  saskiasassen  urban  urbanism  freedom  freewill  howardbloom  juanenríquez  michaelhardt  anthonypagden  danielisenberg  johnhenryclippinger  joséfernández  johannaschiller  douglasrushkoff  manueldelanda  floriancrammer  issaclubb  nataliejeremijenko  wendychun  geertlovink  nishantshah  internet  online  web  danielcoffeen  michaelchichi  jamesdelbourgo  sashasakhar  pedromartínez  miguelfernándezpauldocherty  alexandergalloway  craigfeldman  irenarogovsky  matthewrogers  globalization  networkedculture  networkculture  history  change  nationstates  citystates  sovreignty  empire  power  control  antonionegri  geopolitics  systems  systemsthinking  changemaking  meaningmaking  revolution  paradigmshifts  johnlocke  bourgeoisie  consumption  middleclass  class  democracy  modernity  modernism  government  governence  karlmarx  centralization  socialism  planning  urbanplanning  grass 
october 2014 by robertogreco
An Emphatic Umph: Death and the Afterlife
"The other day, I was spending time with a friend and every time I chuckled, she'd say, That's your brother! That's his laugh! Think about what an insane thing that is to say. I wasn't quite sure I knew what she meant at that juncture but I do know the experience of being possessed by my brother. Usually, I feel it when I'm holding forth. Oh, lord, when I was teaching, I'd be mid-lecture when all I could hear, all I could feel, was my brother spouting — sprouting — up through my mouth, a kind of Ouija board.

My brother lives in Manila, in the Philippines. But he also lives right here — in me, as me, with me, at least a little. My sister is dead and she, too, lives right here — in me, as me, with me. Death, the Philippines, across town, it doesn't matte: our possession of and by other people transcends time and space, transcends body and ego. This can, of course, be to our dismay. I have familial forces working in me that I'd like to dispel. In fact, in order not to be a total asshole of a father — the key word here being total — I have to wrestle, stifle, and muffle the paternal voices that live in me, that live as me, that haunt me all the time.

We live with ghosts. This is not some supernatural thing, some mystical claim. Events are not discrete. When something happens, it doesn't just begin then end. It continues to happen more or less. This is called, amongst other things, memory. Memory is not a card catalog of snapshots. Memory is the presence of the past, here and now. It's my tying my shoe, craving rice noodles for dinner, knowing the way to my son's school. It's also the smell of my childhood house; it's falling into a pile of dog shit at the ever sad PS 165 playground and then my five year old ass being asked to strip for a bath by the Jamaican nanny I could never understand; it's the wide, radiant, true smile of my sister as well as her confused, sad, skinny face days before she died; it's the daily screaming of my parents that still echoes in my skull. It's everything that's ever happened to me and is still happening to me, right here, right now.

We are events, each of us. We continue just as the things that happen to us continue. Sure, they seem done and gone but they — but we — persist in various ways, as echoes and sentiments, as shadows and gestures, as scars and dreams."
danielcoffeen  douglain  death  2014  kierkegaard  ghosts  afterlife  religion  buddhism  meaning  meaningmaking  living  consciousness  williamsburroughs  nietzsche  foucault  jacquesderrida  paulricoeur  pauldeman  marclafia  memory  softarchitecture  lisarobertson  mortality  aubreydegrey  immortality  events  experience  time  memories  writing  transcendence  deleuze  plato  michelfoucault 
october 2014 by robertogreco
Meta is Murder - Mills Baker's Internet Haus of Cards
"One such principle is well phrased by Marilynne Robinson in her essay “When I was a Child,” in her collection When I Was a Child I Read Books:
"It may be mere historical conditioning, but when I see a man or a woman alone, he or she looks mysterious to me, which is only to say that for a moment I see another human being clearly."

The idea that a human seen clearly is a mystery is anathema to a culture of judgment —such as ours— which rests on a simple premise: humans can be understood by means of simple schema that map their beliefs or actions to moral categories. Moreover, because there are usually relatively few of these categories, and few important issues of discernment —our range of political concerns being startlingly narrow, after all— humans can be understood and judged at high speed in large, generalized groups: Democrats, Republicans, women, men, people of color, whites, Muslims, Christians, the rich, the poor, Generation X, millennials, Baby Boomers, and so on.

It should but does not go without saying that none of those terms describes anything with sufficient precision to support the kinds of observations people flatter themselves making. Generalization is rarely sound. No serious analysis, no serious effort to understand, describe, or change anything can contain much generalization, as every aggregation of persons introduces error. One can hardly describe a person in full, let alone a family, a city, a class, a state, a race. Yet we persist in doing so, myself included."



"One of the very best things Nietzsche ever wrote:
"The will to a system is a lack of integrity."

But to systematize is our first reaction to life in a society of scale, and our first experiment as literate or educated or even just “grown-up” persons with powers of apprehension, cogitation, and rhetoric. What would a person be online if he lacked a system in which phenomena could be traced to the constellation of ideas which constituted his firmament? What is life but the daily diagnosis of this or that bit of news as “yet another example of” an overarching system of absolutely correct beliefs? To have a system is proof of one’s seriousness, it seems —our profiles so often little lists of what we “believe,” or what we “are”— and we coalesce around our systems of thought just as our parents did around their political parties, though we of course consider ourselves mere rationalists following the evidence. Not surprisingly, the evidence always leads to the conclusion that many people in the world are horrible, stupid, even evil; and we are smart, wise, and good. It should be amusing, but it is not.

I hate this because I am doing this right now. I detest generalization because when I scan Twitter I generalize about what I see: “people today,” or “our generation,” I think, even though the people of today are as all people always have been, even though they are all just like me. I resent their judgments because I feel reduced by them and feel reality is reduced, so I reduce them with my own judgments: shallow thinkers who lack, I mutter, the integrity not to systematize. And I put fingers to keys to note this system of analysis, lacking all integrity, mocking my very position.

I want to maintain my capacity to view each as a mystery, as a human in full, whose interiority I cannot know. I want not to be full of hatred, so I seek to confess that my hatred is self-hatred: shame at the state of my intellectual reactivity and decay. I worry deeply that our systematizing is inevitable because when we are online we are in public: that these fora mandate performance, and worse, the kind of performance that asserts its naturalness, like the grotesquely beautiful actor who says, "Oh, me? I just roll out of bed in the morning and wear whatever I find lying about" as he smiles a smile so practiced it could calibrate the atomic clock. Every online utterance is an angling for approval; we write in the style of speeches: exhorting an audience, haranguing enemies, lauding the choir. People “remind” no one in particular of the correct ways to think, the correct opinions to hold. When I see us speaking like op-ed columnists, I feel embarrassed: it is like watching a lunatic relative address passers-by using the “royal we,” and, I feel, it is pitifully imitative. Whom are we imitating? Those who live in public: politicians, celebrities, “personalities.”

There is no honesty without privacy, and privacy is not being forbidden so much as rendered irrelevant; privacy is an invented concept, after all, and like all inventions must contend with waves of successive technologies or be made obsolete. The basis of privacy is the idea that judgment should pertain only to public acts —acts involving other persons and society— and not the interior spaces of the self. Society has no right to judge one’s mind; society hasn’t even the right to inquire about one’s mind. The ballot is secret; one cannot be compelled to testify or even talk in our criminal justice system; there can be no penalty for being oneself, however odious we may find given selves or whole (imagined) classes of selves.

This very radical idea has an epistemological basis, not a purely moral one: the self is a mystery. Every self is a mystery. You cannot know what someone really is, what they are capable of, what transformations of belief or character they might undergo, in what their identity consists, what they’ve inherited or appropriated, what they’ll abandon or reconsider; you cannot say when a person is who she is, at what point the “real” person exists or when a person’s journey through selves has stopped. A person is not, we all know, his appearance; but do we all know that she is not her job? Or even her politics?

But totalizing rationalism is emphatic: either something is known or it is irrelevant. Thus: the mystery of the self is a myth; there is no mystery at all. A self is valid or invalid, useful or not, correct or incorrect, and if someone is sufficiently different from you, if their beliefs are sufficiently opposed to yours, their way of life alien enough, they are to be judged and detested. Everyone is a known quantity; simply look at their Twitter bio and despise.

But this is nonsense. In truth, the only intellectually defensible posture is one of humility: all beliefs are misconceptions; all knowledge is contingent, temporary, erroneous; and no self is knowable, not truly, not to another. We can perhaps sense this in ourselves —although I worry that many of us are too happy to brag about our conformity to this or that scheme or judgment, to use labels that honor us as though we’ve earned ourselves rather than chancing into them— but we forget that this is true of every single other, too. This forgetting is the first step of the so-called othering process: forget that we are bound together in irreducibility, forget that we ought to be humble in all things, and especially in our judgments of one another.

Robinson once more:
"Only lonesomeness allows one to experience this sort of radical singularity, one’s greatest dignity and privilege."

Lonesomeness is what we’re all fleeing at the greatest possible speed, what our media now concern themselves chiefly with eliminating alongside leisure. We thus forget our radical singularity, a personal tragedy, an erasure, a hollowing-out, and likewise the singularity of others, which is a tragedy more social and political in nature, and one which seems to me truly and literally horrifying. Because more than any shared “belief system” or political pose, it is the shared experience of radical singularity that unites us: the shared experience of inimitability and mortality. Anything which countermands our duty to recognize and honor the human in the other is a kind of evil, however just its original intention."
millsbaker  canon  self  reality  empathy  humility  howwethink  2014  generalizations  morality  nietzsche  integrity  marilynnerobinson  mystery  grace  privacy  categorization  pigeonholingsingularity  lonesomeness  loneliness  leisure  artleisure  leisurearts  beliefs  belief  inimitability  humanism  judgement  familiarity  understanding 
august 2014 by robertogreco
BBC News - The slow death of purposeless walking
"A number of recent books have lauded the connection between walking - just for its own sake - and thinking. But are people losing their love of the purposeless walk?"
walking  thinking  2014  flaneur  wandering  charlesdickens  georgeorwell  patrickleigh  constantinbrancusi  thoreau  thomasdequincey  nassimtaleb  nietzsche  brucechatwin  wgebald  johnfrancis  fredericgros  geoffnicholson  merlincoverley  observation  attention  mindfulness  rebeccasolnit  finlorohrer  vladimirnabokov 
may 2014 by robertogreco
BOMB Magazine — Etel Adnan by Lisa Robertson
"EA: … Galleries wait for artists to be recognized and then they all solicit the same ones. That happened to me, but I had to say no, because I can’t produce. I can paint, but I can’t produce. I always have done that, even when I was younger. Visual art is big industry; lots of money moves around, which is okay, it’s vital. But it’s also a bit of a heartbreak—I wish this had happened, let’s say, twenty years ago. It’s a nice feeling to have your work appreciated, but it’s almost a fashion for women to be recognized late in life. Agnes Martin, for example. It’s a trend, but we hope it will change."



"LR I’ve been rereading your books in the past two weeks, three or four of them. I read this beautiful line in Seasons this morning: “Women are keepers of their own story therefore they are historians.” I put that in relation to images in your work. Lately, I have been thinking a lot about images—about how the image works in Baudelaire, for example. It’s not only a visual or optical event, it’s happening across all the senses. It’s a poly-sensual perceiving.

EA Yes!


LR So I have two questions. One is about the relationship between the image in poetry and the image in painting, and the other one, which might not be related to the first, is about women’s images. In an interview with Steve McQueen in The Guardian about his film Twelve Years a Slave, he said, “Some images have never been seen before. I needed to see them.” It resonated for me in relationship to your work. You are making images that have not been seen. Some of that might have to do with the fact that you are making women’s images. Do you feel that?

EA Until now at least, a woman’s life, her psyche . . . we don’t like the word essence anymore. As women, of course, we are different from each other as people, but we are also different from men. Or we have been up until now. So we have our own images. We’ve had little girls’ lives, so we carry that. When I grew up in Beirut, there weren’t many sports for boys or girls, but certainly girls were aware of being little girls, of being in. This idea of the outside and the inside works very strongly in women’s lives. In fact, women are rooted somewhere, they are stronger physically. Women are containers—the baby is in their belly; making love is receiving. This container contains hearts and stomachs. Images are, in one way, what we receive, but they are also the tools with which we think. To make images, you think with them, somehow. You mentioned Baudelaire. For Baudelaire, images work not like shapes, but like ideas made visible. He was particularly interested in the encounter between what we call the inner world and the outer world. And poetry deals magnificently with that. It is one of the major definitions of poetry. It addresses that relationship between what we call the subject and the object, which melt in what we call consciousness. Sometimes we transcribe this state of mind into words and call it a poem or a text. The same is true for the other arts. Writing is a very mysterious activity. When you write, you say things that would not have occurred to your mind otherwise. I don’t know if the fact that we don’t use paper and ink anymore affects writing. On a computer it’s a new situation.

LR Do you write on a computer?

EA My poetry is not long. I write in little paragraphs and they pile up, so I do it by hand. But I am more and more obligated to answer letters or emails, so then I use a computer. But to go back to what an image is—

LR That’s my real question. (laughter)


Afternoon Poem, 1968, ink and watercolor on paper, 8 1/2 × 96 inches.
EA For example, I look at this table in front of me. Somebody over there, however, may look at it and not see it. Seeing is an activity; it is not passive.

LR The last sentence I read before I got off the metro on my way here was, “Behind an image there’s the image.”

EA There are layers of images—that’s what I meant, very simply. There is thickness. Vision is multidimensional and simultaneous. You can think, see, see beyond: you can do all these things at the same time. Your psyche, your brain catches up. Some people today say that an image is not necessarily a clear figuration of something; it could be like a blurred abstract drawing, like a sliding door.

LR An event in perceiving.

EA Yes, an event. It is a speed that you catch. Images are not still. They are moving things. They come, they go, they disappear, they approach, they recede, and they are not even visual—ultimately they are pure feeling. They’re like something that calls you through a fog or a cloud.

LR So they are immaterial, in a way.

EA That’s it! They are immaterial in essence. But they could be strongly defined, or they could be fleeting, almost like a ghost of things or of feelings going by. So the word image is very elastic. It’s a very rich concept. Although we are bombarded with images, our culture is anti-image. We think we don’t like it; it’s not fashionable. That is why Surrealism exists: it intends to amplify the image, to force us to see it. Andy Warhol understood that we are surrounded by so many things, and people, that we do not see them. We are rather blinded by them. So he forced our attention on soup cans and Marilyn Monroe.

On an other level, there are also different clarities. Some things are not meant to be clear; obscurity is their clarity. We should not underestimate obscurity. Obscurity is as rich as luminosity."



"EA I went to Catholic schools all my life. There were no other schools in Lebanon. We had religion around all the time. I’m lucky—I never believed in catechism or any of that. I was always a dissident without effort, at a distance from all the things the nuns were saying. I never liked saints. What touched me was their speaking of revelation, even the word itself. That always made sense to me. We owe life to the existence of the sun; therefore light is a very profound part of our makeup. It’s spiritual, in the way that even DNA is spiritual. What we call “spirit” is energy. It’s the definition of life, in one sense. Light, as an object, as a phenomenon, is magnificent. I am talking to you and the light coming in through the window has already changed. You go on the street and you look at the sky and it tells you what time it is. We are dealing with it constantly, and obscurity is also maybe its own light, because it shows you things. Obscurity is not lack of light. It is a different manifestation of light. It has its own illumination."



"LR One of the things I really appreciate in your poems is this very quick and subtle shift of register in the language. So many different idiolects enter into the stanzas or paragraphs that you write, which I actually think of as images in the way we were discussing.

EA What do you mean by “idiolects”?

LR Well, extreme colloquialisms right up against much more subtle, highly literary language.

EA Oh, I don’t realize that I’m doing that. That’s not a decision. I write as things come to my mind, maybe because I love philosophy, but I don’t love theory. There is a big difference. Not that I don’t respect theory, but I am incapable of writing it or even reading it."



"LR That is a beautiful book.

EA Howe manages to show how you should read a writer. The writer is unique, but is also part of a context. You can only approximate what a writer might have said. Philosophy is freer now, and for that reason Heidegger could say that the great philosophers were the poets. That a real, trained philosopher like Heidegger would come to that is very important to poets. Poets were afraid to think and philosophers were afraid to let go, to let loose and speak of themselves as part of their thinking. This boundary has been broken down. I love contemporary poetry because it moves between what we call poetry and what we call philosophy. It joins these fields and makes writing more natural, as in how it is lived in the person. We don’t separate thinking from feeling in real life, so why should we separate it in writing? The life of the mind is one and the boundaries and the categories are useful tools. We made them realities, but they are not realities—they are only tools, categories.

This existed before. In Hölderlin, for example, there is a lot of Romantic German thinking. I’d say Ezra Pound is more of a philosopher than we realize. There is a great presence of thinking in his poetry. Of course there is thinking when you write, but I mean thinking as such—

LR Approaching a problem.

EA That’s it! I find it in Pound. And there is political thinking in Charles Olson, whom I like very much. There is what they call proprioception, which comes very close to thinking—in Robert Creeley, for instance."



"LR The love of the world?

EA Yes. I don’t call it “nature”; I call it “the world.”

LR Well, what is the difference between them?

EA It’s historical. By nature we always mean landscapes. Language! The world is really the word; it’s the fact that it is.

LR Its isness.

EA It is and I love that. It distracted me from other forms of love. At the end of my life, I realize that the love of a person is a key to the world. Nothing matters more. To love a person in particular is the most difficult form of love, because it involves somebody else’s freedom. That is where misunderstandings come in; two people don’t have necessarily the same timing. You may love books and you may love paintings. They have their own technical difficulties, you fight with them, but you are the master of that fight.

LR Are you talking about time and timing? I mean, if you love a book or a painting, it’s more or less stable.

EA At least you are on top; it depends more on you. But a person has priorities, his or her problems, his or her character—you can’t control that and you don’t want to anyway. I mean, your freedom … [more]
eteladnan  lisarobertson  interviews  2014  obscurity  writing  light  art  gender  women  shadows  night  nighttime  joannekyger  philosophy  canon  idiolects  colloquialisms  language  literature  poetry  poems  susanhowe  nietzsche  heidegger  nature  balzac  baudelaire  love  friendship  time  timing  relationships  invention  making  images  thinking  howwethink  howwework  howwewrite  posthumanism  beirut  lebanon  paris  berkeley  ucberkeley 
april 2014 by robertogreco
Messages The City Wants Us To Hear – The New Inquiry
"Some excerpts from Timothy “Speed” Levitch’s Speedology (2002)

1. The Fastest Way to Adventure is to Stand Still

Boredom is an illusion. Boredom is the continuous state of not noticing that the unexpected is constantly arriving while the anticipated is never showing up. Boredom is anti-cruise propaganda.

2. The City as Autobiography

We are not visitors, tourists, nor inhabitants of New York City; we are New York City. The city is our moving self-portrait and a living art installation carved out on an island of rock, even the cracks of the sidewalk are crying out on the topic of our lives. The city is a profound opportunity to understand ourselves.

3. This is No Time for Historical Accuracy

Nothing I say can possibly be defended. I am not interested in being right or wrong; my priority is to be joyous.

As a tour guide, I approach history the same way Charlie Parker would approach a jazz standard. I am not here to recapitulate the notes exactly as they were composed but to find myself within the notes and collaborate with what has been before me to chase after everything I could ever be. My study of history is mostly an attempt to impress women.

4. Fear is Joy Paralyzed

Society— the greatest self-hatred the earth has ever witnessed— is a mediocre improv comedy piece we’re all living despite ourselves, one that would be impossible without fear effectively taking on ingenious disguises throughout the adventure of each and every day […] We do not have agendas, agendas have us.

5. Gregariousness is Great

New York City is a summoning of souls and a tribal ceremony of collected ancient agonies and conflicts brought to a new landscape for healing. A New Yorker is someone who runs wild with healing.

6. The Soul is the Only Landmark

Salvation is seeing everything as it already is.

7. Being Alive is Sexy

The world is an involuntary orgy.

8. What is Created is Destroyed

Many decry the destruction of Pennsylvania Station, the great beaux arts railroad terminal that was knocked down and replaced by the fourth Madison Square Garden. They ask, “If the city is a great teacher, why would it destroy a great building and put a lousy one in its place?” the answer: Pennsylvania station was too beautiful. The anecdote may be a catastrophe from a preservationist’s point of view, but it is a masterpiece from a dramatist’s. It’s just the way Tennessee Williams would have written it. Many will then ask, “Why is the city issuing forth these dramas?” The answer: the city wants to entertain us.

9. The Most Significant Thing About Suffering is That We’re All Doing It

[…]

10. Our True Selves Are the Greatest Parties Ever Thrown

You are a better party than you have ever been to. […] To live in a city is to realize that life is a procession of different versions of ourselves that we meet over time. Evolving is the meeting between who you were and who you just became.

11. Having Faith in Humanity is Supposed to be Fun

Fun is active faith. Faith is the celebration of “I don’t know.” The city is a bravely unfolding movie entertaining us so effectively we are hypnotized by it. The movie is a comedy about mammals in a movie taking the movie seriously and deciding it is a tragedy.

12. I Am Not Getting Laid

I want to make it clear, from the beginning, that I am not currently getting laid as I write this and this fact colors everything I say. It’s the one statement that makes perfect sense of Nietzsche’s work.

Bennett Miller’s 1998 documentary, The Cruise is one of the greatest films ever made about New York City."
boredom  cities  nyc  history  accuracy  fear  joy  society  life  living  2010  timothylevitch  speedology  2002  suffering  humanity  faith  nietzsche  bennettmiller  destruction  creativity 
november 2013 by robertogreco
To see is to forget the name of the thing one sees | Grand Strategy: The View from Oregon
"“To see is to forget the name of the thing one sees.” This is a quote frequently attributed to Paul Valéry, and the line has a quality that is at once both searching and poetic, making the attribution reasonable. I don’t know if Valéry actually said it (I can’t find the source of the quote), but I think of this line every once in a while: my mind returns to it as to an object of fascination. A good aphorism is perennially pregnant with meaning, and always repays further meditation.

If seeing is forgetting the name of the thing one sees, and mutatis mutandis for the aesthetic experiences that follow from the other senses — e.g., to taste is to forget the name of thing one tastes, and so forth — we may take the idea further and insist that it is the forgetting of not only the name but of all the linguistic (i.e., formal) accretions, all categorizations, and all predications, that enables us to experience the thing in itself (to employ a Kantian locution). What we are describing is the pursuit of prepredicative experience after the fact (to employ a Husserlian locution).

This is nothing other than the familiar theme of seeking a pure aesthetic experience unmediated by the intellect, undistracted by conceptualization, unmarred by thought — seeing without thinking the seen. In view of this, can we take the further step, beyond the generalization of naming, extending the conceit to all linguistic formalizations, so that we arrive at a pure aesthesis of thought? Can we say that to think is to forget the name of the thing one thinks?

The pure aesthesis of thought, to feel a thought as one feels an experience of the senses, would be thought unmediated by the conventions of naming, categories, predication, and all the familiar machinery of the intellect, i.e., thought unmediated by the accretions of consciousness. It would be thought without all that we usually think of as being thought. Is such thought even possible? Is this, perhaps, unconscious thought? Is Freud the proper model for a pure aesthesis of thought? Possible or not, conscious or not, Freudian or not, the pursuit of such thought would constitute an effort of thought that must enlarge our intellectual imagination, and the enlargement of our imagination is ultimately the enlargement of our world.

Wittgenstein famously wrote that the limits of my language are the limits of my world (Tractatus Logico-Philosophicus, 5.6 — this is another wonderful aphorism that always repays further meditation). But the limits of language can be extended; we can systematically seek to transcend the limits of our language and thus the limits of our world, or we can augment our language and thus augment our world. Russell, Wittgenstein’s mentor and one-time collaborator, rather than focusing on limits of the self, developed an ethic of impersonal self-enlargement, i.e., the transgression of limits. In the last chapter of his The Problems of Philosophy Russell wrote:
All acquisition of knowledge is an enlargement of the Self, but this enlargement is best attained when it is not directly sought. It is obtained when the desire for knowledge is alone operative, by a study which does not wish in advance that its objects should have this or that character, but adapts the Self to the characters which it finds in its objects. This enlargement of Self is not obtained when, taking the Self as it is, we try to show that the world is so similar to this Self that knowledge of it is possible without any admission of what seems alien. The desire to prove this is a form of self-assertion and, like all self-assertion, it is an obstacle to the growth of Self which it desires, and of which the Self knows that it is capable. Self-assertion, in philosophic speculation as elsewhere, views the world as a means to its own ends; thus it makes the world of less account than Self, and the Self sets bounds to the greatness of its goods. In contemplation, on the contrary, we start from the not-Self, and through its greatness the boundaries of Self are enlarged; through the infinity of the universe the mind which contemplates it achieves some share in infinity.

The obvious extension of this conception of impersonal self-enlargement to an ethics of thought enjoins the self-enlargement of the intellect, the transgression of the limits of the intellect. It is the exercise of imagination that enlarges the intellect, and a great many human failures that we put to failures of understanding and cognition are in fact failures of imagination.

The moral obligation of self-enlargement is a duty of intellectual self-transgression. As Nietzsche put it: “A very popular error: having the courage of one’s convictions; rather it is a matter of having the courage for an attack on one’s convictions!”"

[Came here today because https://twitter.com/rogre/status/403632186944790528 + https://twitter.com/rogre/status/403632476154626048 + https://twitter.com/rogre/status/403636512656334848
thus the tagging with Robert Irwin, Lawrence Weschler, and Clarice Lispector]
paulvaléry  wittgenstein  thought  language  aphorism  mind  memory  senses  familiarization  robertirwin  lawrenceweschler  naming  categorization  predication  freud  bertrandrussell  self  philosophy  claricelispector  knowledge  knowledgeacquisition  self-enlargement  nietzsche  brasil  brazil  literature 
november 2013 by robertogreco
Notebooks
"Burned all my notebooks
What good are notebooks
If they won't help me survive?

But a curiosity of my type remains after all the most agreeable of all vices --- sorry, I meant to say: the love of truth has its reward in heaven and even on earth." ---Nietzsche, Beyond Good and Evil, 45

'They're, well, notebooks --- things I find amusing, outrageous, strange or otherwise noteworthy; notes towards works-in-glacial-progress; hemi-demi-semi-rants; things I want to learn more about; lists of references; quotations from the Talking Heads where appropriate. If you can help with any of these, I'd be grateful; if you can tell me of anything I can profitably prune, I'd be even more grateful.

There is a list of frequently asked questions (FAQ), along with answers, and a colophon, which explains more than anyone would want to know about how these pages are put together. If your question isn't answered in either place, feel free to write, though, sadly, I can't promise a timely reply.'
notes  curiosity  nietzsche  commonplacebooks  notetaking  notebooks  via:selinjessa  cosmashalizi  unbook 
september 2012 by robertogreco
How Do We Identifiy Good Ideas? | Wired Science | Wired.com
"Nietzsche stressed this point. As he observed in his 1878 book Human, All Too Human:

"Artists have a vested interest in our believing in the flash of revelation, the so-called inspiration…shining down from heavens as a ray of grace. In reality, the imagination of the good artist or thinker produces continuously good, mediocre or bad things, but his judgment, trained and sharpened to a fine point, rejects, selects, connects…All great artists and thinkers are great workers, indefatigable not only in inventing, but also in rejecting, sifting, transforming, ordering.""
2012  imagination  editing  rejection  ideas  nietzsche  sifting  sorting  creativity  thinking  artists  jonahlehrer 
january 2012 by robertogreco
Nietzsche, Use and Abuse of History (e-text)
[Google cache of: http://records.viu.ca/~johnstoi/Nietzsche/history.htm ]

"This is a parable for every individual among us. He must organize the chaos in himself by recalling in himself his own real needs. His honesty, his more courageous and more genuine character, must at some point or other struggle against what will only be constantly repeated, relearned, and imitated. He begins then to grasp that culture can still be something other than a decoration of life, that is, basically always only pretence and disguise; for all ornamentation covers over what is decorated. So the Greek idea of culture reveals itself to him, in opposition to the Roman, the idea of culture as a new and improved nature, without inner and outer, without pretence and convention, culture as a unanimous sense of living, thinking, appearing, and willing. Thus, he learns out of his own experience that it was the higher power of moral nature through which the Greeks attained their victory over all other cultures and that each increase of truthfulness must also be…"
nietzsche  history  goethe  culture  greeks  romans  youth  honesty  morality  toread  via:timcarmody 
july 2011 by robertogreco
Conan O’Brien’s Dartmouth Commencement Address ... - AUSTIN KLEON : TUMBLR
"whole address is so good, but I keep coming back to… [part] about how failure to perfectly copy our heroes leads to finding our own voice…

"Way back in the 1940s there was a very, very funny man named Jack Benny. He was a giant star, easily one of the greatest comedians of his generation. And a much younger man named Johnny Carson wanted very much to be Jack Benny. In some ways he was, but in many ways he wasn’t. He emulated Jack Benny, but his own quirks and mannerisms, along with a changing medium, pulled him in a different direction. And yet his failure to completely become his hero made him the funniest person of his generation. David Letterman wanted to be Johnny Carson, and was not, and as a result my generation of comedians wanted to be David Letterman. And none of us are. My peers and I have all missed that mark in a thousand different ways. But the point is this : It is our failure to become our perceived ideal that ultimately defines us and makes us unique.""
conano'brien  dartmouth  creativity  voice  identity  humor  2011  change  mannerisms  johnnycarson  davidletterman  jackbenny  failure  copying  mimicry  quirkiness  personality  mutations  babyboomers  uniqueness  success  nietzsche  disappointment  socialmedia  innovation  spontaneity  satisfaction  convictions  fear  reinvention  perceivedfailure  self-defintion  clarity  originality  commencementspeeches  boomers  commencementaddresses 
june 2011 by robertogreco
The Philosophy of Insomnia - The Chronicle Review - The Chronicle of Higher Education
"Insomnia has intrigued thinkers since the ancients, an interest that continues today, especially in Europe. What light does philosophy's exploration of the dark of night shine on insomnia, particularly for that quintessential insomniac, the scholar?…<br />
<br />
The first thing you learn about insomnia is that it sees in the dark. The second is that it sees nothing. Nada, nichts, néant. The French philosopher Maurice Blanchot said in The Writing of the Disaster (1980), "In the night, insomnia is discussion, not the work of arguments bumping against other arguments, but the extreme shuddering of no thoughts, percussive stillness."<br />
<br />
[via: http://tumble77.com/post/5041107129/the-philosophy-of-insomnia ]
philosophy  sleep  insomnia  religion  willisregier  aristotle  nietzsche  plato  emilcioran 
may 2011 by robertogreco
Sartre, Heidegger, Nietzsche: Three Philosophers in Three Hours | Open Culture
"“Human, All Too Human” is a three-hour BBC series from 1999, about the lives and work of Friedrich Nietzsche, Martin Heidegger, and Jean-Paul Sartre. The filmmakers focus heavily on politics and historical context — the Heidegger hour, for example, focuses almost exclusively on his troubling relationship with Nazism.

The most engaging chapter is “Jean-Paul Sartre: The Road to Freedom,” in part because the filmmakers had so much archival footage and interview material (Check out a still lovely Simone de Bouvoir at minute 9:00, giggling that Sartre was the ugliest, dirtiest, most unshaven student at the Sorbonne).

A note on Part 2: Thinking the Unthinkable. We linked to the YouTube version, which has a slight whistle in the background. Catch a cleaner version here on Google Video while you still can."
culture  philosophy  video  towatch  jean-paulsartre  sartre  heidegger  nietzsche  via:javierarbona  simonedebouvoir  documentary 
april 2011 by robertogreco
Social contract - Wikipedia
"This raises the question of whether social contractarianism, as a central plank of liberal thought, is reconcilable with the Christian religion, and particularly with Catholicism and Catholic social teaching. The individualist and liberal approach has also been criticized since the 19th century by thinkers such as Marx, Nietzsche & Freud, and afterward by structuralist and post-structuralist thinkers, such as Lacan, Althusser, Foucault, Deleuze or Derrida."
socialcontract  philosophy  politics  social  history  karlmarx  marxism  nietzsche  freud  deleuze  foucault  louisalthusser  lacan  christianity  individualism  liberalthought  post-structuralism  stucturalism  religion  jacquesderrida  gillesdeleuze  michelfoucault  althusser 
april 2011 by robertogreco
The Virtues Of Play | Wired Science | Wired.com
"Nietzsche said it best: “The struggle of maturity is to recover the seriousness of the child at play.” While parents might be tempted to enroll their kids in preschools that seem the most “academic,” that’s probably a mistake. There is nothing frivolous about play."
education  play  children  psychology  games  reggioemilia  montessori  kindergarten  preschool  unschooling  deschooling  jonahlehrer  nietzsche  learning  academics  reading  math  tcsnmy  schools  damagedbyschools  cognition  parenting 
march 2011 by robertogreco
On the pleasures of reading Kant. « The Pinocchio Theory
"Some philosophers are such great writers and stylists that they are a pleasure to read — even in translation. Plato and Nietzsche are the most obvious examples, though I’d also include Spinoza, Hume, and Wittgenstein, at the very least, on my short list of great philosophical stylists. And the rhetorical effects of style are a big part of what attracts readers to such philosophers — Nietzsche, especially, seduces more on account of his style than on account of his actual arguments. This is not necessarily a bad thing; it’s a delusion, in any case, to think that you can separate logic from rhetoric, or content from style. Even mathematicians value “elegant” proofs. In things less cut and dried than mathematics — like metaphysics and ethics — style and rhetoric are even more important…"
philosophy  kant  rhetoric  stylists  writing  style  wittgenstein  nietzsche  hume  spinoza  plato  socrates 
march 2011 by robertogreco
International Philosophy Sketch from Monty Python
"The Germans playing 4-2-4, Leibniz in goal, back four Kant, Hegel, Schopenhauer and Schelling, front-runners Schlegel, Wittgenstein, Nietzsche and Heidegger, and the mid-field duo of Beckenbauer and Jaspers. Beckenbauer obviously a bit of a surprise there."
humor  philosophy  football  satire  film  montypython  wittgenstein  kant  nietzsche  heidegger  hegel  leibniz  plato  socrates  aristotle  archimedes  sophocles  ancientgreece  soccer  sports  futbol 
march 2011 by robertogreco
potlatch: blogging and its opposite
"So while academic text is littered with references and footnotes (Johnson 2004), perhaps blogging text should be littered with confessions of physio- and psychological weakness (I Will Davies am getting balder, fatter and consequently more irritable)."
blogging  thinking  critique  writing  academia  nietzsche  blogs  footnotes  psychology  via:blackbeltjones 
july 2008 by robertogreco

Copy this bookmark:





to read