recentpopularlog in

nhaliday : history   1627

« earlier  
As We May Think - Wikipedia
"As We May Think" is a 1945 essay by Vannevar Bush which has been described as visionary and influential, anticipating many aspects of information society. It was first published in The Atlantic in July 1945 and republished in an abridged version in September 1945—before and after the atomic bombings of Hiroshima and Nagasaki. Bush expresses his concern for the direction of scientific efforts toward destruction, rather than understanding, and explicates a desire for a sort of collective memory machine with his concept of the memex that would make knowledge more accessible, believing that it would help fix these problems. Through this machine, Bush hoped to transform an information explosion into a knowledge explosion.[1]

https://twitter.com/michael_nielsen/status/979193577229004800
https://archive.is/FrF8Q
https://archive.is/19hHT
https://archive.is/G7yLl
https://archive.is/wFbbj
A few notes on Vannevar Bush's amazing essay, "As We May Think", from the 1945(!) @TheAtlantic :

https://twitter.com/andy_matuschak/status/1147928384510390277
https://archive.is/tm6fB
https://archive.is/BIok9
When I first read As We May Think* as a teenager, I was astonished by how much it predicted of the computer age in 1945—but recently I’ve been feeling wistful about some pieces it predicts which never came to pass. [thread]

*

http://ceasarbautista.com/posts/memex_meetup_2.html
wiki  org:mag  essay  big-peeps  history  mostly-modern  classic  ideas  worrydream  exocortex  thinking  network-structure  graphs  internet  structure  notetaking  design  skunkworks  multi  techtariat  twitter  social  discussion  reflection  backup  speedometer  software  org:junk  michael-nielsen 
10 weeks ago by nhaliday
What did Max Weber mean by the ‘spirit’ of capitalism? | Aeon Ideas
There was another kind of disintegration besides that of traditional ethics. The proliferation of knowledge and reflection on knowledge had made it impossible for any one person to know and survey it all. In a world which could not be grasped as a whole, and where there were no universally shared values, most people clung to the particular niche to which they were most committed: their job or profession. They treated their work as a post-religious calling, ‘an absolute end in itself’, and if the modern ‘ethic’ or ‘spirit’ had an ultimate found­ation, this was it. One of the most widespread clichés about Weber’s thought is to say that he preached a work ethic. This is a mistake. He personally saw no particular virtue in sweat – he thought his best ideas came to him when relaxing on a sofa with a cigar – and had he known he would be misunder­stood in this way, he would have pointed out that a capacity for hard work was something that did not dist­inguish the modern West from previous soc­ieties and their value systems. However, the idea that people were being ever more defined by the blinkered focus of their employment was one he regarded as profoundly modern and characteristic.

rec'd by Garett Jones
news  org:mag  org:popup  rhetoric  summary  philosophy  ideology  big-peeps  sociology  social-science  zeitgeist  modernity  history  early-modern  labor  economics  capitalism  individualism-collectivism  social-norms  psychology  social-psych  telos-atelos  garett-jones 
october 2019 by nhaliday
2019 Growth Theory Conference - May 11-12 | Economics Department at Brown University
Guillaume Blanc (Brown) and Romain Wacziarg (UCLA and NBER) "Change and Persistence in the Age of Modernization:
Saint-Germain-d’Anxure, 1730-1895∗"

Figure 4.1.1.1 – Fertility
Figure 4.2.1.1 – Mortality
Figure 5.1.0.1 – Literacy

https://twitter.com/GarettJones/status/1127999888359346177
https://archive.is/1EnZg
Short pre-modern lives weren't overwhelmingly about infant mortality:

From this weekend's excellent Deep Roots conference at @Brown_Economics, new evidence from a small French town, an ancestral home of coauthor Romain Wacziarg:
--
European Carpe Diem poems made a lot more sense when 20-year-olds were halfway done with life:
...
--
...
N.B. that's not a correction at all, it's telling the same story as the above figure:

Conditioned on surviving childhood, usually living to less than 50 years total in 1750s France and in medieval times.
study  economics  broad-econ  cliometrics  demographics  history  early-modern  europe  gallic  fertility  longevity  mobility  human-capital  garett-jones  writing  class  data  time-series  demographic-transition  regularizer  lived-experience  gender  gender-diff  pro-rata  trivia  cocktail  econotariat  twitter  social  backup  commentary  poetry  medieval  modernity  alien-character 
september 2019 by nhaliday
The Scholar's Stage: Book Notes—Strategy: A History
https://twitter.com/Scholars_Stage/status/1151681120787816448
https://archive.is/Bp5eu
Freedman's book is something of a shadow history of Western intellectual thought between 1850 and 2010. Marx, Tolstoy, Foucault, game theorists, economists, business law--it is all in there.

Thus the thoughts prompted by this book have surprisingly little to do with war.
Instead I am left with questions about the long-term trajectory of Western thought. Specifically:

*Has America really dominated Western intellectual life in the post 45 world as much as English speakers seem to think it has?
*Has the professionalization/credential-iization of Western intellectual life helped or harmed our ability to understand society?
*Will we ever recover from the 1960s?
wonkish  unaffiliated  broad-econ  books  review  reflection  summary  strategy  war  higher-ed  academia  social-science  letters  organizing  nascent-state  counter-revolution  rot  westminster  culture-war  left-wing  anglosphere  usa  history  mostly-modern  coordination  lens  local-global  europe  gallic  philosophy  cultural-dynamics  anthropology  game-theory  industrial-org  schelling  flux-stasis  trends  culture  iraq-syria  MENA  military  frontier  info-dynamics  big-peeps  politics  multi  twitter  social  commentary  backup  defense 
july 2019 by nhaliday
The Existential Risk of Math Errors - Gwern.net
How big is this upper bound? Mathematicians have often made errors in proofs. But it’s rarer for ideas to be accepted for a long time and then rejected. But we can divide errors into 2 basic cases corresponding to type I and type II errors:

1. Mistakes where the theorem is still true, but the proof was incorrect (type I)
2. Mistakes where the theorem was false, and the proof was also necessarily incorrect (type II)

Before someone comes up with a final answer, a mathematician may have many levels of intuition in formulating & working on the problem, but we’ll consider the final end-product where the mathematician feels satisfied that he has solved it. Case 1 is perhaps the most common case, with innumerable examples; this is sometimes due to mistakes in the proof that anyone would accept is a mistake, but many of these cases are due to changing standards of proof. For example, when David Hilbert discovered errors in Euclid’s proofs which no one noticed before, the theorems were still true, and the gaps more due to Hilbert being a modern mathematician thinking in terms of formal systems (which of course Euclid did not think in). (David Hilbert himself turns out to be a useful example of the other kind of error: his famous list of 23 problems was accompanied by definite opinions on the outcome of each problem and sometimes timings, several of which were wrong or questionable5.) Similarly, early calculus used ‘infinitesimals’ which were sometimes treated as being 0 and sometimes treated as an indefinitely small non-zero number; this was incoherent and strictly speaking, practically all of the calculus results were wrong because they relied on an incoherent concept - but of course the results were some of the greatest mathematical work ever conducted6 and when later mathematicians put calculus on a more rigorous footing, they immediately re-derived those results (sometimes with important qualifications), and doubtless as modern math evolves other fields have sometimes needed to go back and clean up the foundations and will in the future.7

...

Isaac Newton, incidentally, gave two proofs of the same solution to a problem in probability, one via enumeration and the other more abstract; the enumeration was correct, but the other proof totally wrong and this was not noticed for a long time, leading Stigler to remark:

...

TYPE I > TYPE II?
“Lefschetz was a purely intuitive mathematician. It was said of him that he had never given a completely correct proof, but had never made a wrong guess either.”
- Gian-Carlo Rota13

Case 2 is disturbing, since it is a case in which we wind up with false beliefs and also false beliefs about our beliefs (we no longer know that we don’t know). Case 2 could lead to extinction.

...

Except, errors do not seem to be evenly & randomly distributed between case 1 and case 2. There seem to be far more case 1s than case 2s, as already mentioned in the early calculus example: far more than 50% of the early calculus results were correct when checked more rigorously. Richard Hamming attributes to Ralph Boas a comment that while editing Mathematical Reviews that “of the new results in the papers reviewed most are true but the corresponding proofs are perhaps half the time plain wrong”.

...

Gian-Carlo Rota gives us an example with Hilbert:

...

Olga labored for three years; it turned out that all mistakes could be corrected without any major changes in the statement of the theorems. There was one exception, a paper Hilbert wrote in his old age, which could not be fixed; it was a purported proof of the continuum hypothesis, you will find it in a volume of the Mathematische Annalen of the early thirties.

...

Leslie Lamport advocates for machine-checked proofs and a more rigorous style of proofs similar to natural deduction, noting a mathematician acquaintance guesses at a broad error rate of 1/329 and that he routinely found mistakes in his own proofs and, worse, believed false conjectures30.

[more on these "structured proofs":
https://academia.stackexchange.com/questions/52435/does-anyone-actually-publish-structured-proofs
https://mathoverflow.net/questions/35727/community-experiences-writing-lamports-structured-proofs
]

We can probably add software to that list: early software engineering work found that, dismayingly, bug rates seem to be simply a function of lines of code, and one would expect diseconomies of scale. So one would expect that in going from the ~4,000 lines of code of the Microsoft DOS operating system kernel to the ~50,000,000 lines of code in Windows Server 2003 (with full systems of applications and libraries being even larger: the comprehensive Debian repository in 2007 contained ~323,551,126 lines of code) that the number of active bugs at any time would be… fairly large. Mathematical software is hopefully better, but practitioners still run into issues (eg Durán et al 2014, Fonseca et al 2017) and I don’t know of any research pinning down how buggy key mathematical systems like Mathematica are or how much published mathematics may be erroneous due to bugs. This general problem led to predictions of doom and spurred much research into automated proof-checking, static analysis, and functional languages31.

[related:
https://mathoverflow.net/questions/11517/computer-algebra-errors
I don't know any interesting bugs in symbolic algebra packages but I know a true, enlightening and entertaining story about something that looked like a bug but wasn't.

Define sinc𝑥=(sin𝑥)/𝑥.

Someone found the following result in an algebra package: ∫∞0𝑑𝑥sinc𝑥=𝜋/2
They then found the following results:

...

So of course when they got:

∫∞0𝑑𝑥sinc𝑥sinc(𝑥/3)sinc(𝑥/5)⋯sinc(𝑥/15)=(467807924713440738696537864469/935615849440640907310521750000)𝜋

hmm:
Which means that nobody knows Fourier analysis nowdays. Very sad and discouraging story... – fedja Jan 29 '10 at 18:47

--

Because the most popular systems are all commercial, they tend to guard their bug database rather closely -- making them public would seriously cut their sales. For example, for the open source project Sage (which is quite young), you can get a list of all the known bugs from this page. 1582 known issues on Feb.16th 2010 (which includes feature requests, problems with documentation, etc).

That is an order of magnitude less than the commercial systems. And it's not because it is better, it is because it is younger and smaller. It might be better, but until SAGE does a lot of analysis (about 40% of CAS bugs are there) and a fancy user interface (another 40%), it is too hard to compare.

I once ran a graduate course whose core topic was studying the fundamental disconnect between the algebraic nature of CAS and the analytic nature of the what it is mostly used for. There are issues of logic -- CASes work more or less in an intensional logic, while most of analysis is stated in a purely extensional fashion. There is no well-defined 'denotational semantics' for expressions-as-functions, which strongly contributes to the deeper bugs in CASes.]

...

Should such widely-believed conjectures as P≠NP or the Riemann hypothesis turn out be false, then because they are assumed by so many existing proofs, a far larger math holocaust would ensue38 - and our previous estimates of error rates will turn out to have been substantial underestimates. But it may be a cloud with a silver lining, if it doesn’t come at a time of danger.

https://mathoverflow.net/questions/338607/why-doesnt-mathematics-collapse-down-even-though-humans-quite-often-make-mista

more on formal methods in programming:
https://www.quantamagazine.org/formal-verification-creates-hacker-proof-code-20160920/
https://intelligence.org/2014/03/02/bob-constable/

https://softwareengineering.stackexchange.com/questions/375342/what-are-the-barriers-that-prevent-widespread-adoption-of-formal-methods
Update: measured effort
In the October 2018 issue of Communications of the ACM there is an interesting article about Formally verified software in the real world with some estimates of the effort.

Interestingly (based on OS development for military equipment), it seems that producing formally proved software requires 3.3 times more effort than with traditional engineering techniques. So it's really costly.

On the other hand, it requires 2.3 times less effort to get high security software this way than with traditionally engineered software if you add the effort to make such software certified at a high security level (EAL 7). So if you have high reliability or security requirements there is definitively a business case for going formal.

WHY DON'T PEOPLE USE FORMAL METHODS?: https://www.hillelwayne.com/post/why-dont-people-use-formal-methods/
You can see examples of how all of these look at Let’s Prove Leftpad. HOL4 and Isabelle are good examples of “independent theorem” specs, SPARK and Dafny have “embedded assertion” specs, and Coq and Agda have “dependent type” specs.6

If you squint a bit it looks like these three forms of code spec map to the three main domains of automated correctness checking: tests, contracts, and types. This is not a coincidence. Correctness is a spectrum, and formal verification is one extreme of that spectrum. As we reduce the rigour (and effort) of our verification we get simpler and narrower checks, whether that means limiting the explored state space, using weaker types, or pushing verification to the runtime. Any means of total specification then becomes a means of partial specification, and vice versa: many consider Cleanroom a formal verification technique, which primarily works by pushing code review far beyond what’s humanly possible.

...

The question, then: “is 90/95/99% correct significantly cheaper than 100% correct?” The answer is very yes. We all are comfortable saying that a codebase we’ve well-tested and well-typed is mostly correct modulo a few fixes in prod, and we’re even writing more than four lines of code a day. In fact, the vast… [more]
ratty  gwern  analysis  essay  realness  truth  correctness  reason  philosophy  math  proofs  formal-methods  cs  programming  engineering  worse-is-better/the-right-thing  intuition  giants  old-anglo  error  street-fighting  heuristic  zooming  risk  threat-modeling  software  lens  logic  inference  physics  differential  geometry  estimate  distribution  robust  speculation  nonlinearity  cost-benefit  convexity-curvature  measure  scale  trivia  cocktail  history  early-modern  europe  math.CA  rigor  news  org:mag  org:sci  miri-cfar  pdf  thesis  comparison  examples  org:junk  q-n-a  stackex  pragmatic  tradeoffs  cracker-prog  techtariat  invariance  DSL  chart  ecosystem  grokkability  heavyweights  CAS  static-dynamic  lower-bounds  complexity  tcs  open-problems  big-surf  ideas  certificates-recognition  proof-systems  PCP  mediterranean  SDP  meta:prediction  epistemic  questions  guessing  distributed  overflow  nibble  soft-question  track-record  big-list  hmm  frontier  state-of-art  move-fast-(and-break-things)  grokkability-clarity  technical-writing  trust 
july 2019 by nhaliday
Computer latency: 1977-2017
If we look at overall results, the fastest machines are ancient. Newer machines are all over the place. Fancy gaming rigs with unusually high refresh-rate displays are almost competitive with machines from the late 70s and early 80s, but “normal” modern computers can’t compete with thirty to forty year old machines.

...

If we exclude the game boy color, which is a different class of device than the rest, all of the quickest devices are Apple phones or tablets. The next quickest device is the blackberry q10. Although we don’t have enough data to really tell why the blackberry q10 is unusually quick for a non-Apple device, one plausible guess is that it’s helped by having actual buttons, which are easier to implement with low latency than a touchscreen. The other two devices with actual buttons are the gameboy color and the kindle 4.

After that iphones and non-kindle button devices, we have a variety of Android devices of various ages. At the bottom, we have the ancient palm pilot 1000 followed by the kindles. The palm is hamstrung by a touchscreen and display created in an era with much slower touchscreen technology and the kindles use e-ink displays, which are much slower than the displays used on modern phones, so it’s not surprising to see those devices at the bottom.

...

Almost every computer and mobile device that people buy today is slower than common models of computers from the 70s and 80s. Low-latency gaming desktops and the ipad pro can get into the same range as quick machines from thirty to forty years ago, but most off-the-shelf devices aren’t even close.

If we had to pick one root cause of latency bloat, we might say that it’s because of “complexity”. Of course, we all know that complexity is bad. If you’ve been to a non-academic non-enterprise tech conference in the past decade, there’s a good chance that there was at least one talk on how complexity is the root of all evil and we should aspire to reduce complexity.

Unfortunately, it's a lot harder to remove complexity than to give a talk saying that we should remove complexity. A lot of the complexity buys us something, either directly or indirectly. When we looked at the input of a fancy modern keyboard vs. the apple 2 keyboard, we saw that using a relatively powerful and expensive general purpose processor to handle keyboard inputs can be slower than dedicated logic for the keyboard, which would both be simpler and cheaper. However, using the processor gives people the ability to easily customize the keyboard, and also pushes the problem of “programming” the keyboard from hardware into software, which reduces the cost of making the keyboard. The more expensive chip increases the manufacturing cost, but considering how much of the cost of these small-batch artisanal keyboards is the design cost, it seems like a net win to trade manufacturing cost for ease of programming.

...

If you want a reference to compare the kindle against, a moderately quick page turn in a physical book appears to be about 200 ms.

https://twitter.com/gravislizard/status/927593460642615296
almost everything on computers is perceptually slower than it was in 1983
https://archive.is/G3D5K
https://archive.is/vhDTL
https://archive.is/a3321
https://archive.is/imG7S

linux terminals: https://lwn.net/Articles/751763/
techtariat  dan-luu  performance  time  hardware  consumerism  objektbuch  data  history  reflection  critique  software  roots  tainter  engineering  nitty-gritty  ui  ux  hci  ios  mobile  apple  amazon  sequential  trends  increase-decrease  measure  analysis  measurement  os  systems  IEEE  intricacy  desktop  benchmarks  rant  carmack  system-design  degrees-of-freedom  keyboard  terminal  editors  links  input-output  networking  world  s:**  multi  twitter  social  discussion  tech  programming  web  internet  speed  backup  worrydream  interface  metal-to-virtual  latency-throughput  workflow  form-design  interface-compatibility  org:junk  linux 
july 2019 by nhaliday
Lindy effect - Wikipedia
The Lindy effect is a theory that the future life expectancy of some non-perishable things like a technology or an idea is proportional to their current age, so that every additional period of survival implies a longer remaining life expectancy.[1] Where the Lindy effect applies, mortality rate decreases with time. In contrast, living creatures and mechanical things follow a bathtub curve where, after "childhood", the mortality rate increases with time. Because life expectancy is probabilistically derived, a thing may become extinct before its "expected" survival. In other words, one needs to gauge both the age and "health" of the thing to determine continued survival.
wiki  reference  concept  metabuch  ideas  street-fighting  planning  comparison  time  distribution  flux-stasis  history  measure  correlation  arrows  branches  pro-rata  manifolds  aging  stylized-facts  age-generation  robust  technology  thinking  cost-benefit  conceptual-vocab  methodology  threat-modeling  efficiency  neurons  tools  track-record  ubiquity 
june 2019 by nhaliday
One week of bugs
If I had to guess, I'd say I probably work around hundreds of bugs in an average week, and thousands in a bad week. It's not unusual for me to run into a hundred new bugs in a single week. But I often get skepticism when I mention that I run into multiple new (to me) bugs per day, and that this is inevitable if we don't change how we write tests. Well, here's a log of one week of bugs, limited to bugs that were new to me that week. After a brief description of the bugs, I'll talk about what we can do to improve the situation. The obvious answer to spend more effort on testing, but everyone already knows we should do that and no one does it. That doesn't mean it's hopeless, though.

...

Here's where I'm supposed to write an appeal to take testing more seriously and put real effort into it. But we all know that's not going to work. It would take 90k LOC of tests to get Julia to be as well tested as a poorly tested prototype (falsely assuming linear complexity in size). That's two person-years of work, not even including time to debug and fix bugs (which probably brings it closer to four of five years). Who's going to do that? No one. Writing tests is like writing documentation. Everyone already knows you should do it. Telling people they should do it adds zero information1.

Given that people aren't going to put any effort into testing, what's the best way to do it?

Property-based testing. Generative testing. Random testing. Concolic Testing (which was done long before the term was coined). Static analysis. Fuzzing. Statistical bug finding. There are lots of options. Some of them are actually the same thing because the terminology we use is inconsistent and buggy. I'm going to arbitrarily pick one to talk about, but they're all worth looking into.

...

There are a lot of great resources out there, but if you're just getting started, I found this description of types of fuzzers to be one of those most helpful (and simplest) things I've read.

John Regehr has a udacity course on software testing. I haven't worked through it yet (Pablo Torres just pointed to it), but given the quality of Dr. Regehr's writing, I expect the course to be good.

For more on my perspective on testing, there's this.

Everything's broken and nobody's upset: https://www.hanselman.com/blog/EverythingsBrokenAndNobodysUpset.aspx
https://news.ycombinator.com/item?id=4531549

https://hypothesis.works/articles/the-purpose-of-hypothesis/
From the perspective of a user, the purpose of Hypothesis is to make it easier for you to write better tests.

From my perspective as the primary author, that is of course also a purpose of Hypothesis. I write a lot of code, it needs testing, and the idea of trying to do that without Hypothesis has become nearly unthinkable.

But, on a large scale, the true purpose of Hypothesis is to drag the world kicking and screaming into a new and terrifying age of high quality software.

Software is everywhere. We have built a civilization on it, and it’s only getting more prevalent as more services move online and embedded and “internet of things” devices become cheaper and more common.

Software is also terrible. It’s buggy, it’s insecure, and it’s rarely well thought out.

This combination is clearly a recipe for disaster.

The state of software testing is even worse. It’s uncontroversial at this point that you should be testing your code, but it’s a rare codebase whose authors could honestly claim that they feel its testing is sufficient.

Much of the problem here is that it’s too hard to write good tests. Tests take up a vast quantity of development time, but they mostly just laboriously encode exactly the same assumptions and fallacies that the authors had when they wrote the code, so they miss exactly the same bugs that you missed when they wrote the code.

Preventing the Collapse of Civilization [video]: https://news.ycombinator.com/item?id=19945452
- Jonathan Blow

NB: DevGAMM is a game industry conference

- loss of technological knowledge (Antikythera mechanism, aqueducts, etc.)
- hardware driving most gains, not software
- software's actually less robust, often poorly designed and overengineered these days
- *list of bugs he's encountered recently*:
https://youtu.be/pW-SOdj4Kkk?t=1387
- knowledge of trivia becomes [ed.: missing the word "valued" here, I think?]more than general, deep knowledge
- does at least acknowledge value of DRY, reusing code, abstraction saving dev time
techtariat  dan-luu  tech  software  error  list  debugging  linux  github  robust  checking  oss  troll  lol  aphorism  webapp  email  google  facebook  games  julia  pls  compilers  communication  mooc  browser  rust  programming  engineering  random  jargon  formal-methods  expert-experience  prof  c(pp)  course  correctness  hn  commentary  video  presentation  carmack  pragmatic  contrarianism  pessimism  sv  unix  rhetoric  critique  worrydream  hardware  performance  trends  multiplicative  roots  impact  comparison  history  iron-age  the-classics  mediterranean  conquest-empire  gibbon  technology  the-world-is-just-atoms  flux-stasis  increase-decrease  graphics  hmm  idk  systems  os  abstraction  intricacy  worse-is-better/the-right-thing  build-packaging  microsoft  osx  apple  reflection  assembly  things  knowledge  detail-architecture  thick-thin  trivia  info-dynamics  caching  frameworks  generalization  systematic-ad-hoc  universalism-particularism  analytical-holistic  structure  tainter  libraries  tradeoffs  prepping  threat-modeling  network-structure  writing  risk  local-glob 
may 2019 by nhaliday
Braves | West Hunter
If  Amerindians had a lot fewer serious infectious diseases than Old Worlders, something else had to limit population – and it wasn’t the Pill.

Surely there was more death by violence. In principle they could have sat down and quietly starved to death, but I doubt it. Better to burn out than fade away.
west-hunter  scitariat  reflection  ideas  usa  farmers-and-foragers  history  medieval  iron-age  europe  comparison  asia  civilization  peace-violence  martial  selection  ecology  disease  parasites-microbiome  pop-diff  incentives  malthus  equilibrium 
may 2019 by nhaliday
The Architect as Totalitarian: Le Corbusier’s baleful influence | City Journal
Le Corbusier was to architecture what Pol Pot was to social reform. In one sense, he had less excuse for his activities than Pol Pot: for unlike the Cambodian, he possessed great talent, even genius. Unfortunately, he turned his gifts to destructive ends, and it is no coincidence that he willingly served both Stalin and Vichy.
news  org:mag  right-wing  albion  gnon  isteveish  architecture  essay  rhetoric  critique  contrarianism  communism  comparison  aphorism  modernity  authoritarianism  universalism-particularism  europe  gallic  history  mostly-modern  urban-rural  revolution  art  culture 
april 2019 by nhaliday
Flammarion engraving - Wikipedia
A traveller puts his head under the edge of the firmament in the original (1888) printing of the Flammarion engraving.
art  classic  wiki  history  philosophy  science  enlightenment-renaissance-restoration-reformation  mystic  religion  christianity  eden-heaven  sky  myth  tip-of-tongue 
march 2019 by nhaliday
Verbal Edge: Borges & Buckley | Eamonn Fitzgerald: Rainy Day
At one point, Borges said that he found English “a far finer language” than Spanish and Buckley asked “Why?”

Borges: There are many reasons. Firstly, English is both a Germanic and a Latin language, those two registers.

...

And then there is another reason. And the reason is that I think that of all languages, English is the most physical. You can, for example, say “He loomed over.” You can’t very well say that in Spanish.

Buckley: Asomo?
Borges: No; they’re not exactly the same. And then, in English, you can do almost anything with verbs and prepositions. For example, to “laugh off,” to “dream away.” Those things can’t be said in Spanish.

http://www.oenewsletter.org/OEN/print.php/essays/toswell43_1/Array
J.L.B.: "You will say that it's easier for a Dane to study English than for a Spanish-speaking person to learn English or an Englishman Spanish; but I don't think this is true, because English is a Latin language as well as a Germanic one. At least half the English vocabulary is Latin. Remember that in English there are two words for every idea: one Saxon and one Latin. You can say 'Holy Ghost' or 'Holy Spirit,' 'sacred' or 'holy.' There's always a slight difference, but one that's very important for poetry, the difference between 'dark' and 'obscure' for instance, or 'regal' and 'kingly,' or 'fraternal' and 'brotherly.' In the English language almost al words representing abstract ideas come from Latin, and those for concrete ideas from Saxon, but there aren't so many concrete ideas." (P. 71) [2]

In his own words, then, Borges was fascinated by Old English and Old Norse.
interview  history  mostly-modern  language  foreign-lang  anglo  anglosphere  culture  literature  writing  mediterranean  latin-america  germanic  roots  comparison  quotes  flexibility  org:junk  multi  medieval  nordic  lexical  parallax 
february 2019 by nhaliday
Timothy Heath - China's New Governing Party Paradigm - YouTube
https://twitter.com/GarettJones/status/1079807448741863425
https://archive.is/NnO9U
What percentage of CCP elites sincerely believe in the official ideology?

https://twitter.com/BennettJonah/status/1153757516867633152
https://archive.is/PI3QS
One of the most useful things to aid understanding is reading the other side in their own words, rather than reading yet more vague analyses about "what the Chinese are up to."

Which is why you need to read this Xi Jinping speech:
https://palladiummag.com/2019/05/31/xi-jinping-in-translation-chinas-guiding-ideology/
--
I like this speech because it is a clear expression of Marxism as an "organizing philosophy of the state" - nothing about equality, barely even anything about "workers"
video  presentation  china  asia  government  institutions  communism  polisci  ideology  technocracy  leviathan  management  science  polanyi-marx  economics  growth-econ  multi  twitter  social  discussion  speculation  backup  realness  revolution  history  mostly-modern  poll  impetus  garett-jones  quotes  statesmen 
february 2019 by nhaliday
T. Greer on Twitter: "Genesis 1st half of Exodus Basic passages of the Deuteronomic Covenant Select scenes from Numbers-Judges Samuel I-II Job Ecclesiastes Proverbs Select Psalms Select passages of Isiah, Jeremiah, and Ezekiel Jonah 4 Gospels+Acts Romans
https://archive.is/YtwVb
I would pair letters from Paul with Flannery O'Connor's "A Good Man is Hard to Find."

I designed a hero's journey course that included Gilgamesh, Odyssey, and Gawain and the Green Knight. Before reading Gawain you'd read the Sermon on the Mount + few parts of gospels.
The idea with that last one being that Gawain was an attempt to make a hero who (unlike Odysseus) accorded with Christian ethics. As one of its discussion points, the class can debate over how well it actually did that.
...
So I would preface Lord of the Flies with a stylized account of Hobbes and Rosseau, and we would read a great deal of Genesis alongside LOTF.

Same approach was taken to Greece and Rome. Classical myths would be paired with poems from the 1600s-1900s that alluded to them.
...
Genesis
1st half of Exodus
Basic passages of the Deuteronomic Covenant
Select scenes from Numbers-Judges
Samuel I-II
Job
Ecclesiastes
Proverbs
Select Psalms
Select passages of Isiah, Jeremiah, and Ezekiel
Jonah
4 Gospels+Acts
Romans
1 Corinthians
Hebrews
Revelation
twitter  social  discussion  backup  literature  letters  reading  canon  the-classics  history  usa  europe  the-great-west-whale  religion  christianity  ideology  philosophy  ethics  medieval  china  asia  sinosphere  comparison  culture  civilization  roots  spreading  multi 
february 2019 by nhaliday
Perseus Digital Library
This is actually really useful.

Features:
- Load English translation side-by-side if available.
- Click on any word and see the best guess for definition+inflection given context.

this interface allows search by lemma/POS: http://perseus.uchicago.edu/
tools  reference  history  iron-age  mediterranean  the-classics  canon  foreign-lang  linguistics  database  quixotic  stoic  syntax  lexical  exocortex  aggregator  search  multi 
february 2019 by nhaliday
"Humankind is unique in its incapacity to learn from experience" | New Humanist
Your new book claims atheism is a “closed system of thought”. Why so?
--
Because atheists of a certain kind imagine that by rejecting monotheistic beliefs they step out of a monotheistic way of thinking. Actually, they have inherited all of its rigidities and assumptions. Namely, the idea that there is a universal history; that there is something like a collective human agent; or a universal way of life. These are all Christian ideals. Christianity itself is also a much more complex belief system than most contemporary atheists allow for. But then most of these atheists know very little about the history of religion.

Particularly, you argue, Sam Harris and Richard Dawkins. What is your disagreement with them?
--
They treat religion as a kind of intellectual error; something only the crudest of Enlightenment thinkers believed. Not every human being has a religious sensibility, but pretty much all human cultures do. Neither Dawkins or Harris are interesting enough to discuss this at length.

Dawkins is really not worth discussing or engaging with at all. He is an ideologue of Darwinism and knows very little about religion, treating it as a kind of a priori notion, rather than the complex social, and anthropological set of ideas which religion usually entails. Harris is partially interesting, in that he talks about how all human values can be derived from science. But I object strongly to that idea.

...

You are hugely critical of modern liberalism: what is your main problem with the ideology?
--
That it’s immune to empirical evidence. It’s a form of dogmatic faith. If you are a monotheist it makes sense – I myself am not saying it’s true or right – to say that there is only one way of life for all of humankind. And so you should try and convert the rest of humanity to that faith.

But if you are not a monotheist, and you claim to be an atheist, it makes no sense to claim that there is only one way of life. There may be some good and bad ways of living. And there may be some forms of barbarism, where human societies cannot flourish for very long. But there is no reason for thinking that there is only one way of life: the ones that liberal societies practice.

Why the liberal West is a Christian creation: https://www.newstatesman.com/dominion-making-western-mind-tom-holland-review
Christianity is dismissed as a fairy tale but its assumptions underpin the modern secular world.
- John Gray

Secular liberals dismiss Christianity as a fairy tale, but their values and their view of history remain essentially Christian. The Christian story tells of the son of God being put to death on a cross. In the Roman world, this was the fate of criminals and those who challenged imperial power. Christianity brought with it a moral revolution. The powerless came to be seen as God’s children, and therefore deserving of respect as much as the highest in society. History was a drama of sin and redemption in which God – acting through his son – was on the side of the weak.

Dominion: The Making of the Western Mind
Tom Holland
Little, Brown & Co, 624pp, £25

The Origin of the Secular Species: https://kirkcenter.org/reviews/the-origin-of-the-secular-species/
Reviewed by Ben Sixsmith

A great strength of Holland’s book is how it takes the reader back to when Christianity was not institutional and traditional but new and revolutionary. “[Corinth] had a long tradition of hosting eccentrics,” Holland writes in one wry passage:

> Back in the time of Alexander, the philosopher Diogenes had notoriously proclaimed his contempt for the norms of society by living in a large jar and masturbating in public. Paul, though, demanded a far more total recalibration of their most basic assumptions.

Christianity came not with a triumphant warrior wielding his sword, but with a traveling carpenter nailed to a cross; it came not with God as a distant and unimaginable force but with God as man, walking among his followers; it came not with promises of tribal dominance but with the hope of salvation across classes and races.

...

This may sound more pragmatic than liberal but it does reflect a strange, for the time, confidence in the power of education to shape the beliefs of the common man. Holland is keen to emphasize these progressive elements of history that he argues, with some justice, have helped to shape the modern world. Charity became enshrined in legislation, for example, as being able to access the necessities of life became “in a formulation increasingly deployed by canon lawyers” a human “right.”

...

This is, I think, a simplification of Galatians 3:28 that makes it more subversive than it actually is. Adolescents and octogenarians are equally eligible for salvation, in the Christian faith, but that does not mean that they have equal earthly functions.

Holland’s stylistic talents add a great deal to the book. His portraits of Boniface, Luther, and Calvin are vivid, evocative, and free of romanticization or its opposite. Some of his accounts of episodes in religious history are a little superficial—he could have read Helen Andrews for a more complicated portrait of Bartolomé de las Casas, for example—but a sweeping historical narrative without superficial aspects would be like an orchard with no bruising on the fruit. It is only natural.

...

We have to look not just at what survives of Christianity but what has been lost. I agree with Holland that the natural sciences can be aligned with Christian belief, but the predominant explanatory power of secular authorities has inarguably weakened the faith. The abandonment of metaphysics, on which Christian scholarship was founded, was another grievous blow. Finally, the elevation of choice to the highest principles of culture indulges worldly desire over religious adherence. Christianity, in Holland’s book, is a genetic relic.

Still, the tension of Dominion is a haunting one: the tension, that is, between the revolutionary and conservative implications of the Christian faith. On the British right, we—and especially those of us who are not believers—sometimes like to think of Christianity in a mild Scrutonian sense, as a source of wonder, beauty, and social cohesion. What hums throughout Dominion, though, is the intense evangelical spirit of the faith. The most impressive person in the book is St. Paul, striding between cities full of spiritual vigor. Why? Because it was God’s will. And because, as Jean Danielou wrote in his striking little book Prayer as a Political Problem:

> Christ has come to save all that has been made. Redemption is concerned with all creation …

This is not to claim that true Christians are fanatical. Paul himself, as Holland writes, was something of a realist. But the desire to spread the faith is essential to it—the animated evidence of its truth.
news  org:mag  religion  christianity  theos  ideology  politics  polisci  philosophy  westminster  government  uniqueness  diversity  putnam-like  homo-hetero  number  anthropology  morality  values  interview  cycles  optimism  pessimism  nihil  realness  noble-lie  reason  science  europe  EU  enlightenment-renaissance-restoration-reformation  utopia-dystopia  civil-liberty  multi  anglo  big-peeps  books  review  summary  fiction  gedanken  gibbon  history  iron-age  mediterranean  the-classics  egalitarianism-hierarchy  nietzschean  optimate  aristos  culture-war  identity-politics  kumbaya-kult  universalism-particularism  absolute-relative  ethics  formal-values  houellebecq  org:anglo  journos-pundits  albion  latin-america  age-of-discovery  conquest-empire  expansionism  dignity  justice 
october 2018 by nhaliday
Science - Wikipedia
In Northern Europe, the new technology of the printing press was widely used to publish many arguments, including some that disagreed widely with contemporary ideas of nature. René Descartes and Francis Bacon published philosophical arguments in favor of a new type of non-Aristotelian science. Descartes emphasized individual thought and argued that mathematics rather than geometry should be used in order to study nature. Bacon emphasized the importance of experiment over contemplation. Bacon further questioned the Aristotelian concepts of formal cause and final cause, and promoted the idea that science should study the laws of "simple" natures, such as heat, rather than assuming that there is any specific nature, or "formal cause," of each complex type of thing. This new modern science began to see itself as describing "laws of nature". This updated approach to studies in nature was seen as mechanistic. Bacon also argued that science should aim for the first time at practical inventions for the improvement of all human life.

Age of Enlightenment

...

During this time, the declared purpose and value of science became producing wealth and inventions that would improve human lives, in the materialistic sense of having more food, clothing, and other things. In Bacon's words, "the real and legitimate goal of sciences is the endowment of human life with new inventions and riches", and he discouraged scientists from pursuing intangible philosophical or spiritual ideas, which he believed contributed little to human happiness beyond "the fume of subtle, sublime, or pleasing speculation".[72]
article  wiki  reference  science  philosophy  letters  history  iron-age  mediterranean  the-classics  medieval  europe  the-great-west-whale  early-modern  ideology  telos-atelos  ends-means  new-religion  weird  enlightenment-renaissance-restoration-reformation  culture  the-devil  anglo  big-peeps  giants  religion  theos  tip-of-tongue  hmm  truth  dirty-hands  engineering  roots  values  formal-values  quotes  causation  forms-instances  technology  logos 
august 2018 by nhaliday
State (polity) - Wikipedia
https://en.wikipedia.org/wiki/State_formation
In the medieval period (500-1400) in Europe, there were a variety of authority forms throughout the region. These included feudal lords, empires, religious authorities, free cities, and other authorities.[42] Often dated to the 1648 Peace of Westphalia, there began to be the development in Europe of modern states with large-scale capacity for taxation, coercive control of their populations, and advanced bureaucracies.[43] The state became prominent in Europe over the next few centuries before the particular form of the state spread to the rest of the world via the colonial and international pressures of the 19th century and 20th century.[44] Other modern states developed in Africa and Asia prior to colonialism, but were largely displaced by colonial rule.[45]

...

Two related theories are based on military development and warfare, and the role that these forces played in state formation. Charles Tilly developed an argument that the state developed largely as a result of "state-makers" who sought to increase the taxes they could gain from the people under their control so they could continue fighting wars.[42] According to Tilly, the state makes war and war makes states.[49] In the constant warfare of the centuries in Europe, coupled with expanded costs of war with mass armies and gunpowder, warlords had to find ways to finance war and control territory more effectively. The modern state presented the opportunity for them to develop taxation structures, the coercive structure to implement that taxation, and finally the guarantee of protection from other states that could get much of the population to agree.[50] Taxes and revenue raising have been repeatedly pointed out as a key aspect of state formation and the development of state capacity. Economist Nicholas Kaldor emphasized on the importance of revenue raising and warned about the dangers of the dependence on foreign aid.[51] Tilly argues, state making is similar to organized crime because it is a "quintessential protection racket with the advantage of legitimacy."[52]

State of nature: https://en.wikipedia.org/wiki/State_of_nature
Thomas Hobbes
The pure state of nature or "the natural condition of mankind" was deduced by the 17th century English philosopher Thomas Hobbes, in Leviathan and in his earlier work On the Citizen.[4] Hobbes argued that all humans are by nature equal in faculties of body and mind (i.e., no natural inequalities are so great as to give anyone a "claim" to an exclusive "benefit"). From this equality and other causes [example needed]in human nature, everyone is naturally willing to fight one another: so that "during the time men live without a common power to keep them all in awe, they are in that condition which is called warre; and such a warre as is of every man against every man". In this state every person has a natural right or liberty to do anything one thinks necessary for preserving one's own life; and life is "solitary, poor, nasty, brutish, and short" (Leviathan, Chapters XIII–XIV). Hobbes described this natural condition with the Latin phrase bellum omnium contra omnes (meaning war of all against all), in his work De Cive.

Within the state of nature there is neither personal property nor injustice since there is no law, except for certain natural precepts discovered by reason ("laws of nature"): the first of which is "that every man ought to endeavour peace, as far as he has hope of obtaining it" (Leviathan, Ch. XIV); and the second is "that a man be willing, when others are so too, as far forth as for peace and defence of himself he shall think it necessary, to lay down this right to all things; and be contented with so much liberty against other men as he would allow other men against himself" (loc. cit.). From here Hobbes develops the way out of the state of nature into political society and government, by mutual contracts.

According to Hobbes the state of nature exists at all times among independent countries, over whom there is no law except for those same precepts or laws of nature (Leviathan, Chapters XIII, XXX end). His view of the state of nature helped to serve as a basis for theories of international law and relations.[5]

John Locke
John Locke considers the state of nature in his Second Treatise on Civil Government written around the time of the Exclusion Crisis in England during the 1680s. For Locke, in the state of nature all men are free "to order their actions, and dispose of their possessions and persons, as they think fit, within the bounds of the law of nature." (2nd Tr., §4). "The state of Nature has a law of Nature to govern it", and that law is reason. Locke believes that reason teaches that "no one ought to harm another in his life, liberty, and or property" (2nd Tr., §6) ; and that transgressions of this may be punished. Locke describes the state of nature and civil society to be opposites of each other, and the need for civil society comes in part from the perpetual existence of the state of nature.[6] This view of the state of nature is partly deduced from Christian belief (unlike Hobbes, whose philosophy is not dependent upon any prior theology).

Although it may be natural to assume that Locke was responding to Hobbes, Locke never refers to Hobbes by name, and may instead have been responding to other writers of the day, like Robert Filmer.[7] In fact, Locke's First Treatise is entirely a response to Filmer's Patriarcha, and takes a step by step method to refuting Filmer's theory set out in Patriarcha. The conservative party at the time had rallied behind Filmer's Patriarcha, whereas the Whigs, scared of another prosecution of Anglicans and Protestants, rallied behind the theory set out by Locke in his Two Treatises of Government as it gave a clear theory as to why the people would be justified in overthrowing a monarchy which abuses the trust they had placed in it.[citation needed]

...

Jean-Jacques Rousseau
Hobbes' view was challenged in the eighteenth century by Jean-Jacques Rousseau, who claimed that Hobbes was taking socialized people and simply imagining them living outside of the society in which they were raised. He affirmed instead that people were neither good nor bad, but were born as a blank slate, and later society and the environment influence which way we lean. In Rousseau's state of nature, people did not know each other enough to come into serious conflict and they did have normal values. The modern society, and the ownership it entails, is blamed for the disruption of the state of nature which Rousseau sees as true freedom.[9]

https://en.wikipedia.org/wiki/Sovereignty
Ulpian's statements were known in medieval Europe, but sovereignty was an important concept in medieval times.[1] Medieval monarchs were not sovereign, at least not strongly so, because they were constrained by, and shared power with, their feudal aristocracy.[1] Furthermore, both were strongly constrained by custom.[1]

Sovereignty existed during the Medieval period as the de jure rights of nobility and royalty, and in the de facto capability of individuals to make their own choices in life.[citation needed]

...

Reformation

Sovereignty reemerged as a concept in the late 16th century, a time when civil wars had created a craving for stronger central authority, when monarchs had begun to gather power onto their own hands at the expense of the nobility, and the modern nation state was emerging. Jean Bodin, partly in reaction to the chaos of the French wars of religion, presented theories of sovereignty calling for strong central authority in the form of absolute monarchy. In his 1576 treatise Les Six Livres de la République ("Six Books of the Republic") Bodin argued that it is inherent in the nature of the state that sovereignty must be:[1]

- Absolute: On this point he said that the sovereign must be hedged in with obligations and conditions, must be able to legislate without his (or its) subjects' consent, must not be bound by the laws of his predecessors, and could not, because it is illogical, be bound by his own laws.
- Perpetual: Not temporarily delegated as to a strong leader in an emergency or to a state employee such as a magistrate. He held that sovereignty must be perpetual because anyone with the power to enforce a time limit on the governing power must be above the governing power, which would be impossible if the governing power is absolute.

Bodin rejected the notion of transference of sovereignty from people to the ruler (also known as the sovereign); natural law and divine law confer upon the sovereign the right to rule. And the sovereign is not above divine law or natural law. He is above (ie. not bound by) only positive law, that is, laws made by humans. He emphasized that a sovereign is bound to observe certain basic rules derived from the divine law, the law of nature or reason, and the law that is common to all nations (jus gentium), as well as the fundamental laws of the state that determine who is the sovereign, who succeeds to sovereignty, and what limits the sovereign power. Thus, Bodin’s sovereign was restricted by the constitutional law of the state and by the higher law that was considered as binding upon every human being.[1] The fact that the sovereign must obey divine and natural law imposes ethical constraints on him. Bodin also held that the lois royales, the fundamental laws of the French monarchy which regulated matters such as succession, are natural laws and are binding on the French sovereign.

...

Age of Enlightenment
During the Age of Enlightenment, the idea of sovereignty gained both legal and moral force as the main Western description of the meaning and power of a State. In particular, the "Social contract" as a mechanism for establishing sovereignty was suggested and, by 1800, widely accepted, especially in the new United States and France, though also in Great Britain to a lesser extent.

Thomas Hobbes, in Leviathan (1651) arrived a conception of sovereignty … [more]
concept  conceptual-vocab  wiki  reference  leviathan  elite  government  institutions  politics  polisci  philosophy  antidemos  spatial  correlation  intersection-connectedness  geography  matching  nationalism-globalism  whole-partial-many  big-peeps  the-classics  morality  ethics  good-evil  order-disorder  history  iron-age  mediterranean  medieval  feudal  europe  the-great-west-whale  occident  china  asia  sinosphere  n-factor  democracy  authoritarianism  property-rights  civil-liberty  alien-character  crosstab  law  maps  lexical  multi  allodium 
august 2018 by nhaliday
Roman naming conventions - Wikipedia
The distinguishing feature of Roman nomenclature was the use of both personal names and regular surnames. Throughout Europe and the Mediterranean, other ancient civilizations distinguished individuals through the use of single personal names, usually dithematic in nature. Consisting of two distinct elements, or "themes", these names allowed for hundreds or even thousands of possible combinations. But a markedly different system of nomenclature arose in Italy, where the personal name was joined by a hereditary surname. Over time, this binomial system expanded to include additional names and designations.[1][2]

https://en.wikipedia.org/wiki/Gens
In ancient Rome, a gens (/ˈɡɛns/ or /ˈdʒɛnz/), plural gentes, was a family consisting of all those individuals who shared the same nomen and claimed descent from a common ancestor. A branch of a gens was called a stirps (plural stirpes). The gens was an important social structure at Rome and throughout Italy during the period of the Roman Republic. Much of an individual's social standing depended on the gens to which he belonged. Certain gentes were considered patrician, others plebeian, while some had both patrician and plebeian branches. The importance of membership in a gens declined considerably in imperial times.[1][2]

...

The word gens is sometimes translated as "race" or "nation", meaning a people descended from a common ancestor (rather than sharing a common physical trait). It can also be translated as "clan" or "tribe", although the word tribus has a separate and distinct meaning in Roman culture. A gens could be as small as a single family, or could include hundreds of individuals. According to tradition, in 479 BC the gens Fabia alone were able to field a militia consisting of three hundred and six men of fighting age. The concept of the gens was not uniquely Roman, but was shared with communities throughout Italy, including those who spoke Italic languages such as Latin, Oscan, and Umbrian as well as the Etruscans. All of these peoples were eventually absorbed into the sphere of Roman culture.[1][2][3][4]

...

Persons could be adopted into a gens and acquire its nomen. A libertus, or "freedman", usually assumed the nomen (and sometimes also the praenomen) of the person who had manumitted him, and a naturalized citizen usually took the name of the patron who granted his citizenship. Freedmen and newly enfranchised citizens were not technically part of the gentes whose names they shared, but within a few generations it often became impossible to distinguish their descendants from the original members. In practice this meant that a gens could acquire new members and even new branches, either by design or by accident.[1][2][7]

Ancient Greek personal names: https://en.wikipedia.org/wiki/Ancient_Greek_personal_names
Ancient Greeks usually had one name, but another element was often added in semi-official contexts or to aid identification: a father’s name (patronym) in the genitive case, or in some regions as an adjectival formulation. A third element might be added, indicating the individual’s membership in a particular kinship or other grouping, or city of origin (when the person in question was away from that city). Thus the orator Demosthenes, while proposing decrees in the Athenian assembly, was known as "Demosthenes, son of Demosthenes of Paiania"; Paiania was the deme or regional sub-unit of Attica to which he belonged by birth. If Americans used that system, Abraham Lincoln would have been called "Abraham, son of Thomas of Kentucky" (where he was born). In some rare occasions, if a person was illegitimate or fathered by a non-citizen, they might use their mother's name (metronym) instead of their father's. Ten days after a birth, relatives on both sides were invited to a sacrifice and feast called dekátē (δεκάτη), 'tenth day'; on this occasion the father formally named the child.[3]

...

In many contexts, etiquette required that respectable women be spoken of as the wife or daughter of X rather than by their own names.[6] On gravestones or dedications, however, they had to be identified by name. Here, the patronymic formula "son of X" used for men might be replaced by "wife of X", or supplemented as "daughter of X, wife of Y".

Many women bore forms of standard masculine names, with a feminine ending substituted for the masculine. Many standard names related to specific masculine achievements had a common feminine equivalent; the counterpart of Nikomachos, "victorious in battle", would be Nikomachē. The taste mentioned above for giving family members related names was one motive for the creation of such feminine forms. There were also feminine names with no masculine equivalent, such as Glykera "sweet one"; Hedistē "most delightful".
wiki  history  iron-age  mediterranean  the-classics  conquest-empire  culture  language  foreign-lang  social-norms  kinship  class  legacy  democracy  status  multi  gender  syntax  protocol-metadata 
august 2018 by nhaliday
WHO | Priority environment and health risks
also: http://www.who.int/heli/risks/vectors/vector/en/

Environmental factors are a root cause of a significant disease burden, particularly in developing countries. An estimated 25% of death and disease globally, and nearly 35% in regions such as sub-Saharan Africa, is linked to environmental hazards. Some key areas of risk include the following:

- Unsafe water, poor sanitation and hygiene kill an estimated 1.7 million people annually, particularly as a result of diarrhoeal disease.
- Indoor smoke from solid fuels kills an estimated 1.6 million people annually due to respiratory diseases.
- Malaria kills over 1.2 million people annually, mostly African children under the age of five. Poorly designed irrigation and water systems, inadequate housing, poor waste disposal and water storage, deforestation and loss of biodiversity, all may be contributing factors to the most common vector-borne diseases including malaria, dengue and leishmaniasis.
- Urban air pollution generated by vehicles, industries and energy production kills approximately 800 000 people annually.
- Unintentional acute poisonings kill 355 000 people globally each year. In developing countries, where two-thirds of these deaths occur, such poisonings are associated strongly with excessive exposure to, and inappropriate use of, toxic chemicals and pesticides present in occupational and/or domestic environments.
- Climate change impacts including more extreme weather events, changed patterns of disease and effects on agricultural production, are estimated to cause over 150 000 deaths annually.

ed.:
Note the high point at human origin (Africa, Middle East) and Asia. Low points in New World and Europe/Russia. Probably key factor in explaining human psychological variation (Haidt axes, individualism-collectivism, kinship structure, etc.). E.g., compare Islam/Judaism (circumcision, food preparation/hygiene rules) and Christianity (orthodoxy more than orthopraxy, no arbitrary practices for group-marking).

I wonder if the dietary and hygiene laws of Christianity get up-regulated in higher parasite load places (the US South, Middle Eastern Christianity, etc.)?

Also the reason for this variation probably basically boils down how long local microbes have had time to adapt to the human immune system.

obv. correlation: https://pinboard.in/u:nhaliday/b:074ecdf30c50

Tropical disease: https://en.wikipedia.org/wiki/Tropical_disease
Tropical diseases are diseases that are prevalent in or unique to tropical and subtropical regions.[1] The diseases are less prevalent in temperate climates, due in part to the occurrence of a cold season, which controls the insect population by forcing hibernation. However, many were present in northern Europe and northern America in the 17th and 18th centuries before modern understanding of disease causation. The initial impetus for tropical medicine was to protect the health of colonialists, notably in India under the British Raj.[2] Insects such as mosquitoes and flies are by far the most common disease carrier, or vector. These insects may carry a parasite, bacterium or virus that is infectious to humans and animals. Most often disease is transmitted by an insect "bite", which causes transmission of the infectious agent through subcutaneous blood exchange. Vaccines are not available for most of the diseases listed here, and many do not have cures.

cf. Galton: https://pinboard.in/u:nhaliday/b:f72f8e03e729
org:gov  org:ngo  trivia  maps  data  visualization  pro-rata  demographics  death  disease  spreading  parasites-microbiome  world  developing-world  africa  MENA  asia  china  sinosphere  orient  europe  the-great-west-whale  occident  explanans  individualism-collectivism  n-factor  things  phalanges  roots  values  anthropology  cultural-dynamics  haidt  scitariat  morality  correlation  causation  migration  sapiens  history  antiquity  time  bio  EEA  eden-heaven  religion  christianity  islam  judaism  theos  ideology  database  list  tribalism  us-them  archaeology  environment  nature  climate-change  atmosphere  health  fluid  farmers-and-foragers  age-of-discovery  usa  the-south  speculation  questions  flexibility  epigenetics  diet  food  sanctity-degradation  multi  henrich  kinship  gnon  temperature  immune  investing  cost-benefit  tradeoffs  org:davos 
july 2018 by nhaliday
Does left-handedness occur more in certain ethnic groups than others?
Yes. There are some aboriginal tribes in Australia who have about 70% of their population being left-handed. It’s also more than 50% for some South American tribes.

The reason is the same in both cases: a recent past of extreme aggression with other tribes. Left-handedness is caused by recessive genes, but being left-handed is a boost when in hand-to-hand combat with a right-handed guy (who usually has trained extensively with other right-handed guys, as this disposition is genetically dominant so right-handed are majority in most human populations, so lacks experience with a left-handed). Should a particular tribe enter too much war time periods, it’s proportion of left-handeds will naturally rise. As their enemy tribe’s proportion of left-handed people is rising as well, there’s a point at which the natural advantage they get in fighting disipates and can only climb higher should they continuously find new groups to fight with, who are also majority right-handed.

...

So the natural question is: given their advantages in 1-on-1 combat, why doesn’t the percentage grow all the way up to 50% or slightly higher? Because there are COSTS associated with being left-handed, as apparently our neural network is pre-wired towards right-handedness - showing as a reduced life expectancy for lefties. So a mathematical model was proposed to explain their distribution among different societies

THE FIGHTING HYPOTHESIS: STABILITY OF POLYMORPHISM IN HUMAN HANDEDNESS

http://gepv.univ-lille1.fr/downl...

Further, it appears the average left-handedness for humans (~10%) hasn’t changed in thousands of years (judging by the paintings of hands on caves)

Frequency-dependent maintenance of left handedness in humans.

Handedness frequency over more than 10,000 years

[ed.: Compare with Julius Evola's "left-hand path".]
q-n-a  qra  trivia  cocktail  farmers-and-foragers  history  antiquity  race  demographics  bio  EEA  evolution  context  peace-violence  war  ecology  EGT  unintended-consequences  game-theory  equilibrium  anthropology  cultural-dynamics  sapiens  data  database  trends  cost-benefit  strategy  time-series  art  archaeology  measurement  oscillation  pro-rata  iteration-recursion  gender  male-variability  cliometrics  roots  explanation  explanans  correlation  causation  branches 
july 2018 by nhaliday
Dying and Rising Gods - Dictionary definition of Dying and Rising Gods | Encyclopedia.com: FREE online dictionary
https://en.wikipedia.org/wiki/Dying-and-rising_deity
While the concept of a "dying-and-rising god" has a longer history, it was significantly advocated by Frazer's Golden Bough (1906–1914). At first received very favourably, the idea was attacked by Roland de Vaux in 1933, and was the subject of controversial debate over the following decades.[31] One of the leading scholars in the deconstruction of Frazer's "dying-and-rising god" category was Jonathan Z. Smith, whose 1969 dissertation discusses Frazer's Golden Bough,[32] and who in Mircea Eliade's 1987 Encyclopedia of religion wrote the "Dying and rising gods" entry, where he dismisses the category as "largely a misnomer based on imaginative reconstructions and exceedingly late or highly ambiguous texts", suggesting a more detailed categorisation into "dying gods" and "disappearing gods", arguing that before Christianity, the two categories were distinct and gods who "died" did not return, and those who returned never truly "died".[33][34] Smith gave a more detailed account of his views specifically on the question of parallels to Christianity in Drudgery Divine (1990).[35] Smith's 1987 article was widely received, and during the 1990s, scholarly consensus seemed to shift towards his rejection of the concept as oversimplified, although it continued to be invoked by scholars writing about Ancient Near Eastern mythology.[36] As of 2009, the Encyclopedia of Psychology and Religion summarizes the current scholarly consensus as ambiguous, with some scholars rejecting Frazer's "broad universalist category" preferring to emphasize the differences between the various traditions, while others continue to view the category as applicable.[9] Gerald O'Collins states that surface-level application of analogous symbolism is a case of parallelomania which exaggerate the importance of trifling resemblances, long abandoned by mainstream scholars.[37]

Beginning with an overview of the Athenian ritual of growing and withering herb gardens at the Adonis festival, in his book The Gardens of Adonis Marcel Detienne suggests that rather than being a stand-in for crops in general (and therefore the cycle of death and rebirth), these herbs (and Adonis) were part of a complex of associations in the Greek mind that centered on spices.[38] These associations included seduction, trickery, gourmandizing, and the anxieties of childbirth.[39] From his point of view, Adonis's death is only one datum among the many that must be used to analyze the festival, the myth, and the god.[39][40]
wiki  reference  myth  ritual  religion  christianity  theos  conquest-empire  intricacy  contrarianism  error  gavisti  culture  europe  mediterranean  history  iron-age  the-classics  MENA  leadership  government  gender  sex  cycles  death  mystic  multi  sexuality  food  correlation  paganism 
june 2018 by nhaliday
Why read old philosophy? | Meteuphoric
(This story would suggest that in physics students are maybe missing out on learning the styles of thought that produce progress in physics. My guess is that instead they learn them in grad school when they are doing research themselves, by emulating their supervisors, and that the helpfulness of this might partially explain why Nobel prizewinner advisors beget Nobel prizewinner students.)

The story I hear about philosophy—and I actually don’t know how much it is true—is that as bits of philosophy come to have any methodological tools other than ‘think about it’, they break off and become their own sciences. So this would explain philosophy’s lone status in studying old thinkers rather than impersonal methods—philosophy is the lone ur-discipline without impersonal methods but thinking.

This suggests a research project: try summarizing what Aristotle is doing rather than Aristotle’s views. Then write a nice short textbook about it.
ratty  learning  reading  studying  prioritizing  history  letters  philosophy  science  comparison  the-classics  canon  speculation  reflection  big-peeps  iron-age  mediterranean  roots  lens  core-rats  thinking  methodology  grad-school  academia  physics  giants  problem-solving  meta:research  scholar  the-trenches  explanans  crux  metameta  duplication  sociality  innovation  quixotic  meta:reading  classic 
june 2018 by nhaliday
Dividuals – The soul is not an indivisible unit and has no unified will
Towards A More Mature Atheism: https://dividuals.wordpress.com/2015/09/17/towards-a-more-mature-atheism/
Human intelligence evolved as a social intelligence, for the purposes of social cooperation, social competition and social domination. It evolved to make us efficient at cooperating at removing obstacles, especially the kinds of obstacles that tend to fight back, i.e. at warfare. If you ever studied strategy or tactics, or just played really good board games, you have probably found your brain seems to be strangely well suited for specifically this kind of intellectual activity. It’s not necessarily easier than studying physics, and yet it somehow feels more natural. Physics is like swimming, strategy and tactics is like running. The reason for that is that our brains are truly evolved to be strategic, tactical, diplomatic computers, not physics computers. The question our brains are REALLY good at finding the answer for is “Just what does this guy really want?”

...

Thus, a very basic failure mode of the human brain is to overdetect agency.

I think this is partially what SSC wrote about in Mysticism And Pattern-Matching too. But instead of mystical experiences, my focus is on our brains claiming to detect agency where there is none. Thus my view is closer to Richard Carrier’s definition of the supernatural: it is the idea that some mental things cannot be reduced to nonmental things.

...

Meaning actually means will and agency. It took me a while to figure that one out. When we look for the meaning of life, a meaning in life, or a meaningful life, we look for a will or agency generally outside our own.

...

I am a double oddball – kind of autistic, but still far more interested in human social dynamics, such as history, than in natural sciences or technology. As a result, I do feel a calling to religion – the human world, as opposed to outer space, the human city, the human history, is such a perfect fit for a view like that of Catholicism! The reason for that is that Catholicism is the pinnacle of human intellectual efforts dealing with human agency. Ideas like Augustine’s three failure modes of the human brain: greed, lust and desire for power and status, are just about the closest to forming correct psychological theories far earlier than the scientific method was discovered. Just read your Chesterbelloc and Lewis. And of course because the agency radars of Catholics run at full burst, they overdetect it and thus believe in a god behind the universe. My brain, due to my deep interest in human agency and its consequences, also would like to be religious: wouldn’t it be great if the universe was made by something we could talk to, like, everything else that I am interested in, from field generals to municipal governments are entities I could talk to?

...

I also dislike that atheists often refuse to propose a falsifiable theory because they claim the burden of proof is not on them. Strictly speaking it can be true, but it is still good form to provide one.

Since I am something like an “nontheistic Catholic” anyway (e.g. I believe in original sin from the practical, political angle, I just think it has natural, not supernatural causes: evolution, the move from hunting-gathering to agriculture etc.), all one would need to do to make me fully so is to plug a God concept in my mind.

If you can convince me that my brain is not actually overdetecting agency when I feel a calling to religion, if you can convince me that my brain and most human brains detect agency just about right, there will be no reason for me to not believe in God. Because if there would any sort of agency behind the universe, the smartest bet would be that this agency would be the God of Thomas Aquinas’ Summa. That guy was plain simply a genius.

How to convince me my brain is not overdetecting agency? The simplest way is to convince me that magic, witchcraft, or superstition in general is real, and real in the supernatural sense (I do know Wiccans who cast spells and claim they are natural, not supernatural: divination spells make the brain more aware of hidden details, healing spells recruit the healing processes of the body etc.) You see, Catholics generally do believe in magic and witchcraft, as in: “These really do something, and they do something bad, so never practice them.”

The Strange Places the “God of the Gaps” Takes You: https://dividuals.wordpress.com/2018/05/25/the-strange-places-the-god-of-the-gaps-takes-you/
I assume people are familiar with the God of the Gaps argument. Well, it is usually just an accusation, but Newton for instance really pulled one.

But natural science is inherently different from humanities, because in natural science you build a predictive model of which you are not part of. You are just a point-like neutral observer.

You cannot do that with other human minds because you just don’t have the computing power to simulate a roughly similarly intelligent mind and have enough left to actually work with your model. So you put yourself into the predictive model, you make yourself a part of the model itself. You use a certain empathic kind of understanding, a “what would I do in that guys shoes?” and generate your predictions that way.

...

Which means that while natural science is relatively new, and strongly correlates with technological progress, this empathic, self-programming model of the humanities you could do millenia ago as well, you don’t need math or tools for this, and you probably cannot expect anything like straight-line progress. Maybe some wisdoms people figure out this way are really timeless and we just keep on rediscovering them.

So imagine, say, Catholicism as a large set of humanities. Sociology, social psychology, moral philosophy in the pragmatic, scientific sense (“What morality makes a society not collapse and actually prosper?”), life wisdom and all that. Basically just figuring out how people tick, how societies tick and how to make them tick well.

...

What do? Well, the obvious move is to pull a Newton and inject a God of the Gaps into your humanities. We tick like that because God. We must do so and so to tick well because God.

...

What I am saying is that we are at some point probably going to prove pretty much all of the this-worldy, pragmatic (moral, sociological, psychological etc.) aspect of Catholicism correct by something like evolutionary psychology.

And I am saying that while it will dramatically increase our respect for religion, this will also be probably a huge blow to theism. I don’t want that to happen, but I think it will. Because eliminating God from the gaps of natural science does not hurt faith much. But eliminating God from the gaps of the humanities and yes, religion itself?

My Kind of Atheist: http://www.overcomingbias.com/2018/08/my-kind-of-athiest.html
I think I’ve mentioned somewhere in public that I’m now an atheist, even though I grew up in a very Christian family, and I even joined a “cult” at a young age (against disapproving parents). The proximate cause of my atheism was learning physics in college. But I don’t think I’ve ever clarified in public what kind of an “atheist” or “agnostic” I am. So here goes.

The universe is vast and most of it is very far away in space and time, making our knowledge of those distant parts very thin. So it isn’t at all crazy to think that very powerful beings exist somewhere far away out there, or far before us or after us in time. In fact, many of us hope that we now can give rise to such powerful beings in the distant future. If those powerful beings count as “gods”, then I’m certainly open to the idea that such gods exist somewhere in space-time.

It also isn’t crazy to imagine powerful beings that are “closer” in space and time, but far away in causal connection. They could be in parallel “planes”, in other dimensions, or in “dark” matter that doesn’t interact much with our matter. Or they might perhaps have little interest in influencing or interacting with our sort of things. Or they might just “like to watch.”

But to most religious people, a key emotional appeal of religion is the idea that gods often “answer” prayer by intervening in their world. Sometimes intervening in their head to make them feel different, but also sometimes responding to prayers about their test tomorrow, their friend’s marriage, or their aunt’s hemorrhoids. It is these sort of prayer-answering “gods” in which I just can’t believe. Not that I’m absolutely sure they don’t exist, but I’m sure enough that the term “atheist” fits much better than the term “agnostic.”

These sort of gods supposedly intervene in our world millions of times daily to respond positively to particular prayers, and yet they do not noticeably intervene in world affairs. Not only can we find no physical trace of any machinery or system by which such gods exert their influence, even though we understand the physics of our local world very well, but the history of life and civilization shows no obvious traces of their influence. They know of terrible things that go wrong in our world, but instead of doing much about those things, these gods instead prioritize not leaving any clear evidence of their existence or influence. And yet for some reason they don’t mind people believing in them enough to pray to them, as they often reward such prayers with favorable interventions.
gnon  blog  stream  politics  polisci  ideology  institutions  thinking  religion  christianity  protestant-catholic  history  medieval  individualism-collectivism  n-factor  left-wing  right-wing  tribalism  us-them  cohesion  sociality  ecology  philosophy  buddhism  gavisti  europe  the-great-west-whale  occident  germanic  theos  culture  society  cultural-dynamics  anthropology  volo-avolo  meaningness  coalitions  theory-of-mind  coordination  organizing  psychology  social-psych  fashun  status  nationalism-globalism  models  power  evopsych  EEA  deep-materialism  new-religion  metameta  social-science  sociology  multi  definition  intelligence  science  comparison  letters  social-structure  existence  nihil  ratty  hanson  intricacy  reflection  people  physics  paganism 
june 2018 by nhaliday
« earlier      
per page:    204080120160

Copy this bookmark:





to read