nhaliday : logic   47

The Existential Risk of Math Errors - Gwern.net
How big is this upper bound? Mathematicians have often made errors in proofs. But it’s rarer for ideas to be accepted for a long time and then rejected. But we can divide errors into 2 basic cases corresponding to type I and type II errors:

1. Mistakes where the theorem is still true, but the proof was incorrect (type I)
2. Mistakes where the theorem was false, and the proof was also necessarily incorrect (type II)

Before someone comes up with a final answer, a mathematician may have many levels of intuition in formulating & working on the problem, but we’ll consider the final end-product where the mathematician feels satisfied that he has solved it. Case 1 is perhaps the most common case, with innumerable examples; this is sometimes due to mistakes in the proof that anyone would accept is a mistake, but many of these cases are due to changing standards of proof. For example, when David Hilbert discovered errors in Euclid’s proofs which no one noticed before, the theorems were still true, and the gaps more due to Hilbert being a modern mathematician thinking in terms of formal systems (which of course Euclid did not think in). (David Hilbert himself turns out to be a useful example of the other kind of error: his famous list of 23 problems was accompanied by definite opinions on the outcome of each problem and sometimes timings, several of which were wrong or questionable5.) Similarly, early calculus used ‘infinitesimals’ which were sometimes treated as being 0 and sometimes treated as an indefinitely small non-zero number; this was incoherent and strictly speaking, practically all of the calculus results were wrong because they relied on an incoherent concept - but of course the results were some of the greatest mathematical work ever conducted6 and when later mathematicians put calculus on a more rigorous footing, they immediately re-derived those results (sometimes with important qualifications), and doubtless as modern math evolves other fields have sometimes needed to go back and clean up the foundations and will in the future.7

...

Isaac Newton, incidentally, gave two proofs of the same solution to a problem in probability, one via enumeration and the other more abstract; the enumeration was correct, but the other proof totally wrong and this was not noticed for a long time, leading Stigler to remark:

...

TYPE I > TYPE II?
“Lefschetz was a purely intuitive mathematician. It was said of him that he had never given a completely correct proof, but had never made a wrong guess either.”
- Gian-Carlo Rota13

Case 2 is disturbing, since it is a case in which we wind up with false beliefs and also false beliefs about our beliefs (we no longer know that we don’t know). Case 2 could lead to extinction.

...

Except, errors do not seem to be evenly & randomly distributed between case 1 and case 2. There seem to be far more case 1s than case 2s, as already mentioned in the early calculus example: far more than 50% of the early calculus results were correct when checked more rigorously. Richard Hamming attributes to Ralph Boas a comment that while editing Mathematical Reviews that “of the new results in the papers reviewed most are true but the corresponding proofs are perhaps half the time plain wrong”.

...

Gian-Carlo Rota gives us an example with Hilbert:

...

Olga labored for three years; it turned out that all mistakes could be corrected without any major changes in the statement of the theorems. There was one exception, a paper Hilbert wrote in his old age, which could not be fixed; it was a purported proof of the continuum hypothesis, you will find it in a volume of the Mathematische Annalen of the early thirties.

...

Leslie Lamport advocates for machine-checked proofs and a more rigorous style of proofs similar to natural deduction, noting a mathematician acquaintance guesses at a broad error rate of 1/329 and that he routinely found mistakes in his own proofs and, worse, believed false conjectures30.

[more on these "structured proofs":
https://mathoverflow.net/questions/35727/community-experiences-writing-lamports-structured-proofs
]

We can probably add software to that list: early software engineering work found that, dismayingly, bug rates seem to be simply a function of lines of code, and one would expect diseconomies of scale. So one would expect that in going from the ~4,000 lines of code of the Microsoft DOS operating system kernel to the ~50,000,000 lines of code in Windows Server 2003 (with full systems of applications and libraries being even larger: the comprehensive Debian repository in 2007 contained ~323,551,126 lines of code) that the number of active bugs at any time would be… fairly large. Mathematical software is hopefully better, but practitioners still run into issues (eg Durán et al 2014, Fonseca et al 2017) and I don’t know of any research pinning down how buggy key mathematical systems like Mathematica are or how much published mathematics may be erroneous due to bugs. This general problem led to predictions of doom and spurred much research into automated proof-checking, static analysis, and functional languages31.

[related:
https://mathoverflow.net/questions/11517/computer-algebra-errors
I don't know any interesting bugs in symbolic algebra packages but I know a true, enlightening and entertaining story about something that looked like a bug but wasn't.

Define sinc𝑥=(sin𝑥)/𝑥.

Someone found the following result in an algebra package: ∫∞0𝑑𝑥sinc𝑥=𝜋/2
They then found the following results:

...

So of course when they got:

∫∞0𝑑𝑥sinc𝑥sinc(𝑥/3)sinc(𝑥/5)⋯sinc(𝑥/15)=(467807924713440738696537864469/935615849440640907310521750000)𝜋

hmm:
Which means that nobody knows Fourier analysis nowdays. Very sad and discouraging story... – fedja Jan 29 '10 at 18:47

--

Because the most popular systems are all commercial, they tend to guard their bug database rather closely -- making them public would seriously cut their sales. For example, for the open source project Sage (which is quite young), you can get a list of all the known bugs from this page. 1582 known issues on Feb.16th 2010 (which includes feature requests, problems with documentation, etc).

That is an order of magnitude less than the commercial systems. And it's not because it is better, it is because it is younger and smaller. It might be better, but until SAGE does a lot of analysis (about 40% of CAS bugs are there) and a fancy user interface (another 40%), it is too hard to compare.

I once ran a graduate course whose core topic was studying the fundamental disconnect between the algebraic nature of CAS and the analytic nature of the what it is mostly used for. There are issues of logic -- CASes work more or less in an intensional logic, while most of analysis is stated in a purely extensional fashion. There is no well-defined 'denotational semantics' for expressions-as-functions, which strongly contributes to the deeper bugs in CASes.]

...

Should such widely-believed conjectures as P≠NP or the Riemann hypothesis turn out be false, then because they are assumed by so many existing proofs, a far larger math holocaust would ensue38 - and our previous estimates of error rates will turn out to have been substantial underestimates. But it may be a cloud with a silver lining, if it doesn’t come at a time of danger.

https://mathoverflow.net/questions/338607/why-doesnt-mathematics-collapse-down-even-though-humans-quite-often-make-mista

more on formal methods in programming:
https://www.quantamagazine.org/formal-verification-creates-hacker-proof-code-20160920/
https://intelligence.org/2014/03/02/bob-constable/

Update: measured effort
In the October 2018 issue of Communications of the ACM there is an interesting article about Formally verified software in the real world with some estimates of the effort.

Interestingly (based on OS development for military equipment), it seems that producing formally proved software requires 3.3 times more effort than with traditional engineering techniques. So it's really costly.

On the other hand, it requires 2.3 times less effort to get high security software this way than with traditionally engineered software if you add the effort to make such software certified at a high security level (EAL 7). So if you have high reliability or security requirements there is definitively a business case for going formal.

WHY DON'T PEOPLE USE FORMAL METHODS?: https://www.hillelwayne.com/post/why-dont-people-use-formal-methods/
You can see examples of how all of these look at Let’s Prove Leftpad. HOL4 and Isabelle are good examples of “independent theorem” specs, SPARK and Dafny have “embedded assertion” specs, and Coq and Agda have “dependent type” specs.6

If you squint a bit it looks like these three forms of code spec map to the three main domains of automated correctness checking: tests, contracts, and types. This is not a coincidence. Correctness is a spectrum, and formal verification is one extreme of that spectrum. As we reduce the rigour (and effort) of our verification we get simpler and narrower checks, whether that means limiting the explored state space, using weaker types, or pushing verification to the runtime. Any means of total specification then becomes a means of partial specification, and vice versa: many consider Cleanroom a formal verification technique, which primarily works by pushing code review far beyond what’s humanly possible.

...

The question, then: “is 90/95/99% correct significantly cheaper than 100% correct?” The answer is very yes. We all are comfortable saying that a codebase we’ve well-tested and well-typed is mostly correct modulo a few fixes in prod, and we’re even writing more than four lines of code a day. In fact, the vast… [more]
ratty  gwern  analysis  essay  realness  truth  correctness  reason  philosophy  math  proofs  formal-methods  cs  programming  engineering  worse-is-better/the-right-thing  intuition  giants  old-anglo  error  street-fighting  heuristic  zooming  risk  threat-modeling  software  lens  logic  inference  physics  differential  geometry  estimate  distribution  robust  speculation  nonlinearity  cost-benefit  convexity-curvature  measure  scale  trivia  cocktail  history  early-modern  europe  math.CA  rigor  news  org:mag  org:sci  miri-cfar  pdf  thesis  comparison  examples  org:junk  q-n-a  stackex  pragmatic  tradeoffs  cracker-prog  techtariat  invariance  DSL  chart  ecosystem  grokkability  heavyweights  CAS  static-dynamic  lower-bounds  complexity  tcs  open-problems  big-surf  ideas  certificates-recognition  proof-systems  PCP  mediterranean  SDP  meta:prediction  epistemic  questions  guessing  distributed  overflow  nibble  soft-question  track-record  big-list  hmm  frontier  state-of-art  move-fast-(and-break-things)  grokkability-clarity  technical-writing  trust
july 2019 by nhaliday
Lateralization of brain function - Wikipedia
Language
Language functions such as grammar, vocabulary and literal meaning are typically lateralized to the left hemisphere, especially in right handed individuals.[3] While language production is left-lateralized in up to 90% of right-handers, it is more bilateral, or even right-lateralized, in approximately 50% of left-handers.[4]

Broca's area and Wernicke's area, two areas associated with the production of speech, are located in the left cerebral hemisphere for about 95% of right-handers, but about 70% of left-handers.[5]:69

Auditory and visual processing
The processing of visual and auditory stimuli, spatial manipulation, facial perception, and artistic ability are represented bilaterally.[4] Numerical estimation, comparison and online calculation depend on bilateral parietal regions[6][7] while exact calculation and fact retrieval are associated with left parietal regions, perhaps due to their ties to linguistic processing.[6][7]

...

Depression is linked with a hyperactive right hemisphere, with evidence of selective involvement in "processing negative emotions, pessimistic thoughts and unconstructive thinking styles", as well as vigilance, arousal and self-reflection, and a relatively hypoactive left hemisphere, "specifically involved in processing pleasurable experiences" and "relatively more involved in decision-making processes".

Chaos and Order; the right and left hemispheres: https://orthosphere.wordpress.com/2018/05/23/chaos-and-order-the-right-and-left-hemispheres/
In The Master and His Emissary, Iain McGilchrist writes that a creature like a bird needs two types of consciousness simultaneously. It needs to be able to focus on something specific, such as pecking at food, while it also needs to keep an eye out for predators which requires a more general awareness of environment.

These are quite different activities. The Left Hemisphere (LH) is adapted for a narrow focus. The Right Hemisphere (RH) for the broad. The brains of human beings have the same division of function.

The LH governs the right side of the body, the RH, the left side. With birds, the left eye (RH) looks for predators, the right eye (LH) focuses on food and specifics. Since danger can take many forms and is unpredictable, the RH has to be very open-minded.

The LH is for narrow focus, the explicit, the familiar, the literal, tools, mechanism/machines and the man-made. The broad focus of the RH is necessarily more vague and intuitive and handles the anomalous, novel, metaphorical, the living and organic. The LH is high resolution but narrow, the RH low resolution but broad.

The LH exhibits unrealistic optimism and self-belief. The RH has a tendency towards depression and is much more realistic about a person’s own abilities. LH has trouble following narratives because it has a poor sense of “wholes.” In art it favors flatness, abstract and conceptual art, black and white rather than color, simple geometric shapes and multiple perspectives all shoved together, e.g., cubism. Particularly RH paintings emphasize vistas with great depth of field and thus space and time,[1] emotion, figurative painting and scenes related to the life world. In music, LH likes simple, repetitive rhythms. The RH favors melody, harmony and complex rhythms.

...

Schizophrenia is a disease of extreme LH emphasis. Since empathy is RH and the ability to notice emotional nuance facially, vocally and bodily expressed, schizophrenics tend to be paranoid and are often convinced that the real people they know have been replaced by robotic imposters. This is at least partly because they lose the ability to intuit what other people are thinking and feeling – hence they seem robotic and suspicious.

Oswald Spengler’s The Decline of the West as well as McGilchrist characterize the West as awash in phenomena associated with an extreme LH emphasis. Spengler argues that Western civilization was originally much more RH (to use McGilchrist’s categories) and that all its most significant artistic (in the broadest sense) achievements were triumphs of RH accentuation.

The RH is where novel experiences and the anomalous are processed and where mathematical, and other, problems are solved. The RH is involved with the natural, the unfamiliar, the unique, emotions, the embodied, music, humor, understanding intonation and emotional nuance of speech, the metaphorical, nuance, and social relations. It has very little speech, but the RH is necessary for processing all the nonlinguistic aspects of speaking, including body language. Understanding what someone means by vocal inflection and facial expressions is an intuitive RH process rather than explicit.

...

RH is very much the center of lived experience; of the life world with all its depth and richness. The RH is “the master” from the title of McGilchrist’s book. The LH ought to be no more than the emissary; the valued servant of the RH. However, in the last few centuries, the LH, which has tyrannical tendencies, has tried to become the master. The LH is where the ego is predominantly located. In split brain patients where the LH and the RH are surgically divided (this is done sometimes in the case of epileptic patients) one hand will sometimes fight with the other. In one man’s case, one hand would reach out to hug his wife while the other pushed her away. One hand reached for one shirt, the other another shirt. Or a patient will be driving a car and one hand will try to turn the steering wheel in the opposite direction. In these cases, the “naughty” hand is usually the left hand (RH), while the patient tends to identify herself with the right hand governed by the LH. The two hemispheres have quite different personalities.

The connection between LH and ego can also be seen in the fact that the LH is competitive, contentious, and agonistic. It wants to win. It is the part of you that hates to lose arguments.

Using the metaphor of Chaos and Order, the RH deals with Chaos – the unknown, the unfamiliar, the implicit, the emotional, the dark, danger, mystery. The LH is connected with Order – the known, the familiar, the rule-driven, the explicit, and light of day. Learning something means to take something unfamiliar and making it familiar. Since the RH deals with the novel, it is the problem-solving part. Once understood, the results are dealt with by the LH. When learning a new piece on the piano, the RH is involved. Once mastered, the result becomes a LH affair. The muscle memory developed by repetition is processed by the LH. If errors are made, the activity returns to the RH to figure out what went wrong; the activity is repeated until the correct muscle memory is developed in which case it becomes part of the familiar LH.

Science is an attempt to find Order. It would not be necessary if people lived in an entirely orderly, explicit, known world. The lived context of science implies Chaos. Theories are reductive and simplifying and help to pick out salient features of a phenomenon. They are always partial truths, though some are more partial than others. The alternative to a certain level of reductionism or partialness would be to simply reproduce the world which of course would be both impossible and unproductive. The test for whether a theory is sufficiently non-partial is whether it is fit for purpose and whether it contributes to human flourishing.

...

Analytic philosophers pride themselves on trying to do away with vagueness. To do so, they tend to jettison context which cannot be brought into fine focus. However, in order to understand things and discern their meaning, it is necessary to have the big picture, the overview, as well as the details. There is no point in having details if the subject does not know what they are details of. Such philosophers also tend to leave themselves out of the picture even when what they are thinking about has reflexive implications. John Locke, for instance, tried to banish the RH from reality. All phenomena having to do with subjective experience he deemed unreal and once remarked about metaphors, a RH phenomenon, that they are “perfect cheats.” Analytic philosophers tend to check the logic of the words on the page and not to think about what those words might say about them. The trick is for them to recognize that they and their theories, which exist in minds, are part of reality too.

The RH test for whether someone actually believes something can be found by examining his actions. If he finds that he must regard his own actions as free, and, in order to get along with other people, must also attribute free will to them and treat them as free agents, then he effectively believes in free will – no matter his LH theoretical commitments.

...

We do not know the origin of life. We do not know how or even if consciousness can emerge from matter. We do not know the nature of 96% of the matter of the universe. Clearly all these things exist. They can provide the subject matter of theories but they continue to exist as theorizing ceases or theories change. Not knowing how something is possible is irrelevant to its actual existence. An inability to explain something is ultimately neither here nor there.

If thought begins and ends with the LH, then thinking has no content – content being provided by experience (RH), and skepticism and nihilism ensue. The LH spins its wheels self-referentially, never referring back to experience. Theory assumes such primacy that it will simply outlaw experiences and data inconsistent with it; a profoundly wrong-headed approach.

...

Gödel’s Theorem proves that not everything true can be proven to be true. This means there is an ineradicable role for faith, hope and intuition in every moderately complex human intellectual endeavor. There is no one set of consistent axioms from which all other truths can be derived.

Alan Turing’s proof of the halting problem proves that there is no effective procedure for finding effective procedures. Without a mechanical decision procedure, (LH), when it comes to … [more]
gnon  reflection  books  summary  review  neuro  neuro-nitgrit  things  thinking  metabuch  order-disorder  apollonian-dionysian  bio  examples  near-far  symmetry  homo-hetero  logic  inference  intuition  problem-solving  analytical-holistic  n-factor  europe  the-great-west-whale  occident  alien-character  detail-architecture  art  theory-practice  philosophy  being-becoming  essence-existence  language  psychology  cog-psych  egalitarianism-hierarchy  direction  reason  learning  novelty  science  anglo  anglosphere  coarse-fine  neurons  truth  contradiction  matching  empirical  volo-avolo  curiosity  uncertainty  theos  axioms  intricacy  computation  analogy  essay  rhetoric  deep-materialism  new-religion  knowledge  expert-experience  confidence  biases  optimism  pessimism  realness  whole-partial-many  theory-of-mind  values  competition  reduction  subjective-objective  communication  telos-atelos  ends-means  turing  fiction  increase-decrease  innovation  creative  thick-thin  spengler  multi  ratty  hanson  complex-systems  structure  concrete  abstraction  network-s
september 2018 by nhaliday
Moravec's paradox is the discovery by artificial intelligence and robotics researchers that, contrary to traditional assumptions, high-level reasoning requires very little computation, but low-level sensorimotor skills require enormous computational resources. The principle was articulated by Hans Moravec, Rodney Brooks, Marvin Minsky and others in the 1980s. As Moravec writes, "it is comparatively easy to make computers exhibit adult level performance on intelligence tests or playing checkers, and difficult or impossible to give them the skills of a one-year-old when it comes to perception and mobility".[1]

Similarly, Minsky emphasized that the most difficult human skills to reverse engineer are those that are unconscious. "In general, we're least aware of what our minds do best", he wrote, and added "we're more aware of simple processes that don't work well than of complex ones that work flawlessly".[2]

...

One possible explanation of the paradox, offered by Moravec, is based on evolution. All human skills are implemented biologically, using machinery designed by the process of natural selection. In the course of their evolution, natural selection has tended to preserve design improvements and optimizations. The older a skill is, the more time natural selection has had to improve the design. Abstract thought developed only very recently, and consequently, we should not expect its implementation to be particularly efficient.

As Moravec writes:

Encoded in the large, highly evolved sensory and motor portions of the human brain is a billion years of experience about the nature of the world and how to survive in it. The deliberate process we call reasoning is, I believe, the thinnest veneer of human thought, effective only because it is supported by this much older and much more powerful, though usually unconscious, sensorimotor knowledge. We are all prodigious olympians in perceptual and motor areas, so good that we make the difficult look easy. Abstract thought, though, is a new trick, perhaps less than 100 thousand years old. We have not yet mastered it. It is not all that intrinsically difficult; it just seems so when we do it.[3]

A compact way to express this argument would be:

- We should expect the difficulty of reverse-engineering any human skill to be roughly proportional to the amount of time that skill has been evolving in animals.
- The oldest human skills are largely unconscious and so appear to us to be effortless.
- Therefore, we should expect skills that appear effortless to be difficult to reverse-engineer, but skills that require effort may not necessarily be difficult to engineer at all.
concept  wiki  reference  paradox  ai  intelligence  reason  instinct  neuro  psychology  cog-psych  hardness  logic  deep-learning  time  evopsych  evolution  sapiens  the-self  EEA  embodied  embodied-cognition  abstraction  universalism-particularism  gnosis-logos  robotics
june 2018 by nhaliday
Theological differences between the Catholic Church and the Eastern Orthodox Church - Wikipedia
Did the Filioque Ruin the West?: https://contingentnotarbitrary.com/2017/06/15/the-filioque-ruined-the-west/
The theology of the filioque makes the Father and the Son equal as sources of divinity. Flattening the hierarchy implicit in the Trinity does away with the Monarchy of the Father: the family relationship becomes less patriarchal and more egalitarian. The Son, with his humanity, mercy, love and sacrifice, is no longer subordinate to the Father, while the Father – the God of the Old Testament, law and tradition – is no longer sovereign. Looks like the change would elevate egalitarianism, compassion, humanity and self-sacrifice while undermining hierarchy, rules, family and tradition. Sound familiar?
article  wiki  reference  philosophy  backup  religion  christianity  theos  ideology  comparison  nitty-gritty  intricacy  europe  the-great-west-whale  occident  russia  MENA  orient  letters  epistemic  truth  science  logic  inference  guilt-shame  volo-avolo  causation  multi  gnon  eastern-europe  roots  explanans  enlightenment-renaissance-restoration-reformation  modernity  egalitarianism-hierarchy  love-hate  free-riding  cooperate-defect  gender  justice  law  tradition  legacy  parenting  ascetic  altruism  farmers-and-foragers  protestant-catholic  exegesis-hermeneutics
april 2018 by nhaliday
Contingent, Not Arbitrary | Truth is contingent on what is, not on what we wish to be true.
A vital attribute of a value system of any kind is that it works. I consider this a necessary (but not sufficient) condition for goodness. A value system, when followed, should contribute to human flourishing and not produce results that violate its core ideals. This is a pragmatic, I-know-it-when-I-see-it definition. I may refine it further if the need arises.

I think that the prevailing Western values fail by this standard. I will not spend much time arguing this; many others have already. If you reject this premise, this blog may not be for you.

I consider old traditions an important source of wisdom: they have proven their worth over centuries of use. Where they agree, we should listen. Where they disagree, we should figure out why. Where modernity departs from tradition, we should be wary of the new.

Tradition has one nagging problem: it was abandoned by the West. How and why did that happen? I consider this a central question. I expect the reasons to be varied and complex. Understanding them seems necessary if we are to fix what may have been broken.

In short, I want to answer these questions:

1. How do values spread and persist? An ideology does no good if no one holds it.
2. Which values do good? Sounding good is worse than useless if it leads to ruin.

The ultimate hope would be to find a way to combine the two. Many have tried and failed. I don’t expect to succeed either, but I hope I’ll manage to clarify the questions.

Christianity Is The Schelling Point: https://contingentnotarbitrary.com/2018/02/22/christianity-is-the-schelling-point/
Restoring true Christianity is both necessary and sufficient for restoring civilization. The task is neither easy nor simple but that’s what it takes. It is also our best chance of weathering the collapse if that’s too late to avoid.

Christianity is the ultimate coordination mechanism: it unites us with a higher purpose, aligns us with the laws of reality and works on all scales, from individuals to entire civilizations. Christendom took over the world and then lost it when its faith faltered. Historically and culturally, Christianity is the unique Schelling point for the West – or it would be if we could agree on which church (if any) was the true one.

Here are my arguments for true Christianity as the Schelling point. I hope to demonstrate these points in subsequent posts; for now I’ll just list them.

- A society of saints is the most powerful human arrangement possible. It is united in purpose, ideologically stable and operates in harmony with natural law. This is true independent of scale and organization: from military hierarchy to total decentralization, from persecuted minority to total hegemony. Even democracy works among saints – that’s why it took so long to fail.
- There is such a thing as true Christianity. I don’t know how to pinpoint it but it does exist; that holds from both secular and religious perspectives. Our task is to converge on it the best we can.
- Don’t worry too much about the existence of God. I’m proof that you don’t need that assumption in order to believe – it helps but isn’t mandatory.

Pascal’s Wager never sat right with me. Now I know why: it’s a sucker bet. Let’s update it.

If God exists, we must believe because our souls and civilization depend on it. If He doesn’t exist, we must believe because civilization depends on it.

I agree with this
gnon  todo  blog  stream  religion  christianity  theos  morality  ethics  formal-values  philosophy  truth  is-ought  coordination  cooperate-defect  alignment  tribalism  cohesion  nascent-state  counter-revolution  epistemic  civilization  rot  fertility  intervention  europe  the-great-west-whale  occident  telos-atelos  multi  ratty  hanson  big-picture  society  culture  evolution  competition  🤖  rationality  rhetoric  contrarianism  values  water  embedded-cognition  ideology  deep-materialism  moloch  new-religion  patho-altruism  darwinian  existence  good-evil  memetics  direct-indirect  endogenous-exogenous  tradition  anthropology  cultural-dynamics  farmers-and-foragers  egalitarianism-hierarchy  organizing  institutions  protestant-catholic  enlightenment-renaissance-restoration-reformation  realness  science  empirical  modernity  revolution  inference  parallax  axioms  pragmatic  zeitgeist  schelling  prioritizing  ends-means  degrees-of-freedom  logic  reason  interdisciplinary  exegesis-hermeneutics  o
april 2018 by nhaliday
Theories of humor - Wikipedia
There are many theories of humor which attempt to explain what humor is, what social functions it serves, and what would be considered humorous. Among the prevailing types of theories that attempt to account for the existence of humor, there are psychological theories, the vast majority of which consider humor to be very healthy behavior; there are spiritual theories, which consider humor to be an inexplicable mystery, very much like a mystical experience.[1] Although various classical theories of humor and laughter may be found, in contemporary academic literature, three theories of humor appear repeatedly: relief theory, superiority theory, and incongruity theory.[2] Among current humor researchers, there is no consensus about which of these three theories of humor is most viable.[2] Proponents of each one originally claimed their theory to be capable of explaining all cases of humor.[2][3] However, they now acknowledge that although each theory generally covers its own area of focus, many instances of humor can be explained by more than one theory.[2][3][4][5] Incongruity and superiority theories, for instance, seem to describe complementary mechanisms which together create humor.[6]

...

Relief theory
Relief theory maintains that laughter is a homeostatic mechanism by which psychological tension is reduced.[2][3][7] Humor may thus for example serve to facilitate relief of the tension caused by one's fears.[8] Laughter and mirth, according to relief theory, result from this release of nervous energy.[2] Humor, according to relief theory, is used mainly to overcome sociocultural inhibitions and reveal suppressed desires. It is believed that this is the reason we laugh whilst being tickled, due to a buildup of tension as the tickler "strikes".[2][9] According to Herbert Spencer, laughter is an "economical phenomenon" whose function is to release "psychic energy" that had been wrongly mobilized by incorrect or false expectations. The latter point of view was supported also by Sigmund Freud.

Superiority theory
The superiority theory of humor traces back to Plato and Aristotle, and Thomas Hobbes' Leviathan. The general idea is that a person laughs about misfortunes of others (so called schadenfreude), because these misfortunes assert the person's superiority on the background of shortcomings of others.[10] Socrates was reported by Plato as saying that the ridiculous was characterized by a display of self-ignorance.[11] For Aristotle, we laugh at inferior or ugly individuals, because we feel a joy at feeling superior to them.[12]

Incongruous juxtaposition theory
The incongruity theory states that humor is perceived at the moment of realization of incongruity between a concept involved in a certain situation and the real objects thought to be in some relation to the concept.[10]

Since the main point of the theory is not the incongruity per se, but its realization and resolution (i.e., putting the objects in question into the real relation), it is often called the incongruity-resolution theory.[10]

...

Detection of mistaken reasoning
In 2011, three researchers, Hurley, Dennett and Adams, published a book that reviews previous theories of humor and many specific jokes. They propose the theory that humor evolved because it strengthens the ability of the brain to find mistakes in active belief structures, that is, to detect mistaken reasoning.[46] This is somewhat consistent with the sexual selection theory, because, as stated above, humor would be a reliable indicator of an important survival trait: the ability to detect mistaken reasoning. However, the three researchers argue that humor is fundamentally important because it is the very mechanism that allows the human brain to excel at practical problem solving. Thus, according to them, humor did have survival value even for early humans, because it enhanced the neural circuitry needed to survive.

Misattribution is one theory of humor that describes an audience's inability to identify exactly why they find a joke to be funny. The formal theory is attributed to Zillmann & Bryant (1980) in their article, "Misattribution Theory of Tendentious Humor", published in Journal of Experimental Social Psychology. They derived the critical concepts of the theory from Sigmund Freud's Wit and Its Relation to the Unconscious (note: from a Freudian perspective, wit is separate from humor), originally published in 1905.

Benign violation theory
The benign violation theory (BVT) is developed by researchers A. Peter McGraw and Caleb Warren.[47] The BVT integrates seemingly disparate theories of humor to predict that humor occurs when three conditions are satisfied: 1) something threatens one's sense of how the world "ought to be", 2) the threatening situation seems benign, and 3) a person sees both interpretations at the same time.

From an evolutionary perspective, humorous violations likely originated as apparent physical threats, like those present in play fighting and tickling. As humans evolved, the situations that elicit humor likely expanded from physical threats to other violations, including violations of personal dignity (e.g., slapstick, teasing), linguistic norms (e.g., puns, malapropisms), social norms (e.g., strange behaviors, risqué jokes), and even moral norms (e.g., disrespectful behaviors). The BVT suggests that anything that threatens one's sense of how the world "ought to be" will be humorous, so long as the threatening situation also seems benign.

...

Sense of humor, sense of seriousness
One must have a sense of humor and a sense of seriousness to distinguish what is supposed to be taken literally or not. An even more keen sense is needed when humor is used to make a serious point.[48][49] Psychologists have studied how humor is intended to be taken as having seriousness, as when court jesters used humor to convey serious information. Conversely, when humor is not intended to be taken seriously, bad taste in humor may cross a line after which it is taken seriously, though not intended.[50]

Philosophy of humor bleg: http://marginalrevolution.com/marginalrevolution/2017/03/philosophy-humor-bleg.html

Inside Jokes: https://mitpress.mit.edu/books/inside-jokes
humor as reward for discovering inconsistency in inferential chain

https://en.wikipedia.org/wiki/Humour
People of all ages and cultures respond to humour. Most people are able to experience humour—be amused, smile or laugh at something funny—and thus are considered to have a sense of humour. The hypothetical person lacking a sense of humour would likely find the behaviour inducing it to be inexplicable, strange, or even irrational.

...

Ancient Greece
Western humour theory begins with Plato, who attributed to Socrates (as a semi-historical dialogue character) in the Philebus (p. 49b) the view that the essence of the ridiculous is an ignorance in the weak, who are thus unable to retaliate when ridiculed. Later, in Greek philosophy, Aristotle, in the Poetics (1449a, pp. 34–35), suggested that an ugliness that does not disgust is fundamental to humour.

...

China
Confucianist Neo-Confucian orthodoxy, with its emphasis on ritual and propriety, has traditionally looked down upon humour as subversive or unseemly. The Confucian "Analects" itself, however, depicts the Master as fond of humorous self-deprecation, once comparing his wanderings to the existence of a homeless dog.[10] Early Daoist philosophical texts such as "Zhuangzi" pointedly make fun of Confucian seriousness and make Confucius himself a slow-witted figure of fun.[11] Joke books containing a mix of wordplay, puns, situational humor, and play with taboo subjects like sex and scatology, remained popular over the centuries. Local performing arts, storytelling, vernacular fiction, and poetry offer a wide variety of humorous styles and sensibilities.

...

Physical attractiveness
90% of men and 81% of women, all college students, report having a sense of humour is a crucial characteristic looked for in a romantic partner.[21] Humour and honesty were ranked as the two most important attributes in a significant other.[22] It has since been recorded that humour becomes more evident and significantly more important as the level of commitment in a romantic relationship increases.[23] Recent research suggests expressions of humour in relation to physical attractiveness are two major factors in the desire for future interaction.[19] Women regard physical attractiveness less highly compared to men when it came to dating, a serious relationship, and sexual intercourse.[19] However, women rate humorous men more desirable than nonhumorous individuals for a serious relationship or marriage, but only when these men were physically attractive.[19]

Furthermore, humorous people are perceived by others to be more cheerful but less intellectual than nonhumorous people. Self-deprecating humour has been found to increase the desirability of physically attractive others for committed relationships.[19] The results of a study conducted by McMaster University suggest humour can positively affect one’s desirability for a specific relationship partner, but this effect is only most likely to occur when men use humour and are evaluated by women.[24] No evidence was found to suggest men prefer women with a sense of humour as partners, nor women preferring other women with a sense of humour as potential partners.[24] When women were given the forced-choice design in the study, they chose funny men as potential … [more]
article  list  wiki  reference  psychology  cog-psych  social-psych  emotion  things  phalanges  concept  neurons  instinct  👽  comedy  models  theory-of-mind  explanans  roots  evopsych  signaling  humanity  logic  sex  sexuality  cost-benefit  iq  intelligence  contradiction  homo-hetero  egalitarianism-hierarchy  humility  reinforcement  EEA  eden  play  telos-atelos  impetus  theos  mystic  philosophy  big-peeps  the-classics  literature  inequality  illusion  within-without  dennett  dignity  social-norms  paradox  parallax  analytical-holistic  multi  econotariat  marginal-rev  discussion  speculation  books  impro  carcinisation  postrat  cool  twitter  social  quotes  commentary  search  farmers-and-foragers  🦀  evolution  sapiens  metameta  insight  novelty  wire-guided  realness  chart  beauty  nietzschean  class  pop-diff  culture  alien-character  confucian  order-disorder  sociality  🐝  integrity  properties  gender  gender-diff  china  asia  sinosphere  long-short-run  trust  religion  ideology  elegance  psycho-atoms
april 2018 by nhaliday
Diving into Chinese philosophy – Gene Expression
Back when I was in college one of my roommates was taking a Chinese philosophy class for a general education requirement. A double major in mathematics and economics (he went on to get an economics Ph.D.) he found the lack of formal rigor in the field rather maddening. I thought this was fair, but I suggested to him that the this-worldy and often non-metaphysical orientation of much of Chinese philosophy made it less amenable to formal and logical analysis.

...

IMO the much more problematic thing about premodern Chinese political philosophy from the point of view of the West is its lack of interest in constitutionalism and the rule of law, stemming from a generally less rationalist approach than the Classical Westerns, than any sort of inherent anti-individualism or collectivism or whatever. For someone like Aristotle the constitutional rule of law was the highest moral good in itself and the definition of justice, very much not so for Confucius or for Zhu Xi. They still believed in Justice in the sense of people getting what they deserve, but they didn’t really consider the written rule of law an appropriate way to conceptualize it. OG Confucius leaned more towards the unwritten traditions and rituals passed down from the ancestors, and Neoconfucianism leaned more towards a sort of Universal Reason that could be accessed by the individual’s subjective understanding but which again need not be written down necessarily (although unlike Kant/the Enlightenment it basically implies that such subjective reasoning will naturally lead one to reaffirming the ancient traditions). In left-right political spectrum terms IMO this leads to a well-defined right and left and a big old hole in the center where classical republicanism would be in the West. This resonates pretty well with modern East Asian political history IMO

Is logos a proper noun?
Or, is Aristotelian Logic translatable into Chinese?
gnxp  scitariat  books  recommendations  discussion  reflection  china  asia  sinosphere  philosophy  logic  rigor  rigidity  flexibility  leviathan  law  individualism-collectivism  analytical-holistic  systematic-ad-hoc  the-classics  canon  morality  ethics  formal-values  justice  reason  tradition  government  polisci  left-wing  right-wing  order-disorder  eden-heaven  analogy  similarity  comparison  thinking  summary  top-n  n-factor  universalism-particularism  duality  rationality  absolute-relative  subjective-objective  the-self  apollonian-dionysian  big-peeps  history  iron-age  antidemos  democracy  institutions  darwinian  multi  language  concept  conceptual-vocab  inference  linguistics  foreign-lang  mediterranean  europe  germanic  mostly-modern  gallic  culture
march 2018 by nhaliday
[1410.0369] The Universe of Minds
kinda dumb, don't think this guy is anywhere close to legit (e.g., he claims set of mind designs is countable, but gives no actual reason to believe that)
papers  preprint  org:mat  ratty  miri-cfar  ai  intelligence  philosophy  logic  software  cs  computation  the-self
march 2018 by nhaliday
Transcendentals - Wikipedia
The transcendentals (Latin: transcendentalia) are the properties of being that correspond to three aspects of the human field of interest and are their ideals; science (truth), the arts (beauty) and religion (goodness).[citation needed] Philosophical disciplines that study them are logic, aesthetics and ethics.

Parmenides first inquired of the properties co-extensive with being.[1] Socrates, spoken through Plato, then followed (see Form of the Good).

Aristotle's substance theory (being a substance belongs to being qua being) has been interpreted as a theory of transcendentals.[2] Aristotle discusses only unity ("One") explicitly because it is the only transcendental intrinsically related to being, whereas truth and goodness relate to rational creatures.[3]

In the Middle Ages, Catholic philosophers elaborated the thought that there exist transcendentals (transcendentalia) and that they transcended each of the ten Aristotelian categories.[4] A doctrine of the transcendentality of the good was formulated by Albert the Great.[5] His pupil, Saint Thomas Aquinas, posited five transcendentals: res, unum, aliquid, bonum, verum; or "thing", "one", "something", "good", and "true".[6] Saint Thomas derives the five explicitly as transcendentals,[7] though in some cases he follows the typical list of the transcendentals consisting of the One, the Good, and the True. The transcendentals are ontologically one and thus they are convertible: e.g., where there is truth, there is beauty and goodness also.

In Christian theology the transcendentals are treated in relation to theology proper, the doctrine of God. The transcendentals, according to Christian doctrine, can be described as the ultimate desires of man. Man ultimately strives for perfection, which takes form through the desire for perfect attainment of the transcendentals. The Catholic Church teaches that God is Himself truth, goodness, and beauty, as indicated in the Catechism of the Catholic Church.[8] Each transcends the limitations of place and time, and is rooted in being. The transcendentals are not contingent upon cultural diversity, religious doctrine, or personal ideologies, but are the objective properties of all that exists.
concept  conceptual-vocab  wiki  reference  philosophy  europe  the-great-west-whale  the-classics  history  iron-age  mediterranean  gavisti  religion  christianity  theos  truth  science  beauty  art  logic  aesthetics  morality  ethics  formal-values  metameta  virtu  egalitarianism-hierarchy  roots  deep-materialism  new-religion  big-peeps  canon  polarization  reason  occident  meaningness  courage  gnosis-logos  good-evil  subjective-objective  absolute-relative  parallax  forms-instances  is-ought  essence-existence  elegance
march 2018 by nhaliday
Soft analysis, hard analysis, and the finite convergence principle | What's new
It is fairly well known that the results obtained by hard and soft analysis respectively can be connected to each other by various “correspondence principles” or “compactness principles”. It is however my belief that the relationship between the two types of analysis is in fact much closer[3] than just this; in many cases, qualitative analysis can be viewed as a convenient abstraction of quantitative analysis, in which the precise dependencies between various finite quantities has been efficiently concealed from view by use of infinitary notation. Conversely, quantitative analysis can often be viewed as a more precise and detailed refinement of qualitative analysis. Furthermore, a method from hard analysis often has some analogue in soft analysis and vice versa, though the language and notation of the analogue may look completely different from that of the original. I therefore feel that it is often profitable for a practitioner of one type of analysis to learn about the other, as they both offer their own strengths, weaknesses, and intuition, and knowledge of one gives more insight[4] into the workings of the other. I wish to illustrate this point here using a simple but not terribly well known result, which I shall call the “finite convergence principle” (thanks to Ben Green for suggesting this name; Jennifer Chayes has also suggested the “metastability principle”). It is the finitary analogue of an utterly trivial infinitary result – namely, that every bounded monotone sequence converges – but sometimes, a careful analysis of a trivial result can be surprisingly revealing, as I hope to demonstrate here.
gowers  mathtariat  math  math.CA  expert  reflection  philosophy  meta:math  logic  math.CO  lens  big-picture  symmetry  limits  finiteness  nibble  org:bleg  coarse-fine  metameta  convergence  expert-experience
january 2017 by nhaliday
In Computers We Trust? | Quanta Magazine
As math grows ever more complex, will computers reign?

Shalosh B. Ekhad is a computer. Or, rather, it is any of a rotating cast of computers used by the mathematician Doron Zeilberger, from the Dell in his New Jersey office to a supercomputer whose services he occasionally enlists in Austria. The name — Hebrew for “three B one” — refers to the AT&T 3B1, Ekhad’s earliest incarnation.

“The soul is the software,” said Zeilberger, who writes his own code using a popular math programming tool called Maple.
news  org:mag  org:sci  popsci  math  culture  academia  automation  formal-methods  ai  debate  interdisciplinary  rigor  proofs  nibble  org:inst  calculation  bare-hands  heavyweights  contrarianism  computation  correctness  oss  replication  logic  frontier  state-of-art  technical-writing  trust
january 2017 by nhaliday
soft question - A Book You Would Like to Write - MathOverflow
- The Differential Topology of Loop Spaces
- Knot Theory: Kawaii examples for topological machines
- An Introduction to Forcing (for people who don't care about foundations.)
writing  math  q-n-a  discussion  books  list  synthesis  big-list  overflow  soft-question  techtariat  mathtariat  exposition  topology  open-problems  logic  nibble  fedja  questions
october 2016 by nhaliday

Copy this bookmark: