recentpopularlog in

nhaliday : wiki   1228

« earlier  
As We May Think - Wikipedia
"As We May Think" is a 1945 essay by Vannevar Bush which has been described as visionary and influential, anticipating many aspects of information society. It was first published in The Atlantic in July 1945 and republished in an abridged version in September 1945—before and after the atomic bombings of Hiroshima and Nagasaki. Bush expresses his concern for the direction of scientific efforts toward destruction, rather than understanding, and explicates a desire for a sort of collective memory machine with his concept of the memex that would make knowledge more accessible, believing that it would help fix these problems. Through this machine, Bush hoped to transform an information explosion into a knowledge explosion.[1]

https://twitter.com/michael_nielsen/status/979193577229004800
https://archive.is/FrF8Q
https://archive.is/19hHT
https://archive.is/G7yLl
https://archive.is/wFbbj
A few notes on Vannevar Bush's amazing essay, "As We May Think", from the 1945(!) @TheAtlantic :

https://twitter.com/andy_matuschak/status/1147928384510390277
https://archive.is/tm6fB
https://archive.is/BIok9
When I first read As We May Think* as a teenager, I was astonished by how much it predicted of the computer age in 1945—but recently I’ve been feeling wistful about some pieces it predicts which never came to pass. [thread]

*

http://ceasarbautista.com/posts/memex_meetup_2.html
wiki  org:mag  essay  big-peeps  history  mostly-modern  classic  ideas  worrydream  exocortex  thinking  network-structure  graphs  internet  structure  notetaking  design  skunkworks  multi  techtariat  twitter  social  discussion  reflection  backup  speedometer  software  org:junk  michael-nielsen 
4 weeks ago by nhaliday
Japanese sound symbolism - Wikipedia
Japanese has a large inventory of sound symbolic or mimetic words, known in linguistics as ideophones.[1][2] Sound symbolic words are found in written as well as spoken Japanese.[3] Known popularly as onomatopoeia, these words are not just imitative of sounds but cover a much wider range of meanings;[1] indeed, many sound-symbolic words in Japanese are for things that don't make any noise originally, most clearly demonstrated by shiinto (しいんと), meaning "silently".
language  foreign-lang  trivia  wiki  reference  audio  hmm  alien-character  culture  list  objektbuch  japan  asia  writing 
october 2019 by nhaliday
Ask HN: Favorite note-taking software? | Hacker News
Ask HN: What is your ideal note-taking software and/or hardware?: https://news.ycombinator.com/item?id=13221158

my wishlist as of 2019:
- web + desktop macOS + mobile iOS (at least viewing on the last but ideally also editing)
- sync across all those
- open-source data format that's easy to manipulate for scripting purposes
- flexible organization: mostly tree hierarchical (subsuming linear/unorganized) but with the option for directed (acyclic) graph (possibly a second layer of structure/linking)
- can store plain text, LaTeX, diagrams, sketches, and raster/vector images (video prob not necessary except as links to elsewhere)
- full-text search
- somehow digest/import data from Pinboard, Workflowy, Papers 3/Bookends, Skim, and iBooks/e-readers (esp. Kobo), ideally absorbing most of their functionality
- so, eg, track notes/annotations side-by-side w/ original PDF/DjVu/ePub documents (to replace Papers3/Bookends/Skim), and maybe web pages too (to replace Pinboard)
- OCR of handwritten notes (how to handle equations/diagrams?)
- various forms of NLP analysis of everything (topic models, clustering, etc)
- maybe version control (less important than export)

candidates?:
- Evernote prob ruled out do to heavy use of proprietary data formats (unless I can find some way to export with tolerably clean output)
- Workflowy/Dynalist are good but only cover a subset of functionality I want
- org-mode doesn't interact w/ mobile well (and I haven't evaluated it in detail otherwise)
- TiddlyWiki/Zim are in the running, but not sure about mobile
- idk about vimwiki but I'm not that wedded to vim and it seems less widely used than org-mode/TiddlyWiki/Zim so prob pass on that
- Quiver/Joplin/Inkdrop look similar and cover a lot of bases, TODO: evaluate more
- Trilium looks especially promising, tho read-only mobile and for macOS desktop look at this: https://github.com/zadam/trilium/issues/511
- RocketBook is interesting scanning/OCR solution but prob not sufficient due to proprietary data format
- TODO: many more candidates, eg, TreeSheets, Gingko, OneNote (macOS?...), Notion (proprietary data format...), Zotero, Nodebook (https://nodebook.io/landing), Polar (https://getpolarized.io), Roam (looks very promising)

Ask HN: What do you use for you personal note taking activity?: https://news.ycombinator.com/item?id=15736102

Ask HN: What are your note-taking techniques?: https://news.ycombinator.com/item?id=9976751

Ask HN: How do you take notes (useful note-taking strategies)?: https://news.ycombinator.com/item?id=13064215

Ask HN: How to get better at taking notes?: https://news.ycombinator.com/item?id=21419478

Ask HN: How do you keep your notes organized?: https://news.ycombinator.com/item?id=21810400

Ask HN: How did you build up your personal knowledge base?: https://news.ycombinator.com/item?id=21332957
nice comment from math guy on structure and difference between math and CS: https://news.ycombinator.com/item?id=21338628
useful comment collating related discussions: https://news.ycombinator.com/item?id=21333383
highlights:
Designing a Personal Knowledge base: https://news.ycombinator.com/item?id=8270759
Ask HN: How to organize personal knowledge?: https://news.ycombinator.com/item?id=17892731
Do you use a personal 'knowledge base'?: https://news.ycombinator.com/item?id=21108527
Ask HN: How do you share/organize knowledge at work and life?: https://news.ycombinator.com/item?id=21310030
Managing my personal knowledge base: https://news.ycombinator.com/item?id=22000791
The sad state of personal data and infrastructure: https://beepb00p.xyz/sad-infra.html
Building personal search infrastructure for your knowledge and code: https://beepb00p.xyz/pkm-search.html

How to annotate literally everything: https://beepb00p.xyz/annotating.html
Ask HN: How do you organize document digests / personal knowledge?: https://news.ycombinator.com/item?id=21642289
Ask HN: Good solution for storing notes/excerpts from books?: https://news.ycombinator.com/item?id=21920143
Ask HN: What's your cross-platform pdf / ePub reading workflow?: https://news.ycombinator.com/item?id=22170395
some related stuff in the reddit links at the bottom of this pin

https://beepb00p.xyz/grasp.html
How to capture information from your browser and stay sane

Ask HN: Best solutions for keeping a personal log?: https://news.ycombinator.com/item?id=21906650

other stuff:
plain text: https://news.ycombinator.com/item?id=21685660

https://www.getdnote.com/blog/how-i-built-personal-knowledge-base-for-myself/
Tiago Forte: https://www.buildingasecondbrain.com

hn search: https://hn.algolia.com/?query=notetaking&type=story

Slant comparison commentary: https://news.ycombinator.com/item?id=7011281

good comparison of options here in comments here (and Trilium itself looks good): https://news.ycombinator.com/item?id=18840990

https://en.wikipedia.org/wiki/Comparison_of_note-taking_software

stuff from Andy Matuschak and Michael Nielsen on general note-taking:
https://twitter.com/andy_matuschak/status/1202663202997170176
https://archive.is/1i9ep
Software interfaces undervalue peripheral vision! (a thread)
https://twitter.com/andy_matuschak/status/1199378287555829760
https://archive.is/J06UB
This morning I implemented PageRank to sort backlinks in my prototype note system. Mixed results!
https://twitter.com/andy_matuschak/status/1211487900505792512
https://archive.is/BOiCG
https://archive.is/4zB37
One way to dream up post-book media to make reading more effective and meaningful is to systematize "expert" practices (e.g. How to Read a Book), so more people can do them, more reliably and more cheaply. But… the most erudite people I know don't actually do those things!

the memex essay and comments from various people including Andy on it: https://pinboard.in/u:nhaliday/b:1cddf69c0b31

some more stuff specific to Roam below, and cf "Why books don't work": https://pinboard.in/u:nhaliday/b:b4d4461f6378

wikis:
https://www.slant.co/versus/5116/8768/~tiddlywiki_vs_zim
https://www.wikimatrix.org/compare/tiddlywiki+zim
http://tiddlymap.org/
https://www.zim-wiki.org/manual/Plugins/BackLinks_Pane.html
https://zim-wiki.org/manual/Plugins/Link_Map.html

apps:
Roam: https://news.ycombinator.com/item?id=21440289
https://www.reddit.com/r/RoamResearch/
https://twitter.com/hashtag/roamcult
https://twitter.com/search?q=RoamResearch%20fortelabs
https://twitter.com/search?q=from%3AQiaochuYuan%20RoamResearch&src=typd
https://twitter.com/vgr/status/1199391391803043840
https://archive.is/TJPQN
https://archive.is/CrNwZ
https://www.nateliason.com/blog/roam
https://twitter.com/andy_matuschak/status/1190102757430063106
https://archive.is/To30Q
https://archive.is/UrI1x
https://archive.is/Ww22V
Knowledge systems which display contextual backlinks to a node open up an interesting new behavior. You can bootstrap a new node extensionally (rather than intensionally) by simply linking to it from many other nodes—even before it has any content.
https://twitter.com/michael_nielsen/status/1220197017340612608
Curious: what are the most striking public @RoamResearch pages that you know? I'd like to see examples of people using it for interesting purposes, or in interesting ways.
https://acesounderglass.com/2019/10/24/epistemic-spot-check-the-fate-of-rome-round-2/
https://twitter.com/andy_matuschak/status/1206011493495513089
https://archive.is/xvaMh
If I weren't doing my own research on questions in knowledge systems (which necessitates tinkering with my own), and if I weren't allergic to doing serious work in webapps, I'd likely use Roam instead!
https://talk.dynalist.io/t/roam-research-new-web-based-outliner-that-supports-transclusion-wiki-features-thoughts/5911/16
http://forum.eastgate.com/t/roam-research-interesting-approach-to-note-taking/2713/10
interesting app: http://www.eastgate.com/Tinderbox/
https://www.theatlantic.com/notes/2016/09/labor-day-software-update-tinderbox-scrivener/498443/

intriguing but probably not appropriate for my needs: https://www.sophya.ai/

Inkdrop: https://news.ycombinator.com/item?id=20103589

Joplin: https://news.ycombinator.com/item?id=15815040
https://news.ycombinator.com/item?id=21555238

MindForgr: https://news.ycombinator.com/item?id=22088175
one comment links to this, mostly on Notion: https://tkainrad.dev/posts/managing-my-personal-knowledge-base/

https://wreeto.com/

Leo Editor (combines tree outlining w/ literate programming/scripting, I think?): https://news.ycombinator.com/item?id=17769892

Frame: https://news.ycombinator.com/item?id=18760079

https://www.reddit.com/r/TheMotte/comments/cb18sy/anyone_use_a_personal_wiki_software_to_catalog/
https://archive.is/xViTY
Notion: https://news.ycombinator.com/item?id=18904648
https://coda.io/welcome
https://news.ycombinator.com/item?id=15543181

accounting: https://news.ycombinator.com/item?id=19833881
Coda mentioned

https://www.reddit.com/r/slatestarcodex/comments/ap437v/modified_cornell_method_the_optimal_notetaking/
https://archive.is/e9oHu
https://www.reddit.com/r/slatestarcodex/comments/bt8a1r/im_about_to_start_a_one_month_journaling_test/
https://www.reddit.com/r/slatestarcodex/comments/9cot3m/question_how_do_you_guys_learn_things/
https://archive.is/HUH8V
https://www.reddit.com/r/slatestarcodex/comments/d7bvcp/how_to_read_a_book_for_understanding/
https://archive.is/VL2mi

Anki:
https://www.reddit.com/r/Anki/comments/as8i4t/use_anki_for_technical_books/
https://www.freecodecamp.org/news/how-anki-saved-my-engineering-career-293a90f70a73/
https://www.reddit.com/r/slatestarcodex/comments/ch24q9/anki_is_it_inferior_to_the_3x5_index_card_an/
https://archive.is/OaGc5
maybe not the best source for a review/advice

interesting comment(s) about tree outliners and spreadsheets: https://news.ycombinator.com/item?id=21170434
https://lightsheets.app/

tablet:
https://www.inkandswitch.com/muse-studio-for-ideas.html
https://www.inkandswitch.com/capstone-manuscript.html
https://news.ycombinator.com/item?id=20255457
hn  discussion  recommendations  software  tools  desktop  app  notetaking  exocortex  wkfly  wiki  productivity  multi  comparison  crosstab  properties  applicability-prereqs  nlp  info-foraging  chart  webapp  reference  q-n-a  retention  workflow  reddit  social  ratty  ssc  learning  studying  commentary  structure  thinking  network-structure  things  collaboration  ocr  trees  graphs  LaTeX  search  todo  project  money-for-time  synchrony  pinboard  state  duplication  worrydream  simplification-normalization  links  minimalism  design  neurons  ai-control  openai  miri-cfar  parsimony  intricacy  meta:reading  examples  prepping  new-religion  deep-materialism  techtariat  review  critique  mobile  integration-extension  interface-compatibility  api  twitter  backup  vgr  postrat  personal-finance  pragmatic  stay-organized  project-management  news  org:mag  epistemic  steel-man  explore-exploit  correlation  cost-benefit  convexity-curvature  michael-nielsen  hci  ux  oly  skunkworks  europe  germanic 
october 2019 by nhaliday
Linus's Law - Wikipedia
Linus's Law is a claim about software development, named in honor of Linus Torvalds and formulated by Eric S. Raymond in his essay and book The Cathedral and the Bazaar (1999).[1][2] The law states that "given enough eyeballs, all bugs are shallow";

--

In Facts and Fallacies about Software Engineering, Robert Glass refers to the law as a "mantra" of the open source movement, but calls it a fallacy due to the lack of supporting evidence and because research has indicated that the rate at which additional bugs are uncovered does not scale linearly with the number of reviewers; rather, there is a small maximum number of useful reviewers, between two and four, and additional reviewers above this number uncover bugs at a much lower rate.[4] While closed-source practitioners also promote stringent, independent code analysis during a software project's development, they focus on in-depth review by a few and not primarily the number of "eyeballs".[5][6]

Although detection of even deliberately inserted flaws[7][8] can be attributed to Raymond's claim, the persistence of the Heartbleed security bug in a critical piece of code for two years has been considered as a refutation of Raymond's dictum.[9][10][11][12] Larry Seltzer suspects that the availability of source code may cause some developers and researchers to perform less extensive tests than they would with closed source software, making it easier for bugs to remain.[12] In 2015, the Linux Foundation's executive director Jim Zemlin argued that the complexity of modern software has increased to such levels that specific resource allocation is desirable to improve its security. Regarding some of 2014's largest global open source software vulnerabilities, he says, "In these cases, the eyeballs weren't really looking".[11] Large scale experiments or peer-reviewed surveys to test how well the mantra holds in practice have not been performed.

Given enough eyeballs, all bugs are shallow? Revisiting Eric Raymond with bug bounty programs: https://academic.oup.com/cybersecurity/article/3/2/81/4524054

https://hbfs.wordpress.com/2009/03/31/how-many-eyeballs-to-make-a-bug-shallow/
wiki  reference  aphorism  ideas  stylized-facts  programming  engineering  linux  worse-is-better/the-right-thing  correctness  debugging  checking  best-practices  security  error  scale  ubiquity  collaboration  oss  realness  empirical  evidence-based  multi  study  info-econ  economics  intricacy  plots  manifolds  techtariat  cracker-prog  os  systems  magnitude  quantitative-qualitative  number  threat-modeling 
october 2019 by nhaliday
Shuffling - Wikipedia
The Gilbert–Shannon–Reeds model provides a mathematical model of the random outcomes of riffling, that has been shown experimentally to be a good fit to human shuffling[2] and that forms the basis for a recommendation that card decks be riffled seven times in order to randomize them thoroughly.[3] Later, mathematicians Lloyd M. Trefethen and Lloyd N. Trefethen authored a paper using a tweaked version of the Gilbert-Shannon-Reeds model showing that the minimum number of riffles for total randomization could also be 5, if the method of defining randomness is changed.[4][5]
nibble  tidbits  trivia  cocktail  wiki  reference  games  howto  random  models  math  applications  probability  math.CO  mixing  markov  sampling  best-practices  acm 
august 2019 by nhaliday
Unix philosophy - Wikipedia
1. Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new "features".
2. Expect the output of every program to become the input to another, as yet unknown, program. Don't clutter output with extraneous information. Avoid stringently columnar or binary input formats. Don't insist on interactive input.
3. Design and build software, even operating systems, to be tried early, ideally within weeks. Don't hesitate to throw away the clumsy parts and rebuild them.
4. Use tools in preference to unskilled help to lighten a programming task, even if you have to detour to build the tools and expect to throw some of them out after you've finished using them.
wiki  concept  philosophy  lens  ideas  design  system-design  programming  engineering  systems  unix  subculture  composition-decomposition  coupling-cohesion  metabuch  skeleton  hi-order-bits  summary  list  top-n  quotes  aphorism  minimalism  minimum-viable  best-practices  intricacy  parsimony  protocol-metadata 
august 2019 by nhaliday
Call graph - Wikipedia
I've found both static and dynamic versions useful (former mostly when I don't want to go thru pain of compiling something)

best options AFAICT:

C/C++ and maybe Go: https://github.com/gperftools/gperftools
https://gperftools.github.io/gperftools/cpuprofile.html

static: https://github.com/Vermeille/clang-callgraph
https://stackoverflow.com/questions/5373714/how-to-generate-a-calling-graph-for-c-code
I had to go through some extra pain to get this to work:
- if you use Homebrew LLVM (that's slightly incompatible w/ macOS c++filt, make sure to pass -n flag)
- similarly macOS sed needs two extra backslashes for each escape of the angle brackets

another option: doxygen

Go: https://stackoverflow.com/questions/31362332/creating-call-graph-in-golang
both static and dynamic in one tool

Java: https://github.com/gousiosg/java-callgraph
both static and dynamic in one tool

Python:
https://github.com/gak/pycallgraph
more up-to-date forks: https://github.com/daneads/pycallgraph2 and https://github.com/YannLuo/pycallgraph
old docs: https://pycallgraph.readthedocs.io/en/master/
I've had some trouble getting nice output from this (even just getting the right set of nodes displayed, not even taking into account layout and formatting).
- Argument parsing syntax is idiosyncratic. Just read `pycallgraph --help`.
- Options -i and -e take glob patterns (see pycallgraph2/{tracer,globbing_filter}.py), which are applied the function names qualified w/ module paths.
- Functions defined in the script you are running receive no module path. There is no easy way to filter for them using the -i and -e options.
- The --debug option gives you the graphviz for your own use instead of just writing the final image produced.

static: https://github.com/davidfraser/pyan
more up-to-date fork: https://github.com/itsayellow/pyan/
one way to good results: `pyan -dea --format yed $MODULE_FILES > output.graphml`, then open up in yEd and use hierarchical layout

various: https://github.com/jrfonseca/gprof2dot

I believe all the dynamic tools listed here support weighting nodes and edges by CPU time/samples (inclusive and exclusive of descendants) and discrete calls. In the case of the gperftools and the Java option you probably have to parse the output to get the latter, tho.

IIRC Dtrace has probes for function entry/exit. So that's an option as well.

old pin: https://github.com/nst/objc_dep
Graph the import dependancies in an Objective-C project
concept  wiki  reference  tools  devtools  graphs  trees  programming  code-dive  let-me-see  big-picture  libraries  software  recommendations  list  top-n  links  c(pp)  golang  python  javascript  jvm  stackex  q-n-a  howto  yak-shaving  visualization  dataviz  performance  structure  oss  osx  unix  linux  static-dynamic  repo  cocoa 
july 2019 by nhaliday
An Eye Tracking Study on camelCase and under_score Identifier Styles - IEEE Conference Publication
One main difference is that subjects were trained mainly in the underscore style and were all programmers. While results indicate no difference in accuracy between the two styles, subjects recognize identifiers in the underscore style more quickly.

ToCamelCaseorUnderscore: https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.158.9499
An empirical study of 135 programmers and non-programmers was conducted to better understand the impact of identifier style on code readability. The experiment builds on past work of others who study how readers of natural language perform such tasks. Results indicate that camel casing leads to higher accuracy among all subjects regardless of training, and those trained in camel casing are able to recognize identifiers in the camel case style faster than identifiers in the underscore style.

https://en.wikipedia.org/wiki/Camel_case#Readability_studies
A 2009 study comparing snake case to camel case found that camel case identifiers could be recognised with higher accuracy among both programmers and non-programmers, and that programmers already trained in camel case were able to recognise those identifiers faster than underscored snake-case identifiers.[35]

A 2010 follow-up study, under the same conditions but using an improved measurement method with use of eye-tracking equipment, indicates: "While results indicate no difference in accuracy between the two styles, subjects recognize identifiers in the underscore style more quickly."[36]
study  psychology  cog-psych  hci  programming  best-practices  stylized-facts  null-result  multi  wiki  reference  concept  empirical  evidence-based  efficiency  accuracy  time  code-organizing  grokkability  protocol-metadata  form-design  grokkability-clarity 
july 2019 by nhaliday
Factorization of polynomials over finite fields - Wikipedia
In mathematics and computer algebra the factorization of a polynomial consists of decomposing it into a product of irreducible factors. This decomposition is theoretically possible and is unique for polynomials with coefficients in any field, but rather strong restrictions on the field of the coefficients are needed to allow the computation of the factorization by means of an algorithm. In practice, algorithms have been designed only for polynomials with coefficients in a finite field, in the field of rationals or in a finitely generated field extension of one of them.

All factorization algorithms, including the case of multivariate polynomials over the rational numbers, reduce the problem to this case; see polynomial factorization. It is also used for various applications of finite fields, such as coding theory (cyclic redundancy codes and BCH codes), cryptography (public key cryptography by the means of elliptic curves), and computational number theory.

As the reduction of the factorization of multivariate polynomials to that of univariate polynomials does not have any specificity in the case of coefficients in a finite field, only polynomials with one variable are considered in this article.

...

In the algorithms that follow, the complexities are expressed in terms of number of arithmetic operations in Fq, using classical algorithms for the arithmetic of polynomials.

[ed.: Interesting choice...]

...

Factoring algorithms
Many algorithms for factoring polynomials over finite fields include the following three stages:

Square-free factorization
Distinct-degree factorization
Equal-degree factorization
An important exception is Berlekamp's algorithm, which combines stages 2 and 3.

Berlekamp's algorithm
Main article: Berlekamp's algorithm
The Berlekamp's algorithm is historically important as being the first factorization algorithm, which works well in practice. However, it contains a loop on the elements of the ground field, which implies that it is practicable only over small finite fields. For a fixed ground field, its time complexity is polynomial, but, for general ground fields, the complexity is exponential in the size of the ground field.

[ed.: This actually looks fairly implementable.]
wiki  reference  concept  algorithms  calculation  nibble  numerics  math  algebra  math.CA  fields  polynomials  levers  multiplicative  math.NT 
july 2019 by nhaliday
Mutation testing - Wikipedia
Mutation testing involves modifying a program in small ways.[1] Each mutated version is called a mutant and tests detect and reject mutants by causing the behavior of the original version to differ from the mutant. This is called killing the mutant. Test suites are measured by the percentage of mutants that they kill. New tests can be designed to kill additional mutants.
wiki  reference  concept  mutation  selection  analogy  programming  checking  formal-methods  debugging  random  list  libraries  links  functional  haskell  javascript  jvm  c(pp)  python  dotnet  oop  perturbation  static-dynamic 
july 2019 by nhaliday
PythonSpeed/PerformanceTips - Python Wiki
some are obsolete, but I think, eg, the tip about using local vars over globals is still applicable
wiki  reference  cheatsheet  objektbuch  list  programming  python  performance  pls  local-global 
june 2019 by nhaliday
Lindy effect - Wikipedia
The Lindy effect is a theory that the future life expectancy of some non-perishable things like a technology or an idea is proportional to their current age, so that every additional period of survival implies a longer remaining life expectancy.[1] Where the Lindy effect applies, mortality rate decreases with time. In contrast, living creatures and mechanical things follow a bathtub curve where, after "childhood", the mortality rate increases with time. Because life expectancy is probabilistically derived, a thing may become extinct before its "expected" survival. In other words, one needs to gauge both the age and "health" of the thing to determine continued survival.
wiki  reference  concept  metabuch  ideas  street-fighting  planning  comparison  time  distribution  flux-stasis  history  measure  correlation  arrows  branches  pro-rata  manifolds  aging  stylized-facts  age-generation  robust  technology  thinking  cost-benefit  conceptual-vocab  methodology  threat-modeling  efficiency  neurons  tools  track-record  ubiquity 
june 2019 by nhaliday
Solution concept - Wikipedia
In game theory, a solution concept is a formal rule for predicting how a game will be played. These predictions are called "solutions", and describe which strategies will be adopted by players and, therefore, the result of the game. The most commonly used solution concepts are equilibrium concepts, most famously Nash equilibrium.

Many solution concepts, for many games, will result in more than one solution. This puts any one of the solutions in doubt, so a game theorist may apply a refinement to narrow down the solutions. Each successive solution concept presented in the following improves on its predecessor by eliminating implausible equilibria in richer games.

nice diagram
concept  conceptual-vocab  list  wiki  reference  acm  game-theory  inference  equilibrium  extrema  reduction  sub-super 
may 2019 by nhaliday
Fossil: Home
VCS w/ builtin issue tracking and wiki used by SQLite
tools  devtools  software  vcs  wiki  debugging  integration-extension  oss  dbs 
may 2019 by nhaliday
What every computer scientist should know about floating-point arithmetic
Floating-point arithmetic is considered as esoteric subject by many people. This is rather surprising, because floating-point is ubiquitous in computer systems: Almost every language has a floating-point datatype; computers from PCs to supercomputers have floating-point accelerators; most compilers will be called upon to compile floating-point algorithms from time to time; and virtually every operating system must respond to floating-point exceptions such as overflow. This paper presents a tutorial on the aspects of floating-point that have a direct impact on designers of computer systems. It begins with background on floating-point representation and rounding error, continues with a discussion of the IEEE floating point standard, and concludes with examples of how computer system builders can better support floating point.

Float Toy: http://evanw.github.io/float-toy/
https://news.ycombinator.com/item?id=22113485

https://stackoverflow.com/questions/2729637/does-epsilon-really-guarantees-anything-in-floating-point-computations
"you must use an epsilon when dealing with floats" is a knee-jerk reaction of programmers with a superficial understanding of floating-point computations, for comparisons in general (not only to zero).

This is usually unhelpful because it doesn't tell you how to minimize the propagation of rounding errors, it doesn't tell you how to avoid cancellation or absorption problems, and even when your problem is indeed related to the comparison of two floats, it doesn't tell you what value of epsilon is right for what you are doing.

...

Regarding the propagation of rounding errors, there exists specialized analyzers that can help you estimate it, because it is a tedious thing to do by hand.

https://www.di.ens.fr/~cousot/projects/DAEDALUS/synthetic_summary/CEA/Fluctuat/index.html

This was part of HW1 of CS24:
https://en.wikipedia.org/wiki/Kahan_summation_algorithm
In particular, simply summing n numbers in sequence has a worst-case error that grows proportional to n, and a root mean square error that grows as {\displaystyle {\sqrt {n}}} {\sqrt {n}} for random inputs (the roundoff errors form a random walk).[2] With compensated summation, the worst-case error bound is independent of n, so a large number of values can be summed with an error that only depends on the floating-point precision.[2]

cf:
https://en.wikipedia.org/wiki/Pairwise_summation
In numerical analysis, pairwise summation, also called cascade summation, is a technique to sum a sequence of finite-precision floating-point numbers that substantially reduces the accumulated round-off error compared to naively accumulating the sum in sequence.[1] Although there are other techniques such as Kahan summation that typically have even smaller round-off errors, pairwise summation is nearly as good (differing only by a logarithmic factor) while having much lower computational cost—it can be implemented so as to have nearly the same cost (and exactly the same number of arithmetic operations) as naive summation.

In particular, pairwise summation of a sequence of n numbers xn works by recursively breaking the sequence into two halves, summing each half, and adding the two sums: a divide and conquer algorithm. Its worst-case roundoff errors grow asymptotically as at most O(ε log n), where ε is the machine precision (assuming a fixed condition number, as discussed below).[1] In comparison, the naive technique of accumulating the sum in sequence (adding each xi one at a time for i = 1, ..., n) has roundoff errors that grow at worst as O(εn).[1] Kahan summation has a worst-case error of roughly O(ε), independent of n, but requires several times more arithmetic operations.[1] If the roundoff errors are random, and in particular have random signs, then they form a random walk and the error growth is reduced to an average of {\displaystyle O(\varepsilon {\sqrt {\log n}})} O(\varepsilon {\sqrt {\log n}}) for pairwise summation.[2]

A very similar recursive structure of summation is found in many fast Fourier transform (FFT) algorithms, and is responsible for the same slow roundoff accumulation of those FFTs.[2][3]

https://eng.libretexts.org/Bookshelves/Electrical_Engineering/Book%3A_Fast_Fourier_Transforms_(Burrus)/10%3A_Implementing_FFTs_in_Practice/10.8%3A_Numerical_Accuracy_in_FFTs
However, these encouraging error-growth rates only apply if the trigonometric “twiddle” factors in the FFT algorithm are computed very accurately. Many FFT implementations, including FFTW and common manufacturer-optimized libraries, therefore use precomputed tables of twiddle factors calculated by means of standard library functions (which compute trigonometric constants to roughly machine precision). The other common method to compute twiddle factors is to use a trigonometric recurrence formula—this saves memory (and cache), but almost all recurrences have errors that grow as O(n‾√) , O(n) or even O(n2) which lead to corresponding errors in the FFT.

...

There are, in fact, trigonometric recurrences with the same logarithmic error growth as the FFT, but these seem more difficult to implement efficiently; they require that a table of Θ(logn) values be stored and updated as the recurrence progresses. Instead, in order to gain at least some of the benefits of a trigonometric recurrence (reduced memory pressure at the expense of more arithmetic), FFTW includes several ways to compute a much smaller twiddle table, from which the desired entries can be computed accurately on the fly using a bounded number (usually <3) of complex multiplications. For example, instead of a twiddle table with n entries ωkn , FFTW can use two tables with Θ(n‾√) entries each, so that ωkn is computed by multiplying an entry in one table (indexed with the low-order bits of k ) by an entry in the other table (indexed with the high-order bits of k ).

[ed.: Nicholas Higham's "Accuracy and Stability of Numerical Algorithms" seems like a good reference for this kind of analysis.]
nibble  pdf  papers  programming  systems  numerics  nitty-gritty  intricacy  approximation  accuracy  types  sci-comp  multi  q-n-a  stackex  hmm  oly-programming  accretion  formal-methods  yak-shaving  wiki  reference  algorithms  yoga  ground-up  divide-and-conquer  fourier  books  tidbits  chart  caltech  nostalgia  dynamic  calculator  visualization  protocol-metadata  identity 
may 2019 by nhaliday
its-not-software - steveyegge2
You don't work in the software industry.

...

So what's the software industry, and how do we differ from it?

Well, the software industry is what you learn about in school, and it's what you probably did at your previous company. The software industry produces software that runs on customers' machines — that is, software intended to run on a machine over which you have no control.

So it includes pretty much everything that Microsoft does: Windows and every application you download for it, including your browser.

It also includes everything that runs in the browser, including Flash applications, Java applets, and plug-ins like Adobe's Acrobat Reader. Their deployment model is a little different from the "classic" deployment models, but it's still software that you package up and release to some unknown client box.

...

Servware

Our industry is so different from the software industry, and it's so important to draw a clear distinction, that it needs a new name. I'll call it Servware for now, lacking anything better. Hardware, firmware, software, servware. It fits well enough.

Servware is stuff that lives on your own servers. I call it "stuff" advisedly, since it's more than just software; it includes configuration, monitoring systems, data, documentation, and everything else you've got there, all acting in concert to produce some observable user experience on the other side of a network connection.
techtariat  sv  tech  rhetoric  essay  software  saas  devops  engineering  programming  contrarianism  list  top-n  best-practices  applicability-prereqs  desktop  flux-stasis  homo-hetero  trends  games  thinking  checklists  dbs  models  communication  tutorial  wiki  integration-extension  frameworks  api  whole-partial-many  metrics  retrofit  c(pp)  pls  code-dive  planning  working-stiff  composition-decomposition  libraries  conceptual-vocab  amazon  system-design  cracker-prog  tech-infrastructure  blowhards  client-server  project-management 
may 2019 by nhaliday
Delta debugging - Wikipedia
good overview of with examples: https://www.csm.ornl.gov/~sheldon/bucket/Automated-Debugging.pdf

Not as useful for my usecases (mostly contest programming) as QuickCheck. Input is generally pretty structured and I don't have a long history of code in VCS. And when I do have the latter git-bisect is probably enough.

good book tho: http://www.whyprogramsfail.com/toc.php
WHY PROGRAMS FAIL: A Guide to Systematic Debugging\
wiki  reference  programming  systems  debugging  c(pp)  python  tools  devtools  links  hmm  formal-methods  divide-and-conquer  vcs  git  search  yak-shaving  pdf  white-paper  multi  examples  stories  books  unit  caltech  recommendations  advanced  correctness 
may 2019 by nhaliday
List of languages by total number of speakers - Wikipedia
- has both L1 (native speakers) and L2 (second-language speakers)
- I'm guessing most of Mandarin's L2 speakers are Chinese natives. Lots of dialects and such (Cantonese) within the country.
wiki  reference  data  list  top-n  ranking  population  scale  language  linguistics  anglo  china  asia  foreign-lang  objektbuch  india  MENA  europe  gallic  demographics  cost-benefit 
march 2019 by nhaliday
Flammarion engraving - Wikipedia
A traveller puts his head under the edge of the firmament in the original (1888) printing of the Flammarion engraving.
art  classic  wiki  history  philosophy  science  enlightenment-renaissance-restoration-reformation  mystic  religion  christianity  eden-heaven  sky  myth  tip-of-tongue 
march 2019 by nhaliday
ellipsis - Why is the subject omitted in sentences like "Thought you'd never ask"? - English Language & Usage Stack Exchange
This is due to a phenomenon that occurs in intimate conversational spoken English called "Conversational Deletion". It was discussed and exemplified quite thoroughly in a 1974 PhD dissertation in linguistics at the University of Michigan that I had the honor of directing.

Thrasher, Randolph H. Jr. 1974. Shouldn't Ignore These Strings: A Study of Conversational Deletion, Ph.D. Dissertation, Linguistics, University of Michigan, Ann Arbor

...

"The phenomenon can be viewed as erosion of the beginning of sentences, deleting (some, but not all) articles, dummies, auxiliaries, possessives, conditional if, and [most relevantly for this discussion -jl] subject pronouns. But it only erodes up to a point, and only in some cases.

"Whatever is exposed (in sentence initial position) can be swept away. If erosion of the first element exposes another vulnerable element, this too may be eroded. The process continues until a hard (non-vulnerable) element is encountered." [ibidem p.9]

Dad calls this and some similar omissions "Kiplinger style": https://en.wikipedia.org/wiki/Kiplinger
q-n-a  stackex  anglo  language  writing  speaking  linguistics  thesis  trivia  cocktail  parsimony  compression  multi  wiki  organization  technical-writing  protocol-metadata  simplification-normalization 
march 2019 by nhaliday
Citizendium, the Citizens' Compendium
That wikipedia alternative by the nerdy spurned co-founder of Jimmy Wales (Larry Sanger). Unfortunately looks rather empty.
wiki  reference  database  search  comparison  organization  duplication  socs-and-mops  the-devil  god-man-beast-victim  guilt-shame 
november 2018 by nhaliday
Lateralization of brain function - Wikipedia
Language
Language functions such as grammar, vocabulary and literal meaning are typically lateralized to the left hemisphere, especially in right handed individuals.[3] While language production is left-lateralized in up to 90% of right-handers, it is more bilateral, or even right-lateralized, in approximately 50% of left-handers.[4]

Broca's area and Wernicke's area, two areas associated with the production of speech, are located in the left cerebral hemisphere for about 95% of right-handers, but about 70% of left-handers.[5]:69

Auditory and visual processing
The processing of visual and auditory stimuli, spatial manipulation, facial perception, and artistic ability are represented bilaterally.[4] Numerical estimation, comparison and online calculation depend on bilateral parietal regions[6][7] while exact calculation and fact retrieval are associated with left parietal regions, perhaps due to their ties to linguistic processing.[6][7]

...

Depression is linked with a hyperactive right hemisphere, with evidence of selective involvement in "processing negative emotions, pessimistic thoughts and unconstructive thinking styles", as well as vigilance, arousal and self-reflection, and a relatively hypoactive left hemisphere, "specifically involved in processing pleasurable experiences" and "relatively more involved in decision-making processes".

Chaos and Order; the right and left hemispheres: https://orthosphere.wordpress.com/2018/05/23/chaos-and-order-the-right-and-left-hemispheres/
In The Master and His Emissary, Iain McGilchrist writes that a creature like a bird needs two types of consciousness simultaneously. It needs to be able to focus on something specific, such as pecking at food, while it also needs to keep an eye out for predators which requires a more general awareness of environment.

These are quite different activities. The Left Hemisphere (LH) is adapted for a narrow focus. The Right Hemisphere (RH) for the broad. The brains of human beings have the same division of function.

The LH governs the right side of the body, the RH, the left side. With birds, the left eye (RH) looks for predators, the right eye (LH) focuses on food and specifics. Since danger can take many forms and is unpredictable, the RH has to be very open-minded.

The LH is for narrow focus, the explicit, the familiar, the literal, tools, mechanism/machines and the man-made. The broad focus of the RH is necessarily more vague and intuitive and handles the anomalous, novel, metaphorical, the living and organic. The LH is high resolution but narrow, the RH low resolution but broad.

The LH exhibits unrealistic optimism and self-belief. The RH has a tendency towards depression and is much more realistic about a person’s own abilities. LH has trouble following narratives because it has a poor sense of “wholes.” In art it favors flatness, abstract and conceptual art, black and white rather than color, simple geometric shapes and multiple perspectives all shoved together, e.g., cubism. Particularly RH paintings emphasize vistas with great depth of field and thus space and time,[1] emotion, figurative painting and scenes related to the life world. In music, LH likes simple, repetitive rhythms. The RH favors melody, harmony and complex rhythms.

...

Schizophrenia is a disease of extreme LH emphasis. Since empathy is RH and the ability to notice emotional nuance facially, vocally and bodily expressed, schizophrenics tend to be paranoid and are often convinced that the real people they know have been replaced by robotic imposters. This is at least partly because they lose the ability to intuit what other people are thinking and feeling – hence they seem robotic and suspicious.

Oswald Spengler’s The Decline of the West as well as McGilchrist characterize the West as awash in phenomena associated with an extreme LH emphasis. Spengler argues that Western civilization was originally much more RH (to use McGilchrist’s categories) and that all its most significant artistic (in the broadest sense) achievements were triumphs of RH accentuation.

The RH is where novel experiences and the anomalous are processed and where mathematical, and other, problems are solved. The RH is involved with the natural, the unfamiliar, the unique, emotions, the embodied, music, humor, understanding intonation and emotional nuance of speech, the metaphorical, nuance, and social relations. It has very little speech, but the RH is necessary for processing all the nonlinguistic aspects of speaking, including body language. Understanding what someone means by vocal inflection and facial expressions is an intuitive RH process rather than explicit.

...

RH is very much the center of lived experience; of the life world with all its depth and richness. The RH is “the master” from the title of McGilchrist’s book. The LH ought to be no more than the emissary; the valued servant of the RH. However, in the last few centuries, the LH, which has tyrannical tendencies, has tried to become the master. The LH is where the ego is predominantly located. In split brain patients where the LH and the RH are surgically divided (this is done sometimes in the case of epileptic patients) one hand will sometimes fight with the other. In one man’s case, one hand would reach out to hug his wife while the other pushed her away. One hand reached for one shirt, the other another shirt. Or a patient will be driving a car and one hand will try to turn the steering wheel in the opposite direction. In these cases, the “naughty” hand is usually the left hand (RH), while the patient tends to identify herself with the right hand governed by the LH. The two hemispheres have quite different personalities.

The connection between LH and ego can also be seen in the fact that the LH is competitive, contentious, and agonistic. It wants to win. It is the part of you that hates to lose arguments.

Using the metaphor of Chaos and Order, the RH deals with Chaos – the unknown, the unfamiliar, the implicit, the emotional, the dark, danger, mystery. The LH is connected with Order – the known, the familiar, the rule-driven, the explicit, and light of day. Learning something means to take something unfamiliar and making it familiar. Since the RH deals with the novel, it is the problem-solving part. Once understood, the results are dealt with by the LH. When learning a new piece on the piano, the RH is involved. Once mastered, the result becomes a LH affair. The muscle memory developed by repetition is processed by the LH. If errors are made, the activity returns to the RH to figure out what went wrong; the activity is repeated until the correct muscle memory is developed in which case it becomes part of the familiar LH.

Science is an attempt to find Order. It would not be necessary if people lived in an entirely orderly, explicit, known world. The lived context of science implies Chaos. Theories are reductive and simplifying and help to pick out salient features of a phenomenon. They are always partial truths, though some are more partial than others. The alternative to a certain level of reductionism or partialness would be to simply reproduce the world which of course would be both impossible and unproductive. The test for whether a theory is sufficiently non-partial is whether it is fit for purpose and whether it contributes to human flourishing.

...

Analytic philosophers pride themselves on trying to do away with vagueness. To do so, they tend to jettison context which cannot be brought into fine focus. However, in order to understand things and discern their meaning, it is necessary to have the big picture, the overview, as well as the details. There is no point in having details if the subject does not know what they are details of. Such philosophers also tend to leave themselves out of the picture even when what they are thinking about has reflexive implications. John Locke, for instance, tried to banish the RH from reality. All phenomena having to do with subjective experience he deemed unreal and once remarked about metaphors, a RH phenomenon, that they are “perfect cheats.” Analytic philosophers tend to check the logic of the words on the page and not to think about what those words might say about them. The trick is for them to recognize that they and their theories, which exist in minds, are part of reality too.

The RH test for whether someone actually believes something can be found by examining his actions. If he finds that he must regard his own actions as free, and, in order to get along with other people, must also attribute free will to them and treat them as free agents, then he effectively believes in free will – no matter his LH theoretical commitments.

...

We do not know the origin of life. We do not know how or even if consciousness can emerge from matter. We do not know the nature of 96% of the matter of the universe. Clearly all these things exist. They can provide the subject matter of theories but they continue to exist as theorizing ceases or theories change. Not knowing how something is possible is irrelevant to its actual existence. An inability to explain something is ultimately neither here nor there.

If thought begins and ends with the LH, then thinking has no content – content being provided by experience (RH), and skepticism and nihilism ensue. The LH spins its wheels self-referentially, never referring back to experience. Theory assumes such primacy that it will simply outlaw experiences and data inconsistent with it; a profoundly wrong-headed approach.

...

Gödel’s Theorem proves that not everything true can be proven to be true. This means there is an ineradicable role for faith, hope and intuition in every moderately complex human intellectual endeavor. There is no one set of consistent axioms from which all other truths can be derived.

Alan Turing’s proof of the halting problem proves that there is no effective procedure for finding effective procedures. Without a mechanical decision procedure, (LH), when it comes to … [more]
gnon  reflection  books  summary  review  neuro  neuro-nitgrit  things  thinking  metabuch  order-disorder  apollonian-dionysian  bio  examples  near-far  symmetry  homo-hetero  logic  inference  intuition  problem-solving  analytical-holistic  n-factor  europe  the-great-west-whale  occident  alien-character  detail-architecture  art  theory-practice  philosophy  being-becoming  essence-existence  language  psychology  cog-psych  egalitarianism-hierarchy  direction  reason  learning  novelty  science  anglo  anglosphere  coarse-fine  neurons  truth  contradiction  matching  empirical  volo-avolo  curiosity  uncertainty  theos  axioms  intricacy  computation  analogy  essay  rhetoric  deep-materialism  new-religion  knowledge  expert-experience  confidence  biases  optimism  pessimism  realness  whole-partial-many  theory-of-mind  values  competition  reduction  subjective-objective  communication  telos-atelos  ends-means  turing  fiction  increase-decrease  innovation  creative  thick-thin  spengler  multi  ratty  hanson  complex-systems  structure  concrete  abstraction  network-s 
september 2018 by nhaliday
Science - Wikipedia
In Northern Europe, the new technology of the printing press was widely used to publish many arguments, including some that disagreed widely with contemporary ideas of nature. René Descartes and Francis Bacon published philosophical arguments in favor of a new type of non-Aristotelian science. Descartes emphasized individual thought and argued that mathematics rather than geometry should be used in order to study nature. Bacon emphasized the importance of experiment over contemplation. Bacon further questioned the Aristotelian concepts of formal cause and final cause, and promoted the idea that science should study the laws of "simple" natures, such as heat, rather than assuming that there is any specific nature, or "formal cause," of each complex type of thing. This new modern science began to see itself as describing "laws of nature". This updated approach to studies in nature was seen as mechanistic. Bacon also argued that science should aim for the first time at practical inventions for the improvement of all human life.

Age of Enlightenment

...

During this time, the declared purpose and value of science became producing wealth and inventions that would improve human lives, in the materialistic sense of having more food, clothing, and other things. In Bacon's words, "the real and legitimate goal of sciences is the endowment of human life with new inventions and riches", and he discouraged scientists from pursuing intangible philosophical or spiritual ideas, which he believed contributed little to human happiness beyond "the fume of subtle, sublime, or pleasing speculation".[72]
article  wiki  reference  science  philosophy  letters  history  iron-age  mediterranean  the-classics  medieval  europe  the-great-west-whale  early-modern  ideology  telos-atelos  ends-means  new-religion  weird  enlightenment-renaissance-restoration-reformation  culture  the-devil  anglo  big-peeps  giants  religion  theos  tip-of-tongue  hmm  truth  dirty-hands  engineering  roots  values  formal-values  quotes  causation  forms-instances  technology  logos 
august 2018 by nhaliday
State (polity) - Wikipedia
https://en.wikipedia.org/wiki/State_formation
In the medieval period (500-1400) in Europe, there were a variety of authority forms throughout the region. These included feudal lords, empires, religious authorities, free cities, and other authorities.[42] Often dated to the 1648 Peace of Westphalia, there began to be the development in Europe of modern states with large-scale capacity for taxation, coercive control of their populations, and advanced bureaucracies.[43] The state became prominent in Europe over the next few centuries before the particular form of the state spread to the rest of the world via the colonial and international pressures of the 19th century and 20th century.[44] Other modern states developed in Africa and Asia prior to colonialism, but were largely displaced by colonial rule.[45]

...

Two related theories are based on military development and warfare, and the role that these forces played in state formation. Charles Tilly developed an argument that the state developed largely as a result of "state-makers" who sought to increase the taxes they could gain from the people under their control so they could continue fighting wars.[42] According to Tilly, the state makes war and war makes states.[49] In the constant warfare of the centuries in Europe, coupled with expanded costs of war with mass armies and gunpowder, warlords had to find ways to finance war and control territory more effectively. The modern state presented the opportunity for them to develop taxation structures, the coercive structure to implement that taxation, and finally the guarantee of protection from other states that could get much of the population to agree.[50] Taxes and revenue raising have been repeatedly pointed out as a key aspect of state formation and the development of state capacity. Economist Nicholas Kaldor emphasized on the importance of revenue raising and warned about the dangers of the dependence on foreign aid.[51] Tilly argues, state making is similar to organized crime because it is a "quintessential protection racket with the advantage of legitimacy."[52]

State of nature: https://en.wikipedia.org/wiki/State_of_nature
Thomas Hobbes
The pure state of nature or "the natural condition of mankind" was deduced by the 17th century English philosopher Thomas Hobbes, in Leviathan and in his earlier work On the Citizen.[4] Hobbes argued that all humans are by nature equal in faculties of body and mind (i.e., no natural inequalities are so great as to give anyone a "claim" to an exclusive "benefit"). From this equality and other causes [example needed]in human nature, everyone is naturally willing to fight one another: so that "during the time men live without a common power to keep them all in awe, they are in that condition which is called warre; and such a warre as is of every man against every man". In this state every person has a natural right or liberty to do anything one thinks necessary for preserving one's own life; and life is "solitary, poor, nasty, brutish, and short" (Leviathan, Chapters XIII–XIV). Hobbes described this natural condition with the Latin phrase bellum omnium contra omnes (meaning war of all against all), in his work De Cive.

Within the state of nature there is neither personal property nor injustice since there is no law, except for certain natural precepts discovered by reason ("laws of nature"): the first of which is "that every man ought to endeavour peace, as far as he has hope of obtaining it" (Leviathan, Ch. XIV); and the second is "that a man be willing, when others are so too, as far forth as for peace and defence of himself he shall think it necessary, to lay down this right to all things; and be contented with so much liberty against other men as he would allow other men against himself" (loc. cit.). From here Hobbes develops the way out of the state of nature into political society and government, by mutual contracts.

According to Hobbes the state of nature exists at all times among independent countries, over whom there is no law except for those same precepts or laws of nature (Leviathan, Chapters XIII, XXX end). His view of the state of nature helped to serve as a basis for theories of international law and relations.[5]

John Locke
John Locke considers the state of nature in his Second Treatise on Civil Government written around the time of the Exclusion Crisis in England during the 1680s. For Locke, in the state of nature all men are free "to order their actions, and dispose of their possessions and persons, as they think fit, within the bounds of the law of nature." (2nd Tr., §4). "The state of Nature has a law of Nature to govern it", and that law is reason. Locke believes that reason teaches that "no one ought to harm another in his life, liberty, and or property" (2nd Tr., §6) ; and that transgressions of this may be punished. Locke describes the state of nature and civil society to be opposites of each other, and the need for civil society comes in part from the perpetual existence of the state of nature.[6] This view of the state of nature is partly deduced from Christian belief (unlike Hobbes, whose philosophy is not dependent upon any prior theology).

Although it may be natural to assume that Locke was responding to Hobbes, Locke never refers to Hobbes by name, and may instead have been responding to other writers of the day, like Robert Filmer.[7] In fact, Locke's First Treatise is entirely a response to Filmer's Patriarcha, and takes a step by step method to refuting Filmer's theory set out in Patriarcha. The conservative party at the time had rallied behind Filmer's Patriarcha, whereas the Whigs, scared of another prosecution of Anglicans and Protestants, rallied behind the theory set out by Locke in his Two Treatises of Government as it gave a clear theory as to why the people would be justified in overthrowing a monarchy which abuses the trust they had placed in it.[citation needed]

...

Jean-Jacques Rousseau
Hobbes' view was challenged in the eighteenth century by Jean-Jacques Rousseau, who claimed that Hobbes was taking socialized people and simply imagining them living outside of the society in which they were raised. He affirmed instead that people were neither good nor bad, but were born as a blank slate, and later society and the environment influence which way we lean. In Rousseau's state of nature, people did not know each other enough to come into serious conflict and they did have normal values. The modern society, and the ownership it entails, is blamed for the disruption of the state of nature which Rousseau sees as true freedom.[9]

https://en.wikipedia.org/wiki/Sovereignty
Ulpian's statements were known in medieval Europe, but sovereignty was an important concept in medieval times.[1] Medieval monarchs were not sovereign, at least not strongly so, because they were constrained by, and shared power with, their feudal aristocracy.[1] Furthermore, both were strongly constrained by custom.[1]

Sovereignty existed during the Medieval period as the de jure rights of nobility and royalty, and in the de facto capability of individuals to make their own choices in life.[citation needed]

...

Reformation

Sovereignty reemerged as a concept in the late 16th century, a time when civil wars had created a craving for stronger central authority, when monarchs had begun to gather power onto their own hands at the expense of the nobility, and the modern nation state was emerging. Jean Bodin, partly in reaction to the chaos of the French wars of religion, presented theories of sovereignty calling for strong central authority in the form of absolute monarchy. In his 1576 treatise Les Six Livres de la République ("Six Books of the Republic") Bodin argued that it is inherent in the nature of the state that sovereignty must be:[1]

- Absolute: On this point he said that the sovereign must be hedged in with obligations and conditions, must be able to legislate without his (or its) subjects' consent, must not be bound by the laws of his predecessors, and could not, because it is illogical, be bound by his own laws.
- Perpetual: Not temporarily delegated as to a strong leader in an emergency or to a state employee such as a magistrate. He held that sovereignty must be perpetual because anyone with the power to enforce a time limit on the governing power must be above the governing power, which would be impossible if the governing power is absolute.

Bodin rejected the notion of transference of sovereignty from people to the ruler (also known as the sovereign); natural law and divine law confer upon the sovereign the right to rule. And the sovereign is not above divine law or natural law. He is above (ie. not bound by) only positive law, that is, laws made by humans. He emphasized that a sovereign is bound to observe certain basic rules derived from the divine law, the law of nature or reason, and the law that is common to all nations (jus gentium), as well as the fundamental laws of the state that determine who is the sovereign, who succeeds to sovereignty, and what limits the sovereign power. Thus, Bodin’s sovereign was restricted by the constitutional law of the state and by the higher law that was considered as binding upon every human being.[1] The fact that the sovereign must obey divine and natural law imposes ethical constraints on him. Bodin also held that the lois royales, the fundamental laws of the French monarchy which regulated matters such as succession, are natural laws and are binding on the French sovereign.

...

Age of Enlightenment
During the Age of Enlightenment, the idea of sovereignty gained both legal and moral force as the main Western description of the meaning and power of a State. In particular, the "Social contract" as a mechanism for establishing sovereignty was suggested and, by 1800, widely accepted, especially in the new United States and France, though also in Great Britain to a lesser extent.

Thomas Hobbes, in Leviathan (1651) arrived a conception of sovereignty … [more]
concept  conceptual-vocab  wiki  reference  leviathan  elite  government  institutions  politics  polisci  philosophy  antidemos  spatial  correlation  intersection-connectedness  geography  matching  nationalism-globalism  whole-partial-many  big-peeps  the-classics  morality  ethics  good-evil  order-disorder  history  iron-age  mediterranean  medieval  feudal  europe  the-great-west-whale  occident  china  asia  sinosphere  n-factor  democracy  authoritarianism  property-rights  civil-liberty  alien-character  crosstab  law  maps  lexical  multi  allodium 
august 2018 by nhaliday
Roman naming conventions - Wikipedia
The distinguishing feature of Roman nomenclature was the use of both personal names and regular surnames. Throughout Europe and the Mediterranean, other ancient civilizations distinguished individuals through the use of single personal names, usually dithematic in nature. Consisting of two distinct elements, or "themes", these names allowed for hundreds or even thousands of possible combinations. But a markedly different system of nomenclature arose in Italy, where the personal name was joined by a hereditary surname. Over time, this binomial system expanded to include additional names and designations.[1][2]

https://en.wikipedia.org/wiki/Gens
In ancient Rome, a gens (/ˈɡɛns/ or /ˈdʒɛnz/), plural gentes, was a family consisting of all those individuals who shared the same nomen and claimed descent from a common ancestor. A branch of a gens was called a stirps (plural stirpes). The gens was an important social structure at Rome and throughout Italy during the period of the Roman Republic. Much of an individual's social standing depended on the gens to which he belonged. Certain gentes were considered patrician, others plebeian, while some had both patrician and plebeian branches. The importance of membership in a gens declined considerably in imperial times.[1][2]

...

The word gens is sometimes translated as "race" or "nation", meaning a people descended from a common ancestor (rather than sharing a common physical trait). It can also be translated as "clan" or "tribe", although the word tribus has a separate and distinct meaning in Roman culture. A gens could be as small as a single family, or could include hundreds of individuals. According to tradition, in 479 BC the gens Fabia alone were able to field a militia consisting of three hundred and six men of fighting age. The concept of the gens was not uniquely Roman, but was shared with communities throughout Italy, including those who spoke Italic languages such as Latin, Oscan, and Umbrian as well as the Etruscans. All of these peoples were eventually absorbed into the sphere of Roman culture.[1][2][3][4]

...

Persons could be adopted into a gens and acquire its nomen. A libertus, or "freedman", usually assumed the nomen (and sometimes also the praenomen) of the person who had manumitted him, and a naturalized citizen usually took the name of the patron who granted his citizenship. Freedmen and newly enfranchised citizens were not technically part of the gentes whose names they shared, but within a few generations it often became impossible to distinguish their descendants from the original members. In practice this meant that a gens could acquire new members and even new branches, either by design or by accident.[1][2][7]

Ancient Greek personal names: https://en.wikipedia.org/wiki/Ancient_Greek_personal_names
Ancient Greeks usually had one name, but another element was often added in semi-official contexts or to aid identification: a father’s name (patronym) in the genitive case, or in some regions as an adjectival formulation. A third element might be added, indicating the individual’s membership in a particular kinship or other grouping, or city of origin (when the person in question was away from that city). Thus the orator Demosthenes, while proposing decrees in the Athenian assembly, was known as "Demosthenes, son of Demosthenes of Paiania"; Paiania was the deme or regional sub-unit of Attica to which he belonged by birth. If Americans used that system, Abraham Lincoln would have been called "Abraham, son of Thomas of Kentucky" (where he was born). In some rare occasions, if a person was illegitimate or fathered by a non-citizen, they might use their mother's name (metronym) instead of their father's. Ten days after a birth, relatives on both sides were invited to a sacrifice and feast called dekátē (δεκάτη), 'tenth day'; on this occasion the father formally named the child.[3]

...

In many contexts, etiquette required that respectable women be spoken of as the wife or daughter of X rather than by their own names.[6] On gravestones or dedications, however, they had to be identified by name. Here, the patronymic formula "son of X" used for men might be replaced by "wife of X", or supplemented as "daughter of X, wife of Y".

Many women bore forms of standard masculine names, with a feminine ending substituted for the masculine. Many standard names related to specific masculine achievements had a common feminine equivalent; the counterpart of Nikomachos, "victorious in battle", would be Nikomachē. The taste mentioned above for giving family members related names was one motive for the creation of such feminine forms. There were also feminine names with no masculine equivalent, such as Glykera "sweet one"; Hedistē "most delightful".
wiki  history  iron-age  mediterranean  the-classics  conquest-empire  culture  language  foreign-lang  social-norms  kinship  class  legacy  democracy  status  multi  gender  syntax  protocol-metadata 
august 2018 by nhaliday
Reconsidering epistemological scepticism – Dividuals
I blogged before about how I consider an epistemological scepticism fully compatible with being conservative/reactionary. By epistemological scepticism I mean the worldview where concepts, categories, names, classes aren’t considered real, just useful ways to categorize phenomena, but entirely mental constructs, basically just tools. I think you can call this nominalism as well. The nominalism-realism debate was certainly about this. What follows is the pro-empirical worldview where logic and reasoning is considered highly fallible: hence you don’t think and don’t argue too much, you actually look and check things instead. You rely on experience, not reasoning.

...

Anyhow, the argument is that there are classes, which are indeed artificial, and there are kinds, which are products of natural forces, products of causality.

...

And the deeper – Darwinian – argument, unspoken but obvious, is that any being with a model of reality that does not conform to such real clumps, gets eaten by a grue.

This is impressive. It seems I have to extend my one-variable epistemology to a two-variable epistemology.

My former epistemology was that we generally categorize things according to their uses or dangers for us. So “chair” is – very roughly – defined as “anything we can sit on”. Similarly, we can categorize “predator” as “something that eats us or the animals that are useful for us”.

The unspoken argument against this is that the universe or the biosphere exists neither for us nor against us. A fox can eat your rabbits and a lion can eat you, but they don’t exist just for the sake of making your life difficult.

Hence, if you interpret phenomena only from the viewpoint of their uses or dangers for humans, you get only half the picture right. The other half is what it really is and where it came from.

Copying is everything: https://dividuals.wordpress.com/2015/12/14/copying-is-everything/
Philosophy professor Ruth Millikan’s insight that everything that gets copied from an ancestor has a proper function or teleofunction: it is whatever feature or function that made it and its ancestor selected for copying, in competition with all the other similar copiable things. This would mean Aristotelean teleology is correct within the field of copyable things, replicators, i.e. within biology, although in physics still obviously incorrect.

Darwinian Reactionary drew attention to it two years ago and I still don’t understand why didn’t it generate a bigger buzz. It is an extremely important insight.

I mean, this is what we were waiting for, a proper synthesis of science and philosophy, and a proper way to rescue Aristotelean teleology, which leads to so excellent common-sense predictions that intuitively it cannot be very wrong, yet modern philosophy always denied it.

The result from that is the briding of the fact-value gap and burying the naturalistic fallacy: we CAN derive values from facts: a thing is good if it is well suitable for its natural purpose, teleofunction or proper function, which is the purpose it was selected for and copied for, the purpose and the suitability for the purpose that made the ancestors of this thing selected for copying, instead of all the other potential, similar ancestors.

...

What was humankind selected for? I am afraid, the answer is kind of ugly.

Men were selected to compete between groups, the cooperate within groups largely for coordinating for the sake of this competition, and have a low-key competition inside the groups as well for status and leadership. I am afraid, intelligence is all about organizing elaborate tribal raids: “coalitionary arms races”. The most civilized case, least brutal but still expensive case is arms races in prestige status, not dominance status: when Ancient Athens buildt pretty buildings and modern France built the TGV and America sent a man to the Moon in order to gain “gloire” i.e. the prestige type respect and status amongst the nations, the larger groups of mankind. If you are the type who doesn’t like blood, you should probably focus on these kinds of civilized, prestige-project competitions.

Women were selected for bearing children, for having strong and intelligent sons therefore having these heritable traits themselves (HBD kind of contradicts the more radically anti-woman aspects of RedPillery: marry a weak and stupid but attractive silly-blondie type woman and your son’s won’t be that great either), for pleasuring men and in some rarer but existing cases, to be true companions and helpers of their husbands.

https://en.wikipedia.org/wiki/Four_causes
- Matter: a change or movement's material cause, is the aspect of the change or movement which is determined by the material that composes the moving or changing things. For a table, that might be wood; for a statue, that might be bronze or marble.
- Form: a change or movement's formal cause, is a change or movement caused by the arrangement, shape or appearance of the thing changing or moving. Aristotle says for example that the ratio 2:1, and number in general, is the cause of the octave.
- Agent: a change or movement's efficient or moving cause, consists of things apart from the thing being changed or moved, which interact so as to be an agency of the change or movement. For example, the efficient cause of a table is a carpenter, or a person working as one, and according to Aristotle the efficient cause of a boy is a father.
- End or purpose: a change or movement's final cause, is that for the sake of which a thing is what it is. For a seed, it might be an adult plant. For a sailboat, it might be sailing. For a ball at the top of a ramp, it might be coming to rest at the bottom.

https://en.wikipedia.org/wiki/Proximate_and_ultimate_causation
A proximate cause is an event which is closest to, or immediately responsible for causing, some observed result. This exists in contrast to a higher-level ultimate cause (or distal cause) which is usually thought of as the "real" reason something occurred.

...

- Ultimate causation explains traits in terms of evolutionary forces acting on them.
- Proximate causation explains biological function in terms of immediate physiological or environmental factors.
gnon  philosophy  ideology  thinking  conceptual-vocab  forms-instances  realness  analytical-holistic  bio  evolution  telos-atelos  distribution  nature  coarse-fine  epistemic  intricacy  is-ought  values  duplication  nihil  the-classics  big-peeps  darwinian  deep-materialism  selection  equilibrium  subjective-objective  models  classification  smoothness  discrete  schelling  optimization  approximation  comparison  multi  peace-violence  war  coalitions  status  s-factor  fashun  reputation  civilization  intelligence  competition  leadership  cooperate-defect  within-without  within-group  group-level  homo-hetero  new-religion  causation  direct-indirect  ends-means  metabuch  physics  axioms  skeleton  wiki  reference  concept  being-becoming  essence-existence  logos  real-nominal 
july 2018 by nhaliday
Dying and Rising Gods - Dictionary definition of Dying and Rising Gods | Encyclopedia.com: FREE online dictionary
https://en.wikipedia.org/wiki/Dying-and-rising_deity
While the concept of a "dying-and-rising god" has a longer history, it was significantly advocated by Frazer's Golden Bough (1906–1914). At first received very favourably, the idea was attacked by Roland de Vaux in 1933, and was the subject of controversial debate over the following decades.[31] One of the leading scholars in the deconstruction of Frazer's "dying-and-rising god" category was Jonathan Z. Smith, whose 1969 dissertation discusses Frazer's Golden Bough,[32] and who in Mircea Eliade's 1987 Encyclopedia of religion wrote the "Dying and rising gods" entry, where he dismisses the category as "largely a misnomer based on imaginative reconstructions and exceedingly late or highly ambiguous texts", suggesting a more detailed categorisation into "dying gods" and "disappearing gods", arguing that before Christianity, the two categories were distinct and gods who "died" did not return, and those who returned never truly "died".[33][34] Smith gave a more detailed account of his views specifically on the question of parallels to Christianity in Drudgery Divine (1990).[35] Smith's 1987 article was widely received, and during the 1990s, scholarly consensus seemed to shift towards his rejection of the concept as oversimplified, although it continued to be invoked by scholars writing about Ancient Near Eastern mythology.[36] As of 2009, the Encyclopedia of Psychology and Religion summarizes the current scholarly consensus as ambiguous, with some scholars rejecting Frazer's "broad universalist category" preferring to emphasize the differences between the various traditions, while others continue to view the category as applicable.[9] Gerald O'Collins states that surface-level application of analogous symbolism is a case of parallelomania which exaggerate the importance of trifling resemblances, long abandoned by mainstream scholars.[37]

Beginning with an overview of the Athenian ritual of growing and withering herb gardens at the Adonis festival, in his book The Gardens of Adonis Marcel Detienne suggests that rather than being a stand-in for crops in general (and therefore the cycle of death and rebirth), these herbs (and Adonis) were part of a complex of associations in the Greek mind that centered on spices.[38] These associations included seduction, trickery, gourmandizing, and the anxieties of childbirth.[39] From his point of view, Adonis's death is only one datum among the many that must be used to analyze the festival, the myth, and the god.[39][40]
wiki  reference  myth  ritual  religion  christianity  theos  conquest-empire  intricacy  contrarianism  error  gavisti  culture  europe  mediterranean  history  iron-age  the-classics  MENA  leadership  government  gender  sex  cycles  death  mystic  multi  sexuality  food  correlation  paganism 
june 2018 by nhaliday
Moravec's paradox - Wikipedia
Moravec's paradox is the discovery by artificial intelligence and robotics researchers that, contrary to traditional assumptions, high-level reasoning requires very little computation, but low-level sensorimotor skills require enormous computational resources. The principle was articulated by Hans Moravec, Rodney Brooks, Marvin Minsky and others in the 1980s. As Moravec writes, "it is comparatively easy to make computers exhibit adult level performance on intelligence tests or playing checkers, and difficult or impossible to give them the skills of a one-year-old when it comes to perception and mobility".[1]

Similarly, Minsky emphasized that the most difficult human skills to reverse engineer are those that are unconscious. "In general, we're least aware of what our minds do best", he wrote, and added "we're more aware of simple processes that don't work well than of complex ones that work flawlessly".[2]

...

One possible explanation of the paradox, offered by Moravec, is based on evolution. All human skills are implemented biologically, using machinery designed by the process of natural selection. In the course of their evolution, natural selection has tended to preserve design improvements and optimizations. The older a skill is, the more time natural selection has had to improve the design. Abstract thought developed only very recently, and consequently, we should not expect its implementation to be particularly efficient.

As Moravec writes:

Encoded in the large, highly evolved sensory and motor portions of the human brain is a billion years of experience about the nature of the world and how to survive in it. The deliberate process we call reasoning is, I believe, the thinnest veneer of human thought, effective only because it is supported by this much older and much more powerful, though usually unconscious, sensorimotor knowledge. We are all prodigious olympians in perceptual and motor areas, so good that we make the difficult look easy. Abstract thought, though, is a new trick, perhaps less than 100 thousand years old. We have not yet mastered it. It is not all that intrinsically difficult; it just seems so when we do it.[3]

A compact way to express this argument would be:

- We should expect the difficulty of reverse-engineering any human skill to be roughly proportional to the amount of time that skill has been evolving in animals.
- The oldest human skills are largely unconscious and so appear to us to be effortless.
- Therefore, we should expect skills that appear effortless to be difficult to reverse-engineer, but skills that require effort may not necessarily be difficult to engineer at all.
concept  wiki  reference  paradox  ai  intelligence  reason  instinct  neuro  psychology  cog-psych  hardness  logic  deep-learning  time  evopsych  evolution  sapiens  the-self  EEA  embodied  embodied-cognition  abstraction  universalism-particularism  gnosis-logos  robotics 
june 2018 by nhaliday
Eliminative materialism - Wikipedia
Eliminative materialism (also called eliminativism) is the claim that people's common-sense understanding of the mind (or folk psychology) is false and that certain classes of mental states that most people believe in do not exist.[1] It is a materialist position in the philosophy of mind. Some supporters of eliminativism argue that no coherent neural basis will be found for many everyday psychological concepts such as belief or desire, since they are poorly defined. Rather, they argue that psychological concepts of behaviour and experience should be judged by how well they reduce to the biological level.[2] Other versions entail the non-existence of conscious mental states such as pain and visual perceptions.[3]

Eliminativism about a class of entities is the view that that class of entities does not exist.[4] For example, materialism tends to be eliminativist about the soul; modern chemists are eliminativist about phlogiston; and modern physicists are eliminativist about the existence of luminiferous aether. Eliminative materialism is the relatively new (1960s–1970s) idea that certain classes of mental entities that common sense takes for granted, such as beliefs, desires, and the subjective sensation of pain, do not exist.[5][6] The most common versions are eliminativism about propositional attitudes, as expressed by Paul and Patricia Churchland,[7] and eliminativism about qualia (subjective interpretations about particular instances of subjective experience), as expressed by Daniel Dennett and Georges Rey.[3] These philosophers often appeal to an introspection illusion.

In the context of materialist understandings of psychology, eliminativism stands in opposition to reductive materialism which argues that mental states as conventionally understood do exist, and that they directly correspond to the physical state of the nervous system.[8][need quotation to verify] An intermediate position is revisionary materialism, which will often argue that the mental state in question will prove to be somewhat reducible to physical phenomena—with some changes needed to the common sense concept.

Since eliminative materialism claims that future research will fail to find a neuronal basis for various mental phenomena, it must necessarily wait for science to progress further. One might question the position on these grounds, but other philosophers like Churchland argue that eliminativism is often necessary in order to open the minds of thinkers to new evidence and better explanations.[8]
concept  conceptual-vocab  philosophy  ideology  thinking  metameta  weird  realness  psychology  cog-psych  neurons  neuro  brain-scan  reduction  complex-systems  cybernetics  wiki  reference  parallax  truth  dennett  within-without  the-self  subjective-objective  absolute-relative  deep-materialism  new-religion  identity  analytical-holistic  systematic-ad-hoc  science  theory-practice  theory-of-mind  applicability-prereqs  nihil  lexical 
april 2018 by nhaliday
Theological differences between the Catholic Church and the Eastern Orthodox Church - Wikipedia
Did the Filioque Ruin the West?: https://contingentnotarbitrary.com/2017/06/15/the-filioque-ruined-the-west/
The theology of the filioque makes the Father and the Son equal as sources of divinity. Flattening the hierarchy implicit in the Trinity does away with the Monarchy of the Father: the family relationship becomes less patriarchal and more egalitarian. The Son, with his humanity, mercy, love and sacrifice, is no longer subordinate to the Father, while the Father – the God of the Old Testament, law and tradition – is no longer sovereign. Looks like the change would elevate egalitarianism, compassion, humanity and self-sacrifice while undermining hierarchy, rules, family and tradition. Sound familiar?
article  wiki  reference  philosophy  backup  religion  christianity  theos  ideology  comparison  nitty-gritty  intricacy  europe  the-great-west-whale  occident  russia  MENA  orient  letters  epistemic  truth  science  logic  inference  guilt-shame  volo-avolo  causation  multi  gnon  eastern-europe  roots  explanans  enlightenment-renaissance-restoration-reformation  modernity  egalitarianism-hierarchy  love-hate  free-riding  cooperate-defect  gender  justice  law  tradition  legacy  parenting  ascetic  altruism  farmers-and-foragers  protestant-catholic  exegesis-hermeneutics 
april 2018 by nhaliday
Theory of Self-Reproducing Automata - John von Neumann
Fourth Lecture: THE ROLE OF HIGH AND OF EXTREMELY HIGH COMPLICATION

Comparisons between computing machines and the nervous systems. Estimates of size for computing machines, present and near future.

Estimates for size for the human central nervous system. Excursus about the “mixed” character of living organisms. Analog and digital elements. Observations about the “mixed” character of all componentry, artificial as well as natural. Interpretation of the position to be taken with respect to these.

Evaluation of the discrepancy in size between artificial and natural automata. Interpretation of this discrepancy in terms of physical factors. Nature of the materials used.

The probability of the presence of other intellectual factors. The role of complication and the theoretical penetration that it requires.

Questions of reliability and errors reconsidered. Probability of individual errors and length of procedure. Typical lengths of procedure for computing machines and for living organisms--that is, for artificial and for natural automata. Upper limits on acceptable probability of error in individual operations. Compensation by checking and self-correcting features.

Differences of principle in the way in which errors are dealt with in artificial and in natural automata. The “single error” principle in artificial automata. Crudeness of our approach in this case, due to the lack of adequate theory. More sophisticated treatment of this problem in natural automata: The role of the autonomy of parts. Connections between this autonomy and evolution.

- 10^10 neurons in brain, 10^4 vacuum tubes in largest computer at time
- machines faster: 5 ms from neuron potential to neuron potential, 10^-3 ms for vacuum tubes

https://en.wikipedia.org/wiki/John_von_Neumann#Computing
pdf  article  papers  essay  nibble  math  cs  computation  bio  neuro  neuro-nitgrit  scale  magnitude  comparison  acm  von-neumann  giants  thermo  phys-energy  speed  performance  time  density  frequency  hardware  ems  efficiency  dirty-hands  street-fighting  fermi  estimate  retention  physics  interdisciplinary  multi  wiki  links  people  🔬  atoms  duplication  iteration-recursion  turing  complexity  measure  nature  technology  complex-systems  bits  information-theory  circuits  robust  structure  composition-decomposition  evolution  mutation  axioms  analogy  thinking  input-output  hi-order-bits  coding-theory  flexibility  rigidity  automata-languages 
april 2018 by nhaliday
John Dee - Wikipedia
John Dee (13 July 1527 – 1608 or 1609) was an English mathematician, astronomer, astrologer, occult philosopher,[5] and advisor to Queen Elizabeth I. He devoted much of his life to the study of alchemy, divination, and Hermetic philosophy. He was also an advocate of England's imperial expansion into a "British Empire", a term he is generally credited with coining.[6]

Dee straddled the worlds of modern science and magic just as the former was emerging. One of the most learned men of his age, he had been invited to lecture on the geometry of Euclid at the University of Paris while still in his early twenties. Dee was an ardent promoter of mathematics and a respected astronomer, as well as a leading expert in navigation, having trained many of those who would conduct England's voyages of discovery.

Simultaneously with these efforts, Dee immersed himself in the worlds of magic, astrology and Hermetic philosophy. He devoted much time and effort in the last thirty years or so of his life to attempting to commune with angels in order to learn the universal language of creation and bring about the pre-apocalyptic unity of mankind. However, Robert Hooke suggested in the chapter Of Dr. Dee's Book of Spirits, that John Dee made use of Trithemian steganography, to conceal his communication with Elizabeth I.[7] A student of the Renaissance Neo-Platonism of Marsilio Ficino, Dee did not draw distinctions between his mathematical research and his investigations into Hermetic magic, angel summoning and divination. Instead he considered all of his activities to constitute different facets of the same quest: the search for a transcendent understanding of the divine forms which underlie the visible world, which Dee called "pure verities".

In his lifetime, Dee amassed one of the largest libraries in England. His high status as a scholar also allowed him to play a role in Elizabethan politics. He served as an occasional advisor and tutor to Elizabeth I and nurtured relationships with her ministers Francis Walsingham and William Cecil. Dee also tutored and enjoyed patronage relationships with Sir Philip Sidney, his uncle Robert Dudley, 1st Earl of Leicester, and Edward Dyer. He also enjoyed patronage from Sir Christopher Hatton.

https://twitter.com/Logo_Daedalus/status/985203144044040192
https://archive.is/h7ibQ
mind meld

Leave Me Alone! Misanthropic Writings from the Anti-Social Edge
people  big-peeps  old-anglo  wiki  history  early-modern  britain  anglosphere  optimate  philosophy  mystic  deep-materialism  science  aristos  math  geometry  conquest-empire  nietzschean  religion  christianity  theos  innovation  the-devil  forms-instances  god-man-beast-victim  gnosis-logos  expansionism  age-of-discovery  oceans  frontier  multi  twitter  social  commentary  backup  pic  memes(ew)  gnon  🐸  books  literature 
april 2018 by nhaliday
Christian ethics - Wikipedia
Christian ethics is a branch of Christian theology that defines virtuous behavior and wrong behavior from a Christian perspective. Systematic theological study of Christian ethics is called moral theology, possibly with the name of the respective theological tradition, e.g. Catholic moral theology.

Christian virtues are often divided into four cardinal virtues and three theological virtues. Christian ethics includes questions regarding how the rich should act toward the poor, how women are to be treated, and the morality of war. Christian ethicists, like other ethicists, approach ethics from different frameworks and perspectives. The approach of virtue ethics has also become popular in recent decades, largely due to the work of Alasdair MacIntyre and Stanley Hauerwas.[2]

...

The seven Christian virtues are from two sets of virtues. The four cardinal virtues are Prudence, Justice, Restraint (or Temperance), and Courage (or Fortitude). The cardinal virtues are so called because they are regarded as the basic virtues required for a virtuous life. The three theological virtues, are Faith, Hope, and Love (or Charity).

- Prudence: also described as wisdom, the ability to judge between actions with regard to appropriate actions at a given time
- Justice: also considered as fairness, the most extensive and most important virtue[20]
- Temperance: also known as restraint, the practice of self-control, abstention, and moderation tempering the appetition
- Courage: also termed fortitude, forebearance, strength, endurance, and the ability to confront fear, uncertainty, and intimidation
- Faith: belief in God, and in the truth of His revelation as well as obedience to Him (cf. Rom 1:5:16:26)[21][22]
- Hope: expectation of and desire of receiving; refraining from despair and capability of not giving up. The belief that God will be eternally present in every human's life and never giving up on His love.
- Charity: a supernatural virtue that helps us love God and our neighbors, the same way as we love ourselves.

Seven deadly sins: https://en.wikipedia.org/wiki/Seven_deadly_sins
The seven deadly sins, also known as the capital vices or cardinal sins, is a grouping and classification of vices of Christian origin.[1] Behaviours or habits are classified under this category if they directly give birth to other immoralities.[2] According to the standard list, they are pride, greed, lust, envy, gluttony, wrath, and sloth,[2] which are also contrary to the seven virtues. These sins are often thought to be abuses or excessive versions of one's natural faculties or passions (for example, gluttony abuses one's desire to eat).

originally:
1 Gula (gluttony)
2 Luxuria/Fornicatio (lust, fornication)
3 Avaritia (avarice/greed)
4 Superbia (pride, hubris)
5 Tristitia (sorrow/despair/despondency)
6 Ira (wrath)
7 Vanagloria (vainglory)
8 Acedia (sloth)

Golden Rule: https://en.wikipedia.org/wiki/Golden_Rule
The Golden Rule (which can be considered a law of reciprocity in some religions) is the principle of treating others as one would wish to be treated. It is a maxim that is found in many religions and cultures.[1][2] The maxim may appear as _either a positive or negative injunction_ governing conduct:

- One should treat others as one would like others to treat oneself (positive or directive form).[1]
- One should not treat others in ways that one would not like to be treated (negative or prohibitive form).[1]
- What you wish upon others, you wish upon yourself (empathic or responsive form).[1]
The Golden Rule _differs from the maxim of reciprocity captured in do ut des—"I give so that you will give in return"—and is rather a unilateral moral commitment to the well-being of the other without the expectation of anything in return_.[3]

The concept occurs in some form in nearly every religion[4][5] and ethical tradition[6] and is often considered _the central tenet of Christian ethics_[7] [8]. It can also be explained from the perspectives of psychology, philosophy, sociology, human evolution, and economics. Psychologically, it involves a person empathizing with others. Philosophically, it involves a person perceiving their neighbor also as "I" or "self".[9] Sociologically, "love your neighbor as yourself" is applicable between individuals, between groups, and also between individuals and groups. In evolution, "reciprocal altruism" is seen as a distinctive advance in the capacity of human groups to survive and reproduce, as their exceptional brains demanded exceptionally long childhoods and ongoing provision and protection even beyond that of the immediate family.[10] In economics, Richard Swift, referring to ideas from David Graeber, suggests that "without some kind of reciprocity society would no longer be able to exist."[11]

...

hmm, Meta-Golden Rule already stated:
Seneca the Younger (c. 4 BC–65 AD), a practitioner of Stoicism (c. 300 BC–200 AD) expressed the Golden Rule in his essay regarding the treatment of slaves: "Treat your inferior as you would wish your superior to treat you."[23]

...

The "Golden Rule" was given by Jesus of Nazareth, who used it to summarize the Torah: "Do to others what you want them to do to you." and "This is the meaning of the law of Moses and the teaching of the prophets"[33] (Matthew 7:12 NCV, see also Luke 6:31). The common English phrasing is "Do unto others as you would have them do unto you". A similar form of the phrase appeared in a Catholic catechism around 1567 (certainly in the reprint of 1583).[34] The Golden Rule is _stated positively numerous times in the Hebrew Pentateuch_ as well as the Prophets and Writings. Leviticus 19:18 ("Forget about the wrong things people do to you, and do not try to get even. Love your neighbor as you love yourself."; see also Great Commandment) and Leviticus 19:34 ("But treat them just as you treat your own citizens. Love foreigners as you love yourselves, because you were foreigners one time in Egypt. I am the Lord your God.").

The Old Testament Deuterocanonical books of Tobit and Sirach, accepted as part of the Scriptural canon by Catholic Church, Eastern Orthodoxy, and the Non-Chalcedonian Churches, express a _negative form_ of the golden rule:

"Do to no one what you yourself dislike."

— Tobit 4:15
"Recognize that your neighbor feels as you do, and keep in mind your own dislikes."

— Sirach 31:15
Two passages in the New Testament quote Jesus of Nazareth espousing the _positive form_ of the Golden rule:

Matthew 7:12
Do to others what you want them to do to you. This is the meaning of the law of Moses and the teaching of the prophets.

Luke 6:31
Do to others what you would want them to do to you.

...

The passage in the book of Luke then continues with Jesus answering the question, "Who is my neighbor?", by telling the parable of the Good Samaritan, indicating that "your neighbor" is anyone in need.[35] This extends to all, including those who are generally considered hostile.

Jesus' teaching goes beyond the negative formulation of not doing what one would not like done to themselves, to the positive formulation of actively doing good to another that, if the situations were reversed, one would desire that the other would do for them. This formulation, as indicated in the parable of the Good Samaritan, emphasizes the needs for positive action that brings benefit to another, not simply restraining oneself from negative activities that hurt another. Taken as a rule of judgment, both formulations of the golden rule, the negative and positive, are equally applicable.[36]

The Golden Rule: Not So Golden Anymore: https://philosophynow.org/issues/74/The_Golden_Rule_Not_So_Golden_Anymore
Pluralism is the most serious problem facing liberal democracies today. We can no longer ignore the fact that cultures around the world are not simply different from one another, but profoundly so; and the most urgent area in which this realization faces us is in the realm of morality. Western democratic systems depend on there being at least a minimal consensus concerning national values, especially in regard to such things as justice, equality and human rights. But global communication, economics and the migration of populations have placed new strains on Western democracies. Suddenly we find we must adjust to peoples whose suppositions about the ultimate values and goals of life are very different from ours. A clear lesson from events such as 9/11 is that disregarding these differences is not an option. Collisions between worldviews and value systems can be cataclysmic. Somehow we must learn to manage this new situation.

For a long time, liberal democratic optimism in the West has been shored up by suppositions about other cultures and their differences from us. The cornerpiece of this optimism has been the assumption that whatever differences exist they cannot be too great. A core of ‘basic humanity’ surely must tie all of the world’s moral systems together – and if only we could locate this core we might be able to forge agreements and alliances among groups that otherwise appear profoundly opposed. We could perhaps then shelve our cultural or ideological differences and get on with the more pleasant and productive business of celebrating our core agreement. One cannot fail to see how this hope is repeated in order buoy optimism about the Middle East peace process, for example.

...

It becomes obvious immediately that no matter how widespread we want the Golden Rule to be, there are some ethical systems that we have to admit do not have it. In fact, there are a few traditions that actually disdain the Rule. In philosophy, the Nietzschean tradition holds that the virtues implicit in the Golden Rule are antithetical to the true virtues of self-assertion and the will-to-power. Among religions, there are a good many that prefer to emphasize the importance of self, cult, clan or tribe rather than of general others; and a good many other religions for whom large populations are simply excluded from goodwill, being labeled as outsiders, heretics or … [more]
article  letters  philosophy  morality  ethics  formal-values  religion  christianity  theos  n-factor  europe  the-great-west-whale  occident  justice  war  peace-violence  janus  virtu  list  sanctity-degradation  class  lens  wealth  gender  sex  sexuality  multi  concept  wiki  reference  theory-of-mind  ideology  cooperate-defect  coordination  psychology  cog-psych  social-psych  emotion  cybernetics  ecology  deep-materialism  new-religion  hsu  scitariat  aphorism  quotes  stories  fiction  gedanken  altruism  parasites-microbiome  food  diet  nutrition  individualism-collectivism  taxes  government  redistribution  analogy  lol  troll  poast  death  long-short-run  axioms  judaism  islam  tribalism  us-them  kinship  interests  self-interest  dignity  civil-liberty  values  homo-hetero  diversity  unintended-consequences  within-without  increase-decrease  signum  ascetic  axelrod  guilt-shame  patho-altruism  history  iron-age  mediterranean  the-classics  robust  egalitarianism-hierarchy  intricacy  hypocrisy  parable  roots  explanans  crux  s 
april 2018 by nhaliday
Society of Mind - Wikipedia
A core tenet of Minsky's philosophy is that "minds are what brains do". The society of mind theory views the human mind and any other naturally evolved cognitive systems as a vast society of individually simple processes known as agents. These processes are the fundamental thinking entities from which minds are built, and together produce the many abilities we attribute to minds. The great power in viewing a mind as a society of agents, as opposed to the consequence of some basic principle or some simple formal system, is that different agents can be based on different types of processes with different purposes, ways of representing knowledge, and methods for producing results.

This idea is perhaps best summarized by the following quote:

What magical trick makes us intelligent? The trick is that there is no trick. The power of intelligence stems from our vast diversity, not from any single, perfect principle. —Marvin Minsky, The Society of Mind, p. 308

https://en.wikipedia.org/wiki/Modularity_of_mind

The modular organization of human anatomical
brain networks: Accounting for the cost of wiring: https://www.mitpressjournals.org/doi/pdfplus/10.1162/NETN_a_00002
Brain networks are expected to be modular. However, existing techniques for estimating a network’s modules make it difficult to assess the influence of organizational principles such as wiring cost reduction on the detected modules. Here we present a modification of an existing module detection algorithm that allowed us to focus on connections that are unexpected under a cost-reduction wiring rule and to identify modules from among these connections. We applied this technique to anatomical brain networks and showed that the modules we detected differ from those detected using the standard technique. We demonstrated that these novel modules are spatially distributed, exhibit unique functional fingerprints, and overlap considerably with rich clubs, giving rise to an alternative and complementary interpretation of the functional roles of specific brain regions. Finally, we demonstrated that, using the modified module detection approach, we can detect modules in a developmental dataset that track normative patterns of maturation. Collectively, these findings support the hypothesis that brain networks are composed of modules and provide additional insight into the function of those modules.
books  ideas  speculation  structure  composition-decomposition  complex-systems  neuro  ai  psychology  cog-psych  intelligence  reduction  wiki  giants  philosophy  number  cohesion  diversity  systematic-ad-hoc  detail-architecture  pdf  study  neuro-nitgrit  brain-scan  nitty-gritty  network-structure  graphs  graph-theory  models  whole-partial-many  evopsych  eden  reference  psych-architecture  article  coupling-cohesion  multi 
april 2018 by nhaliday
Ultimate fate of the universe - Wikipedia
The fate of the universe is determined by its density. The preponderance of evidence to date, based on measurements of the rate of expansion and the mass density, favors a universe that will continue to expand indefinitely, resulting in the "Big Freeze" scenario below.[8] However, observations are not conclusive, and alternative models are still possible.[9]

Big Freeze or heat death
Main articles: Future of an expanding universe and Heat death of the universe
The Big Freeze is a scenario under which continued expansion results in a universe that asymptotically approaches absolute zero temperature.[10] This scenario, in combination with the Big Rip scenario, is currently gaining ground as the most important hypothesis.[11] It could, in the absence of dark energy, occur only under a flat or hyperbolic geometry. With a positive cosmological constant, it could also occur in a closed universe. In this scenario, stars are expected to form normally for 1012 to 1014 (1–100 trillion) years, but eventually the supply of gas needed for star formation will be exhausted. As existing stars run out of fuel and cease to shine, the universe will slowly and inexorably grow darker. Eventually black holes will dominate the universe, which themselves will disappear over time as they emit Hawking radiation.[12] Over infinite time, there would be a spontaneous entropy decrease by the Poincaré recurrence theorem, thermal fluctuations,[13][14] and the fluctuation theorem.[15][16]

A related scenario is heat death, which states that the universe goes to a state of maximum entropy in which everything is evenly distributed and there are no gradients—which are needed to sustain information processing, one form of which is life. The heat death scenario is compatible with any of the three spatial models, but requires that the universe reach an eventual temperature minimum.[17]
physics  big-picture  world  space  long-short-run  futurism  singularity  wiki  reference  article  nibble  thermo  temperature  entropy-like  order-disorder  death  nihil  bio  complex-systems  cybernetics  increase-decrease  trends  computation  local-global  prediction  time  spatial  spreading  density  distribution  manifolds  geometry  janus 
april 2018 by nhaliday
« earlier      
per page:    204080120160

Copy this bookmark:





to read