recentpopularlog in

nhaliday : blowhards   51

How I Choose What To Read — David Perell
unaffiliated  advice  reflection  checklists  metabuch  learning  studying  info-foraging  skeleton  books  heuristic  contrarianism  ubiquity  time  track-record  thinking  blowhards  bret-victor  worrydream  list  top-n  recommendations  arbitrage  trust  aphorism  meta:reading  prioritizing  judgement 
november 2019 by nhaliday
Modules Matter Most | Existential Type
note comment from gasche (significant OCaml contributor) critiquing modules vs typeclasses:
I also think you’re unfair to type classes. You’re right that they are not completely satisfying as a modularity tool, but your presentation make them sound bad in all aspects, which is certainly not true. The limitation of only having one instance per type may be a strong one, but it allows for a level of impliciteness that is just nice. There is a reason why, for example, monads are relatively nice to use in Haskell, while using monads represented as modules in a SML/OCaml programs is a real pain.

It’s a fact that type-classes are widely adopted and used in the Haskell circles, while modules/functors are only used for relatively coarse-gained modularity in the ML community. It should tell you something useful about those two features: they’re something that current modules miss (or maybe a trade-off between flexibility and implicitness that plays against modules for “modularity in the small”), and it’s dishonest and rude to explain the adoption difference by “people don’t know any better”.
nibble  org:bleg  techtariat  programming  pls  plt  ocaml-sml  functional  haskell  types  composition-decomposition  coupling-cohesion  engineering  structure  intricacy  arrows  matching  network-structure  degrees-of-freedom  linearity  nonlinearity  span-cover  direction  multi  poast  expert-experience  blowhards  static-dynamic  protocol-metadata  cmu 
july 2019 by nhaliday
When to use C over C++, and C++ over C? - Software Engineering Stack Exchange
You pick C when
- you need portable assembler (which is what C is, really) for whatever reason,
- your platform doesn't provide C++ (a C compiler is much easier to implement),
- you need to interact with other languages that can only interact with C (usually the lowest common denominator on any platform) and your code consists of little more than the interface, not making it worth to lay a C interface over C++ code,
- you hack in an Open Source project (many of which, for various reasons, stick to C),
- you don't know C++.
In all other cases you should pick C++.


At the same time, I have to say that @Toll's answers (for one obvious example) have things just about backwards in most respects. Reasonably written C++ will generally be at least as fast as C, and often at least a little faster. Readability is generally much better, if only because you don't get buried in an avalanche of all the code for even the most trivial algorithms and data structures, all the error handling, etc.


As it happens, C and C++ are fairly frequently used together on the same projects, maintained by the same people. This allows something that's otherwise quite rare: a study that directly, objectively compares the maintainability of code written in the two languages by people who are equally competent overall (i.e., the exact same people). At least in the linked study, one conclusion was clear and unambiguous: "We found that using C++ instead of C results in improved software quality and reduced maintenance effort..."


(Side-note: Check out Linus Torvads' rant on why he prefers C to C++. I don't necessarily agree with his points, but it gives you insight into why people might choose C over C++. Rather, people that agree with him might choose C for these reasons.)

Why would anybody use C over C++? [closed]:
Joel's answer is good for reasons you might have to use C, though there are a few others:
- You must meet industry guidelines, which are easier to prove and test for in C.
- You have tools to work with C, but not C++ (think not just about the compiler, but all the support tools, coverage, analysis, etc)
- Your target developers are C gurus
- You're writing drivers, kernels, or other low level code
- You know the C++ compiler isn't good at optimizing the kind of code you need to write
- Your app not only doesn't lend itself to be object oriented, but would be harder to write in that form

In some cases, though, you might want to use C rather than C++:
- You want the performance of assembler without the trouble of coding in assembler (C++ is, in theory, capable of 'perfect' performance, but the compilers aren't as good at seeing optimizations a good C programmer will see)
- The software you're writing is trivial, or nearly so - whip out the tiny C compiler, write a few lines of code, compile and you're all set - no need to open a huge editor with helpers, no need to write practically empty and useless classes, deal with namespaces, etc. You can do nearly the same thing with a C++ compiler and simply use the C subset, but the C++ compiler is slower, even for tiny programs.
- You need extreme performance or small code size, and know the C++ compiler will actually make it harder to accomplish due to the size and performance of the libraries
- You contend that you could just use the C subset and compile with a C++ compiler, but you'll find that if you do that you'll get slightly different results depending on the compiler.

Regardless, if you're doing that, you're using C. Is your question really "Why don't C programmers use C++ compilers?" If it is, then you either don't understand the language differences, or you don't understand compiler theory.


- Because they already know C
- Because they're building an embedded app for a platform that only has a C compiler
- Because they're maintaining legacy software written in C
- You're writing something on the level of an operating system, a relational database engine, or a retail 3D video game engine.
q-n-a  stackex  programming  engineering  pls  best-practices  impetus  checklists  c(pp)  systems  assembly  compilers  hardware  embedded  oss  links  study  evidence-based  devtools  performance  rant  expert-experience  types  blowhards  linux  git  vcs  debate  rhetoric  worse-is-better/the-right-thing  cracker-prog  multi  metal-to-virtual  interface-compatibility 
may 2019 by nhaliday
its-not-software - steveyegge2
You don't work in the software industry.


So what's the software industry, and how do we differ from it?

Well, the software industry is what you learn about in school, and it's what you probably did at your previous company. The software industry produces software that runs on customers' machines — that is, software intended to run on a machine over which you have no control.

So it includes pretty much everything that Microsoft does: Windows and every application you download for it, including your browser.

It also includes everything that runs in the browser, including Flash applications, Java applets, and plug-ins like Adobe's Acrobat Reader. Their deployment model is a little different from the "classic" deployment models, but it's still software that you package up and release to some unknown client box.



Our industry is so different from the software industry, and it's so important to draw a clear distinction, that it needs a new name. I'll call it Servware for now, lacking anything better. Hardware, firmware, software, servware. It fits well enough.

Servware is stuff that lives on your own servers. I call it "stuff" advisedly, since it's more than just software; it includes configuration, monitoring systems, data, documentation, and everything else you've got there, all acting in concert to produce some observable user experience on the other side of a network connection.
techtariat  sv  tech  rhetoric  essay  software  saas  devops  engineering  programming  contrarianism  list  top-n  best-practices  applicability-prereqs  desktop  flux-stasis  homo-hetero  trends  games  thinking  checklists  dbs  models  communication  tutorial  wiki  integration-extension  frameworks  api  whole-partial-many  metrics  retrofit  c(pp)  pls  code-dive  planning  working-stiff  composition-decomposition  libraries  conceptual-vocab  amazon  system-design  cracker-prog  tech-infrastructure  blowhards  client-server  project-management 
may 2019 by nhaliday
Harnessing Evolution - with Bret Weinstein | Virtual Futures Salon - YouTube
- ways to get out of Malthusian conditions: expansion to new frontiers, new technology, redistribution/theft
- some discussion of existential risk
- wants to change humanity's "purpose" to one that would be safe in the long run; important thing is it has to be ESS (maybe he wants a singleton?)
- not too impressed by transhumanism (wouldn't identify with a brain emulation)
video  interview  thiel  expert-experience  evolution  deep-materialism  new-religion  sapiens  cultural-dynamics  anthropology  evopsych  sociality  ecology  flexibility  biodet  behavioral-gen  self-interest  interests  moloch  arms  competition  coordination  cooperate-defect  frontier  expansionism  technology  efficiency  thinking  redistribution  open-closed  zero-positive-sum  peace-violence  war  dominant-minority  hypocrisy  dignity  sanctity-degradation  futurism  environment  climate-change  time-preference  long-short-run  population  scale  earth  hidden-motives  game-theory  GT-101  free-riding  innovation  leviathan  malthus  network-structure  risk  existence  civil-liberty  authoritarianism  tribalism  us-them  identity-politics  externalities  unintended-consequences  internet  social  media  pessimism  universalism-particularism  energy-resources  biophysical-econ  politics  coalitions  incentives  attention  epistemic  biases  blowhards  teaching  education  emotion  impetus  comedy  expression-survival  economics  farmers-and-foragers  ca 
april 2018 by nhaliday
The Flynn effect for verbal and visuospatial short-term and working memory: A cross-temporal meta-analysis
Specifically, the Flynn effect was found for forward digit span (r = 0.12, p < 0.01) and forward Corsi block span (r = 0.10, p < 0.01). Moreover, an anti-Flynn effect was found for backward digit span (r = − 0.06, p < 0.01) and for backward Corsi block span (r = − 0.17, p < 0.01). Overall, the results support co-occurrence theories that predict simultaneous secular gains in specialized abilities and declines in g. The causes of the differential trajectories are further discussed.
study  psychology  cog-psych  psychometrics  iq  trends  dysgenics  flynn  psych-architecture  meta-analysis  multi  albion  scitariat  summary  commentary  blowhards  mental-math  science-anxiety  news  org:sci 
- the genetic book of the dead [Dawkins]
- complementarity [Frank Wilczek]
- relative information
- effective theory [Lisa Randall]
- affordances [Dennett]
- spontaneous symmetry breaking
- relatedly, equipoise [Nicholas Christakis]
- case-based reasoning
- population reasoning (eg, common law)
- criticality [Cesar Hidalgo]
- Haldan's law of the right size (!SCALE!)
- polygenic scores
- non-ergodic
- ansatz
- state [Aaronson]:
- transfer learning
- effect size
- satisficing
- scaling
- the breeder's equation [Greg Cochran]
- impedance matching

- reciprocal altruism
- life history [Plomin]
- intellectual honesty [Sam Harris]
- coalitional instinct (interesting claim: building coalitions around "rationality" actually makes it more difficult to update on new evidence as it makes you look like a bad person, eg, the Cathedral)
basically same:


interesting timing. how woke is this dude?
org:edge  2017  technology  discussion  trends  list  expert  science  top-n  frontier  multi  big-picture  links  the-world-is-just-atoms  metameta  🔬  scitariat  conceptual-vocab  coalitions  q-n-a  psychology  social-psych  anthropology  instinct  coordination  duty  power  status  info-dynamics  cultural-dynamics  being-right  realness  cooperate-defect  westminster  chart  zeitgeist  rot  roots  epistemic  rationality  meta:science  analogy  physics  electromag  geoengineering  environment  atmosphere  climate-change  waves  information-theory  bits  marginal  quantum  metabuch  homo-hetero  thinking  sapiens  genetics  genomics  evolution  bio  GT-101  low-hanging  minimum-viable  dennett  philosophy  cog-psych  neurons  symmetry  humility  life-history  social-structure  GWAS  behavioral-gen  biodet  missing-heritability  ergodic  machine-learning  generalization  west-hunter  population-genetics  methodology  blowhards  spearhead  group-level  scale  magnitude  business  scaling-tech  tech  business-models  optimization  effect-size  aaronson  state  bare-hands  problem-solving  politics 
may 2017 by nhaliday
Could you explain the character of Fat Tony in Antifragile by Taleb? - Quora
Dr. John can make gigantic errors that affect other people by ignoring reality in favor of assumptions. Fat Tony makes smaller errors that affect only himself, but more seriously (they kill him).
q-n-a  qra  aphorism  jargon  analogy  narrative  blowhards  outcome-risk  noise-structure 
may 2017 by nhaliday
Sustainability | West Hunter
There have been societies that functioned for a long time, thousands of years. They had sustainable demographic patterns. That means that they had enough children to replace themselves – not necessarily in every generation, but over the long haul. But sustainability requires more than that. Long-lived civilizations [ones with cities, literacy, governments, and all that] had a pattern of natural selection that didn’t drastically decrease intelligence – in some cases, one that favored it, at least in some subgroups. There was also ongoing selection against mutational accumulation – which meant that individuals with more genetic load than than average were significantly less likely to survive and reproduce. Basically, this happened through high child mortality, and in some cases by lower fitness in lower socioeconomic classes [starvation]. There was nothing fun about it.

Modern industrialized societies are failing on all three counts. Every population that can make a decent cuckoo clock has below-replacement fertility. The demographic pattern also selects against intelligence, something like one IQ point a generation. And, even if people at every level of intelligence had the same number of children, so that there was no selection against IQ, we would still be getting more and messed up, because there’s not enough selection going on to counter ongoing mutations.

It is possible that some country, or countries, will change in a way that avoids civilizational collapse. I doubt if this will happen by voluntary action. Some sort of technological solution might also arise – but it has to be soon.

Bruce Charlton, Victorian IQ, Episcopalians, military officers:
Again, I don’t believe a word of it. As for the declining rate of innovation, you have to have a really wide-ranging understanding of modern science and technology to have any feeling for what the underlying causes are. I come closer than most, and I probably don’t know enough. You don’t know enough. Let me tell you one thing: if genetic potential IQ for IQ had dropped 1 std, we’d see the end of progress in higher mathematics, and that has not happened at all.

Moreover, the selective trends disfavoring IQ all involve higher education among women and apparently nothing else – a trend which didn’t really get started until much more recently.

Not long enough, nor is dysgenic selection strong enough.

ranting on libertarians:
About 40% of those Americans with credit cards keep a balance on their credit cards and pay ridiculous high interest. But that must be the right decision!
” then that is their decision” – that’s fucking obvious. The question is whether they tend to make decisions that work very well – saying ‘that is their decision” is exactly the kind of crap I was referring to. As for “they probably have it coming” – if I’m smarter than you, which I surely am, using those smarts to rook you in every possible way must be just peachy. In fact, I’ll bet I could manage it even after warning you in advance.

On average, families in this country have paid between 10% and 14% of their income in debt service over the past few decades. That fraction averages considerably higher in low-income families – more like 18%. A quarter of those low income families are putting over 40% of their income into debt service. That’s mostly stuff other than credit-card debt.

Is this Straussian?

Examining Arguments Made by Interest Rate Cap Advocates:
Interest rate caps on $1,000 installment loans, by US state, today and in 1935
west-hunter  civilization  dysgenics  fertility  legacy  risk  mutation  genetic-load  discussion  rant  iq  demographics  gnon  sapiens  trends  malthus  leviathan  long-short-run  science-anxiety  error  biodet  duty  s:*  malaise  big-picture  debt  randy-ayndy  recent-selection  demographic-transition  order-disorder  deep-materialism  🌞  age-generation  scitariat  rhythm  allodium  behavioral-gen  nihil  zeitgeist  rot  the-bones  prudence  darwinian  flux-stasis  counter-revolution  modernity  microfoundations  multi  poast  civil-liberty  is-ought  track-record  time-preference  temperance  patience  antidemos  money  compensation  class  coming-apart  pro-rata  behavioral-econ  blowhards  history  early-modern  britain  religion  christianity  protestant-catholic  gender  science  innovation  frontier  the-trenches  speedometer  military  elite  optimate  data  intervention  aphorism  alt-inst  ethics  morality  straussian  intelligence  class-warfare  authoritarianism  hari-seldon  interests  crooked  twitter  social  back 
march 2017 by nhaliday
Social Epistasis Amplifies the Fitness Costs of Deleterious Mutations, Engendering Rapid Fitness Decline Among Modernized Populations | SpringerLink
- Michael A. Woodley

We argue that in social species, interorganismal gene-gene interactions, which in previous literatures have been termed social epistasis, allow genomes carrying deleterious mutations to reduce via group-level pleiotropy the fitness of others, including noncarriers. This fitness reduction occurs by way of degradation of group-level processes that optimize the reproductive ecology of a population for intergroup competition through, among other mechanisms, suppression of free-riding.


Fitness indicators theory (Houle 2000; Miller 2000) predicts that the behavioral and physiological condition of prospective partners strongly influences female mate choice in particular, as these constitute honest indicators of underlying genetic quality. Furthermore, as deleterious mutations are pleiotropic (i.e., they can influence the development of multiple traits simultaneously), they are a source of genetic correlation among diverse behavioral and physiological domains, yielding a latent general fitness factor( f ). This optimizes the efficiency of sexual selection, as selection for quality with respect to one domain will increase the probability of selection for quality “across the board” (Houle 2000; Miller 2000). If purifying selection is primarily cryptic—working by virtue of those lower in f simply being less successful in competition for mates and therefore producing fewer offspring relative to those higher in the factor—then considerably less reproductive failure is needed to solve the mutation load paradox (19% instead of 88% based on simulations in Leseque et al. 2012).


Theoretical work involving humans suggests a loss of intrinsic fitness of around 1% per generation in the populations of modernized countries (Lynch 2016; Muller 1950). Thus, these might yet be undergoing mutational meltdown, albeit very gradually (i.e., over the course of centuries)


An interesting observation is that the fitness of the populations of modernized nations does appear to be rapidly decreasing—although not in a manner consonant with the direct action of deleterious mutations on the fitness of individuals (as per the mutation load paradox).


Increased education has furthermore encouraged individuals to trade fertility against opportunities to enhance their social status and earning power, with the largest fitness losses occurring among those with high status who potentially carry fewer deleterious mutations (i.e., by virtue of possessing higher levels of traits that exhibit some sensitivity to mutation load, such as general intelligence; Spain et al. 2015; Woodley of Menie et al. 2016a). Hitherto not considered is the possibility that the demographic transition represents a potential change in the fitness characteristics of the group-level extended phenotype of modernized populations, indicating that there might exist pathways through which deleterious mutations that accumulate due to ecological mildness could pathologically alter fertility tradeoffs in ways that might account for the maladaptive aspects of the fertility transition (e.g., subreplacement fertility; Basten, Lutz and Scherbov, 2013).


Cooperation, though offering significant fitness benefits to individual organisms and groups, involves some costs for cooperators in order to realize mutual gains for all parties. Free riders are individuals that benefit from cooperation without suffering any of the costs needed to sustain it. Hence, free riders enjoy a fitness advantage relative to cooperators via the former’s parasitism on the latter.


The balance of selection can alternate between the different levels depending on the sorts of selective challenges that a population encounters. For example, group selection may operate on human populations during times of intergroup conflict (i.e., warfare), whereas during times of peace, selection may tend to favor the fitness of individuals instead (Woodley and Figueredo 2013; Wilson 2002). A major factor that seems to permit group-level selection to be viable under certain ecological regimes is the existence of free-rider controls, i.e., features of the group’s social ecology that curb the reproductive fitness of the carriers of “selfish” genetic variants (MacDonald 1994; Wilson 2002).


High-status individuals participate in the generation and vertical cultural transmission of free-rider controls—these take the form of religious and ideological systems which make a virtue out of behaviors that overtly benefit the group, and a vice out of those that only favor individual-level fitness, via the promotion of ethnocentrism, martyrdom, and displays of commitment (MacDonald 1994, 2009, 2010; Wilson 2002). Humans are furthermore equipped with specialized mental adaptations for coordinating as part of a group, such as effortful control—the ability to override implicit behavioral drives via the use of explicit processing systems, which allow them to regulate their behavior based on what is optimal for the group (MacDonald 2008). The interaction between individuals of different degrees of status, i.e., those that generate and maintain cultural norms and those who are merely subject to them, therefore constitutes a form of social epistasis, as the complex patterns of interactions among genomes that characterize human culture have the effect of regulating both individual- and group-level (via the curbing of free-riding) fitness (MacDonald 2009, 2010).

Mutations that push the behavior of high-status individuals away from the promotion of group-selected norms may promote a breakdown of or otherwise alter these social epistatic interactions, causing dysregulation of the group’s reproductive ecology. Behavioral changes are furthermore a highly likely consequence of mutation accumulation, as “behavior” (construed broadly) is a large potential target for new mutations (Miller 2000; Lynch 2016) 1 owing to the fact that approximately 84% of all genes in the human genome are involved in some aspect of brain development and/or maintenance (Hawrylycz et al. 2012).

Consistent with the theorized role of group-level (cultural) regulatory processes in the maintenance of fitness optima, positive correlations exist between religiosity (a major freerider control; MacDonald 1994; Wilson 2002) and fertility, both at the individual differences and cross-cultural levels (Meisenberg 2011). Religiosity has declined in modernized nations—a process that has gone hand-in-hand with the rise of a values system called postmaterialism (Inglehart 1977), which is characterized by the proliferation of individualistic, secular, and antihierarchical values (Welzel 2013). The holding of these values is negatively associated with fertility, both at the individual level (when measured as political liberalism; Goldstone et al. 2011) and across time and cultures (Inglehart and Appel 1989). The rise of postmaterialist values is also associated with increasingly delayed onset of reproduction (Klien 1990) which directly increases the (population) mutation load.

Pathological Altruism

Some of the values embodied in postmaterialism have been linked to the pathological altruism phenomenon, i.e., forms of altruism that damage the intended recipients or givers of largesse (Oakley et al. 2012; Oakley 2013). Virtues associated with altruism such as kindness, fidelity, magnanimity, and heroism, along with quasi-moral traits associated with personality and mental health, may be under sexual selection and might therefore be sensitive, through the f factor, to the deleterious effects of accumulating mutations (Miller 2007).


Another form of pathologically altruistic behavior that Oakley (2013) documents is self-righteousness, which may be increasing, consistent with secular trend data indicating elevated levels of self-regarding behavior among Western populations (sometimes called the narcissism epidemic; Twenge and Campbell 2009). This sort of behavior constitutes a key component of the clever silly phenomenon in which the embrace of counterfactual beliefs is used to leverage social status via virtue signaling (e.g., the conflation of moral equality among individuals, sexes, and populations with biological equality) (Dutton and van der Linden 2015; Charlton 2009; Woodley 2010). There may be a greater number of influential persons inclined to disseminate such beliefs, in that the prevalence of phenotypes disposed toward egoistic behaviors may have increased in Western populations (per Twenge and coworkers’ research), and because egoists, specifically Machiavellians and narcissists, appear advantaged in the acquisition of elite societal stations (Spurk et al. 2015).

[Do Bad Guys Get Ahead or Fall Behind? Relationships of the Dark Triad of Personality With Objective and Subjective Career Success:

After controlling for other relevant variables (i.e., gender, age, job tenure, organization size, education, and work hours), narcissism was positively related to salary, Machiavellianism was positively related to leadership position and career satisfaction, and psychopathy was negatively related to all analyzed outcomes.]


By altering cultural norms, elite egoists may encourage the efflorescence of selfish behaviors against which some older and once highly influential cultural systems acted. For example, Christianity in various forms strongly promoted personal sacrifice for the good of groups and proscribed egoistic behaviors (Rubin 2015), but has declined significantly in terms of cultural power following modernization (Inglehart 1977). Thus, it is possible that a feedback loop exists wherein deleterious mutation accumulation raises population levels of egoism, either directly or indirectly, via the breakdown of developmental constraints on personality canalization; the resultantly greater number of egoists are then able to exploit relevant personality traits to attain positions of sociocultural influence; and through these … [more]
study  speculation  models  biodet  bio  sapiens  evolution  genetic-load  paternal-age  the-monster  slippery-slope  society  social-structure  free-riding  coordination  EGT  dynamical  🌞  fertility  dysgenics  eh  self-control  obesity  altruism  mutation  multi  twitter  social  commentary  perturbation  gnon  new-religion  science-anxiety  population-genetics  biophysical-econ  hmm  discipline  autism  scitariat  clown-world  epidemiology  malaise  sociology  demographic-transition  blowhards  model-organism  nonlinearity  civilization  expression-survival  universalism-particularism  order-disorder  trends  deep-materialism  values  ideology  domestication  cohesion  christopher-lasch  scale  patho-altruism  social-capital  behavioral-gen  madisonian  chart  nihil  aristos  piracy  theos  cultural-dynamics  roots  zeitgeist  rot  the-bones  counter-revolution  pdf  modernity  microfoundations  video  presentation  religion  christianity  health  longevity  ethnocentrism  genetic-correlation  👽  instinct 
march 2017 by nhaliday
Arthur Schopenhauer: Chapter XXIV, On Reading and Books
Ignorance degrades a man only when it is found in company with wealth. A poor man is subdued by his poverty and distress; with him his work takes the place of knowledge and occupies his thoughts. On the other hand, the wealthy who are ignorant live merely for their pleasures and are like animals, as can be seen every day. Moreover, there is the reproach that wealth and leisure have not been used for that which bestows on them the greatest possible value.

When we read, someone else thinks for us; we repeat merely his mental process. It is like the pupil who, when learning to write, goes over with his pen the strokes made in pencil by the teacher. Accordingly, when we read, the work of thinking is for the most part taken away from us. Hence the noticeable relief when from preoccupation with our thoughts we pass to reading. But while we are reading our mind is really only the playground of other people’s ideas; and when these finally depart, what remains? The result is that, whoever reads very much and almost the entire day but at intervals amuses himself with thoughtless pastime, gradually loses the ability to think for himself; just as a man who always rides ultimately forgets how to walk. But such is the case with very many scholars; they have read themselves stupid. [...]
pdf  essay  big-peeps  history  early-modern  europe  philosophy  info-foraging  rant  rhetoric  meta:rhetoric  signal-noise  prioritizing  literature  learning  info-dynamics  aristos  lol  aphorism  germanic  the-classics  attention  studying  blowhards  retention  class  realness  thinking  neurons  letters 
march 2017 by nhaliday
Neurodiversity | West Hunter
Having an accurate evaluation of a syndrome as a generally bad thing isn’t equivalent to attacking those with that syndrome. Being a leper is a bad thing, not just another wonderful flavor of humanity [insert hot tub joke] , but that doesn’t mean that we have to spend our spare time playing practical jokes on lepers, tempting though that is.. Leper hockey. We can cure leprosy, and we are right to do so. Preventing deafness through rubella vaccination was the right thing too – deafness sucks. And so on. As we get better at treating and preventing, humans are going to get more uniform – and that’s a good thing. Back to normalcy!

interesting discussion of mutational load:
I was thinking again about the consequences of having more small-effect deleterious mutations than average. I don’t think that they would push hard in a particular direction in phenotype space – I don’t believe they would make you look weird, but by definition they would be bad for you, reduce fitness. I remembered a passage in a book by Steve Stirling, in which our heroine felt as if her brain ‘was moving like a mechanism of jewels and steel precisely formed.’ It strikes me that a person with an extra dollop of this kind of genetic load wouldn’t feel like that. And of course that heroine did have low genetic load, being the product of millennia of selective breeding, not to mention an extra boost from the Invisible Crown.
Well, what does the distribution of fitness burden by frequency look like for deleterious mutations of a given fitness penalty?
It’s proportional to the mutation rate for that class. There is reason to believe that there are more ways to moderately or slightly screw up a protein than to really ruin it, which indicates that mild mutations make up most load in protein-coding sequences. More of the genome is made up of conserved regulatory sequences, but mutations there probably have even milder effects, since few mutations in non-coding sequences cause a serious Mendelian disease.
I have wondered if there was some sort of evolutionary tradeoff between muscles and brains over the past hundred thousand years through dystrophin’s dual role. There is some evidence of recent positive selection among proteins that interact with dystrophin, such as DTNBP1 and DTNA.

Any novel environment where higher intelligence can accrue more caloric energy than brute strength alone (see: the invention of the bow) should relax the selection pressure for muscularity. The Neanderthals didn’t fare so well with the brute strength strategy.
Sure: that’s what you might call an inevitable tradeoff, a consequence of the laws of physics. Just as big guys need more food. But because of the way our biochemistry is wired, there can be tradeoffs that exist but are not inevitable consequences of the laws of physics – particularly likely when a gene has two fairly different functions, as they often do.
west-hunter  discussion  morality  philosophy  evolution  sapiens  psychology  psychiatry  disease  neuro  scitariat  ideology  rhetoric  diversity  prudence  genetic-load  autism  focus  👽  multi  poast  mutation  equilibrium  scifi-fantasy  rant  🌞  paternal-age  perturbation  nibble  ideas  iq  quotes  aphorism  enhancement  signal-noise  blowhards  dysgenics  data  distribution  objektbuch  tradeoffs  embodied  speculation  metabolic  volo-avolo  degrees-of-freedom  race  africa  genetics  genomics  bio  QTL  population-genetics  stylized-facts  britain  history  early-modern  pre-ww2  galton  old-anglo  giants  industrial-revolution  neuro-nitgrit  recent-selection  selection  medicine  darwinian  strategy  egalitarianism-hierarchy  CRISPR  biotech  definition  reflection  poetry  deep-materialism  EGT  discrimination  conceptual-vocab  psycho-atoms 
february 2017 by nhaliday
The Art of Being Right - Wikipedia
The Art of Being Right: 38 Ways to Win an Argument (also Eristic Dialectic: The Art of Winning an Argument; German: Eristische Dialektik: Die Kunst, Recht zu behalten; 1831) is an acidulous and sarcastic treatise written by the German philosopher Arthur Schopenhauer in sarcastic deadpan.[1] In it, Schopenhauer examines a total of thirty-eight methods of showing up one's opponent in a debate.
history  early-modern  europe  germanic  philosophy  negotiation  dark-arts  meta:rhetoric  books  essay  wiki  persuasion  big-peeps  info-dynamics  aristos  blowhards 
february 2017 by nhaliday
Holocene selection for variants associated with cognitive ability: Comparing ancient and modern genomes. | bioRxiv
- Michael Woodley

Human populations living in Eurasia during the Holocene experienced significant evolutionary change. It has been predicted that the transition of Holocene populations into agrarianism and urbanization brought about culture-gene co-evolution that favoured via directional selection genetic variants associated with higher general cognitive ability (GCA).
These observations are consistent with the expectation that GCA rose during the Holocene.
study  preprint  bio  sapiens  genetics  genomics  GWAS  antiquity  trends  iq  dysgenics  recent-selection  aDNA  multi  hn  commentary  gwern  enhancement  evolution  blowhards  behavioral-gen 
february 2017 by nhaliday 2013 : WHAT *SHOULD* WE BE WORRIED ABOUT?
- Chinese eugenics [Geoffrey Miller. Pretty weird take. ("30 years running"? No.)]
- finance [Seth Lloyd]
- demographic collapse
- quantum mechanics [Lee Smolin]
- technology endangering democracy
- "idiocracy looming"
- the Two Culture and the nature-nurture debate [Simon Baron-Cohen]
- "the real risk factors for war" [Pinker]
org:edge  frontier  uncertainty  risk  discussion  list  top-n  multi  planning  big-picture  prediction  links  spearhead  blowhards  pinker  technology  fertility  dysgenics  trends  finance  culture-war  postmortem  2013  enhancement  aversion  democracy  q-n-a  metameta  zeitgeist  speedometer  questions 
january 2017 by nhaliday
Marginal Restoration | Veracity is the heart of morality
“The Great Liberal Death-Wish” (1966)
- Malcom Muggeridge

The readiest explanation in years to come of this evident contradiction between the objectives and consequences of liberalism is likely, I should have thought, to be that, despite its seemingly sanguine and benevolent character, liberalism in reality represented a collective death-wish. Like individuals, civilizations in decline, consciously or unconsciously, want to be extinguished; liberalism is the primrose-path to extinction. There is a story (probably apocryphal) that in the days of the Third Reich a Nazi procession included a contingent of liberal intellectuals bearing the banner: ‘Down With Us!’ Had they but known it, they were speaking on behalf of all liberals everywhere.


Orwell, in his enchanting fable Animal Farm, in his brilliant analysis of double-speak and double-think as projected by the Ministry of Truth (based, as he told me, not on a Nazi or Fascist or Soviet model, but on the BBC), worked it all out superbly in imaginative detail. He made only one mistake. He envisaged the nightmare as being imposed by ruthlessly efficient power-maniacs, not realizing that it had been born and nourished in the finest, most civilized, and most humane minds of our time, including his own. For our Dark Ages, it is we ourselves who are turning out the lights, fondly supposing that we are turning them on.
I am a senior applied math major at Yale.
I’m interested in demography, sociology, economics, moral psychology, and political theory.
blog  stream  gnon  politics  culture-war  migration  wonkish  ideology  unaffiliated  right-wing  multi  tumblr  social  backup  quotes  essay  rhetoric  polisci  vitality  rot  zeitgeist  civil-liberty  big-peeps  journos-pundits  old-anglo  literature  fiction  civilization  occident  the-great-west-whale  people  track-record  blowhards  nietzschean  history  mostly-modern  cold-war  nihil  death  orwellian  britain  usa  anglosphere  morality  gender  sex  sexuality  democracy  tocqueville  duty 
december 2016 by nhaliday
Y-chromosome crash | West Hunter
there probably wasn't vast reproductive inequality ("17 to 1! woah") in the Bronze Age, and there wouldn't have to be to explain observed genetic patterns

comment on TFR gradients in Malthusian conditions:
“By contrast, the average number of surviving children for the majority of men was probably somewhere between zero and one – despite that they were having sex and babies.”

Fuck me, that’s obviously ridiculous. In real life, take a peasant village in England: if your model were correct, you’d have surname turnover every couple of generations. But that didn’t happen.

Here’s a model that’s at least in the ballpark: there was some class differential in fitness. The poorest, landless laborers, had a TFR below replacement, but not by a tremendous amount: 1.6? Most peasants were close to break-even, upper farmers did better than break-even, Other groups were mostly too small in number or too urban (population sinks) to matter. Overall TFR was of course break-even over the moderately long haul, in a sloppy way, with occasional epidemics and crop failures.
west-hunter  sapiens  antiquity  regularizer  speculation  gavisti  explanation  thinking  🌞  sex  gender  male-variability  winner-take-all  inequality  pop-structure  science-anxiety  scitariat  nietzschean  sexuality  gender-diff  null-result  deep-materialism  EEA  history  multi  aDNA  archaeology  conquest-empire  china  asia  genetics  genomics  poast  fertility  medieval  britain  demographics  malthus  class  correlation  blowhards  traces 
november 2016 by nhaliday
Mandelbrot (and Hudson’s) The (mis)Behaviour of Markets: A Fractal View of Risk, Ruin, and Reward | EVOLVING ECONOMICS
If you have read Nassim Taleb’s The Black Swan you will have come across some of Benoit Mandelbrot’s ideas. However, Mandelbrot and Hudson’s The (mis)Behaviour of Markets: A Fractal View of Risk, Ruin, and Reward offers a much clearer critique of the underpinnings of modern financial theory (there are many parts of The Black Swan where I’m still not sure I understand what Taleb is saying). Mandelbrot describes and pulls apart the contributions of Markowitz, Sharpe, Black, Scholes and friends in a way likely understandable to the intelligent lay reader. I expect that might flow from science journalist Richard Hudson’s involvement in writing the book.

- interesting parable about lakes and markets (but power laws aren't memoryless...?)
- yeah I think that's completely wrong actually. the important property of power laws is the lack of finite higher-order moments.

based off I think he really did mean a power law (x = 100/sqrt(r) => pdf is p(x) ~ |dr/dx| = 2e4/x^3)

edit: ah I get it now, for X ~ p(x) = 2/x^3 on [1,inf), we have E[X|X > k] = 2k, so not memoryless, but rather subject to a "slippery slope"
books  summary  finance  map-territory  tetlock  review  econotariat  distribution  parable  blowhards  multi  risk  decision-theory  tails  meta:prediction  complex-systems  broad-econ  power-law 
november 2016 by nhaliday
What You Can't Say
E Pur Si Muove:

Sam Altman and the fear of political correctness:
Earlier this year, I noticed something in China that really surprised me. I realized I felt more comfortable discussing controversial ideas in Beijing than in San Francisco. I didn’t feel completely comfortable—this was China, after all—just more comfortable than at home.

That showed me just how bad things have become, and how much things have changed since I first got started here in 2005.

It seems easier to accidentally speak heresies in San Francisco every year. Debating a controversial idea, even if you 95% agree with the consensus side, seems ill-advised.
And so it runs with shadow prices for speech, including rights to say things and to ask questions. Whatever you are free to say in America, you have said many times already, and the marginal value of exercising that freedom yet again doesn’t seem so high. But you show up in China, and wow, your pent-up urges are not forbidden topics any more. Just do be careful with your mentions of Uncle Xi, Taiwan, Tibet, Uighur terrorists, and disappearing generals. That said, in downtown Berkeley you can speculate rather freely on whether China will someday end up as a Christian nation, and hardly anybody will be offended.

For this reason, where we live typically seems especially unfree when it comes to speech. And when I am in China, I usually have so, so many new dishes I want to sample, including chestnuts and pumpkin.


Baidu's Robin Li is Helping China Win the 21st Century:
Therein lies the contradiction at the heart of China’s efforts to forge the future: the country has the world’s most severe restrictions on Internet freedom, according to advocacy group Freedom House. China employs a highly sophisticated censorship apparatus, dubbed the Great Firewall, to snuff out any content deemed critical or inappropriate. Google, Facebook and Twitter, as well as news portals like the New York Times, Bloomberg and TIME, are banned. Manned by an army of 2 million online censors, the Great Firewall gives outsiders the impression of deathly silence within.

But in fact, business thrives inside the firewall’s confines–on its guardians’ terms, of course–and the restrictions have not appeared to stymie progress. “It turns out you don’t need to know the truth of what happened in Tiananmen Square to develop a great smartphone app,” says Kaiser Kuo, formerly Baidu’s head of international communications and a co-host of Sinica, an authoritative podcast on China. “There is a deep hubris in the West about this.” The central government in Beijing has a fearsome capacity to get things done and is willing to back its policy priorities with hard cash. The benefits for companies willing or able to go along with its whims are clear. The question for Baidu–and for Li–is how far it is willing to go.

Silicon Valley would be wise to follow China’s lead:
The work ethic in Chinese tech companies far outpaces their US rivals

The declaration by Didi, the Chinese ride-hailing company, that delivery business Meituan’s decision to launch a rival service would spark “the war of the century”, throws the intensive competition between the country’s technology companies into stark relief.

The call to arms will certainly act as a spur for Didi employees, although it is difficult to see how they can work even harder. But what it does reveal is the striking contrast between working life in China’s technology companies and their counterparts in the west.

In California, the blogosphere has been full of chatter about the inequity of life. Some of this, especially for women, is true and for certain individuals their day of reckoning has been long overdue. But many of the soul-sapping discussions seem like unwarranted distractions. In recent months, there have been complaints about the political sensibilities of speakers invited to address a corporate audience; debates over the appropriate length of paternity leave or work-life balances; and grumbling about the need for a space for musical jam sessions. These seem like the concerns of a society that is becoming unhinged.


While male chauvinism is still common in the home, women have an easier time gaining recognition and respect in China’s technology workplaces — although they are still seriously under-represented in the senior ranks. Many of these high-flyers only see their children — who are often raised by a grandmother or nanny — for a few minutes a day. There are even examples of husbands, eager to spend time with their wives, who travel with them on business trips as a way to maintain contact.
What I learned from 5 weeks in Beijing + Shanghai:

- startup creation + velocity dwarfs anything in SF
- no one in China I met is remotely worried about U.S. or possibly even cares
- access to capital is crazy
- scale feels about 20x of SF
- endless energy
- not SV jaded
Western values are freeriding on Western innovation.
Comparatively unimpeded pursuit of curiosity into innovation is a Western value that pays the carriage fare.
True. A lot of values are worthwhile in certain contexts but should never have been scaled.

Diversity, "social mobility", iconoclasm
but due to military and technological victory over its competitors
There's something to be said for Western social trust as well, though that's an institution more than an idea
essay  yc  culture  society  philosophy  reflection  contrarianism  meta:rhetoric  thiel  embedded-cognition  paulg  water  🖥  techtariat  barons  info-dynamics  realness  truth  straussian  open-closed  preference-falsification  individualism-collectivism  courage  orwellian  multi  backup  econotariat  marginal-rev  commentary  links  quotes  hard-tech  skunkworks  enhancement  genetics  biotech  sv  tech  trends  civil-liberty  exit-voice  longevity  environment  innovation  frontier  politics  identity-politics  zeitgeist  china  asia  sinosphere  censorship  news  org:lite  org:biz  debate  twitter  social  social-norms  gender  sex  sexuality  org:med  blowhards  drama  google  poll  descriptive  values  rot  humility  tradeoffs  government  the-great-west-whale  internet  occident  org:rec  org:anglo  venture  vitality  gibbon  competition  investing  martial  discussion  albion  journos-pundits  europe  ideology  free-riding  degrees-of-freedom  land  gnon  peace-violence  diversity  mobility  tradition  reason  curiosity  trust  n-factor  institutions  th 
october 2016 by nhaliday
command center: Prints
rob pike on the ephemerality of the digital and the reliability of the physical
planning  futurism  photography  lifestyle  reflection  rsc  techtariat  blowhards  cracker-prog  flux-stasis  spreading  nihil  bits  interface-compatibility  time  sequential 
july 2016 by nhaliday
command center: The byte order fallacy
Whenever I see code that asks what the native byte order is, it's almost certain the code is either wrong or misguided. And if the native byte order really does matter to the execution of the program, it's almost certain to be dealing with some external software that is either wrong or misguided. If your code contains #ifdef BIG_ENDIAN or the equivalent, you need to unlearn about byte order.
programming  systems  techtariat  rsc  c(pp)  blowhards  cracker-prog  nitty-gritty  tip-of-tongue  rant  direction 
august 2014 by nhaliday
Rob Pike: Notes on Programming in C
Issues of typography
Sometimes they care too much: pretty printers mechanically produce pretty output that accentuates irrelevant detail in the program, which is as sensible as putting all the prepositions in English text in bold font. Although many people think programs should look like the Algol-68 report (and some systems even require you to edit programs in that style), a clear program is not made any clearer by such presentation, and a bad program is only made laughable.
Typographic conventions consistently held are important to clear presentation, of course - indentation is probably the best known and most useful example - but when the ink obscures the intent, typography has taken over.


Finally, I prefer minimum-length but maximum-information names, and then let the context fill in the rest. Globals, for instance, typically have little context when they are used, so their names need to be relatively evocative. Thus I say maxphysaddr (not MaximumPhysicalAddress) for a global variable, but np not NodePointer for a pointer locally defined and used. This is largely a matter of taste, but taste is relevant to clarity.


C is unusual in that it allows pointers to point to anything. Pointers are sharp tools, and like any such tool, used well they can be delightfully productive, but used badly they can do great damage (I sunk a wood chisel into my thumb a few days before writing this). Pointers have a bad reputation in academia, because they are considered too dangerous, dirty somehow. But I think they are powerful notation, which means they can help us express ourselves clearly.
Consider: When you have a pointer to an object, it is a name for exactly that object and no other.


A delicate matter, requiring taste and judgement. I tend to err on the side of eliminating comments, for several reasons. First, if the code is clear, and uses good type names and variable names, it should explain itself. Second, comments aren't checked by the compiler, so there is no guarantee they're right, especially after the code is modified. A misleading comment can be very confusing. Third, the issue of typography: comments clutter code.
But I do comment sometimes. Almost exclusively, I use them as an introduction to what follows.


Most programs are too complicated - that is, more complex than they need to be to solve their problems efficiently. Why? Mostly it's because of bad design, but I will skip that issue here because it's a big one. But programs are often complicated at the microscopic level, and that is something I can address here.
Rule 1. You can't tell where a program is going to spend its time. Bottlenecks occur in surprising places, so don't try to second guess and put in a speed hack until you've proven that's where the bottleneck is.

Rule 2. Measure. Don't tune for speed until you've measured, and even then don't unless one part of the code overwhelms the rest.

Rule 3. Fancy algorithms are slow when n is small, and n is usually small. Fancy algorithms have big constants. Until you know that n is frequently going to be big, don't get fancy. (Even if n does get big, use Rule 2 first.) For example, binary trees are always faster than splay trees for workaday problems.

Rule 4. Fancy algorithms are buggier than simple ones, and they're much harder to implement. Use simple algorithms as well as simple data structures.

The following data structures are a complete list for almost all practical programs:

linked list
hash table
binary tree
Of course, you must also be prepared to collect these into compound data structures. For instance, a symbol table might be implemented as a hash table containing linked lists of arrays of characters.
Rule 5. Data dominates. If you've chosen the right data structures and organized things well, the algorithms will almost always be self-evident. Data structures, not algorithms, are central to programming. (See The Mythical Man-Month: Essays on Software Engineering by F. P. Brooks, page 102.)

Rule 6. There is no Rule 6.

Programming with data.
One of the reasons data-driven programs are not common, at least among beginners, is the tyranny of Pascal. Pascal, like its creator, believes firmly in the separation of code and data. It therefore (at least in its original form) has no ability to create initialized data. This flies in the face of the theories of Turing and von Neumann, which define the basic principles of the stored-program computer. Code and data are the same, or at least they can be. How else can you explain how a compiler works? (Functional languages have a similar problem with I/O.)

Function pointers
Another result of the tyranny of Pascal is that beginners don't use function pointers. (You can't have function-valued variables in Pascal.) Using function pointers to encode complexity has some interesting properties.
Some of the complexity is passed to the routine pointed to. The routine must obey some standard protocol - it's one of a set of routines invoked identically - but beyond that, what it does is its business alone. The complexity is distributed.

There is this idea of a protocol, in that all functions used similarly must behave similarly. This makes for easy documentation, testing, growth and even making the program run distributed over a network - the protocol can be encoded as remote procedure calls.

I argue that clear use of function pointers is the heart of object-oriented programming. Given a set of operations you want to perform on data, and a set of data types you want to respond to those operations, the easiest way to put the program together is with a group of function pointers for each type. This, in a nutshell, defines class and method. The O-O languages give you more of course - prettier syntax, derived types and so on - but conceptually they provide little extra.


Include files
Simple rule: include files should never include include files. If instead they state (in comments or implicitly) what files they need to have included first, the problem of deciding which files to include is pushed to the user (programmer) but in a way that's easy to handle and that, by construction, avoids multiple inclusions. Multiple inclusions are a bane of systems programming. It's not rare to have files included five or more times to compile a single C source file. The Unix /usr/include/sys stuff is terrible this way.
There's a little dance involving #ifdef's that can prevent a file being read twice, but it's usually done wrong in practice - the #ifdef's are in the file itself, not the file that includes it. The result is often thousands of needless lines of code passing through the lexical analyzer, which is (in good compilers) the most expensive phase.

Just follow the simple rule.

First, I don't think it actually is true: in many compilers, most time is not spend in lexing source code. For example, in C++ compilers (e.g. g++), most time is spend in semantic analysis, in particular in overload resolution (trying to find out what implicit template instantiations to perform). Also, in C and C++, most time is often spend in optimization (creating graph representations of individual functions or the whole translation unit, and then running long algorithms on these graphs).

When comparing lexical and syntactical analysis, it may indeed be the case that lexical analysis is more expensive. This is because both use state machines, i.e. there is a fixed number of actions per element, but the number of elements is much larger in lexical analysis (characters) than in syntactical analysis (tokens).
programming  systems  philosophy  c(pp)  summer-2014  intricacy  engineering  rhetoric  contrarianism  diogenes  parsimony  worse-is-better/the-right-thing  data-structures  list  algorithms  stylized-facts  essay  ideas  performance  functional  state  pls  oop  gotchas  blowhards  duplication  compilers  syntax  lexical  checklists  metabuch  lens  notation  thinking  neurons  guide  pareto  heuristic  time  cost-benefit  multi  q-n-a  stackex  plt  hn  commentary  minimalism  techtariat  rsc  writing  technical-writing  cracker-prog  code-organizing  grokkability  protocol-metadata  direct-indirect  grokkability-clarity  latency-throughput 
august 2014 by nhaliday

Copy this bookmark:

to read