recentpopularlog in

optimization

« earlier   
java - Why is it faster to process a sorted array than an unsorted array? - Stack Overflow
Another answer mentioned an interesting optimization called: loop-invariant hoisting optimization.
programming  optimization 
6 hours ago by Babygun
[1811.05008] Choosing to grow a graph: Modeling network formation as discrete choice
We provide a framework for modeling social network formation through conditional multinomial logit models from discrete choice and random utility theory, in which each new edge is viewed as a "choice" made by a node to connect to another node, based on (generic) features of the other nodes available to make a connection. This perspective on network formation unifies existing models such as preferential attachment, triadic closure, and node fitness, which are all special cases, and thereby provides a flexible means for conceptualizing, estimating, and comparing models. The lens of discrete choice theory also provides several new tools for analyzing social network formation; for example, mixtures of existing models can be estimated by adapting known expectation-maximization algorithms, and the significance of node features can be evaluated in a statistically rigorous manner. We demonstrate the flexibility of our framework through examples that analyze a number of synthetic and real-world datasets. For example, we provide rigorous methods for estimating preferential attachment models and show how to separate the effects of preferential attachment and triadic closure. Non-parametric estimates of the importance of degree show a highly linear trend, and we expose the importance of looking carefully at nodes with degree zero. Examining the formation of a large citation graph, we find evidence for an increased role of degree when accounting for age.

--seems related to some of M.O. Jackson's constructions...
networks  dynamics  optimization  game_theory  via:clauset 
18 hours ago by rvenkat
Endlich besser schlafen - Dossier - tagesanzeiger.ch
Warum guter Schlaf wirklich wichtig ist und wie Sie ihn optimieren können.
health  sleep  dossier  tagi  optimization  life  living 
yesterday by navegador
ImageOptim/ImageOptim: GUI image optimizer for Mac
GUI image optimizer for Mac. Contribute to ImageOptim/ImageOptim development by creating an account on GitHub.
image  compression  optimization  macos 
yesterday by reorx
kornelski/ImageAlpha: Mac GUI for pngquant, pngnq and posterizer
Mac GUI for pngquant, pngnq and posterizer. Contribute to kornelski/ImageAlpha development by creating an account on GitHub.
image  compression  optimization  macos 
yesterday by reorx
The Amory Lovins bottleneck – keeping simple
"If you start by improving power efficiency of air-conditioning – a good thing in itself – you cannot obtain the scale improvements that can be gained on the other end of the pipeline by reducing the activities that use power and generate heat. That is, if you can increase work-done/computational-steps you drive savings up the pipeline. And the kind of large scale savings Lovins achieves in other industrial processes seem plausible: if you reduce power demand at the work end enough to reduce the inputs of cooling needed so that a smaller air conditioning unit can be used, you have a potentially greater savings than by improving the efficiency of the air conditioning unit."
optimization  programming 
yesterday by mechazoidal
[1810.12281] Three Mechanisms of Weight Decay Regularization
Weight decay is one of the standard tricks in the neural network toolbox, but the reasons for its regularization effect are poorly understood, and recent results have cast doubt on the traditional interpretation in terms of L2 regularization. Literal weight decay has been shown to outperform L2 regularization for optimizers for which they differ. We empirically investigate weight decay for three optimization algorithms (SGD, Adam, and K-FAC) and a variety of network architectures. We identify three distinct mechanisms by which weight decay exerts a regularization effect, depending on the particular optimization algorithm and architecture: (1) increasing the effective learning rate, (2) approximately regularizing the input-output Jacobian norm, and (3) reducing the effective damping coefficient for second-order optimization. Our results provide insight into how to improve the regularization of neural networks.
neural-net  optimization  regularization  weight-decay  l2  roger-grosse 
2 days ago by arsyed
How To Optimize Your Website For Voice Search Assistant? Tips & Tricks
Nowadays audience likes to find the product/ services via voice search using Alexa, Google Assistant, Cortana or Siri as compared to typing on the search bar. In Jan 2018 Alpine.AI calculate there are over 1 billion voice searches happening per month.
voice  search  website  optimization  bestpractice 
2 days ago by gilberto5757

Copy this bookmark:





to read