optimization
Tailwind: Pinterest & Instagram Scheduler, Analytics & Marketing Tool
pinterest
instagram
post
social
media
marketing
optimization
content
8 hours ago by kpieper876
Save time scheduling to Pinterest and Instagram, post at the best times for engagement, grow together with Tribes, get more actionable analytics. Start free!
8 hours ago by kpieper876
Vue.js App Performance Optimization
14 hours ago by angusm
Part 1 of a multipart series on VueJS performance optimization.
vuejs
performance
optimization
howtos
webdev
14 hours ago by angusm
[1902.06720] Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent
yesterday by cshalizi
"A longstanding goal in deep learning research has been to precisely characterize training and generalization. However, the often complex loss landscapes of neural networks have made a theory of learning dynamics elusive. In this work, we show that for wide neural networks the learning dynamics simplify considerably and that, in the infinite width limit, they are governed by a linear model obtained from the first-order Taylor expansion of the network around its initial parameters. Furthermore, mirroring the correspondence between wide Bayesian neural networks and Gaussian processes, gradient-based training of wide neural networks with a squared loss produces test set predictions drawn from a Gaussian process with a particular compositional kernel. While these theoretical results are only exact in the infinite width limit, we nevertheless find excellent empirical agreement between the predictions of the original network and those of the linearized version even for finite practically-sized networks. This agreement is robust across different architectures, optimization methods, and loss functions."
to:NB
neural_networks
optimization
yesterday by cshalizi
pierreablin/autoptim: Automatic differentiation + optimization
yesterday by arthegall
I'm a little confused -- weren't people *always* writing packages like this? (PyTorch + ADiff)
automatic-differentiation
pytorch
python
software
optimization
yesterday by arthegall
GitHub - plasma-umass/Mesh: A memory allocator that automatically reduces the memory footprint of C/C++ applications.
2 days ago by ddribin
"Mesh is a drop in replacement for malloc(3) that compacts the heap without rewriting application pointers."
programming
c
c++
memory
fragmentation
optimization
2 days ago by ddribin
[1902.04738] Mesh: Compacting Memory Management for C/C++ Applications
2 days ago by ddribin
"Programs written in C/C++ can suffer from serious memory fragmentation, leading to low utilization of memory, degraded performance, and application failure due to memory exhaustion. This paper introduces Mesh, a plug-in replacement for malloc that, for the first time, eliminates fragmentation in unmodified C/C++ applications. Mesh combines novel randomized algorithms with widely-supported virtual memory operations to provably reduce fragmentation, breaking the classical Robson bounds with high probability. Mesh generally matches the runtime performance of state-of-the-art memory allocators while reducing memory consumption; in particular, it reduces the memory of consumption of Firefox by 16% and Redis by 39%."
programming
c
c++
memory
fragmentation
optimization
2 days ago by ddribin