recentpopularlog in

machinelearning

« earlier   
Software commodities are eating interesting data science work | Yanir Seroussi
So where does this leave us? It seems to be a more general phenomenon. Essentially every problem that requires specialised knowledge and is valuable ends up attracting repeatable solutions that obviate the need for deep thinking and manual work. These solutions are software commodities. Deploying them is a matter of writing some glue code and fitting them into the overall system – an engineering problem. Implementing data science components to compete with commodities may be interesting and fun, but it’s usually a waste of time when there’s a generic solution that is good enough.
datascience  machinelearning  professional 
15 hours ago by mike
Software 2.0 - Andrej Karpathy - Medium
Neural networks are not just another classifier, they represent the beginning of a fundamental shift in how we write software. They are Software 2.0.

It turns out that a large portion of real-world problems have the property that it is significantly easier to collect the data (or more generally, identify a desirable behavior) than to explicitly write the program. In these cases, the programmers will split into two teams. The 2.0 programmers manually curate, maintain, massage, clean and label datasets; each labeled example literally programs the final system because the dataset gets compiled into Software 2.0 code via the optimization. Meanwhile, the 1.0 programmers maintain the surrounding tools, analytics, visualizations, labeling interfaces, infrastructure, and the training code.
ai  software  programming  machinelearning  thought-piece  article 
21 hours ago by tobym
On-device AI is "new normal" – Arm claims mega-jump in edge AI with new chip combo
Its new Cortex-M55 processor and Ethos-U55 NPU will deliver a 480-fold leap in ML performance to microcontrollers, the company said. The Ethos-U55 is the first micro NPU for the Cortex-M line of silicon blueprints. The pair are geared to small, power-constrained IoT and embedded devices.
machinelearning  arm  chip  hardware  iot  ovum 
yesterday by yorksranter
GitHub - google/trax: Trax — your path to advanced deep learning
Structure

Trax code is structured in a way that allows you to understand deep learning from scratch. We start with basic maths and go through layers, models, supervised and reinforcement learning. We get to advanced deep learning results, including recent papers such as Reformer - The Efficient Transformer, selected for oral presentation at ICLR 2020.

The main steps needed to understand deep learning correspond to sub-directories in Trax code:

math/ — basic math operations and ways to accelerate them on GPUs and TPUs (through JAX and TensorFlow)
layers/ are the basic building blocks of neural networks and here you'll find how they are build and all the needed ones
models/ contains all basic models (MLP, ResNet, Transformer, ...) and a number of new research models
optimizers/ is a directory with optimizers needed for deep learning
supervised/ contains the utilities needed to run supervised learning and the Trainer class
rl/ contains our work on reinforcement learning
ai  google  machinelearning 
yesterday by euler
google/trax: Trax — your path to advanced deep learning
Trax helps you understand and explore advanced deep learning. We focus on making Trax code clear while pushing advanced models like Reformer to their limits. Trax is actively used and maintained in the Google Brain team. Give it a try, talk to us or open an issue if needed.
machinelearning  AI  Google 
yesterday by neuralmarket
google/trax: Trax — your path to advanced deep learning
Trax helps you understand and explore advanced deep learning. We focus on making Trax code clear while pushing advanced models like Reformer to their limits. Trax is actively used and maintained in the Google Brain team. Give it a try, talk to us or open an issue if needed.
google  ml  ai  machinelearning  numpy 
yesterday by wjy
Google Trax
Trax helps you understand and explore advanced deep learning. We focus on making Trax code clear while pushing advanced models like Reformer to their limits. Trax is actively used and maintained in the Google Brain team.
machinelearning  python  google 
yesterday by jazwiecki
GitHub - google/trax: Trax — your path to advanced deep learning
Trax — your path to advanced deep learning. Contribute to google/trax development by creating an account on GitHub.
machinelearning 
yesterday by geetarista
Not Real News
We're raising funding/want to know you! Contact us: +1 4086577173 or bigbird@bigbird.dev
machinelearning  ai  news 
yesterday by geetarista

Copy this bookmark:





to read