recentpopularlog in

imagenet

« earlier   
Twitter
The top three state of the art network architectures for are produced by , now i…
deeplearning  AI  AutoML  ImageNet  from twitter_favs
9 days ago by aratob
Prepare the ImageNet dataset — gluoncv 0.3.0 documentation
includes instructions for other datasets, too, namely COCO, ADE20K, PASCAL VOC
imagenet  advice 
10 weeks ago by wpenman
Exploring the Limits of Weakly Supervised Pretraining – Facebook Research
"State-of-the-art visual perception models for a wide range of tasks rely on supervised pretraining. ImageNet classification is the defacto pretraining task for these models. Yet, ImageNet is now nearly ten years old and is by modern standards small. Even so, relatively little is known about the behavior of pretraining with datasets that are multiple orders of magnitude larger. The reasons are obvious: such datasets are difficult to collect and annotate. In this paper, we present a unique study of transfer learning with large convolutional networks trained to predict hashtags on billions of social media images. Our experiments demonstrate that training for large-scale hashtag prediction leads to excellent results. We show improvements on several image classification and object detection tasks, and report the highest ImageNet-1k single-crop, top-1 accuracy to date: 85.4% (97.6% top-5). We also perform extensive experiments that provide novel empirical data on the relationship between large-scale pretraining and transfer learning performance."
neural-net  pretraining  transfer-learning  imagenet  computer-vision 
august 2018 by arsyed
[D] Have we overfit to ImageNet? : MachineLearning
Just came across yet another architecture search paper: https://arxiv.org/abs/1712.00559 Everyone is treating .1% improvements as...
overfitting  imagenet  deep-learning 
august 2018 by pmigdal

Copy this bookmark:





to read