作者: Ludovic Denoyer , Tom Veniat
DOI:
关键词:
摘要: We propose to focus on the problem of discovering neural network architectures efficient in terms both prediction quality and cost. For instance, our approach is able solve following tasks: learn a predict well less than 100 milliseconds or an model that fits 50 Mb memory. Our contribution novel family models called Budgeted Super Networks (BSN). They are learned using gradient descent techniques applied budgeted learning objective function which integrates maximum authorized cost, while making no assumption nature this present set experiments computer vision problems analyze ability technique deal with three different costs: computation memory consumption cost distributed particularly show can discover have better accuracy ResNet Convolutional Neural Fabrics CIFAR-10 CIFAR-100, at lower