Добавить в избранное
Форум
Правила сайта "Мир Книг"
Группа в Вконтакте
Подписка на книги
Правообладателям
Найти книгу:
Навигация
Вход на сайт
Регистрация



Реклама



Название: Advanced Topics in Neural Networks With Matlab. Parallel Computing, Optimize And Training
Автор: Perez C.
Издательство: CreateSpace Independent Publishing Platform
ISBN: 1974082040
Год: 2017
Страниц: 160
Язык: английский
Формат: epub, azw3
Размер: 14.0 MB

Neural networks are inherently parallel algorithms. Multicore CPUs, graphical processing units (GPUs), and clusters of computers with multiple CPUs and GPUs can take advantage of this parallelism. Parallel Computing Toolbox, when used in conjunction with Neural Network Toolbox, enables neural network training and simulation to take advantage of each mode of parallelism. Parallel Computing Toolbox allows neural network training and simulation to run across multiple CPU cores on a single PC, or across multiple CPUs on multiple computers on a network using MATLAB Distributed Computing Server. Using multiple cores can speed calculations. Using multiple computers can allow you to solve problems using data sets too big to fit in the RAM of a single computer. The only limit to problem size is the total quantity of RAM available across all computers. Distributed and GPU computing can be combined to run calculations across multiple CPUs and/or GPUs on a single computer, or on a cluster with MATLAB Distributed Computing Server. It is desirable to determine the optimal regularization parameters in an automated fashion. One approach to this process is the Bayesian framework. In this framework, the weights and biases of the network are assumed to be random variables with specified distributions. The regularization parameters are related to the unknown variances associated with these distributions. You can then estimate these parameters using statistical techniques. It is very difficult to know which training algorithm will be the fastest for a given problem. It depends on many factors, including the complexity of the problem, the number of data points in the training set, the number of weights and biases in the network, the error goal, and whether the network is being used for pattern recognition (discriminant analysis) or function approximation (regression). This book compares the various training algorithms. One of the problems that occur during neural network training is called overfitting. The error on the training set is driven to a very small value, but when new data is presented to the network the error is large. The network has memorized the training examples, but it has not learned to generalize to new situations.

This book develops the following topics:
• “Neural Networks with Parallel and GPU Computing”
• “Deep Learning”
• “Optimize Neural Network Training Speed and Memory”
• “Improve Neural Network Generalization and Avoid Overfitting”
• “Create and Train Custom Neural Network Architectures”
• “Deploy Training of Neural Networks”
• “Perceptron Neural Networks”
• “Linear Neural Networks”
• “Hopfield Neural Network”
• “Neural Network Object Reference”
• “Neural Network Simulink Block Library”
• “Deploy Neural Network Simulink Diagrams”

Скачать Advanced Topics in Neural Networks With Matlab. Parallel Computing, Optimize And Training









НЕ РАБОТАЕТ TURBOBIT.NET? ЕСТЬ РЕШЕНИЕ, ЖМИ СЮДА!





Автор: Ingvar16 25-01-2019, 04:59 | Напечатать | СООБЩИТЬ ОБ ОШИБКЕ ИЛИ НЕ РАБОЧЕЙ ССЫЛКЕ
 
Уважаемый посетитель, Вы зашли на сайт как незарегистрированный пользователь.





С этой публикацией часто скачивают:
    {related-news}

Посетители, находящиеся в группе Гости, не могут оставлять комментарии к данной публикации.





 MyMirKnig.ru  ©2019     При использовании материалов библиотеки обязательна обратная активная ссылка    Политика конфиденциальности