Author: Kaplunovich, A.; Yesha, Y.
Title: Automatic Hyperparameter Optimization for Arbitrary Neural Networks in Serverless AWS Cloud Cord-id: 6qqngwi7 Document date: 2021_1_1
ID: 6qqngwi7
Snippet: Deep Neural Networks are the most efficient method to solve many challenging problems. The importance of the subject can be demonstrated by the fact that the 2019 Turing Award was given to the godfathers of AI (and Neural Networks) Yoshua Bengio, Geoffrey Hinton, and Yann LeCun. In spite of the numerous advancements in the field, most of the models are being tuned manually. Accurate models became especially important during the novel coronavirus pandemic.Many day-to-day decisions depend on the m
Document: Deep Neural Networks are the most efficient method to solve many challenging problems. The importance of the subject can be demonstrated by the fact that the 2019 Turing Award was given to the godfathers of AI (and Neural Networks) Yoshua Bengio, Geoffrey Hinton, and Yann LeCun. In spite of the numerous advancements in the field, most of the models are being tuned manually. Accurate models became especially important during the novel coronavirus pandemic.Many day-to-day decisions depend on the model predictions affecting billions of people. We implemented a flexible automatic real-time hyperparameter tuning approach for arbitrary DNN models written in Python and Keras without manual steps. All of the existing tuning libraries require manual steps (like hyperopt, Scikit-Optimize or SageMaker). We provide an innovative methodology to automate hyper-parameter tuning for an arbitrary Neural Network model source code, utilizing Serverless Cloud and implementing revolutionary microservices, security, interoperability and orchestration. Our methodology can be used in numerous applications, including Information and Communication Systems. © 2021 IEEE.
Search related documents:
Co phrase search for related documents- Try single phrases listed below for: 1
Co phrase search for related documents, hyperlinks ordered by date