In this talk, we will introduce a novel platform Resource-Aware AutoML (RA-AutoML) which enables flexible and generalized algorithms to build machine learning models subjected to resource and hardware constraints. RA-AutoML intelligently conducts Hyper-Parameter Search (HPS) as well as Neural Architecture Search (NAS) to build models optimizing predefined objectives.
RA-AutoML is a versatile framework that allows human user to prescribe many resource/hardware constraints and objectives demanded by the problem or business requirements. At its heart, RA-AutoML relies on our in-house search-engine algorithm, MOBOGA, which combines a modified constraint-aware Bayesian optimization and Genetic algorithm to construct Pareto optimal candidates. Our experiments on CIFAR-10 dataset shows very good accuracy compared to results obtained by state-of-art neural network models, while subjected to resource constraints in the form of model size.
Yao Yang (Accenture Labs)
A research data scientist at Accenture Labs. She has led R&D in various areas of AIOps to make machine learning models more scalable, accessible, transparent and trustworthy
A R&D senior analyst at Accenture Labs. He has contributed to work in democratizing and accelerating AI/ML b designing accessible platforms that automate designing, training, deploying and governing models at scale
A R&D data scientist at Accenture Labs. His main focus has been on addressing challenges associated with machine learning and AI at scale for enterprise purposes.