Learning Everywhere: Pervasive Machine Learning for Effective High-Performance Computation: Application Background. Fox, G., Glazier, J., A., Kadupitiya, J., Jadhao, V., Kim, M., Qiu, J., Sluka, J., P., Somogy, E., Marathe, M., Adiga, A., Chen, J., Beckstein, O., Jha, S., Somogyi, E., Marathe, M., Adiga, A., Chen, J., Beckstein, O., & Jha, S. In 2019 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW), pages 33, 5, 2019. IEEE.
Learning Everywhere: Pervasive Machine Learning for Effective High-Performance Computation: Application Background [pdf]Paper  Learning Everywhere: Pervasive Machine Learning for Effective High-Performance Computation: Application Background [link]Website  doi  abstract   bibtex   
The convergence of HPC and data-intensive methodologies provide a promising approach to major performance improvements. This paper provides a general description of the interaction between traditional HPC and ML approaches and motivates the Learning Everywhere paradigm for HPC. We introduce the concept of effective performance that one can achieve by combining learning methodologies with simulation-based approaches, and distinguish between traditional performance as measured by benchmark scores. To support the promise of integrating HPC and learning methods, this paper examines specific examples and opportunities across a series of domains. It concludes with a series of open computer science and cyberinfrastructure questions and challenges that the Learning Everywhere paradigm presents.

Downloads: 0