Prasanna Balaprakash

Affiliation

Oak Ridge National Laboratory, Oak Ridge, Tennessee, USA

Topic

Neural Network,Deep Neural Network,High-performance Computing,Hyperparameter Search,Hyperparameter Tuning,Aleatoric Uncertainty,Bayesian Optimization,Catastrophic Forgetting,Epistemic Uncertainty,Graph Neural Networks,Machine Learning Models,Parallelization,Search Space,Training Data,Uncertainty Quantification,Alternative Models,Anomaly Detection,Bayesian Neural Network,Convolutional Neural Network,Deep Convolutional Neural Network,Dynamic Programming,Ensemble Model,Forward Pass,Foundation Model,Gradient Ascent,High-performance Computing Systems,Hyperparameter Configuration,Incremental Learning,Language Model,Large Language Models,Machine Learning,Model Parameters,Model Performance,Model Size,NVIDIA GPU,Neuronal Populations,Node Features,Previous Tasks,Scale Efficiency,Short-term Forecasting,Strong Scaling,System Logs,Traffic Forecasting,Transfer Learning,Uncertainty Estimation,5G Networks,AI Models,Accuracy Of Model,Accuracy Scores,Acquisition Function,

Biography

Prasanna Balaprakash received the B.S. degree in computer science engineering from the Periyar University, Salem, India, the M.S. degree in computer science from the Otto-von-Guericke University, Magdeburg, Germany, and the Ph.D. degree in engineering sciences from CoDE-IRIDIA (AI Lab), Université libre de Bruxelles, Brussels, Belgium.
He was a Marie Curie Fellow and later an FNRS Aspirant at AI Lab. Currently, he is a Computer Scientist with a joint appointment in the Mathematics and Computer Science Division and the Leadership Computing Facility, Argonne National Laboratory. His research interests span the areas of artificial intelligence, machine learning, optimization, and high-performance computing. Currently, his research focus is on the automated design and development of scalable algorithms for solving large-scale problems that arise in scientific data analysis and in automating application performance modeling and tuning.