Optimizing Neural Network Training: Smooth Functions, Performance Gains, and Advances in AI Models 

Koushik Biswas, Northwestern University
Seminar
MCS Seminar Graphic

In this talk, I will present novel smooth activation functions that can approximate ReLU and its variants, and I will address some critical limitations. The proposed functions are developed through mathematical approximations that constantly outperform ReLU, its variants, and other state-of-the-art smooth activation functions across various deep learning problems, including image classification, object detection, semantic segmentation, and machine translation.


I will also provide a brief overview of my recent research work, including semi-supervised learning and uncertainty estimation. These areas of study aim to push the boundaries of deep learning by enhancing model efficiency, robustness, and scalability across multiple domains.  

To add to calendar:
Click on:  https://wordpress.cels.anl.gov/cels-seminars/
Enter your credentials.
Search for your seminar 
Click “Add to calendar”