博彩公司-真人在线博彩公司大全_百家乐园首选去澳_全讯网赢足一世 (中国)·官方网站

網站頁面已加載完成

由于您當前的瀏覽器版本過低,存在安全隱患。建議您盡快更新,以便獲取更好的體驗。推薦使用最新版Chrome、Firefox、Opera、Edge

Chrome

Firefox

Opera

Edge

ENG

當前位置: 首頁 · 學術交流 · 正文

學術交流

【學術報告】研究生“靈犀學術殿堂”第560期之徐揚揚教授報告會通知

發布時間:2020年09月18日 來源:研究生院 點擊數:

全校師生:

我校定于2020年09月26日舉辦研究生靈犀學術殿堂——徐揚揚教授報告會,現將有關事項通知如下:

1.報告會簡介

報告人:徐揚揚教授

時間:2020年09月26日(星期六)10:00

地點:騰訊會議(會議號:688 912 696)

報告題目:Accelerating stochastic gradient methods

內容簡介:Stochastic gradient method has been extensively used to train machine learning models, in particular for deep learning. Various techniques have been applied to accelerate stochastic gradient methods, either numerically or theoretically, such as momentum acceleration and adapting learning rates. In this talk, I will present two ways to accelerate stochastic gradient methods. The first one is to accelerate the popular adaptive (Adam-type) stochastic gradient method by asynchronous (async) parallel computing. Numerically, async-parallel computing can have significantly higher parallelization speed-up than its sync-parallel counterpart. Several previous works have studied async-parallel non-adaptive stochastic gradient methods. However, a non-adaptive stochastic gradient method often converges significantly slower than an adaptive one. I will show that our async-parallel adaptive stochastic gradient method can have near-linear speed-up on top of the fast convergence of an adaptive stochastic gradient method. In the second part, I will present a momentum-accelerated proximal stochastic gradient method. It can have provably faster convergence than a standard proximal stochastic gradient method. I will also show experimental results to demonstrate its superiority on training a sparse deep learning model.

2.歡迎各學院師生前來聽報告。報告會期間請關閉手機或將手機調至靜音模式。

西北工業大學黨委學生工作部

數學與統計學院

復雜系統動力學與控制工信部重點實驗室

2020年9月18日

報告人簡介

Dr. Yangyang Xu(徐揚揚) is now a tenure-track assistant professor in the Department of Mathematical Sciences at Rensselaer Polytechnic Institute. He received his B.S. in Computational Mathematics from Nanjing University in 2007, M.S. in Operations Research from Chinese Academy of Sciences in 2010, and Ph.D from the Department of Computational and Applied Mathematics at Rice University in 2014. His research interests are optimization theory and methods and their applications such as in machine learning, statistics, and signal processing. He developed optimization algorithms for compressed sensing, matrix completion, and tensor factorization and learning. Recently, his research focuses on first-order methods, operator splitting, stochastic optimization methods, and high performance parallel computing. He has published over 30 papers in prestigious journals and conference proceedings. He was awarded the gold medal in 2017 International Consortium of Chinese Mathematicians.

百家乐桌子轮盘| 海王星百家乐的玩法技巧和规则| 百家乐官网打格式| 金狮国际| 百家乐网上真钱娱乐| 百家乐官网能破解| 皇冠备用网址| 全景网百家乐的玩法技巧和规则| 大三元百家乐官网的玩法技巧和规则 | 百家乐庄家怎样赚钱| 百家乐官网tt娱乐城娱乐城| 网上娱乐城排名| 中国百家乐软件| 现场百家乐官网电话投注| 法老王娱乐城| 百家乐娱乐网官网网| 百家乐官网高手论| 大发888游戏平台hg dafa888 gw| 温州百家乐真人网| 百家乐官网游戏技巧| 香港六合彩特码| 百家乐游戏什么时间容易出| 百家乐小游戏开发| 百家乐官网双倍派彩的娱乐城| 网上真钱赌博| 德州扑克单机游戏| 百家乐作| 什么是百家乐官网的大路| 百家乐官网视频游戏挖坑| 澳门百家乐官网指数| 立博开户| 立即博百家乐的玩法技巧和规则| 百家乐优惠高的网址| 澳门百家乐官网765118118| 百家乐官网是否违法| 博狗备用网站| 大发888 漏洞| 大发888娱乐城送58| 大发888怎样存款| 大发888casino下载| 百家乐园搏彩论坛|