Colloquiums and Conferences
Three Experts From Sun Yat-sen University reported their work in School of Mathematical Science
Time: 2018-10-12 19:01:00
Multi-task Learning in Vector-valued Reproducing Kernel Banach Spaces with the L1 Norm.
Speaker:Researcher Rongrong Lin Sun Yat-sen University(SYSU)
Time:2018-10-12 09:00--10:00
Location:Room 401 in School of Mathematical Sciences
Introduction:
A Higher-Order Polynomial Method for SPECT Reconstruction
Speaker:Professor Si Li Sun Yat-sen University(SYSU)
Time:2018-10-12 10:00--11:00
Location:Room 401 in School of Mathematical Sciences
Introduction:
Existing single-photon emission computed tomography (SPECT) reconstruction methods are most based on discrete models that may be viewed as piecewise constant approximations of certain continuous data acquisition process. Due to low accuracy order of piecewise constant approximations, traditional discrete models introduce irreducible model errors which are a bottleneck of the quality improvement of reconstructed images in clinical applications. To overcome this drawback, we develop a higher-order polynomial method for SPECT reconstruction. Specifically, we represent the data acquisition of SPECT imaging by using an integral equation model, approximate the solution of the underlying integral equation by higher-order piecewise polynomials leading to a new discrete system and introduce two novel regularizers for the system, by exploring the a priori knowledge of the radiotracer distribution, suitable for the approximation.
High Dimension Sparse Grid Approximation Techniques and its Application in Integral Equation and Random Differential Equation
Speaker:Professor Ying Jiang Sun Yat-sen University(SYSU)
Time:2018-10-12 11:00--12:00
Location:Room 401 in School of Mathematical Sciences
Introduction:
This talk is about a kind of high-dimension approximation techniques, called sparse grids,which are widely used in solving partial differential equations, integral equations,designing high-dimension quadrature formula, data mining, etc. The approximation schemes on sparse grids achieve quasi-linear computational cost when the schemes on full grids suffer from the ``curse of dimensionality'', since the computational complexity increases exponentially as the dimension grows. At same time, the approximation schemes on sparse grids enjoy the optimal approximation order as the schemes on full grids do.