My name is Yasunori Fujikoshi. I got Doctor of Science in the area of Statistics, especially in the area of multivariate statistical analysis.
The following two books are strongly related to my research. One is “Multivariate statistics; High-dimensional and Large-sample approximations” published in Hoboken: Wiley, 2010, by Y. Fujikoshi, V. V. Ulyanov and R. Shimizu. The other is “Non-asymptotic analysis of approximations for multivariate statistics” published in Springer, 2020, by Y. Fujikoshi and V. V. Ulyanov.
Recently, in multivariate analysis it is important to analyze high-dimensional date, in addition to the usual multivariate data. The first book gives various high-dimensional methods, in addition to the traditional multivariate methods. Especially, in addition to classical large sample approximations, various high dimensional approximations are introduced.
The second book treats with multivariate approximations. An important problem related to multivariate approximations concerns their errors. Most results contain only so-called order estimates. However, such error estimates do not provide information on actual errors. Ideally, we need non-asymptotic or computable error bounds that relate to these actual errors, in addition to order estimates. This book gives various non -asymptotic bounds for high-dimensional and large-sample approximations.
One of my research interests is concerned with selection of variables and models in multivariate analysis. An approach is to use model selection criteria AIC and BIC, etc. However, it has been pointed that there are computationally problems especially in high-dimensional multivariate data. To avoid this computational problem, there is a method due to Nishii et al. (1988), which determines “selection” or “no selection” for each variable by comparing the model after removing a variable and the full model. The method was recently named as KOO (knock-one-out) method. In Fujikoshi (2022, JMA) it was pointed that such approaches have high-dimensional consistency in multivariate regression model and discriminant analysis.