{"success":1,"msg":"","color":"rgb(28, 35, 49)","title":"Naive Penalized Spline Estimators of Derivatives Achieve Optimal Rates of Convergence<\/b>","description":"webinar","title2":"","start":"2022-03-25 14:00","end":"2022-03-25 15:00","responsable":"Isabelle Beaudry <\/i><\/a>","speaker":"John Staudenmayer (University of Massachusetts, Amherst)","id":"57","type":"webinar","timezone":"America\/Santiago","activity":"https:\/\/zoom.us\/j\/93416271268?pwd=Uk53d1lHTEJNa2xUMTRwNDBSMXU3Zz09\r\nPasscode: 258393","abstract":"Given data \\{x_i, y_i\\}_{i=1}^n, sampled from y_i = f(x_i)+e_i where f is an unknown function with p continuous derivatives, and e_i are iid with mean zero and constant variance, we are interested in estimating a derivative of f. While it is straightforward to compute nonparametric estimates of derivative functions, the challenge is that those estimates also require some sort of regularization to balance estimation bias and overfitting, and methods to choose that regularization are usually designed for estimating the function itself, not derivatives. In this talk we review a few methods to address that problem and what is known about their asymptotic properties. We also present new asymptotic result about penalized splines and show that choosing a smoothing parameter to estimate the function itself and symply differentiating the estimate achieves the optimal L_2 rate of convergence. This is joint work with Bright Antwi Boasiako, a graduate student at the University of Massachusets, Amherst."}