1

This is a question which came to me due to several previous question: sorry for the all previous links necessary to look to get the question. The latest question is in the link:

Convergence on Norm vector space.

My understanding is that, the problem of convergence with limited data may be possible in functional space by allowing countable discontinuity. My question is: Is there any function space already available for this purpose. It is understood that the required space is problem dependent.

The common class of problems I have in mind is parameter estimation. The issue is maximum likelihood method has a hidden assumption that the parameters of the models are independent. This may be true when we have lots of data point (equivalent to expected value) but not with limited data. The most simple class of problem is a single parameter estimation with noise. For single parameter estimation we may not have a solution as the structure may be linear but for multiple parameter there may be structure depending on the problem. Another useful application is suppose you want to measure the frequencies of a signal. One can perform Fourier transform and find the frequencies but with with limited data we do not get good estimates.

Creator
  • 3,128
  • Then please write down the problem with limited data you have in mind. A possible solution has to rely on the structure of your problem. – daw May 12 '15 at 13:36
  • There is not enough information to understand what it is that you want. – copper.hat May 12 '15 at 14:38
  • @copper.hat as per your request i have added information. – Creator May 12 '15 at 19:42
  • @daw as per your request I have added information. – Creator May 12 '15 at 20:11
  • What do you mean by "hidden assumption that the parameters of the models are independent"---independent, in which sense? – kjetil b halvorsen May 13 '15 at 10:43
  • @kjetilbhalvorsen imagine we have likelihood function with two parameters, To find the maximum likelihood we do partial derivative(PD) is not it? Performing PD means independent. – Creator May 13 '15 at 17:52
  • OK, so you means functional independence, as I had guessed (privately). No method can get by without that assumption, in statistics it is called identifiability. With non-identifiable models there are no unique maximum likelihood estimates, but non unique estimates by non other method either. – kjetil b halvorsen May 13 '15 at 19:51
  • 1
    @kjetilbhalvorsen Thank you for your comment. I was under the impression that functional independence is inherently based on linear space. Anyway, thank you for your interest. – Creator May 13 '15 at 20:06
  • @kjetilbhalvorsen Sorry, to bother you again. But now it seems the problem does not defy functional independence as -identifiable issue which is described only when we have infinite data, the wikipedia link http://en.wikipedia.org/wiki/Identifiability Am I missing something? Would you please comment. In other words model is identifiable and with limited data problem exist. – Creator May 13 '15 at 22:42
  • Identifiability means that two different parameter values gives rise to different distributions for the data. But, a model (with certain data) could be "weakly identifiable" in the sense that some different parameter values gives rise to almost the same distribution. But, if such is the case, all methods will have problems. It is a problem with a certail model/data combination, and when it arises, all methods will have problems. Nothing specific to do with maximum likelihood. – kjetil b halvorsen May 13 '15 at 23:28
  • @kjetilbhalvorsen Exactly. The problem exist even with identifiable variable as well. And my question which I am asking here is: can;t we assume a structure between them. This may equivalent to treating as if parameters have a conditional distribution till we get infinite data. – Creator May 14 '15 at 00:27
  • same distribution does lead to unidentifiable only if one uses distribution as input for estimation. – Creator May 14 '15 at 19:27

0 Answers0