Skip to main content

Regularization of Positive Signal Nonparametric Filtering in Multiplicative Observation Model

  • Conference paper
  • First Online:
Nonparametric Statistics

Part of the book series: Springer Proceedings in Mathematics & Statistics ((PROMS,volume 175))

  • 1567 Accesses

Abstract

A solution to the problem of useful random signal extraction from a mixture with a noise in the multiplicative observation model is proposed. Unlike conventional filtering tasks, in the problem under consideration it is supposed that the distribution (and the model) of the useful signal is unknown. Therefore, in this case one cannot apply such well-known techniques like Kalman filter or posterior Stratonovich-Kushner evolution equation. The new paper is a continuation and development of the author’s article, reported at the First ISNPS Conference (Halkidiki’2012), where the filtering problem of positive signal with the unknown distribution had been solved using the generalized filtering equation and nonparametric kernel techniques. In the present study, new findings are added concerning the construction of stable procedures for filtering, the search for optimal smoothing parameter in the multidimensional case and some of the convergence results of the proposed techniques. The main feature of the problem is the positive distribution support. In this case, the classical methods of nonparametric estimation with symmetric kernels are not applicable because of large estimator bias at the support boundary. To overcome this drawback, we use asymmetric gamma kernel functions. To have stable estimators, we propose a regularization procedure with a data-driven optimal regularization parameter. Similar filtering algorithms can be used, for instance, in the problems of volatility estimation in statistical models of financial and actuarial mathematics.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    For brevity, in the future the arguments of some functions are omitted.

References

  1. Dobrovidov Alexander V., Koshkin Gennady M., Vasiliev Vyacheslav A.: Non-Parametric State Spase Models. Kendrick Press, USA (2012)

    Google Scholar 

  2. Lehmann, E.L.: Testing Statistical Hypotheses. Wiley, N.Y. (1959)

    Google Scholar 

  3. Dobrovidov, A.V.: Nonparametric methods of nonlinear filtering of stationary random sequences. Automat. Remote Control 44(6), 757–768 (1983)

    MathSciNet  MATH  Google Scholar 

  4. Pensky, M.: A general approach to nonparametric empirical Bayes estimation. Statistics 29, 61–80 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  5. Pensky, M., Singh, R.S.: Empirical Bayes estimation of reliability characteristics for an exponential family. Can. J. Stat. 27, 127–136 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  6. Markovich, L.A.: The equation of optimal filtering, Kalman’s filter and theorem on normal correlation. In: Proceedings of the 11th International Vilnius Conference on Probability and Mathematical Statistics. Vilnius, Lithuania, 182, 30 June–4 July 2014. ISBN: 978-609-433-220-3

    Google Scholar 

  7. Dobrovidov, A.V.: Stable nonparametric signal filtration in nonlinear models. In: Topics in Nonparametric Statistics: Proceedings of the First Conference of the International Society for Nonparametric Statistics, vol. XVI, pp. 61–74. Springer, New York (2014)

    Google Scholar 

  8. Taoufik, B., Rambouts, J.: Nonparametric density estimation for multivariate bounded data. J. Stat. Plann. Infer. 140(1), 139–152 (2007)

    MathSciNet  Google Scholar 

  9. Chen, S.X.: Probability density function estimation using gamma kernels. Ann. Inst. Statist. Math. 52(3), 471–480 (2000)

    Google Scholar 

  10. Markovich, L.A.: Gamma kernel estimation of multivariate density and its derivative on the nonnegative semi-axis by dependent data (2015). arXiv:1410.2507v2

  11. Dobrovidov, A.V., Markovich, L.A.: Nonparametric gamma kernel estimators of density derivatives on positive semi-axis. In: Proceedings of IFAC MIM 2013, pp. 1–6. Petersburg, Russia, 19–21 June 2013

    Google Scholar 

  12. Hall, P., Marron, J.S., Park, B.U.: Smoothed cross-validation. Probab. Theory Relat. Fields 92, 1–20 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  13. Devroye, L., Gyorfi, L.: Nonparametric Density Estimation. The L1 View. John Wiley, New York, N.Y. (1985)

    Google Scholar 

  14. Davydov, Y.A.: On Convergence of distributions induced by stationary random processes. Probab. Theory Appl. V. XIII(4), 730–737 (1968)

    Google Scholar 

  15. Masry, E.: Probability density estimation from sampled data. IEEE Trans. Inf. Theory. V. IT-29(5), 696–709 (1983)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alexander V. Dobrovidov .

Editor information

Editors and Affiliations

Appendix

Appendix

Proof of Lemma 2.1.3. We give here only outline of the proof. Provided the condition (1), from formula (13) it follows that for a sequence of statistically dependent random variables the variance of their sum is expressed through the covariance

$$\begin{aligned} C= & {} 2N^{-1} \sum \limits _{i=1}^{N-1} \left( 1-\frac{1}{N} \right) Cov\left( \tilde{K}(\mathbf x,\mathbf X_1), \tilde{K}(\mathbf x,\mathbf X_{1+i}) \right) . \end{aligned}$$
(41)

This covariance is estimated from above by using the Davydov’s inequality [14]

$$\begin{aligned} \quad \quad |Cov\left( \tilde{K}(\mathbf x,\mathbf X_1),\tilde{K}(\mathbf x,\mathbf X_{1+i})\right) |\le 2\pi \alpha (i)^{1/r}\parallel \tilde{K}(\mathbf x,\mathbf X_1) \parallel _q \parallel \tilde{K}(\mathbf x,\mathbf X_{1+i}) \parallel _p, \end{aligned}$$

where \(p^{-1}+q^{-1}+r^{-1}=1\), \(\alpha (i)\) is a strong mixing coefficient of the process \((X_n)\), and \(\parallel \cdot \parallel _q\) is a norm in the space \(L_q\). This norm can be estimated by the following expression

$$\begin{aligned} \parallel \tilde{K}(\mathbf x,\mathbf X_1) \parallel _q= & {} \left( \int \left( \prod \limits _{j=1}^dK_{{\rho _1(x_j),b_j}}\left( t_{j}\right) \right) ^q f(t_1^d)dt_1\ldots d t_{d}\right) ^{1/q} \nonumber \\= & {} \left( \mathsf E\left( \prod \limits _{j=1}^dK^{q-1}_{{\rho _1(x_j),b_j}}\left( \xi _{j}\right) f(\xi _1^d)\right) \right) ^{1/q}, \end{aligned}$$
(42)

where a kernel \(\prod _{j=1}^dK_{{\rho _1(x_j),b_j}}\left( \xi _{j}\right) \) is used as density function and random variables \(\xi _{j}\) is distributed like \(Gamma(\rho _1(x_j),b_j)\)-distribution with expectation \(\mu _j=x_j\) and variance \(\sigma _{\xi }^2 = x_jb_j\). Expectation in parentheses of (42) is calculated by expanding the function \(f(\xi _1^d)\) in a Taylor series at the point \( \mathbf {\mu } = \mu _1^d \). After some algebra we get

$$\begin{aligned} |C|=|C(\hat{f}(\mathbf {x})|\leqslant \frac{D(\upsilon ,\mathbf x)}{N}b^{-d\frac{1+\upsilon }{2}} \int _{1}^{\infty }\alpha (\tau )^{\upsilon } d\tau ,\quad 0<\upsilon<1, \quad D(\upsilon ,\mathbf x)<\infty . \end{aligned}$$

Provided the condition (2), we use the technique of the proof from [15]. To do this, we divide (41) into two terms \(|C|=(2/N)\sum _{i=1}^{N-1}(\cdot )=(2/N)\big (\sum _{i=1}^{c(N)}(\cdot )+\sum _{i=c(N)+1}^{N-1}(\cdot )\big )=I_1+I_2\) and estimate each at \(N\rightarrow \infty .\) As a result we get

$$\begin{aligned} I_1= O\left( \frac{c(N)b^{d/2}}{nb^{d/2}}\right) ,\quad I_2= O\left( \frac{1}{nb^{d/2}c(N)b^{d\upsilon /2}}\right) . \end{aligned}$$

It remains to find c(N) satisfying conditions \(c(N)b^{d/2}\rightarrow 0\) and \(c(N)b^{d\upsilon /2}\rightarrow \infty .\) Let \(c(N)=b^{-\varepsilon }\). Then, these conditions are met simultaneously if \( \frac{d}{2}\!>\!\varepsilon \!> \!\frac{d}{2}\upsilon >0, \quad 0< \upsilon < 1,\) and \(|C|=o\left( Nb^{d/2}\right) ^{-1}\). \(\square \)

Proof of Lemma 2.2.3. Proof of this Lemma is carried out, in principle, in the same way as Lemma 2.1.3, but it occupies a lot of space because of a complex estimator expression containing a special functions (see [10]). \(\square \)

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Dobrovidov, A.V. (2016). Regularization of Positive Signal Nonparametric Filtering in Multiplicative Observation Model. In: Cao, R., González Manteiga, W., Romo, J. (eds) Nonparametric Statistics. Springer Proceedings in Mathematics & Statistics, vol 175. Springer, Cham. https://doi.org/10.1007/978-3-319-41582-6_7

Download citation

Publish with us

Policies and ethics