Versatile Imaging System

##plugins.themes.bootstrap3.article.main##

  •   Frank Edughom Ekpar

Abstract

This paper introduces a versatile imaging system that encompasses a wide variety of methods, devices and systems. The versatile imaging system or device introduced in this paper is designed to facilitate the synthesis of representations of stimuli covering substantially all directions or only a subset of directions around a given reference or view point, comprises at least one grid of one or more focusing elements disposed on an N-dimensional and arbitrarily shaped surface, at least one grid of one or more sensor elements disposed on an N-dimensional and arbitrarily shaped surface, and optionally, at least one grid of one or more stimulus guide elements disposed on an N- dimensional and arbitrarily shaped surface, where N can be chosen to be 1, 2, 3, or any other suitable quantity. A sampling of contemporary imaging systems that fall within the scope of the system is described. Pointers to architectures that take greater advantage of the features of the system are presented.


Keywords: Versatile Imaging System, Representations of Stimuli, Sensor Grid, Omnidirectional Imaging System, Lens-less Imaging

References

R. Benosman, “Panoramic imaging from 1767 to the present”, Proc. International Conference on Advanced Robotics, Workshop, Volume 1, 2001, pp. 9-10.

F. Ekpar, H. Hase, and M. Yoneda, “On the Interactive Visualization of Very Large Data Sets”, Proceedings. 7th International IEEE Conference on Computer and Information Processing, 2007, pp. 627-632.

R. Szeliski, “Video mosaics for virtual environments”, Computer Graphics and Applications, 16(3), 1996, pp. 23- 30.

H. Sawhney, S. Hsu, and R. Kumar, “Robust video mosaicing through topology inference and local to global alignment”, Proc. Fifth European Conference on Computer Vision, Volume 2, 1998, pp. 103-119.

Simon Baker and Shree K. Nayar, “A Theory of Single- Viewpoint Catadioptric Image Formation”, International Journal of Computer Vision, Volume 35, Issue 2, 1999, pp. 175 - 196.

C. Geyer, and K. Daniilidis, “Catadioptric projective geometry”, Proc. International Conference on Advanced Robotics, Volume 1, 2001, pp. 17-30.

Y. Yagi, S. Kawato, and S. Tsuji, “Real-time omni directional image sensor (copis) for vision-guided navigation”, IEEE Transactions on Robotics and Automation, 10 (1), 1994, pp. 11-22.

J. Chahl, and M. Srinivasan, “Reflective surfaces for panoramic imaging”, Applied Optics, Volume 36, 1997, pp. 8275-8285.

Shenchang Eric Chen, “QuickTime VR: an image-based approach to virtual environment navigation”, Computer Graphics (Proc. SIGGRAPH), 1995, pp. 29-38.

F. Ekpar, H. Hase, and M. Yoneda, “Constructing arbitrary perspective-corrected views from panoramic images using neural networks”, Proc. 7th International Conference on Neural Information Processing, Volume 1, 2000, pp. 156-160.

F. Ekpar, H. Hase, and M. Yoneda, “Correcting distortions in panoramic images using constructive neural networks”, International Journal of Neural Systems, Volume 1, 2003, pp. 239-250.

F. Ekpar, “Method and apparatus for creating interactive virtual tours”, United States Patent Number 7567274, Issued: 2009.

Peter J. Burt and Edward H. Adelson, “A Multiresolution Spline with Application to Image Mosaics”,ACM Transactions on Graphics, Volume 2, Number 4, 1983, pp. 217-236.

Khademhosseinieh B, Sencan I, Biener G, Su TW, Coskun AF, Tseng D, and Ozcan A, “Lensfree on-chip imaging using nanostructured surfaces”, Applied Physics Letters, 2010, Volume 96, Issue 17, 171106.

Sungkyu Seo, Serhan O. Isikman, Ikbal Sencan, Onur Mudanyali, Ting-Wei Su, Waheb Bishara, Anthony Erlinger and Aydogan Ozcan, “High Throughput Lens-free Blood Analysis on a chip”, Analytical Chemistry, 2010, Volume 82, Issue 11, pp. 4621-4627.

Downloads

Download data is not yet available.

##plugins.themes.bootstrap3.article.details##

How to Cite
[1]
Ekpar, F. 2019. Versatile Imaging System. European Journal of Engineering and Technology Research. 4, 12 (Dec. 2019), 102-107. DOI:https://doi.org/10.24018/ejers.2019.4.12.1668.