Submitted:
09 August 2023
Posted:
11 August 2023
You are already at the latest version
Abstract

Keywords:
1. Introduction
2. System Configuration
2.1. Hardware Configuration
2.2. Software Configuration
3. Camera Control System
3.1. Marker Recognition Process
- (1a)
- The camera parameters are set to the initial values W=500 and F=150, with the zoom at its maximum and the focus at its farthest forward. The reason for these settings is that the zoom value is farther away for a wider recognition range, and the focus value can be processed faster if it is shifted from near to far.
- (1b)
- Detect AR markers within a single frame of image output from the camera.
- (1c)
- If no AR marker is found in the image, the focus value F is reduced by 15 to focus on a more distant point.
- (1d)
- Repeat steps (1b) – (1c) until AR markers are detected.
- (1e)
- If an AR marker is detected for the first time, get the initial position and posture of the AR marker to be given as initial values for the next Iteration process.
- (2a)
- Receive the initial recognition values and of the AR marker from the Scanning process.
- (2b)
- Update the camera parameters based on the recognized depth distance z of the AR marker.
- (2c)
- Get the next recognition values and with the updated camera parameters.
- (2d)
- Calculate the absolute error value between and , and between and . If the error values are larger than the thresholds and , repeat steps (2b) – (2c).
- (2e)
- If the absolute error values calculated in step (2d) are smaller than the thresholds and , the latest and are output as the final recognition values and .
3.2. Dynamic Camera Parameter Controller
4. Experimental Evaluation
4.1. Experimental Setup
4.2. Experimental Conditions
4.3. Recognition Performance using Fixed Camera Parameters
4.4. Recognition Performance using Dynamic Focus Control
4.5. Recognition Performance using Dynamic Control of Both Focus and Zoom (Proposed Method)
4.5.1. Performance on Translational Position Recognition
4.5.2. Performance of the Marker’s Posture Recognition
5. Discussion
6. Conclusion
Author Contributions
Acknowledgments
Conflicts of Interest
References
- Miyahara, R.; Haraguchi, D. Object Handling System using Ultra-Small and High-Precision AR Markers(KRIS2023). 2023, p. 140.
- Brachmann, E.; Rother, C. Visual camera re-localization from RGB and RGB-D images using DSAC. IEEE transactions on pattern analysis and machine intelligence 2021, 44, 5847–5865. [Google Scholar] [CrossRef] [PubMed]
- Mathis, A.; Mamidanna, P.; Cury, K.M.; Abe, T.; Murthy, V.N.; Mathis, M.W.; Bethge, M. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nature neuroscience 2018, 21, 1281–1289. [Google Scholar] [CrossRef] [PubMed]
- Madec, S.; Jin, X.; Lu, H.; Solan, B.D.; Liu, S.; Duyme, F.; Heritier, E.; Baret, F. Ear density estimation from high resolution RGB imagery using deep learning technique. Agricultural and forest meteorology 2019, 264, 225–234. [Google Scholar] [CrossRef]
- Panteleris, P.; Oikonomidis, I.; Argyros, A. Using a Single RGB Frame for Real Time 3D Hand Pose Estimation in the Wild. 2018, pp. 436–445. [CrossRef]
- Ichnowski, J.; Avigal, Y.; Kerr, J.; Goldberg, K. Dex-NeRF: Using a Neural Radiance Field to Grasp Transparent Objects. PMLR 2022, 164, 526–536. [Google Scholar]
- Kan, T.W.; Teng, C.H.; Chen, M.Y. , B., Ed.; Springer New York: New York, NY, 2011; pp. 339–354. https://doi.org/10.1007/978-1-4614-0064-6_16.Applications. In Handbook of Augmented Reality; Furht, B., Ed.; Springer New York: New York, NY, 2011; Springer New York: New York, NY, 2011; pp. 339–354. [Google Scholar] [CrossRef]
- Rekimoto, J. Matrix: a realtime object identification and registration method for augmented reality. In Proceedings of the Proceedings. 3rd Asia Pacific Computer Human Interaction (Cat. No.98EX110); 1998; pp. 63–68. [Google Scholar] [CrossRef]
- Ruan, K.; Jeong, H. An Augmented Reality System Using Qr Code as Marker in Android Smartphone. In Proceedings of the 2012 Spring Congress on Engineering and Technology; 2012; pp. 1–3. [Google Scholar] [CrossRef]
- Ikeda, K.; Tsukada, K. CapacitiveMarker: novel interaction method using visual marker integrated with conductive pattern. 2015, pp. 225–226.
- Uranishi, Y.; Imura, M.; Kuroda, T. The Rainbow Marker: An AR marker with planar light probe based on structural color pattern matching. IEEE, 2016, pp. 303–304.
- ARToolKit SDKs download website. http://www.hitl.washington.edu/artoolkit/download/.
- Zhao, T.; Jiang, H. Landing system for AR. Drone 2.0 using onboard camera and ROS. IEEE, 2016, pp. 1098–1102. [CrossRef]
- Qi, J.; Guan, X.; Lu, X. An Autonomous Pose Estimation Method of MAV Based on Monocular Camera and Visual Markers. 2018, pp. 252–257. [CrossRef]
- Aoki, R.; Tanaka, H.; Izumi, K.; Tsujimura, T. Self-Position Estimation based on Road Sign using Augmented Reality Technology. 2018, pp. 39–42. [CrossRef]
- eddine Ababsa, F.; Mallem, M. Robust camera pose estimation using 2d fiducials tracking for real-time augmented reality systems. 2004, pp. 431–435.
- Kato, J.; Deguchi, G.; Inoue, J.; Iwase, M. Improvement of Performance of Navigation System for Supporting Independence Rehabilitation of Wheelchair - Bed Transfer. Journal of Physics: Conference Series 2020, 1487, 012041. [Google Scholar] [CrossRef]
- Nakanishi, H.; Hashimoto, H. AR-Marker/IMU Hybrid Navigation System for Tether-Powered UAV. Journal of Robotics and Mechatronics 2018, 30, 76–85. [Google Scholar] [CrossRef]
- Tsujimura, T.; Aoki, R.; Izumi, K. Geometrical optics analysis of projected-marker augmented reality system for robot navigation. IEEE, 2018, pp. 1–6.
- Yu, X.; Yang, G.; Jones, S.; Saniie, J. AR Marker Aided Obstacle Localization System for Assisting Visually Impaired. 2018, pp. 271–276. [CrossRef]
- Romli, R.; Razali, A.F.; Ghazali, N.H.; Hanin, N.A.; Ibrahim, S.Z. Mobile Augmented Reality (AR) Marker-based for Indoor Library Navigation. IOP Conference Series: Materials Science and Engineering 2020, 767, 012062. [Google Scholar] [CrossRef]
- Choi, C.; Christensen, H.I. Real-time 3D model-based tracking using edge and keypoint features for robotic manipulation. 2010, pp. 4048–4055. [CrossRef]
- Pai, Y.S.; Yap, H.J.; Singh, R. Augmented reality–based programming, planning and simulation of a robotic work cell. Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture 2014, 229, 1029–1045. [Google Scholar] [CrossRef]
- Raessa, M.; Chen, J.C.Y.; Wan, W.; Harada, K. Human-in-the-Loop Robotic Manipulation Planning for Collaborative Assembly. IEEE Transactions on Automation Science and Engineering 2020, 17, 1800–1813. [Google Scholar] [CrossRef]
- Zhang, L.; Ye, M.; Chan, P.L.; Yang, G.Z. Real-time surgical tool tracking and pose estimation using a hybrid cylindrical marker. International journal of computer assisted radiology and surgery 2017, 12, 921–930. [Google Scholar] [CrossRef] [PubMed]
- Costanza, E.; Huang, J. Designable Visual Markers. Association for Computing Machinery, 2009, pp. 1879–1888. [CrossRef]
- Douxchamps, D.; Chihara, K. High-accuracy and robust localization of large control markers for geometric camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence 2008, 31, 376–383. [Google Scholar] [CrossRef] [PubMed]
- Yoon, J.H.; Park, J.S.; Kim, C. Increasing Camera Pose Estimation Accuracy Using Multiple Markers. Springer Berlin Heidelberg, 2006, pp. 239–248.
- Yu, R.; Yang, T.; Zheng, J.; Zhang, X. Real-Time Camera Pose Estimation Based on Multiple Planar Markers. 2009, pp. 640–645. [CrossRef]
- Hayakawa, S.; Al-Falouji, G.; Schickhuber, G.; Mandl, R.; Yoshida, T.; Hangai, S. A Method of Toothbrush Position Measurement using AR Markers. IEEE, 2020, pp. 91–93.
- Uematsu, Y.; Saito, H. Improvement of Accuracy for 2D Marker-Based Tracking Using Particle Filter. 2007, pp. 183–189. [CrossRef]
- Rubio, M.; Quintana, A.; Pérez-Rosés, H.; Quirós, R.; Camahort, E. Jittering reduction in marker-based augmented reality systems. Springer, 2006, pp. 510–517. [CrossRef]
- Bergamasco, F.; Albarelli, A.; Rodolà, E.; Torsello, A. RUNE-Tag: A high accuracy fiducial marker with strong occlusion resilience. 2011, pp. 113–120. [CrossRef]
- Bergamasco, F.; Albarelli, A.; Torsello, A. Pi-tag: a fast image-space marker design based on projective invariants. Machine vision and applications 2013, 24, 1295–1310. [Google Scholar] [CrossRef]
- Tanaka, H.; Sumi, Y.; Matsumoto, Y. A novel AR marker for high-accuracy stable image overlay. IEEE, 2012, pp. 217–218. [CrossRef]
- Tanaka, H.; Sumi, Y.; Matsumoto, Y. A high-accuracy visual marker based on a microlens array. IEEE, 2012, pp. 4192–4197. [CrossRef]
- Tanaka, H.; Sumi, Y.; Matsumoto, Y. Avisual marker for precise pose estimation based on lenticular lenses. 2012, pp. 5222–5227. [CrossRef]
- Tanaka, H.; Ogata, K.; Matsumoto, Y. Improving the accuracy of visual markers by four dots and image interpolation. 2016, pp. 178–183. [CrossRef]
- Toyoura, M.; Aruga, H.; Turk, M.; Mao, X. Mono-spectrum marker: an AR marker robust to image blur and defocus. The Visual Computer 2014, 30, 1035–1044. [Google Scholar] [CrossRef]
- OpenCV-Python Tutorials. http://labs.eecs.tottori-u.ac.jp/sd/Member/oyamada/OpenCV/html/py_tutorials/py_calib3d/py_calibration/py_calibration.html#.
- Inoue, M.; Ogata, M.; Izumi, K.; Tsujimura, T. Posture Estimation for Robot Navigation System based on AR Markers. 2021, pp. 628–633.















| Product name | Logicool BRIO C1000eR® |
|---|---|
| Output resolution | 1920 × 1080 (FHD) |
| Frame rate | 30fps |
| Diagonal FOV | 90° |
| Digital zoom | 1x - 5x |
| Size (mm) | 102 × 27 × 27 |
| Parameter | Symbol | Value |
|---|---|---|
| Zoom value | W | Variable. () |
| Maximum zoom value | 500 | |
| Minimum zoom value | 100 | |
| Depth distance of AR marker | z | Variable. (m) |
| Minimum depth distance | 0.05 (m) | |
| Conversion coefficient of zoom value | C | 1000 (1/m) |
| Camera magnification | n | Variable. |
| Focus value | F | Variable. () |
| Constants of focus value | ||
| Constants of focus value | ||
| Distortion coefficient matrix | Determined by calibration. | |
| Radial distortion coefficient | ||
| Tangential distortion coefficient | ||
| Internal parameter matrix | Determined by calibration. | |
| focus distance | ||
| Optical center | ||
| AR marker position | Iterative output variable. | |
| AR marker posture | Iterative output variable. | |
| Threshold of position error | Arbitrally setting. | |
| Threshold of posture error | Arbitrally setting. |
| Parameter | Symbol | Value |
|---|---|---|
| Zoom value | W | 500 |
| Focus value | F | 150 |
| Threshold of position error | 1.0 (mm) | |
| Threshold of posture error | 0.01 (deg) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
