Abstract
Non-line-of-sight (NLOS) imaging aims at recovering the shape and albedo of hidden objects. Despite recent advances, real-time video of complex and dynamic scenes remains a major challenge owing to the weak signal of multiply scattered light. Here we propose and demonstrate a framework of spectrum filtering and motion compensation to realize high-quality NLOS video for room-sized scenes. Spectrum filtering leverages a wave-based model for denoising and deblurring in the frequency ___domain, enabling computational image reconstruction with a small number of sampling points. Motion compensation tailored with an interleaved scanning scheme can compute high-resolution live video during the acquisition of low-quality image sequences. Together, we demonstrate live NLOS videos at 4 fps for a variety of dynamic real-life scenes. The results mark a substantial stride toward real-time, large-scale and low-power NLOS imaging and sensing applications.
This is a preview of subscription content, access via your institution
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
27,99 € / 30 days
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
99,00 € per year
only 8,25 € per issue
Buy this article
- Purchase on SpringerLink
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout






Similar content being viewed by others
Data availability
Source data for Figs. 2–6 are available with this paper. The datasets collected by the imaging system can be accessed from Code Ocean at https://doi.org/10.24433/CO.2487919.v2 (ref. 48).
Code availability
The code used in the current study can be accessed from Code Ocean at https://doi.org/10.24433/CO.2487919.v2 (ref. 48).
References
Altmann, Y. et al. Quantum-inspired computational imaging. Science 361, eaat2298 (2018).
Faccio, D., Velten, A. & Wetzstein, G. Non-line-of-sight imaging. Nat. Rev. Phys. 2, 318–327 (2020).
Kirmani, A., Hutchison, T., Davis, J. & Raskar, R. Looking around the corner using transient imaging. In 2009 IEEE 12th International Conference on Computer Vision 159–166 (IEEE, 2009).
Velten, A. et al. Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging. Nat. Commun. 3, 745 (2012).
Heide, F., Xiao, L., Heidrich, W. & Hullin, M. B. Diffuse mirrors: 3D reconstruction from diffuse indirect illumination using inexpensive time-of-flight sensors. In Proc. IEEE/CVF Conference On Computer Vision and Pattern Recognition 3222–3229 (IEEE, 2014).
Buttafava, M., Zeman, J., Tosi, A., Eliceiri, K. & Velten, A. Non-line-of-sight imaging using a time-gated single photon avalanche diode. Opt. Express 23, 20997–21011 (2015).
Laurenzis, M., Klein, J., Bacher, E. & Metzger, N. Multiple-return single-photon counting of light in flight and sensing of non-line-of-sight objects at shortwave infrared wavelengths. Opt. Lett. 40, 4815–4818 (2015).
O’Toole, M., Lindell, D. B. & Wetzstein, G. Confocal non-line-of-sight imaging based on the light-cone transform. Nature 555, 338–341 (2018).
Lindell, D. B., Wetzstein, G. & O’Toole, M. Wave-based non-line-of-sight imaging using fast f–k migration. ACM Trans. Graph. 38, 1–13 (2019).
Liu, X. et al. Non-line-of-sight imaging using phasor-field virtual wave optics. Nature 572, 620–623 (2019).
Xin, S. et al. A theory of fermat paths for non-line-of-sight shape reconstruction. In Proc. IEEE/CVF Conference on Computer Vision and Pattern Recognition 6800–6809 (IEEE, 2019).
Pediredla, A., Dave, A. & Veeraraghavan, A. SNLOS: non-line-of-sight scanning through temporal focusing. In 2019 IEEE International Conference on Computational Photography 1–13 (IEEE, 2019).
Bertolotti, J. et al. Non-invasive imaging through opaque scattering layers. Nature 491, 232–234 (2012).
Katz, O., Heidmann, P., Fink, M. & Gigan, S. Non-invasive single-shot imaging through scattering layers and around corners via speckle correlations. Nat. Photon. 8, 784–790 (2014).
Metzler, C. A. et al. Deep-inverse correlography: towards real-time high-resolution non-line-of-sight imaging. Optica 7, 63–71 (2020).
Mosk, A. P., Lagendijk, A., Lerosey, G. & Fink, M. Controlling waves in space and time for imaging and focusing in complex media. Nat. Photon. 6, 283–292 (2012).
Cao, R., de Goumoens, F., Blochet, B., Xu, J. & Yang, C. High-resolution non-line-of-sight imaging employing active focusing. Nat. Photon. 16, 462–468 (2022).
Klein, J., Peters, C., Martín, J., Laurenzis, M. & Hullin, M. B. Tracking objects outside the line of sight using 2D intensity images. Sci. Rep. 6, 32491 (2016).
Xu, F. et al. Revealing hidden scenes by photon-efficient occlusion-based opportunistic active imaging. Opt. Express 26, 9945–9962 (2018).
Saunders, C., Murray-Bruce, J. & Goyal, V. K. Computational periscopy with an ordinary digital camera. Nature 565, 472–475 (2019).
Rapp, J. et al. Seeing around corners with edge-resolved transient imaging. Nat. Commun. 11, 5929 (2020).
Willomitzer, F. et al. Fast non-line-of-sight imaging with high-resolution and wide field of view using synthetic wavelength holography. Nat. Commun. 12, 6647 (2021).
Nam, J. H. et al. Low-latency time-of-flight non-line-of-sight imaging at 5 frames per second. Nat. Commun. 12, 6526 (2021).
Chen, W., Wei, F., Kutulakos, K. N., Rusinkiewicz, S. & Heide, F. Learned feature embeddings for non-line-of-sight imaging and recognition. ACM Trans. Graph. 39, 1–18 (2020).
Gariepy, G., Tonolini, F., Henderson, R., Leach, J. & Faccio, D. Detection and tracking of moving objects hidden from view. Nat. Photon. 10, 23–26 (2016).
Chan, S., Warburton, R. E., Gariepy, G., Leach, J. & Faccio, D. Non-line-of-sight tracking of people at long range. Opt. Express 25, 10109–10117 (2017).
Wu, C. et al. Non-line-of-sight imaging over 1.43 km. Proc. Natl Acad. Sci. USA 118, e2024468118 (2021).
Liu, X. et al. Non-line-of-sight reconstruction with signal–object collaborative regularization. Light Sci. Appl. 10, 198 (2021).
Wang, B. et al. Non-line-of-sight imaging with picosecond temporal resolution. Phys. Rev. Lett. 127, 053602 (2021).
Zhu, S., Sua, Y. M., Bu, T. & Huang, Y.-P. Compressive non-line-of-sight imaging with deep learning. Phys. Rev. Appl. 19, 034090 (2023).
Li, Y. et al. NLOST: non-line-of-sight imaging with transformer. In Proc. IEEE/CVF Conference on Computer Vision and Pattern Recognition 13313–13322 (IEEE, 2023).
Bronzi, D., Villa, F., Tisa, S., Tosi, A. & Zappa, F. SPAD figures of merit for photon-counting, photon-timing, and imaging applications: a review. IEEE Sens. J. 16, 3–12 (2015).
Henderson, R. K. et al. A 192 × 128 time correlated spad image sensor in 40-nm CMOS technology. IEEE J. Solid State Circuits 54, 1907–1916 (2019).
Morimoto, K. et al. Megapixel time-gated spad image sensor for 2D and 3D imaging applications. Optica 7, 346–354 (2020).
Liu, X., Bauer, S. & Velten, A. Phasor field diffraction based reconstruction for fast non-line-of-sight imaging systems. Nat. Commun. 11, 1645 (2020).
Pei, C. et al. Dynamic non-line-of-sight imaging system based on the optimization of point spread functions. Opt. Express 29, 32349–32364 (2021).
Riccardo, S., Conca, E., Sesta, V., Velten, A. & Tosi, A. Fast-gated 16 × 16 spad array with 16 on-chip 6 ps time-to-digital converters for non-line-of-sight imaging. IEEE Sens. J. 22, 16874–16885 (2022).
Zhao, J. et al. A gradient-gated SPAD array for non-line-of-sight imaging. IEEE J. Sel. Top. Quantum Electron. 30, 8000110 (2023).
Ye, J.-T., Huang, X., Li, Z.-P. & Xu, F. Compressed sensing for active non-line-of-sight imaging. Opt. Express 29, 1749–1763 (2021).
Liu, X. et al. Non-line-of-sight imaging with arbitrary illumination and detection pattern. Nat. Commun. 14, 3230 (2023).
Li, Y., Zhang, Y., Ye, J., Xu, F. & Xiong, Z. Deep non-line-of-sight imaging from under-scanning measurements. In Proc. 37th International Conference on Neural Information Processing Systems 59095–59106 (Curran Associates Inc., 2024).
Wang, J. et al. Non-line-of-sight imaging with signal superresolution network. In Proc. IEEE/CVF Conference on Computer Vision and Pattern Recognition 17420–17429 (IEEE, 2023).
Ozkan, M. K., Sezan, M. I. & Tekalp, A. M. Adaptive motion-compensated filtering of noisy image sequences. IEEE Trans. Circuits Syst. Video Technol. 3, 277–290 (1993).
Schäffter, T., Rasche, V. & Carlsen, I. C. Motion compensated projection reconstruction. Magn. Reson. Med. 41, 954–963 (1999).
Yu, Y. et al. Enhancing non-line-of-sight imaging via learnable inverse kernel and attention mechanisms. In Proc. IEEE/CVF International Conference on Computer Vision 10563–10573 (IEEE, 2023).
Boyce, J. M. Noise reduction of image sequences using adaptive motion compensated frame averaging. In IEEE International Conference on Acoustics, Speech, and Signal Processing Vol. 3, 461–464 (IEEE, 1992).
Hasinoff, S. W. et al. Burst photography for high dynamic range and low-light imaging on mobile cameras. ACM Trans. Graph. 35, 1–12 (2016).
Code for the paper_Real-time non-line-of-sight computational imaging using spectrum filtering and motion compensation. Code Ocean https://doi.org/10.24433/CO.2487919.v2 (2024).
Acknowledgements
This work was supported by the Innovation Program for Quantum Science and Technology (2021ZD0300300), the National Natural Science Foundation of China (grant number 62031024), the Shanghai Municipal Science and Technology Major Project (2019SHZDZX01), the Shanghai Science and Technology Development Funds (22JC1402900), the Shanghai Academic/Technology Research Leader (21XD1403800), the Anhui Initiative in Quantum Information Technologies, Chinese Academy of Sciences, and the New Cornerstone Science Foundation through the Xplorer Prize.
Author information
Authors and Affiliations
Contributions
F.X., X.D. and J.-W.P. conceived of the research. J.-T.Y., Y.S. and F.X. performed the experiments and data processing. J.-T.Y., Y.S. and W.L. implemented the reconstruction algorithms and analyzed the data, with input from all authors. J.-T.Y., Y.S., J.-W.Z. and Z.-P.L. calibrated the imaging system. J.-T.Y., F.X. and J.-W.P. wrote the paper, with input from all authors. All authors contributed materials and analysis tools.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Computational Science thanks Christopher Metzler, Andreas Velten and Jingyi Yu for their contribution to the peer review of this work. Primary Handling Editor: Jie Pan, in collaboration with the Nature Computational Science team.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Supplementary Information
Supplementary Figs. 1–18, Notes 1–12 and Tables 1–7.
Supplementary Video 1
Introduction video.
Supplementary Video 2
Imaging results (Fig. 4).
Supplementary Video 3
Imaging results (Fig. 5).
Supplementary Video 4
Imaging results (Fig. 6).
Supplementary Video 5
Imaging results (Supplementary Fig. 11).
Source data
Source Data Fig. 2
Reconstruction results of algorithm.
Source Data Fig. 3
Reconstruction results of algorithm.
Source Data Fig. 4
Reconstruction results of algorithm.
Source Data Fig. 5
Reconstruction results of algorithm.
Source Data Fig. 6
Reconstruction results of algorithm.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Ye, JT., Sun, Y., Li, W. et al. Real-time non-line-of-sight computational imaging using spectrum filtering and motion compensation. Nat Comput Sci 4, 920–927 (2024). https://doi.org/10.1038/s43588-024-00722-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1038/s43588-024-00722-4
This article is cited by
-
Cover runners-up of 2024
Nature Computational Science (2024)