Abstract
Probabilistic computing excels in approximating combinatorial problems and modeling uncertainty. However, using conventional deterministic hardware for probabilistic models is challenging: (pseudo) random number generation introduces computational overhead and additional data shuffling. Therefore, there is a pressing need for different probabilistic computing architectures that achieve low latencies with reasonable energy consumption. Physical computing offers a promising solution, as these systems do not rely on an abstract deterministic representation of data but directly encode the information in physical quantities, enabling inherent probabilistic architectures utilizing entropy sources. Photonic computing is a prominent variant of physical computing due to the large available bandwidth, several orthogonal degrees of freedom for data encoding and optimal properties for in-memory computing and parallel data transfer. Here, we highlight key developments in physical photonic computing and photonic random number generation. We further provide insights into the realization of probabilistic photonic processors and their impact on artificial intelligence systems and future challenges.
This is a preview of subscription content, access via your institution
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
27,99 € / 30 days
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
99,00 € per year
only 8,25 € per issue
Buy this article
- Purchase on SpringerLink
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout




Similar content being viewed by others
References
Rao, Q. & Frtunikj, J. Deep learning for self-driving cars: chances and challenges. In 2018 IEEE/ACM 1st International Workshop on Software Engineering for AI in Autonomous Systems (SEFAIAS) 35–38 (IEEE, 2018).
Ker, J. & Wang, L. Deep learning applications in medical image analysis. IEEE Access 6, 9375–9389 (2018).
Arkhangelskaya, E. O. & Nikolenko, S. I. Deep learning for natural language processing: a survey. J. Math. Sci. 273, 533–582 (2023).
Kendall, A. & Gal, Y. What uncertainties do we need in Bayesian deep learning for computer vision? In Advances in Neural Information Processing Systems 30, 5575–5585 (NIPS, 2017).
Le Gallo, M. et al. A 64-core mixed-signal in-memory compute chip based on phase-change memory for deep neural network inference. Nat. Electron. 6, 680–693 (2023).
Friston, K. et al. The free energy principle made simpler but not too simple. Phys. Rep. 1024, 1–29 (2023).
Friston, K. The free-energy principle: a unified brain theory? Nat. Rev. Neurosci. 11, 127–138 (2010).
Liu, S. et al. Bayesian neural networks using magnetic tunnel junction-based probabilistic in-memory computing. Front. Nanotechnol. 4, 1021943 (2022).
Feldmann, J. et al. Parallel convolutional processing using an integrated photonic tensor core. Nature 589, 52–58 (2021).
Shen, Y. et al. Deep learning with coherent nanophotonic circuits. Nat. Photon. 11, 441–446 (2017).
Xu, X. et al. 11 TOPS photonic convolutional accelerator for optical neural networks. Nature 589, 44–51 (2021).
Brückerhoff-Plückelmann, F. et al. Broadband photonic tensor core with integrated ultra-low crosstalk wavelength multiplexers. Nanophotonics 11, 4063–4072 (2022).
Wu, C. et al. Programmable phase-change metasurfaces on waveguides for multimode photonic convolutional neural network. Nat. Commun. 12, 96 (2021).
Cao, G., Zhang, L., Huang, X., Hu, W. & Yang, X. 16.8 Tb/s true random number generator based on amplified spontaneous emission. IEEE Photon. Technol. Lett. 33, 699–702 (2021).
Huang, M., Chen, Z., Zhang, Y. & Guo, H. A phase fluctuation based practical quantum random number generator scheme with delay-free structure. Appl. Sci. 10, 7 (2020).
Brückerhoff-Plückelmann, F. et al. Probabilistic photonic computing with chaotic light. Nat. Commun. 15, 10445 (2024).
Wu, C. et al. Harnessing optoelectronic noises in a photonic generative network. Sci. Adv. 8, eabm2956 (2022).
Mehonic, A. & Kenyon, A. J. Brain-inspired computing needs a master plan. Nature 604, 255–260 (2022).
Zhou, H. et al. Photonic matrix multiplication lights up photonic accelerator and beyond. Light Sci. Appl. 11, 30 (2022).
Schuman, C. D. et al. Opportunities for neuromorphic computing algorithms and applications. Nat. Comput Sci. 2, 10–19 (2022).
Marković, D., Mizrahi, A., Querlioz, D. & Grollier, J. Physics for neuromorphic computing. Nat. Rev. Phys. 2, 499–510 (2020).
Wang, T. et al. An optical neural network using less than 1 photon per multiplication. Nat. Commun. 13, 123 (2022).
Sludds, A. et al. Delocalized photonic deep learning on the internet’s edge. Science 378, 270–276 (2022).
Ma, S.-Y., Wang, T., Laydevant, J., Wright, L. G. & McMahon, P. L. Quantum-noise-limited optical neural networks operating at a few quanta per activation. Nat. Commun. 16, 359 (2025).
Chowdhury, S. et al. A full-stack view of probabilistic computing with p-bits: devices, architectures, and algorithms. IEEE J. Explor. Solid-State Comput. Devices Circuits 9, 1–11 (2023).
Brückerhoff-Plückelmann, F. et al. Event-driven adaptive optical neural network. Sci. Adv. 9, eadi9127 (2023).
Tait, A. N. et al. Neuromorphic photonic networks using silicon photonic weight banks. Sci. Rep. 7, 7430 (2017).
Li, G. H. Y. et al. All-optical, ultrafast energy-efficient ReLU function for nanophotonic neural networks. Nanophotonics 12, 847–855 (2022).
Grottke, T., Hartmann, W., Schuck, C. & Pernice, W. H. P. Optoelectromechanical phase shifter with low insertion loss and a 13π tuning range. Opt. Express 29, 5525–5537 (2021).
Ríos, C. et al. In-memory computing on a photonic platform. Sci. Adv. 5, eaau5759 (2019).
Xu, R. et al. Mode conversion trimming in asymmetric directional couplers enabled by silicon ion implantation. Nano Lett. 24, 10813–10819 (2024).
Dong, B. et al. Partial coherence enhances parallelized photonic computing. Nature 632, 55–62 (2024).
Hamerly, R., Bandyopadhyay, S. & Englund, D. Asymptotically fault-tolerant programmable photonics. Nat. Commun. 13, 6831 (2022).
Tait, A. N. et al. Microring weight banks. IEEE J. Sel. Top. Quantum Electron. 22, 312–325 (2016).
Hu, J. et al. Diffractive optical computing in free space. Nat. Commun. 15, 1525 (2024).
Farhat, N. H., Psaltis, D., Prata, A. & Paek, E. Optical implementations of the Hopfield model. Appl. Opt. 24, WB3 (1985).
Shastri, B. J. et al. Photonics for artificial intelligence and neuromorphic computing. Nat. Photon. 15, 102–114 (2021).
Bogaerts, W. et al. Silicon microring resonators. Laser Photon Rev. 6, 47–73 (2012).
Messner, A. et al. Plasmonic, photonic, or hybrid? Reviewing waveguide geometries for electro-optic modulators. APL Photon. 8, 10 (2023).
Bose, D. et al. Anneal-free ultra-low loss silicon nitride integrated photonics. Light Sci. Appl. 13, 156 (2024).
Sebastian, A. et al. Two-dimensional materials-based probabilistic synapses and reconfigurable neurons for measuring inference uncertainty using Bayesian neural networks. Nat. Commun. 13, 6139 (2022).
Dutta, S. et al. Neural sampling machine with stochastic synapse allows brain-like learning and inference. Nat. Commun. 13, 2571 (2022).
Wu, C., Yang, X., Chen, Y. & Li, M. Photonic Bayesian neural network using programmed optical noises. IEEE J. Sel. Top. Quantum Electron. https://doi.org/10.1109/JSTQE.2022.3217819 (2023).
Tran, M. A., Huang, D. & Bowers, J. E. Tutorial on narrow linewidth tunable semiconductor lasers using Si/III–V heterogeneous integration. APL Photon. 4, 11 (2019).
Henry, C. H. Theory of the linewidth of semiconductor lasers. IEEE J. Quantum Electron 18, 259–264 (1982).
Lovic, V., Marangon, D. G., Lucamarini, M., Yuan, Z. & Shields, A. J. Characterizing phase noise in a gain-switched laser diode for quantum random-number generation. Phys. Rev. Appl. 16, 054012 (2021).
Álvarez, J.-R., Sarmiento, S., Lázaro, J. A., Gené, J. M. & Torres, J. P. Random number generation by coherent detection of quantum phase noise. Opt. Express 28, 5538 (2020).
Qi, B., Chi, Y.-M., Lo, H.-K. & Qian, L. High-speed quantum random number generation by measuring phase noise of a single-mode laser. Opt. Lett. 35, 312–314 (2010).
Nie, Y. Q. et al. The generation of 68 Gbps quantum random number by measuring laser phase fluctuations. Rev. Sci. Instrum. 86, 063105 (2015).
Guo, H., Tang, W., Liu, Y. & Wei, W. Truly random number generation based on measurement of phase noise of a laser. Phys. Rev. E 81, 051137 (2010).
Sciamanna, M. & Shore, K. A. Physics and applications of laser diode chaos. Nat. Photon. 9, 151–162 (2015).
Goodman, J. Statistical Optics (John Wiley & Sons, 2000).
Guo, Y. et al. 40 Gb/s quantum random number generation based on optically sampled amplified spontaneous emission. APL Photon. 6, 6 (2021).
Zhang, L. et al. 640-Gbit/s fast physical random number generation using a broadband chaotic semiconductor laser. Sci. Rep. 8, 4–11 (2017).
Shen, B. et al. Harnessing microcomb-based parallel chaos for random number generation and optical decision making. Nat. Commun. 14, 4590 (2023).
Eaton, M. et al. Resolution of 100 photons and quantum generation of unbiased random numbers. Nat. Photon. 17, 106–111 (2023).
Mattioli, F. et al. Photon-number-resolving superconducting nanowire detectors. Supercond. Sci. Technol. 28, 10 (2015).
Aungskunsiri, K. et al. Quantum random number generation based on multi-photon detection. ACS Omega 8, 35085–35092 (2023).
Madsen, L. S. et al. Quantum computational advantage with a programmable photonic processor. Nature 606, 75–81 (2022).
Roques-Carmes, C. et al. Biasing the quantum vacuum to control macroscopic probability distributions. Science 381, 205–209 (2023).
Choi, S. et al. Photonic probabilistic machine learning using quantum vacuum noise. Nat. Commun. 15, 7760 (2024).
Pearl, J. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference (Morgan Kaufmann, 1988).
Jospin, L. V. et al. Hands-on Bayesian neural networks — A tutorial for deep learning users. IEEE Comput. Intell. Mag. 17, 29–48 (2022).
Ho, J., Jain, A. & Abbeel, P. Denoising diffusion probabilistic models. Adv. Neural Inf. Process. Syst. 33, 6840–6851 (2020).
Kingma, D. P. & Welling, M. Auto-Encoding Variational Bayes. Camb. Explor. Arts Sci. https://doi.org/10.61603/ceas.v2i1.33 (2014).
Fahlman, S. E., Hinton, G. E. & Sejnowski, T. J. Massively parallel architectures for AI: NETL, Thistle, and Boltzmann machines. In Proc. AAAI-83 Conference (AAAI-Press) 109–113 (1983).
Bonnet, D. et al. Bringing uncertainty quantification to the extreme-edge with memristor-based Bayesian neural networks. Nat. Commun. 14, 7530 (2023).
Langenegger, J. et al. In-memory factorization of holographic perceptual representations. Nat. Nanotechnol. 18, 479–485 (2023).
Sarwat, S. G., Kersting, B., Moraitis, T., Jonnalagadda, V. P. & Sebastian, A. Phase-change memtransistive synapses for mixed-plasticity neural computations. Nat. Nanotechnol. 17, 507–513 (2022).
Ramesh, A. et al. Zero-shot text-to-image generation. Proc. Mach. Learn. Res. 139, 8821–8831 (2021).
Ronneberger, O., Fischer, P. & Brox, T. U-Net: convolutional networks for biomedical image segmentation. In Medical Image Computing and Computer-Assisted Intervention—MICCAI (eds Navab, N. et al.) 12–20 (Lecture Notes in Computer Science 9351, Springer, 2015).
Ashtiani, F., Geers, A. J. & Aflatouni, F. An on-chip photonic deep neural network for image classification. Nature 606, 501–506 (2022).
Qiu, Y. L., Zheng, H. & Gevaert, O. Genomic data imputation with variational auto-encoders. GigaScience 9, giaa082 (2020).
McCoy, J. T., Kroon, S. & Auret, L. Variational autoencoders for missing data imputation with application to a simulated milling circuit. IFAC-PapersOnLine 51, 141–146 (2018).
Wang, T. et al. Image sensing with multilayer, nonlinear optical neural networks. Nat Photon. 17, 408–415 (2023).
Chen, Y. et al. Photonic unsupervised learning variational autoencoder for high-throughput and low-latency image transmission. Sci. Adv. 9, eadf8437 (2023).
Sharma, M., Farquhar, S., Nalisnick, E. & Rainforth, T. Do Bayesian neural networks need to be fully stochastic? Proc. Mach. Learn. Res. 206, 7694–7722 (2023).
Lambert, B., Forbes, F., Doyle, S., Dehaene, H. & Dojat, M. Trustworthy clinical AI solutions: a unified review of uncertainty quantification in Deep Learning models for medical image analysis. Artif. Intell. Med. 150, 102830 (2024).
Syed, G. S. & Sebastian, A. Solving optimization problems with photonic crossbars. US patent US20230176606A1 (2021).
Gibney, E. & Castelvecchi, D. Physics Nobel scooped by machine-learning pioneers. Nature 634, 523–524 (2024).
Roques-Carmes, C. et al. Heuristic recurrent algorithms for photonic Ising machines. Nat. Commun. 11, 249 (2020).
Fan, Z., Lin, J., Dai, J., Zhang, T. & Xu, K. Photonic Hopfield neural network for the Ising problem. Opt. Express 31, 21340 (2023).
Attneave, F., B, M. & Hebb, D. O. The organization of behavior; a neuropsychological theory. Am. J. Psychol. 63, 633–642 (1950).
Mohseni, N., McMahon, P. L. & Byrnes, T. Ising machines as hardware solvers of combinatorial optimization problems. Nat. Rev. Phys. 4, 363–379 (2022).
Lucas, A. Ising formulations of many NP problems. Front. Phys. 2, 5 (2014).
Hopfield, J. J. & Tank, D. W. ‘Neural’ computation of decisions in optimization problems. Biol. Cybern. 52, 141–152 (1985).
Aarts, E. H. L. & Korst, J. H. M. Boltzmann machines for travelling salesman problems. Eur. J. Oper. Res. 39, 79–95 (1989).
Hopfield, J. J. Neural networks and physical systems with emergent collective computational abilities. Proc. Natl Acad. Sci. USA 79, 2554–2558 (1982).
Frady, E. P., Kent, S. J., Olshausen, B. A. & Sommer, F. T. Resonator networks, 1: an efficient solution for factoring high-dimensional, distributed representations of data structures. Neural Comput. 32, 2311–2331 (2020).
Kent, S. J., Frady, E. P., Sommer, F. T. & Olshausen, B. A. Resonator networks, 2: factorization performance and capacity compared to optimization-based methods. Neural Comput. 32, 2332–2388 (2020).
Hersche, M., Zeqiri, M., Benini, L., Sebastian, A. & Rahimi, A. A neuro-vector-symbolic architecture for solving Raven’s progressive matrices. Nat. Mach. Intell. 5, 363–375 (2023).
Khaddam-Aljameh, R. et al. HERMES-Core-A 1.59-TOPS/mm2PCM on 14-nm CMOS in-memory compute core using 300-ps/LSB linearized CCO-based ADCs. IEEE J. Solid-State Circuits 57, 1027–1038 (2022).
Hua, S. et al. An integrated large-scale photonic accelerator with ultralow latency. Nature 640, 361–367 (2025).
Tsakyridis, A. et al. Photonic neural networks and optics-informed deep learning fundamentals. APL Photon. 9, 1 (2024).
Rasch, M. J. et al. Hardware-aware training for large-scale and diverse deep learning inference workloads using in-memory computing-based accelerators. Nat. Commun. 14, 5282 (2023).
Pai, S. et al. Experimentally realized in situ backpropagation for deep learning in photonic neural networks. Science 380, 398–404 (2023).
Momeni, A., Rahmani, B., Malléjac, M., del Hougne, P. & Fleury, R. Backpropagation-free training of deep physical neural networks. Science 382, 1297–1304 (2023).
Wright, L. G. et al. Deep physical neural networks trained with backpropagation. Nature 601, 549–555 (2022).
Varri, A. et al. Noise-resilient photonic analog neural networks. J. Lightwave Technol. 42, 7969–7976 (2024).
Jain, S. et al. A heterogeneous and programmable compute-in-memory accelerator architecture for analog-AI using dense 2-D mesh. IEEE Trans. Very Large Scale Integr. (VLSI) Syst. 31, 114–127 (2023).
Dazzi, M. et al. 5 Parallel Prism: a topology for pipelined implementations of convolutional neural networks using computational memory. In 33rd Conference on Neural Information Processing Systems (NeurIPS 2019).
Acknowledgements
We thank J. Stuhrmann, from Illustrato, for his assistance with the illustrations. This research was supported by the European Union’s Horizon 2020 research and innovation program (grant no. 101017237, PHOENICS project) and the European Union’s Innovation Council Pathfinder program (grant no. 101046878, HYBRAIN project). We acknowledge funding support by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany’s Excellence Strategy EXC 2181/1—390900948 (the Heidelberg STRUCTURES Excellence Cluster), the Excellence Cluster 3D Matter Made to Order (EXC-2082/1—390761711) and CRC 1459 ‘Intelligent Matter’.
Author information
Authors and Affiliations
Contributions
Conceptualization: F.B.-P., W.P. Methodology: F.B.-P., A.P.O., A.V., H. Borras, B.K., G.S.S. Investigation: F.B.-P., A.P.O., A.V., H. Borras, B.K., G.S.S. Visualization: F.B.-P., A.P.O., A.V. Funding acquisition: W.P., H.F., C.D.W., H. Bhaskaran, A.S., H.F. Project administration: W.P., H.F. Supervision: W.P., H.F., C.D.W., H. Bhaskaran, G.S.S., A.S. - Writing—original draft: F.B.-P., A.P.O., A.V. - Writing—review & editing: all authors.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Computational Science thanks the anonymous reviewer(s) for their contribution to the peer review of this work. Primary Handling Editor: Jie Pan, in collaboration with the Nature Computational Science team.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Brückerhoff-Plückelmann, F., Ovvyan, A.P., Varri, A. et al. Probabilistic photonic computing for AI. Nat Comput Sci 5, 377–387 (2025). https://doi.org/10.1038/s43588-025-00800-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1038/s43588-025-00800-1