top of page


“Intelligence relies on understanding and acting in an imperfectly sensed and uncertain world” --- Prof. Zoubin Ghahramani

Home: Welcome



Prof. Zoubin Ghahramani said in his Nature paper [1], “intelligence relies on understanding and acting in an imperfectly sensed and uncertain world”. Such uncertainty handling ability is even more critical for some safety-critical tasks (e.g., autonomous driving and medical diagnosis). Unfortunately, existing neural networks are weak on that. The topic of this session - Bayesian neural networks - is to combine the beauties of two fields: neural networks which is powerful on complex function approximation and hidden representation learning, and Bayesian which has solid theoretical foundation on uncertainty modeling. It is a newly emerging topic for neural networks. Compared to vanilla neural networks, Bayesian neural networks has distinctive advantages: 1) representing, manipulating, and mitigating uncertainty based on the solid theoretical foundations of probability; 2) encoding the prior knowledge about a problem; and 3) good interpretability thanks to its clear and meaningful probabilistic structure.
This area started roughly in 1990s when Radford Neal [2], David MacKay [3], and Dayan et al. [4] firstly use Bayesian techniques in neural networks. However, there is no much works following them. As the quick development of both neural networks and Bayesian learning over the past few years, this area has received great interest from the community again, and then many seminal works emerged to lay the theoretical foundation and achieve state-of-the-art performances, such as Dropout as a Bayesian approximation [5], the connection between Gaussian Process with neural network [6], Bayesian convolutional neural networks [7], etc. In addition, due to the need of quantifying uncertainty, this area has also attracted interest from wider communities including computer vision, natural language processing, and medical application [8]. To keep and enhance such great success, this special session will study the new theories, models, inference algorithms, and applications of this area, and will be a platform to host the recent flourish of ideas using Bayesian approaches in neural networks and using neural networks in Bayesian modelling.

Home: About



All aspects of using Bayesian approaches in neural networks and using neural networks in Bayesian modeling are welcomed, including but not limited to:

  • Theoretical connections between stochastic processes and neural networks;

  • Various prior design for different types of neural networks;

  • Fusion of probabilistic graphical models with neural networks (such as: neural topic models, neural stochastic processes, HMM with neural network likelihoods);

  • (Conditional) Variational autoencoders;

  • Bayesian deep learning (such deep kernel learning)

  • Variational Bayesian neural networks

  • Approximate inference for Bayesian neural networks (such as variational Bayes, expectation propagation, etc.);

  • Stochastic gradient Markov chain Monte Carlo (MCMC);

  • Scalable inference in Bayesian neural networks for big data;

  • Bayesian neural networks for classification and regression with uncertainty estimation;

  • Bayesian neural netoworks for computer vision (object detection, semantic segmentation and scene understanding, motion prediction); 

  • Bayesian neural networks for transfer learning;

  • Bayesian neural networks for reinforcement learning;

  • Bayesian neural networks for causal inference;

  • Bayesian neural networks for temporal and spatial data mining;

  • Bayesian neural networks for text and web mining;

  • New baselines for Bayesian uncertainty in network networks.

WCCI-IJCNN is the premier international meeting for researchers and other professionals in neural networks and related areas. 

Home: Text


Guidelines for Paper Submission

All papers should be prepared according to the WCCI 2022 submission policy and should be submitted using the conference website ( .

To submit your paper to this special session, you will need to choose our special session "Bayesian Neural Networks: The Interplay between Bayes' Theorem and Neural Networks"

Submission Link:

Home: About


Get to Know Us

headshot maoying qiao.jpg


Peter Faber Business School
Australian Catholic University, Australia



Senior Lecturer
School of Computer Science
University of Technology Sydney, Australia



Associate Professor

Institute of Natural Sciences, and School of Mathematical Sciences at Shanghai Jiao Tong University

Home: Team
Image by Joshua Fernandez


  • 31 Jan 2022                           Paper Submission Deadline

  • 26 Apr 2022                          Paper Acceptance Notification of acceptance or rejection

  • 23 May 2022                         Final paper submission 

Home: Opening Hours


  1. Zoubin Ghahramani, Probabilistic machine learning and artificial intelligence, Nature, 521(7553), 452, 2015.

  2. Radford Neal, Bayesian learning for neural networks, PhD Thesis, 1995.

  3. David MacKay, A practical Bayesian framework for backpropagation networks. Neural Computation, 4(3), 448-472, 1992.

  4. Peter Dayan, Geoffrey Hinton, Radford Neal, and Richard Zemel. The Helmholtz machine, Neural Computation 7(5), 889-904, 1995.

  5. Yarin Gal and Zoubin Ghahramani, Dropout as a Bayesian approximation: Representing model uncertainty in deep learning, ICML 2016.

  6. Jaehoon Lee, et al. Deep neural networks as Gaussian processes. arXiv preprint arXiv:1711.00165, 2017.

  7. Kumar Shridhar, Felix Laumann, and Marcus Liwicki, A Comprehensive guide to Bayesian convolutional neural network with variational inference, arXiv preprint, arXiv: 1901.02731, 2019.

  8. Moloud Abdar, Farhad Pourpanah, Sadiq Hussain, Dana Rezazadegan, Li Liu, Mohammad Ghavamzadeh, Paul Fieguth, Xiaochun Cao, Abbas Khosravi, U Rajendra Acharya, Vladimir Makarenkov, and Saeid Nahavandi, A review of Uncertainty Quantification in Deep Learning: Techniques, Applications and Challenges, arXiv preprint, arXiv: 2011.06225, 2020.

Home: Text



IJCNN 2021

Home: List
bottom of page