Publications

[1]

Voldoire, T., Chopin, N., Rateau, G., and Ryder, R. J. Saddlepoint monte carlo and its application to exact ecological inference. arxiv 2410.18243 (October 2024). [ bib | arXiv ]

[2]

Andrieu, C., Chopin, N., Fincato, E., and Gerber, M. Gradient-free optimization via integration. arxiv 2408.00888 (August 2024). [ bib | arXiv ]

[3]

Sakhi, O., Aouali, I., Alquier, P., and Chopin, N. Logarithmic smoothing for pessimistic off-policy evaluation, selection and learning. Accepted at NeurIPS 2024 (2024). [ bib | arXiv ]

[4]

Chopin, N., Crucinio, F. R., and Korba, A. A connection between tempering and entropic mirror descent. ICML (July 2024). [ bib | arXiv ]

[5]

Chopin, N., Crucinio, F. R., and Singh, S. S. Towards a turnkey approach to unbiased Monte Carlo estimation of smooth functions of expectations. arxiv 2403.20313 (March 2024). [ bib | arXiv ]

[6]

Chopin, N., and Gerber, M. Higher-order Monte Carlo through cubic stratification. SIAM J. Numer. Anal. 62, 1 (2024), 229–247. [ bib | DOI | arXiv ]

[7]

Youssfi, Y., and Chopin, N. Scalable Bayesian bi-level variable selection in generalized linear models. Foundations of Data Science (2024). [ bib | DOI | arXiv | http ]

[8]

Jin, R., Singh, S. S., and Chopin, N. De-biasing particle filtering for a continuous time hidden Markov model with a Cox process observation model. Statist. Sinica 34 (2024), 1215–1239. [ bib | DOI | arXiv ]

[9]

Sakhi, O., Rohde, D., and Chopin, N. Fast slate policy optimization: Going beyond Plackett-Luce. Transactions on Machine Learning Research (2023). [ bib | arXiv | http ]

[10]

Corenflos, A., Sutton, M., and Chopin, N. Debiasing piecewise deterministic Markov process samplers using couplings. arxiv 2306.15422 (June 2023). [ bib | arXiv ]

[11]

Sakhi, O., Alquier, P., and Chopin, N. PAC-Bayesian offline contextual bandits with guarantees. In Proceedings of the 40th International Conference on Machine Learning (23–29 Jul 2023), A. Krause, E. Brunskill, K. Cho, B. Engelhardt, S. Sabato, and J. Scarlett, Eds., vol. 202 of Proceedings of Machine Learning Research, PMLR, pp. 29777–29799. [ bib | arXiv | .html | .pdf ]

[12]

Dau, H.-D., and Chopin, N. On backward smoothing algorithms. Ann. Statist. 51, 5 (2023), 2145–2169. [ bib | DOI | arXiv ]

[13]

Chopin, N., Fulop, A., Heng, J., and Thiery, A. H. Computational doob h-transforms for online filtering of discretely observed diffusions. In Proceedings of the 40th International Conference on Machine Learning (23–29 Jul 2023), A. Krause, E. Brunskill, K. Cho, B. Engelhardt, S. Sabato, and J. Scarlett, Eds., vol. 202 of Proceedings of Machine Learning Research, PMLR, pp. 5904–5923. [ bib | arXiv | .html | .pdf ]

[14]

Chopin, N., Singh, S. S., Soto, T., and Vihola, M. On resampling schemes for particle filters with weakly informative observations. Ann. Statist. 50, 6 (2022), 3197–3222. [ bib | DOI | arXiv ]

[15]

Corenflos, A., Chopin, N., and Särkkä, S. De-Sequentialized Monte Carlo: a parallel-in-time particle smoother. Journal of Machine Learning Research 23, 283 (2022), 1–39. [ bib | arXiv | .html ]

[16]

Ducrocq, G., Chopin, N., Errard, J., and Stompor, R. Improved Gibbs samplers for cosmic microwave background power spectrum estimation. Phys. Rev. D 105 (May 2022), 103501. [ bib | DOI | arXiv ]

[17]

Dau, H.-D., and Chopin, N. Waste-free sequential Monte Carlo. J. R. Stat. Soc. Ser. B. Stat. Methodol. 84, 1 (2022), 114–148. [ bib | DOI | arXiv ]

[18]

Chopin, N., and Ducrocq, G. Fast compression of MCMC output. Entropy 23, 8 (2021), Paper No. 1017, 16. [ bib | DOI | arXiv ]

[19]

Buchholz, A., Chopin, N., and Jacob, P. E. Adaptive tuning of Hamiltonian Monte Carlo within sequential Monte Carlo. Bayesian Anal. 16, 3 (2021), 745–771. [ bib | DOI | arXiv | http ]

[20]

Alvares, D., Armero, C., Forte, A., and Chopin, N. Sequential Monte Carlo methods in Bayesian joint models for longitudinal and time-to-event data. Stat. Model. 21, 1-2 (2021), 161–181. [ bib | DOI ]

[21]

Findling, C., Chopin, N., and Koechlin, E. Imprecise neural computations as a source of adaptive behaviour in volatile environments. Nature Human Behaviour (2020), 1–14. [ bib | DOI | arXiv ]

[22]

Gerber, M., Chopin, N., and Whiteley, N. Negative association, ordering and convergence of resampling methods. Ann. Statist. 47, 4 (2019), 2236–2260. [ bib | DOI | arXiv ]

[23]

Buchholz, A., and Chopin, N. Improving Approximate Bayesian Computation via Quasi-M onte Carlo. J. Comput. Graph. Statist. 28, 1 (2019), 205–219. [ bib | DOI | arXiv ]

[24]

Andrieu, C., Doucet, A., Yildirim, S., and Chopin, N. On the utility of Metropolis-Hastings with asymmetric acceptance ratio. arXiv e-prints (Mar 2018). [ bib | arXiv ]

[25]

Riou-Durand, L., and Chopin, N. Noise contrastive estimation: asymptotic properties, formal comparison with MC-MLE. Electron. J. Stat. 12, 2 (2018), 3473–3518. [ bib | DOI | arXiv ]

[26]

Chopin, N., and Gerber, M. Sequential quasi–Monte Carlo: introduction for non-experts, dimension reduction, application to partly observed diffusion processes. In Monte Carlo and quasi–Monte Carlo methods, vol. 241 of Springer Proc. Math. Stat. Springer, Cham, 2018, pp. 99–121. [ bib | arXiv ]

[27]

Alvares, D., Armero, C., Forte, A., and Chopin, N. Sequential Monte Carlo methods in random intercept models for longitudinal data. In Bayesian statistics in action, vol. 194 of Springer Proc. Math. Stat. Springer, Cham, 2017, pp. 3–9. [ bib | DOI ]

[28]

Vasishth, S., Chopin, N., Ryder, R., and Nicenboim, B. Modelling dependency completion in sentence comprehension as a Bayesian hierarchical mixture process: A case study involving Chinese relative clauses. In Proceedings of the 39th annual meeting of the cognitive science society (July 2017). [ bib | arXiv ]

[29]

Vasishth, S., Nicenboim, B., Chopin, N., and Ryder, R. Bayesian hierarchical finite mixture models of reading times: A case study. PsyArXiv (July 2017). [ bib | DOI ]

[30]

Oates, C. J., Girolami, M., and Chopin, N. Control functionals for Monte Carlo integration. J. R. Stat. Soc. Ser. B. Stat. Methodol. 79, 3 (2017), 695–718. [ bib | DOI | arXiv ]

[31]

Gerber, M., and Chopin, N. Convergence of sequential quasi-Monte Carlo smoothing algorithms. Bernoulli 23, 4B (2017), 2951–2987. [ bib | DOI | arXiv ]

[32]

Chopin, N., and Ridgway, J. Leave Pima Indians alone: binary regression as a benchmark for B ayesian computation. Statist. Sci. 32, 1 (2017), 64–87. [ bib | DOI | arXiv ]

[33]

Schretter, C., He, Z., Gerber, M., Chopin, N., and Niederreiter, H. Van der Corput and golden ratio sequences along the Hilbert space-filling curve. In Monte Carlo and quasi-Monte Carlo methods, vol. 163 of Springer Proc. Math. Stat. Springer, [Cham], 2016, pp. 531–544. [ bib | DOI ]

[34]

Alquier, P., Ridgway, J., and Chopin, N. On the properties of variational approximations of Gibbs posteriors. J. Mach. Learn. Res. 17, 239 (2016), Paper No. 239, 41. [ bib | arXiv ]

[35]

Barthelmé, S., Chopin, N., and Cottet, V. Divide and conquer in ABC: expectation-propagation algorithms for likelihood-free inference. In Handbook of approximate Bayesian computation, Chapman & Hall/CRC Handb. Mod. Stat. Methods. CRC Press, Boca Raton, FL, 2019, pp. 415–434. [ bib ]

[36]

Chopin, N., and Gerber, M. Application of sequential Quasi-Monte Carlo to autonomous positioning. In Signal Processing Conference (EUSIPCO), 2015 23rd European (Aug 2015), pp. 489–493. [ bib | DOI | arXiv ]

[37]

Chopin, N., Ridgway, J., Gerber, M., and Papaspiliopoulos, O. Towards automatic calibration of the number of state particles within the SMC2 algorithm. ArXiv preprint 1506.00570 (Jun 2015). [ bib | arXiv ]

[38]

Chopin, N., Gadat, S., Guedj, B., Guyader, A., and Vernet, E. On some recent advances on high dimensional Bayesian statistics. In Modélisation Aléatoire et Statistique—Journe ́ es MAS 2014, vol. 51 of ESAIM Proc. Surveys. EDP Sci., Les Ulis, 2015, pp. 293–319. [ bib | DOI ]

[39]

Gerber, M., and Chopin, N. Sequential quasi Monte Carlo. J. R. Stat. Soc. Ser. B. Stat. Methodol. 77, 3 (2015), 509–579. [ bib | DOI | arXiv ]

[40]

Barthelmé, S., and Chopin, N. The Poisson transform for unnormalised statistical models. Stat. Comput. 25, 4 (2015), 767–780. [ bib | DOI | arXiv ]

[41]

Chopin, N., and Singh, S. S. On particle Gibbs sampling. Bernoulli 21, 3 (2015), 1855–1883. [ bib | DOI | arXiv ]

[42]

Kantas, N., Doucet, A., Singh, S. S., Maciejowski, J., and Chopin, N. On particle methods for parameter estimation in state-space models. Statist. Sci. 30, 3 (2015), 328–351. [ bib | DOI | arXiv ]

[43]

Gelman, A., Vehtari, A., Jylänki, P., Robert, C., Chopin, N., and Cunningham, J. P. Expectation propagation as a way of life. ArXiv e-prints (Dec 2014). [ bib | arXiv ]

[44]

Ridgway, J., Alquier, P., Chopin, N., and Liang, F. PAC-Bayesian AUC classification and scoring. In Advances in Neural Information Processing Systems 27, Z. Ghahramani, M. Welling, C. Cortes, N. Lawrence, and K. Weinberger, Eds. Curran Associates, Inc., 2014, pp. 658–666. [ bib | arXiv ]

[45]

Alquier, P., Cottet, V., Chopin, N., and Rousseau, J. Bayesian matrix completion: prior specification and consistency. ArXiv preprint, 1406.1440 (2014). [ bib | arXiv ]

[46]

Barthelmé, S., and Chopin, N. Expectation propagation for likelihood-free inference. J. Amer. Statist. Assoc. 109, 505 (2014), 315–333. [ bib | DOI | arXiv ]

[47]

Andrieu, C., Chopin, N., Doucet, A., and Rubenthaler, S. Perfect simulation for the Feynman-Kac law on the path space. ArXiv preprint 1210.0376 (Mar 2013). [ bib | arXiv ]

[48]

Chopin, N., Jacob, P. E., and Papaspiliopoulos, O. SMC2: an efficient algorithm for sequential analysis of state space models. J. R. Stat. Soc. Ser. B. Stat. Methodol. 75, 3 (2013), 397–426. [ bib | DOI | arXiv ]

[49]

Chopin, N., Rousseau, J., and Liseo, B. Computational aspects of Bayesian spectral density estimation. J. Comput. Graph. Statist. 22, 3 (2013), 533–557. [ bib | DOI | arXiv ]

[50]

Singh, S. S., Chopin, N., and Whiteley, N. Bayesian learning of noisy Markov decision processes. ACM Trans. Model. Comput. Simul. 23, 1 (2013), Art. 4, 25. [ bib | DOI | arXiv ]

[51]

Schäfer, C., and Chopin, N. Sequential Monte Carlo on large binary sampling spaces. Stat. Comput. 23, 2 (2013), 163–184. [ bib | DOI | arXiv ]

[52]

Chopin, N., Gelman, A., Mengersen, K. L., and Robert, C. P. In praise of the referee. ArXiv preprint 1205.4304 (May 2012). [ bib | arXiv ]

[53]

Andrieu, C., Barthelme, S., Chopin, N., Cornebise, J., Doucet, A., Girolami, M., Kosmidis, I., Jasra, A., Lee, A., Marin, J.-M., Pudlo, P., Robert, C. P., Sedki., M., and Singh, S. S. Some discussions of d. fearnhead and d. prangle’s read paper ” constructing summary statistics for approximate bayesian computation: semi-automatic approximate bayesian computation”. ArXiv preprint 1201.1314 (Jan 2012). [ bib | arXiv ]

[54]

Chopin, N., and Robert, C. Discussion of “catching up faster by switching sooner: a predictive approach to adaptive estimation with an application to the aic–bic dilemma” by Erven, Tim van and Grünwald, Peter and de rooij, Steven. Journal of the Royal Statistical Society (series B) 74, 3 (2012), 361–417. [ bib ]

[55]

Rousseau, J., Chopin, N., and Liseo, B. Bayesian nonparametric estimation of the spectral density of a long or intermediate memory Gaussian process. Ann. Statist. 40, 2 (2012), 964–995. [ bib | DOI | arXiv ]

[56]

Chopin, N., Lelièvre, T., and Stoltz, G. Free energy methods for Bayesian inference: efficient exploration of univariate Gaussian mixture posteriors. Stat. Comput. 22, 4 (2012), 897–916. [ bib | DOI | arXiv ]

[57]

Barthelmé, S., Beffy, M., Chopin, N., Doucet, A., Jacob, P., Johansen, A., Marin, J., and C.P., R. Discussions on “Riemann manifold langevin and Hamiltonian M onte Carlo methods” by M. Girolami and B. Caldherhead. Journal of the Royal Statistical Society (series B) 73, 2 (2011), 123–214. [ bib | DOI | arXiv ]

[58]

Chopin, N., and Robert, C. Comments on “Using TPA for Bayesian inference” by Huber, m. and Schott, s. In Bayesian Statistics 9 (2011), J. M. Bernardo, M. J. Bayarri, J. O. Berger, A. P. Dawid, D. Heckerman, S. A. F. M., and M. West, Eds., Oxford University Press, pp. 257–282. [ bib ]

[59]

Chopin, N., and Papaspiliopoulos, O. Comments on “Bayesian variable selection for random intercept modeling of Gaussian and non-Gaussian data” by Frühwirth-schnatter, s. and Wagner, h. In Bayesian Statistics 9 (2011), J. M. Bernardo, M. J. Bayarri, J. O. Berger, A. P. Dawid, D. Heckerman, S. A. F. M., and M. West, Eds., Oxford University Press, pp. 165–200. [ bib ]

[60]

Chopin, N. Fast simulation of truncated Gaussian distributions. Stat. Comput. 21, 2 (2011), 275–288. [ bib | DOI | arXiv ]

[61]

Chopin, N., Del Moral, P., and Rubenthaler, S. Stability of Feynman-Kac formulae with path-dependent potentials. Stochastic Process. Appl. 121, 1 (2011), 38–60. [ bib | DOI | arXiv ]

[62]

Chopin, N., Iacobucci, A., Marin, J., Mengersen, K., Robert, C., Ryder, R., and Schfer, C. On particle learning; comments on “particle learning for sequential bayesian computation” by lopes, carvalho, johannes, and polson. In Bayesian Statistics 9 (2011), J. M. Bernardo, M. J. Bayarri, J. O. Berger, A. P. Dawid, D. Heckerman, S. A. F. M., and M. West, Eds., Oxford University Press, pp. 317–360. [ bib | arXiv ]

[63]

Chopin, N., and Jacob, P. Free energy sequential Monte Carlo, application to mixture modelling. In Bayesian statistics 9, J. M. Bernardo, M. J. Bayarri, J. O. Berger, A. P. Dawid, D. Heckerman, S. A. F. M., and M. West, Eds. Oxford Univ. Press, Oxford, 2011, pp. 91–118. With discussions by Peter J. Green and Benjamin M. Taylor. [ bib | DOI | arXiv ]

[64]

Barthelmé, S., and Chopin, N. Abc-ep: Expectation propagation for likelihood-free bayesian computation. In Proceedings of the 28th International Conference on Machine Learning (ICML-11) (New York, NY, USA, June 2011), L. Getoor and T. Scheffer, Eds., ICML ’11, ACM, pp. 289–296. [ bib ]

[65]

Robert, C. P., Chopin, N., and Rousseau, J. Rejoinder: Harold Jeffreys’s Theory of probability revisited. ArXiv preprint 0909.1008 (Jan 2010). [ bib | arXiv ]

[66]

Chopin, N., and Robert, C. P. Properties of nested sampling. Biometrika 97, 3 (2010), 741–755. [ bib | DOI | arXiv ]

[67]

Jacob, P., Chopin, N., Robert, C. P., and Rue, H. Comments on “Particle Markov chain Monte Carlo” by C. Andrieu , A. Doucet, and R. Hollenstein. ArXiv preprint 0911.0985 (Nov 2009). [ bib | DOI | arXiv ]

[68]

Rue, H., Martino, S., and Chopin, N. Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations. J. R. Stat. Soc. Ser. B Stat. Methodol. 71, 2 (2009), 319–392. [ bib | DOI ]

[69]

Robert, C. P., Chopin, N., and Rousseau, J. Harold Jeffreys’s theory of probability revisited. Statist. Sci. 24, 2 (2009), 141–172. [ bib | DOI | arXiv ]

[70]

Chopin, N. Jim Albert: Bayesian computation with R. Statistics and Computing 19 (2009), 111–112. [ bib | DOI ]

[71]

Chopin, N. On the equivalence between standard and sequentially ordered hidden Markov models. Statist. Probab. Lett. 78, 14 (2008), 2171–2174. [ bib | DOI | arXiv ]

[72]

Chopin, N., and Robert, C. Comment on `Nested sampling’ by Skilling. In Bayesian Statistics 8 (2007), O. U. P. Bernardo, J. M. et al. (eds), Ed., pp. 491–524. [ bib ]

[73]

Chopin, N., and Robert, C. Comment on ‘Estimating the integrated likelihood via posterior simulation using the harmonic mean equality’, by Raftery et al. In Bayesian Statistics 8 (2007), J. M. e. a. e. Bernardo, Ed., Oxford University Press, pp. 371–416. [ bib ]

[74]

Chopin, N., and Fearnhead, P. Comment on `objective bayesian analysis of multiple changepoints for linear models’, by Giron et al. In Bayesian Statistics 8 (2007), O. U. P. Bernardo, J. M. et al. (eds), Ed., pp. 227–252. [ bib ]

[75]

Chopin, N. Comment on `sequential Monte Carlo for Bayesian computation’ by Del Moral et al. In Bayesian Statistics 8 (2007), O. U. P. Bernardo, J. M. et al. (eds), Ed., pp. 115–148. [ bib ]

[76]

Rue, H., Martino, S., and Chopin, N. Discussion on `Modern statistics for spatial point processes’ by M ller and Waagepetersen. Scandinavian Journal of Statistics 34, 4 (2007), 685–711. [ bib ]

[77]

Chopin, N. Dynamic detection of change points in long time series. Ann. Inst. Statist. Math. 59, 2 (2007), 349–366. [ bib | DOI ]

[78]

Chopin, N. Inference and model choice for sequentially ordered hidden Markov models. J. R. Stat. Soc. Ser. B Stat. Methodol. 69, 2 (2007), 269–284. [ bib | DOI ]

[79]

Chopin, N., and Varini, E. Particle filtering for continuous-time hidden Markov models. In Conference Oxford sur les méthodes de Monte Carlo séquentielles, vol. 19 of ESAIM Proc. EDP Sci., Les Ulis, 2007, pp. 12–17. [ bib | DOI ]

[80]

Chopin, N. Discussion of `exact and efficient likelihood-based estimation for discretely observed diffusion processes’ by Beskos et al. Journal of the Royal Statistical Society (series B), 68 (2006), 333–382. [ bib ]

[81]

Chopin, N. Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference. Ann. Statist. 32, 6 (2004), 2385–2411. [ bib | DOI | arXiv ]

[82]

Chopin, N., and Pelgrin, F. Bayesian inference and state number determination for hidden M arkov models: an application to the information content of the yield curve about inflation. J. Econometrics 123, 2 (2004), 327–344. [ bib | DOI ]

[83]

Comment on `iid sampling with self-avoiding particle filters: the pinball sampler by Mengersen and Robert. In Bayesian Statistics 7 (2003), O. U. P. Bernardo, J. M. et al. (eds), Ed., pp. 277–292. [ bib ]

[84]

Chopin, N. A sequential particle filter method for static models. Biometrika 89, 3 (2002), 539–551. [ bib | DOI ]


This file was generated by bibtex2html 1.99.