Bayesian optimization for successive decisions with multi-arm bandits R-Bloggers

Bayesian optimization for successive decisions with multi-arm bandits R-Bloggers

[This article was first published on R-posts.com, and kindly contributed to R-bloggers]. (You can report problems here about the content on this page)


Do you want to share your content on R-bloggers? Click here if you have a blog, or here If you don’t.

Become a member of our workshop entitled Bayesian optimization for successive decisions with multi-arm banditsThat is part of our workshops for Ukraine series!

Here is some more info:

Title: Bayesian optimization for successive decisions with multi-arm bandits

Date: Thursday, October 23, 18:00 – 20:00 CEST (Rome, Berlin, Paris Timezone)

Speaker: Jordan Nafa is a data scientist and Bayesian statistician who has previously worked for Game Data Pros, where he has designed and built production systems for Bayesian optimization and experiments in large mobile and console games. He was previously a PhD student in the political sciences at the University of Noord -Texas, where he did not give courses in causal conclusion, statistics and American political behavior.

Description: This workshop introduces Bayesian optimization and multiple bandits. It deals with sequential decision problems, Thompson bodies and multi-arm bandits. The workshop is designed for people with a basic knowledge of traditional experimental design and analysis treatments, such as they often applied in A/B testing, who want to learn about Bayesian methods for sequential decision-making. The workshop contains practical examples of implementing multi-arm bandit algorithms and their generalizations in R and Python with the help of Stan and PYMC.

Minimum registration costs: 20 euros (or 20 USD or 800 UAH)

Please note that the registration confirmation is sent to all registered participants 1 day before the workshop instead of immediately after registration

How can I register?

  • Save your donation reception (after the donation has been processed, there is an option to enter your e -mail address on the website to which the donation reception is sent)

  • Fill the Registration form, Add a screenshot of a donation reception (add the screenshot of the donation reception certificate that has been emailed to you instead of the page you see after donation).

If you are not personally interested in attending, you can also contribute by sponsoring a student’s participation, who can then participate for free. If you choose to sponsor a student, all proceeds also go directly to organizations that work in Ukraine. You can sponsor a certain student or you can leave it to us so that we can assign the sponsored place to students who have registered for the waiting list.

How can I sponsor a student?

  • Save your donation reception (after the donation has been processed, there is an option to enter your e -mail address on the website to which the donation reception is sent)

  • Fill the sponsor formAdding the screenshot of the donation reception (add the screenshot of the donation reception certificate that has been mailed to you instead of the page you see after the donation). You can indicate whether you want to sponsor a certain student or we can assign this place to the students of the waiting list. You can also indicate whether you prefer to prioritize students from developing countries when assigning place (s) that you have sponsored.

If you are a university student and cannot pay the registration costs, you can also register for the waiting list here. (Note that you are not guaranteed to participate by registering for the waiting list).

You can also find more information about this workshop series, a schedule of our future workshops and a list of our previous workshops that you can get the recordings and materials here.

I look forward to seeing you during the workshop!


Bayesian optimization for successive decisions with multi-arm bandits Was posted for the first time on September 23, 2025 at 6.10 pm.


#Bayesian #optimization #successive #decisions #multiarm #bandits #RBloggers

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *