Machine learning for particle physics using R, Budapest BI Forum, October 2015
An artist's depiction of the collision of subatomic particles.

Machine learning for particle physics using R, Budapest BI Forum, October 2015

Search strategies for new subatomic particles often depend on being able to efficiently discriminate between signal and background processes. Particle physics experiments are expensive, the competition between rival experiments is intense, and the stakes are high. This has lead to increased interest in advanced statistical methods to extend the discovery reach of experiments. This talk will present a walk-through of the development of a prototype machine learning classifier for differentiating between decays of quarks and gluons at experiments like those at the Large Hadron Collider at CERN. The power to discriminate between these two types of particle would have a huge impact on many searches for new physics at CERN and beyond. I will discuss why I chose to perform this analysis in R, how switching to R has helped my work, and how I have overcome the problems that I encountered when working with large datasets in R.

This is a talk that I gave at the Budapest BI Forum in 2015. It's an early version of similar talks that I gave elsewhere. The analysis at this time was still evolving and some ideas presented here didn't turn out to be successful, and further work resulted in significant improvements in model performance.

The slides for this talk can be found here.


Photos taken at the event:

A photo of the author presenting at the Budapest BI Forum in 2015.
Presenting at the Budapest BI Forum in 2015. The slide shown shows the subset of the subatomic particles in the Standard Model that are the subject of the talk.
A photo of the author preenting at the Budapest BI Forum in 2015.
Presenting at the Budapest BI Forum in 2015. The slide shown shows that a large correlation matrix can be visualised as a force-directed network.

要查看或添加评论,请登录

Andi L.的更多文章

社区洞察

其他会员也浏览了