Skip to Content.
Sympa Menu

hot-qcd-whitepaper-l - [Hot-qcd-whitepaper-l] some comments for the WP

hot-qcd-whitepaper-l AT lists.bnl.gov

Subject: List for the 2014 Hot QCD White Paper Writing Group

List archive

Chronological Thread  
  • From: "Steffen A. Bass" <bass AT phy.duke.edu>
  • To: hot-qcd-whitepaper-l AT lists.bnl.gov
  • Subject: [Hot-qcd-whitepaper-l] some comments for the WP
  • Date: Mon, 15 Sep 2014 09:07:26 -0400

Dear WP writing committee,

since I won’t be at the Town Hall meeting this afternoon, I’d just like to
share my personal thoughts about some important items for our WP with you:

Generally I am very confident that we (as community) and you (as writers)
have a well-developed plan for the exciting science we wish to do over the
next decade; for me a useful starting point in formulating this vision would
be our community WP from two years ago that can be found at the website for
this meeting (and in case you are looking for the latex source, let me know
and I’ll be happy to send it to you).

There are two items that were not touched upon in detail in that WP, but that
I strongly feel should be strengthened in this forthcoming WP:

* the significant progress we’ve made in developing tools for
quantitative model-to-data
comparisons and how these new tools/techniques will enable us to
rigorously quantify the
properties of QCD matter that we are promising as deliverables for
the next phase of
RHIC operations

* the role that synergy between experiment and theory play in this
endeavor. Essentially we
now have three pillars: experiment, lattice/first principles QCD
and dynamical modeling
with effective models of QCD. The synergy is important - missing
any of the three pillars
puts the success of our field at risk. However, each pillar comes
with its resource needs
and in particular the third pillar has often played 2nd (or rather
3rd) fiddle in that respect
(quite often when we talk about computational needs we only think
Lattice, but that’s
absolutely not true anymore). I’m attaching a WP regarding the
computational needs for
RHIC phenomenology to this email that outlines developments and
needs in that domain
in significant detail (the WP was drafted by U. Heinz, B. Schenke
and myself and also has
Raju’s blessing - it was originally written as input for the
Computational Nuclear Physics
Town Hall Meeting).

I’ve also tried to synthesize 3 paragraphs from the model-to-data comparison
WP and the computational RHIC phenomenology WP that I’m attaching as well, as
a suggestion in terms of the language/content that I would suggest to include
in our community WP.

Thank you very much for doing this job - having been involved in the last
such exercise two years ago, I can fully appreciate the amount of time and
work that you are investing for the future of our field (and the little
amount of thanks/recognition you’re going to get…). Please let me know if you
need anything else I can do for you.

all the best,

Steffen



The physics goal for the next decade is to characterize the properties of the
quark-gluon plasma liquid by quantitative extraction of important medium
parameters from precision measurements of sensitive observables, including
hadron spectra, angular distributions and correlations, jet observables, and
electromagnetic probes. To achieve this goal, detailed comparisons of
theoretical calculations with a variety of experimental observables are
necessary.

These theoretical calculations all require significant computational
resources and are absolutely essential for the success of the overall RHIC
program. To perform meaningful comparisons, for many observables
event-by-event calculations are required. For example, the azimuthal
anisotropy in the produced particle distributions is highly sensitive to
event-by-event fluctuations. Furthermore, the details of fluctuations need to
be under control when studying observables sensitive to the details of the
phase transformation and the presence of a critical point.

Experimental data sets for a single beam energy and projectile combination
tend to surpass several petabytes. The modeling community must not only
reproduce results extracted from these data sets for numerous classes of
observables covering numerous collisions, but must also investigate a
high-dimension (of order two dozen) parameter space. In the last few years
modelers have taken on the effect of fluctuations of the initial state, which
requires analyzing hundreds of initial state configurations for a single
impact parameters. Additionally, as the field begins to analyze and interpret
data from the beam energy scan, modelers must forego the two-dimensional
descriptions applied at the highest energies and consider approaches that
model the dynamical evolution in three spatial dimensions, without the
simplification of boost-invariance along the beam direction. Combined with
the increased data number and size of the experimental data sets resulting
from measurements at many different beam energies and from an increased
number of collision systems, the numerical demands facing the modeling
community will grow by 3 to 4 orders of magnitude.

During the last decade strategies have been developed expressly for comparing
complex compute-intensive theoretical models to large heterogenous data sets.
These strategies utilize a combination of state-of-the-art methodologies in
the statistical sciences and high throughput computing technology that are
just now becoming part of the scientific toolset in theoretical nuclear
physics. They integrate Bayesian inference and model surrogate algorithms
with the modeling environment to constrain the multi-parameter model space by
comparison with the high-dimensional data from RHIC and LHC. High-throughput
computing techniques, combined with Gaussian process surrogate models to
rapidly explore a model’s parameter space, will allow for the computational
model to data comparison within a feasible amount of time and provide the
means for multiple iterations of this process. However, obtaining
quantitative conclusions requires one to faithfully calculate what has been
measured and to assign uncertainties for comparing experimental observables,
thus requiring a sustained effort in the development of realistic
computational models of heavy-ion collisions.

Attachment: RHIC_computing_2014.pdf
Description: Adobe PDF document






  • [Hot-qcd-whitepaper-l] some comments for the WP, Steffen A. Bass, 09/15/2014

Archive powered by MHonArc 2.6.24.

Top of Page