Skip to Content.
Sympa Menu

star-fcv-l - Re: [Star-fcv-l] FCV PWG meeting, 18/Nov/2020 (Wed) 9:30am (New York time zone)

star-fcv-l AT lists.bnl.gov

Subject: STAR Flow, Chirality and Vorticity PWG

List archive

Chronological Thread  
  • From: jagbir <jagbir AT rcf.rhic.bnl.gov>
  • To: "Wang, Fuqiang" <fqwang AT purdue.edu>
  • Cc: star-cme-focusgroup-l AT lists.bnl.gov, "STAR Flow, Chirality and Vorticity PWG" <star-fcv-l AT lists.bnl.gov>, aggarwal AT pu.ac.in
  • Subject: Re: [Star-fcv-l] FCV PWG meeting, 18/Nov/2020 (Wed) 9:30am (New York time zone)
  • Date: Tue, 15 Dec 2020 19:25:23 +0530

Dear Fuqiang,

Please find my replies below:-

------------------------------

1. because your selection is predominated by statistical fluctuations yet
you're applying a cut on those statistical fluctuations.

Our selection is not by statistical fluctuations but based on fractional
Dumbbell charge separation in the data. However, similar type of charge
separation can be due to statistical fluctuations, to account for that
we are using charge reshuffle. Now we are having about 160M events. It is
seen that observed delta gamma in the data is large beyond statistical
fluctuations than those of charge reshuffle for the top 0-20% Db+-max
bins. The plot you asked is attached here with email.

------------------------------

2. I mean the max Dbmax_shuffle bin is a random collection of events from
this centrality bin.

The max Dbmax_shuffle bin is not a random collection of events from
this centrality bin. As explained earlier we generated charge reshuffle
events by reshuffling charges of partices in the real data in a given
collision centrality. So, charge reshuffle events are completely independent
sample from real data sample in given centrality though number of
positive/negative charged partciles are kept same in each reshuffle event
corresponding to real data event. Again Dbmax_shuffle bins are made
according to the fractional Dumbbell charge separation in the charged
reshuffle event sample for a given collision centrality.

------------------------------

Thank you,
Jagbir Singh


On 2020-12-14 22:46, Wang, Fuqiang wrote:
Jagbir,

Please see my replies below.

Best regards,
Fuqiang



-----Original Message-----
From: jagbir <jagbir AT rcf.rhic.bnl.gov>
Sent: Monday, December 14, 2020 11:18 AM
To: Wang, Fuqiang <fqwang AT purdue.edu>
Cc: STAR Flow, Chirality and Vorticity PWG <star-fcv-l AT lists.bnl.gov>; star-cme-
focusgroup-l AT lists.bnl.gov; aggarwal AT pu.ac.in
Subject: Re: [Star-fcv-l] FCV PWG meeting, 18/Nov/2020 (Wed) 9:30am (New
York time zone)

Dear Fuqiang,

Please find my replies below:-

1. I understand your motivation doing that but I don't agree this is
the right approach
(I think it causes biases).

Please let me know why this is not the right approach and what kind of biases
you meant.
[Fuqiang Wang] because your selection is predominated by statistical
fluctuations yet you're applying a cut on those statistical
fluctuations.

Can you plot (data-chrgR. Bkg) and (Correlated bkg) vs (Dbmax bin) on
slide 25 so we can see the details better?



2. you have only one point left. Your f_cme is basically the
(Delta gamma of those events in that Dbmax bin)
- (Delta gamma of a random collection of events in the same centrality bin
which happen
to have the same Dbmax_shuffle bin)
- (Delta gamma of those same random events calculated after the charges
are shuffled)
Do I understand it correctly?

If only one point is left as you wrote, please see the explanation
below:

Here, there is nothing like random collection of events in the same centrality bin.
[Fuqiang Wang] I mean the max Dbmax_shuffle bin is a random collection
of events from this centrality bin.

We select events depending on Db+-max i.e., depending on the back-to-back
charge
separation fDbCS = Db+-max-1. As you wrote only one point, in that case it is
the
top 10% Db+-max events corresponding to maximum back-to-back charge
separation events.
It is similar to selecting events in particular collision centrality depending on
the impact parameter or event multiplicity. As we have selected in the data
top 10%
Db+-max corresponding to maximum back-to-back charge separation events,
in same way
we select events from charge reshuffle in the same collision centrality for top
10%
Db+-max(here Db+-max of charge reshuffle) corresponding to maximum back-
to-back
separation. Now we get delta_gamma_data of real data events
corresponding to top
10% Db+-max of data and delta_gamma_sta of charge reshuffle events
corresponding to top
10% Db+-max of charge reshuffle for a given collision centrality which gives
us delta_gamma due to statistical fluctuations. Now for correlated
background we
look for real events in the data corresponding to the top 10% Db+-max
of charge
reshuffle in a given centrality and get delta_gamma_cor from those real
data events.

Now the f_CME is obtained as

f_CME = N1*(delta_gamma_data - delta_gamma_sta -
delta_gamma_cor)/(delta_gamma_data * N)

Where N1 is number of events in top 10% Db+-max and
N is total number of events in a given collision centrality.

Thank you,
With regards,
Jagbir Singh

On 2020-12-14 10:32, Wang, Fuqiang wrote:
> Jagbir,
>
> Thanks for your answers.
>
> To your questions:
>> Please clarify the following
>> Of course you've also removed the large charge-shuffle
>> background which is basically
>> an autocorrelation effect (sort to speak) due to the Dbmax (and
>> Dbmax_shuffle)
>> selection bias.
> I was just saying it in passing, referring to the fact that you're
> largely selecting on statistical fluctuations and trying to remove the
> auto-correlation effect by shuffling. It wasn't a question.
>
>> Please explain the following:
>> So is your finite signal really due to the difference between
>> the average of ratios
>> and the ratio of averages (or perhaps also due to residual
>> effect from shuffling)?
> I was referring to the fact that if you had a single Dbmax bin (i.e.
> taking average first and then ratio) then you'd get zero signal by
> definition. You now have 10 bins and take ratios first in each bin and
> then take average of the ratios, and get a positive signal. During the
> focus meeting discussion, it was made clear that your analysis
> required multiple Dbmax bins, not taking average of all bins, but only
> those with Delta gamma > 0. So now I think I understand technically
> how you did it. I understand your motivation doing that but I don't
> agree this is the right approach (I think it causes biases).
> So let me try to understand better:
> On slide 9 of your focus meeting presentation
> https://drupal.star.bnl.gov/STAR/system/files/CME_FOCUS.pdf, you
> state:
> (1) If Delta gamma_bkg. is negative then it is taken as zero.
> (2) If gamma_SS is not negative and gamma_OS is not positive then
> delta gamma = 0.
> Now to slide 25, let's take one centrality say 40-50%, you have the
> blue points (signal) and red+green points (bkg). The 8 points to the
> right of this centrality: all of them have negative bkg and negative
> Delta gamma, so they are not counted in your calculation of CME
> fraction. Now you're left with the two leftmost points. Do both points
> satisfy (2) above? I know both points seem to have bkg>0 & Delta
> gamma>0 but it's unclear if they satisfy SS<0 & OS>0. Assume they do,
> then you're taking average of these two data points. For the sake of
> simplicity, let me say you have only one point left. Your f_cme is
> basically the (Delta gamma of those events in that Dbmax bin)
> - (Delta gamma of a random collection of events in the same centrality
> bin which happen to have the same Dbmax_shuffle bin)
> - (Delta gamma of those same random events calculated after the
> charges are shuffled) Do I understand it correctly?
>
> Best regards,
> Fuqiang
>
>
>
>> -----Original Message-----
>> From: jagbir <jagbir AT rcf.rhic.bnl.gov>
>> Sent: Sunday, December 13, 2020 10:12 AM
>> To: Wang, Fuqiang <fqwang AT purdue.edu>
>> Cc: STAR Flow, Chirality and Vorticity PWG
>> <star-fcv-l AT lists.bnl.gov>;
>> star-cme-
>> focusgroup-l AT lists.bnl.gov; aggarwal AT pu.ac.in
>> Subject: Re: [Star-fcv-l] FCV PWG meeting, 18/Nov/2020 (Wed) 9:30am
>> (New York time zone)
>>
>> Dear Fuqiang, all,
>>
>> Sorry for not answering your email. Infact, I did not look this
>> email. Please go through my replies below:
>> ------------------------
>>
>> 1. A few events do not satisfy this cut so not including in it
>> Db+-max
>> but in overall calculations all events are included.
>>
>> 2. Yes
>>
>> 3. We reshuffle charges in each event. We donot randomize charges
>> according to the positive/negative charge ratio of the given
>> event.
>> In fact, we pick up one event and reshuffle positive/negative
>> charges
>> keeping theta, phi, number of postive charges and number of
>> negative
>> charges as such. After this we calculate gamma correlator. This
>> procedure
>> is repeated for each event. The Db+-max of reshuffle is a bit
>> bit wider
>> than the real distribution which may be due to some
>> correlations in the
>> real data whereas reshuffle is purely randomize. Db+-max binning
>> is
>> done on the basis of same fractions.
>>
>> 4. Let me explain this point
>>
>> We pick up a real data event and calculate following
>> i) Dbmax+- of real data event
>> ii) reshuffle charges in an event
>>
>> iii) again calculate Dbmax+- and termed it Db+-max of charge
>> reshuffle
>> iv) calculate gamma of real data event
>> v) calculate gamma of reshuffle event
>>
>> Now for a given centrality
>> Steps i) to v) repeated for each event. Db+-max (data) and
>> Db+-max(reshuffle)
>> sliced into ten percentile bins.
>> Now average gamma is found in every sliced Db+-max (data) and
>> Db+-max(reshuffle)
>> from the respective event samples. It should be noted that
>> events in the top
>> say 10% Db+-max(data) are not the same as in the top 10%
>> Db+-max(reshuffle) i.e,
>> real events in the top 10% Db+-max(data) are different from
>> those top 10%
>> Db+-max(reshuffle). Now the correlated background is caculated
>> from the real events
>> corresponding to the top 10% Db+-max(reshuffle) events.
>>
>> Please clarify the following
>>
>> Of course you've also removed the large charge-shuffle
>> background which is basically
>> an autocorrelation effect (sort to speak) due to the Dbmax (and
>> Dbmax_shuffle)
>> selection bias.
>>
>> 5. Db+-max distribution is sliced in to ten percentile bins which
>> represent
>> different amount of charge separation in each sliced db+-max
>> bin.
>> Let us
>> say we have Db+-max = 2, in this case fractional dumbbell
>> charge separation
>> f_DbCS = Db+-max-1=1 i.e., 100% back-to-back charge separation
>> i.e.,
>> positive charged particles on one side of the dumbbell and
>> negative charge
>> particles on other side of the dumbbell. So, computing gamma in
>> different
>> Db+-max and calculating things is different from just making a
>> single wide
>> bin as you mentioned. This method is designed to get CME-like
>> enriched sample
>> in given collision centrality as one divides all events into
>> different collision
>> centralities depending on either the impact parameter or event
>> multiplicity but
>> one does not study all events taken together without making
>> different
>> collision centrality classes. However, for a single wide
>> Db+-max bin as you
>> wrote we will get zero signal.
>>
>> Please explain the following:
>>
>> So is your finite signal really due to the difference between
>> the average of ratios
>> and the ratio of averages (or perhaps also due to residual
>> effect from shuffling)?
>>
>>
>> Thank you,
>>
>> with regards,
>> Jagbir Singh
>>
>>
>>
>> On 2020-11-18 23:18, Wang, Fuqiang wrote:
>> > Hi Jagbir,
>> >
>> > Your results are quite interesting. I have a few further questions
>> > about the details of your analysis:
>> > 1. For each event you have Dbmax with the condition of |Dbasy|<0.25.
>> > You bin events of each centrality in Dbmax. You use all events in
>> > your analysis (i.e. you're not throwing away events based on Dbmax
>> > or Dbasy), right?
>> > 2. In your calculation of gamma=<...>/v2c for a particular Dbmax
>> > bin of a given centrality, the v2c is calculated using those events
>> > only, right?
>> > 3. For the charge reshuffle, you reshuffle the charges of all
>> > events, and repeat your analysis from step 1 (i.e. you treat this
>> > as a completely separate "new" data sample), right? Did you
>> > "randomize" the charges according to the positive/negative charge
>> > ratio of the given event? On s11, the Dbmax_shuffle distribution is
>> > a bit wider than the real distribution, do you understand why? How
>> > do you bin the Dbmax and Dbmax_shuffle into 10 bins, respectively
>> > (same bin edges or same fractions)?
>> > 4. Your correlated background gamma is calculated for the Dbmax bin
>> > where Dbmax is from the charge-shuffled events, but using restored
>> > charges, right? If so, then you're effectively taking gamma
>> > difference between Dbmax_i events and Dbmax_shuffle_i events (which
>> > are different events), right? Of course you've also removed the
>> > large charge-shuffle background which is basically an
>> > autocorrelation effect (sort to
>> > speak) due to the Dbmax (and Dbmax_shuffle) selection bias.
>> > 5. You divide Dbmax (and Dbmax_shuffle) into 10 bins and do your
>> > analysis in each bin separately, and then take the weighted average
>> > for your f_cme result. You could just use a single wide Dbmax (and
>> > Dbmax_shuffle) bin, then in principle you should get zero signal
>> > because the correlated "background" is your real signal since they
>> > are now identical event sample. So is your finite signal really due
>> > to the difference between the average of ratios and the ratio of
>> > averages (or perhaps also due to residual effect from shuffling)?
>> >
>> > This is a complicated analysis. It would be really good to have
>> > more discussions so the details can flesh out better.
>> >
>> > Thanks,
>> > Fuqiang
>> >
>> >
>> >
>> >> -----Original Message-----
>> >> From: Star-fcv-l <star-fcv-l-bounces AT lists.bnl.gov> On Behalf Of
>> >> jagbir via Star- fcv-l
>> >> Sent: Tuesday, November 17, 2020 10:49 AM
>> >> To: ShinIchi Esumi <esumi.shinichi.gn AT u.tsukuba.ac.jp>; STAR Flow,
>> >> Chirality and Vorticity PWG <star-fcv-l AT lists.bnl.gov>
>> >> Subject: Re: [Star-fcv-l] FCV PWG meeting, 18/Nov/2020 (Wed)
>> >> 9:30am (New York time zone)
>> >>
>> >> Dear ShinIchi, Prithwish and Jiangyong,
>> >>
>> >> I would like to give "Update on event by event charge separation
>> >> in
>> >> Au+Au collisions at 200GeV with STAR detector"
>> >>
>> >> Please add me to agenda.
>> >> I will post my slides later.
>> >>
>> >> Thankyou,
>> >> Jagbir Singh
>> >>
>> >> On 2020-11-16 15:57, ShinIchi Esumi via Star-fcv-l wrote:
>> >> > Dear FCV PWG colleagues
>> >> > We will have our weekly FCV PWG meeting on coming Wednesday
>> >> > 18/Nov/2020
>> >> > 9:30AM (in BNL) at our usual time and place. So if you have
>> >> > anything to present, please let us know and please post your
>> >> > slide by
>> Tuesday.
>> >> > We'll talk about the "HLT express productions" in the beginning
>> >> > of the meeting as you see in the agenda page. Jiangyong, please
>> >> > send a link to your slide from last week.
>> >> >
>> >> > The zoom room link, ID and password are in our usual drupal
>> >> > agenda page below.
>> >> > Please also keep in mind that all the preliminary plots should
>> >> > have already been there in the summary area below.
>> >> > Best regards, Jiangyong, Prithwish and ShinIchi
>> >> >
>> >> > Meeting agenda page with zoom link :
>> >> > https://drupal.star.bnl.gov/STAR/blog/jjiastar/bulkcorr
>> >> >
>> >> > Preliminary page :
>> >> > https://drupal.star.bnl.gov/STAR/pwg/bulk-correlations/bulkcorr-
>> >> > pre
>> >> > lim inary-summary
>> _______________________________________________
>> >> > Star-fcv-l mailing list
>> >> > Star-fcv-l AT lists.bnl.gov
>> >> > https://lists.bnl.gov/mailman/listinfo/star-fcv-l
>> >> _______________________________________________
>> >> Star-fcv-l mailing list
>> >> Star-fcv-l AT lists.bnl.gov
>> >> https://lists.bnl.gov/mailman/listinfo/star-fcv-l

Attachment: Reply_to_comments.pdf
Description: Adobe PDF document




Archive powered by MHonArc 2.6.24.

Top of Page