Skip to Content.
Sympa Menu

sphenix-run-l - [Sphenix-run-l] sPHENIX Shift Change Notes (Monday, July 3, 2023)

sphenix-run-l AT lists.bnl.gov

Subject: Commissioning and running of sPHENIX

List archive

Chronological Thread  
  • From: Jamie Nagle <jamie.nagle AT colorado.edu>
  • To: sphenix-run-l AT lists.bnl.gov
  • Subject: [Sphenix-run-l] sPHENIX Shift Change Notes (Monday, July 3, 2023)
  • Date: Mon, 3 Jul 2023 22:48:44 -0400

Monday,  July 3, 2023

General (Stefan/Kin)


Current Priority Items (PC)


Work Control Coordinators (Chris/Joel)


Plan of the Day (Stefan/PC/all–to be revisited at end of meeting)

  • Exercising running of BigPartition (EMCal, HCal, MBD, ZDC, INTT, TPOT, LL1, GL1)

  • Work on adding remaining EMCal ADC readout

  • INTT work  on event synchronization 

  • Training new shift crew in running DAQ/scripts


  • Request from TPC group for potential “peripheral”  AuAu  trigger

  • Status of INTT reconstruction and z-vertex position

  • Status of MBD now that RevTick working for feedback on z-vertex to C-AD

  • Plan to walk through what to expect in OnlineMonitoring this week (HCal today)

  • Major activity on  Wednesday
    → 5 am - starting turning magnet off (takes two hours)

→ 7 am - access starts
→ see detailed activities, and note the ordering in terms of sEPD install (10 am)

→ after bore closed, 5 hours from TPC with no field
→ 2 hours to ramp field back on (does not prevent RHIC from filling, coordinate)


========================================================================

Evening (Brett Fadem [SL], , Aditya Prasad Dash [Daq Op], Lameck Mwibanda [Data Mon], Zhongling Ji [Det Op])

  • Began shift, inherited fill from day shift.

  • Dan worked to integrate the EMCAL. 

  • We were told there would be a beam dump at 7:10 pm but we asked to push it down the line by 2 hours to 9:10 pm. 

  • After getting a new fill, we were able to successfully run for one run

  • On the subsequent run, SEBs 00 to 06 and 14 and 15 weren’t incrementing. 

  • Shift ended with John Haggerty working to get the DAQ working again.


Night (YA, H. Enyo, Abdullah)

  • Inherit beam. FIll #33932   107x107  scheduled to dump 6:46

  • ~0:00 DAQ expert (John Haggerty) is working to fix a problem that many sebs are not running (seb0-seb6 etc)

  • ~0:10   data taking started. John fixed the problem and started run20514

  • ~0:30 end run to practice starting a run.  0.54M events.

  • ~0:46  started run20516. But soon only TPOT was producing packets. Stop the run.

  • Several attempt to start run but failed

  • 2AM called TPOT expert (Hugo). He fixed the problem

  • 2:25-   run20522   1 hour. 1.1M events

  • 3:35-   run20523   1 hour   1.2M events   stopped since VNCviewer response is very slow and hard to control. Open new vncviewer and start new run

  • 4:50  run20524   1 hour   1.2M events

  • A few runs that failed to start.

  • 6:20 run20527. 

  • 6:45   RHIC prepare for dump. Turn detectors to safe mode

  • Ready to dump → beam dumped at about 6:52

  • 7:45 injection started. Ramping up.


Day (athira, maya, stacyann, bill)

  • Our first fill was lost unexpectedly (Blue quench) at ~8:50.

  • MCR noted that beam not expected to be back for 4-6 hours.

  • MCR put us in Controlled Access at ~11:20 (for "at least 3hrs"). 

  • Tim worked on EMCal voltages (black block for sector 17 bias fixed).

  • Charles did some TPC work in local mode.

  • Rachid and Maya tested INTT LV GUI (works well).

  • Rachid and team made a RA for INTT 2W1 work (filter board 2S fixed).

  • Takao powercycled many TPC FEEs.


========================================================================

Magnet (Kin)

  • Nothing new to report.  Will turn off the magnet Wed. morning ~ 5 am.

MBD (Mickey, Lameck)

Trigger (Dan)

  • New LUT to be included tonight to add a trigger for peripheral collisions:

  • 8 ->  MBD N

  • 9 -> MBD S

  • 10 -> MBD NS >= 2 hits

  • 11 -> MBD NS >= 2 hits & <= 10 hits

  • 12 -> MBD NS >= 2 hits w/ timing

  • 13 -> MBD NS >= 2 hits & <= 10 hits w/ timing

  • Note: Label on the LL1TriggerControl.py gui will  not change right away unless I am sure of stability of the change.


  • Need timing cut study before using the peripheral trigger reliably here so running with timing cut (trigger 12) beforehand will be helpful.

  • HCAL Basic LEMO cosmic trigger ready to test on detector. Wednesday 7/5 during access?

  • Will remove the MBD LEMO input and insert the HCAL cosmic LEMO input.

Measure the latency for HCAL, then would like to align ZDC/HCAL/MBD triggers.

  • HCAL should be around the same as MBD at this point.

GTM/GL1 (Martin/Dan/John K)


DAQ (Martin/John H)


MVTX (Zhaozhong)

  • Discussion about possibility of installing a few small scintillator fiber pads after the MVTX near the beam pipe to monitor the beam condition

  • Might ship some scintillating pad from LANL to BNL

  • Bob Azmoun might have some extra scintillators 

  • Report the studies to C-AD for better beam collimation 

TPC (Tom Hemmick, Jin, Takao, Evgeny, Charles, Tamas, David, Nick, Adi, Thomas Marshall, Christof)

  • 1st - Bob + John K. restore 5 diffuse laser heads + trigger board (access)

  • 2nd - Immediately after bore closure, test spark monitor installed yesterday

  • 3rd - After spark monitor test, start diffuse laser test

  • Require 5-6 hours with NO MAGNET + NO BEAM after bore closure

Today 07/03: FEE Maintenance on TPC 

  • Power cycling to recover stuck FEEs

  • Monitoring Runs in local mode

  • Thanks to Jamie and Bill for heads up

Thanks to Takao for helping me with Sector 8

  • Kin has some good news about CF4 supply, see his update on Gas and Cooling

  • Question about triggering on low multiplicity events and z-vertex position

  • Previously during low rate collisions TPC has taken MBD triggers in global mode (I believe)

  • Is it possible to trigger only on low multiplicity/low centrality?

  • Concern from our tracking group:

  • “More importantly we need to select on collisions, but reject central events. In central events the combinatorics is prohibitive to run tracking with wide open cuts. “

  • Secondarily: “The question is if we can assemble a trigger that selects on the Z position to be close to 0, +- 10 cm is fine.”

On TPC side, we need to make sure if we had semi-central/peripheral MBD triggers we can actually see them


HCal (Shuhang)

  • Hot towers possibly caused by the ADC board still exist.

  • Uploaded the absolute calibration based on cosmic to the CDB

EMCal ()

  • Still planning to go in for access wednesday morning to replace humidity probes

  • Anthony working on online monitoring plot/display

  • JH helped recover SiPM bias voltage channels

TPOT (Hugo)

  • Called tonight to fix stuck FEE problem (power-cycle, re-initialize, reset). Considering to write a script to do that “automatically” and pass to shift crew. 

  • Working on online monitoring

  • Nothing to report on the detector side 


INTT (Rachid/Maya)

  • We had access to IR, we fixed the current of two ladders from ROC 2S, and we need to confirm this success with the beam data.

  • In next store, we would  like to have short run ( 15 min ) to check two ladders by analyzing the data quickly.

  • Synchronization among the felixes: good progress in the reset of the FPHX beam crossing counters.



sEPD(Rosi)

  • sEPD install July 5th - no further update

Gas/Cooling (Kin)

  • ~10 bottles of CF4 delivered today (~noon) and Jeff/Aaron will swap the bottles.

  • Consumptions seemed more than the flow rate indicates.  Rob/Aaron/Jeff haven't found leaks.


ZDC ()


Background Counters ()


Online Monitoring (Chris)




||------------------------------------------------------------------------------------------
|| James L. Nagle   
|| Professor of Physics, University of Colorado Boulder
|| EMAIL:   jamie.nagle AT colorado.edu
|| SKYPE:  jamie-nagle        
|| WEB:      http://spot.colorado.edu/~naglej 
||------------------------------------------------------------------------------------------


  • [Sphenix-run-l] sPHENIX Shift Change Notes (Monday, July 3, 2023), Jamie Nagle, 07/03/2023

Archive powered by MHonArc 2.6.24.

Top of Page