high energy physics focuses on rare processes. The process that produces the higgs boson occurs at 0.1Hz for example. There are around 40mn collisions happening per second

LHC experiments have millions of sensors which accurately measure momentum (by putting the particle in a magnetic field) and energy of particles (by calorimeters). Muons survive the calorimeters whereas other particles don’t, so they’re easy to identifiy.

Theoretical output: 40Mhz x 1mb = 40TB/s. Saving all the data is impossible! Realtime data selection is known as triggering

Data selection Stage 1 (high level), hardware, 40mhz 100-1000khz

  • this is online computing, so data lost here is lost forever
  • FPGAs, anolog sum
  • expensive and difficult to change.
  • maintenance by experts only Stage 2 (low level), software down to 1-10kz

they shut down in winter because of the energy crisis lol

Impracticalities of reading 40Mhz (>40 TB/s) data

  • small transistors are more sensitive to radiation
  • radiation-hard optical links are slow

Fast local algorithms (stage 1)

  • On calorimeters, cluster finding, identify high energy spots, coarse
  • For muons, track finding, momentum evaluation

Stage 2 High Level Trigger Full reconstruction of the collisions data decoding

  • sensor information to coordinates and energy.
  • these require alignment and calibration track finding
  • complicated
  • problem scales exponentially in number of particles and layers vertex finding

How do we get data out of the detector? catastrophic throughput collapse solutions

  • money
  • traffic shaping