Abstract:
We examine 288 gamma-ray bursts (GRBs) detected by the Fermi Gamma-ray Space Telescope's Gamma-ray Burst Monitor (GBM) that fell within the field of view of Fermi's Large Area Telescope (LAT) during the first 2.5 years of observations, which showed no evidence for emission above 100 MeV. We report the photon flux upper limits in the 0.1-10 GeV range during the prompt emission phase as well as for fixed 30 s and 100 s integrations starting from the trigger time for each burst. We compare these limits with the fluxes that would be expected from extrapolations of spectral fits presented in the first GBM spectral catalog and infer that roughly half of the GBM-detected bursts either require spectral breaks between the GBM and LAT energy bands or have intrinsically steeper spectra above the peak of the nu F-nu spectra (E-pk). In order to distinguish between these two scenarios, we perform joint GBM and LAT spectral fits to the 30 brightest GBM-detected bursts and find that a majority of these bursts are indeed softer above E-pk than would be inferred from fitting the GBM data alone. Approximately 20% of this spectroscopic subsample show statistically significant evidence for a cutoff in their high-energy spectra, which if assumed to be due to gamma gamma attenuation, places limits on the maximum Lorentz factor associated with the relativistic outflow producing this emission. All of these latter bursts have maximum Lorentz factor estimates that are well below the minimum Lorentz factors calculated for LAT-detected GRBs, revealing a wide distribution in the bulk Lorentz factor of GRB outflows and indicating that LAT-detected bursts may represent the high end of this distribution.