This paper presents a study of the errors observed when an optical Gigabit Ethernet link is subject to attenuation. We use a set of purpose-built tools which allows us to examine the errors observed on a per-octet basis. We find that some octets suffer from far higher probability of error than others, and that the distribution of errors varies depending on the type of packet transmitted.