In optical communication systems, there are many different causes of loss in the quality of the signal. Improvement of bit error rate (BER) in optical transmission systems is a crucial and challenging problem. Whenever the distance travelled by the pulses increases, the difficulty of it being classified correctly also increases. As is normally the case the phase is measured at the mid point of the pulse because that represents the highest power level. Some researches demonstrate that linear Support Vector Machines (SVM) outperformed other trainable classifiers for error correction in optical data transmission, such as using neural networks. In this research, I use a linear support vector machine for classifying signals and demonstrate a BER improvement in comparison with the traditional method. Also, I investigate the most significant samples that can be used for training the linear SVM classifier to reduce the number of bit errors during the classification process. In particular, we take into account the neighbouring information from each symbol.