The following sections provide detailed analysis of common problems in fiber testing.
(1) Why does the fiber test pass but the packet is still lost during network operation?
In the choice of standard, many users will make some obvious mistakes, such as paying little attention to whether the tested fiber is 50μm or 62.5μm.
The requirements for the maximum loss value of the two-aperture fibers are relatively large. Incorrect selection of the optical cable test standard will directly lead to the change of the determination threshold. For example, if the actual measured link is 50μm fiber, and the selected test standard is 62.5μm, and the application is 100Base-FX, assuming the test result is 10dB, the tester will get the PASS result, and the real situation should be unqualified Because it exceeds the decision threshold of 6.3dB.
This answers the previous question, and the test passes, but why the data will still lose packets.
(2) Why the 10 Gigabit rate is still not supported when it passes the 10 Gigabit standard?
There are such users who upgrade the backbone of the network. They will upgrade the modules of the switch and the server. Of course, they will also test the loss of the fiber in the network. It seems that there is no problem in the method. The fiber has been tested to meet the requirements of the 10 Gigabit network. , The loss is less than the standard limit, but the actual operation effect is still not ideal.
The reason for the analysis is mainly that the mode bandwidth of the fiber optic cable is not considered. The mode bandwidth of different fiber optic cables represents the maximum bandwidth that can be provided within a certain distance. The larger the mode bandwidth, the greater the transmission rate within a certain distance.They were deployed in earlier years. Generally, the mode bandwidth is relatively low, less than 160. As a result, the speed cannot be increased as the distance is longer, although the loss is acceptable at this time.
(3) The test loss is up to standard, and there is no problem with the mode bandwidth. Why is there any problem in actual operation?
We still have a misunderstanding in the test. As long as the loss passes, the fiber is considered to be okay, but this is not the case. Assuming such a situation, the standard design requires the link loss to be 2.6dB. The loss of an adapter head is more than 0.75dB, but the total link loss is still less than 2.6dB. At this time, if you simply test the loss, you may not find the adapter problem, but in real network use, it will be because of the adapter problem. As a result, the transmission bit error rate is greatly increased.