What is the purpose of jam signal in CSMA/CD?
First, beware that modern switched Ethernet LANs are not CSMA/CD anymore. CSMA/CD was a technique that applied to 10mbit/sec and 100mbit/sec Ethernets that used hubs, not switches. And honestly there were never many 100BASE-TX hubs around; everyone went to switches around that time. The Gigabit Ethernet (1000BASE-T) spec requires switches; there's no such thing as a GigE hub.
On modern switched Ethernets, you don't have a shared medium anymore. When you're plugged into a switch, the "collision domain" is only between you and your switch port. And if you're in full duplex mode, which is almost always true with switches, then you don't have any possibility of collision at all. If you can't have a collision, you'll never detect a collision, so you'll never have reason to transmit a jam signal.
So back in the days of hubs (and shared cables like thinnet/cheapernet/10BASE-2 coax and thicknet 10BASE-5), here's how it worked:
Imagine you have a large 10BASE-T LAN with lots of hubs and max length cable runs, maxing out the "repeater rule" of maximum 4 repeaters (hubs) between any two devices on the LAN. Due to signal propagation delays across the hubs, it can take the IEEE spec maximum of 232 bit-times for a signal transmitted from Host A to reach the furthest host on the network (Host B).
Now imagine Host A begins transmitting a frame, and by bad luck, Host B, way at the other end of the network, 232 bit-times away, begins transmitting a frame just 231 bit-times after Host A started transmitting. Because of propagation delays on the network, Host B didn't know that Host A was already 231 bits into its transmission when Host B sent the first bit of its preamble. Now Host B will detect this collision within the first, say, 32 bits of Host B's transmission, which could be about 232 to 264 bit-times sooner than Host A will detect it. If Host B happened to detect it on the first bit of transmission, and just stopped transmitting the moment it detected this collision, then it might not have stayed on the medium long enough for Host A to also detect it and realize that a collision occurred (you can't guarantee that the receiver will detect the collision on the very first bit of the collision). That would then mean that Host A would not know to do the right collision-handling procedure. So rather than just stop transmitting, Host B transmits the Jam signal so that it stays on the medium long enough to make sure Host A realizes that a collision occurred.
This is also the reason that Ethernet frames have a minimum length of 64 Bytes. This guarantees that Host A stays on the medium long enough that a collision can be detected all the way on the other side of the network, and the Jam signal from Host B can make its way all the way back across the network, so that Host A is still on the medium when the Jam signal comes through, so it is able to realize that someone collided with its transmission.
The sender must be able to detect a collision before completely sending a frame. So, the minimum frame length must be such that, before the frame completely leaves the sender any collision must be detected.
Now, the worst case for collision detection is when the start of the frame is about to reach the receiver and the receiver starts sending. Collision happens and a jam signal is produced and this signal must travel to the sender. So, the time for this will be the time for the start of the frame to reach near the receiver + time for the jam signal to reach the sender + transmission time for the jam signal.
Another reference for requiring jam signal bits to be included for minimum frame size.
Comments
Post a Comment