11-20-2002 11:26 AM - edited 03-02-2019 03:03 AM
is there a mathematical reason to why ethernet can only span 100meters?
11-20-2002 01:43 PM
I am unsure of the formula, but the attenuation of the signial at more than 100 meters (326 ft) is the reason
11-20-2002 01:51 PM
Its due to cable delay that the distance is limited to 100 ft. The maximum delay specifications for ethernet shudnt exceed 1000 nsec thats 1 microsec. With mathematical equations, it can be found that the maximum distance limitation would be no more than 100 meters. So ultimate reason - cable delay
11-20-2002 02:46 PM
No.
The origional 802.3 10 base 5 spec was for 500 Meters.
A little used 10 Broad36 supported 3.6 kM over a broadband system.
The 10 Mb and 100 Mb fiber specifications are tipically for 2 KM.
In order to keep costs low, the UTP specifications have been 100 meters.
10 baseT, 100BaseT, 100BaseT4, and 1000BaseT.
The distance limitations on half duplex are either attenuation or timing.
The attenuation factor is how far away can the receiver be until the data cannot be reliably received over the noise.
The timeing is how long can the cable be before 64 bytes are transmitted on one station without the first byte being received on the receiving station. This was used to sense collisions on half-duplex ethernet..
There are some excelent white papers on the web.
Regards
Doug
11-20-2002 03:15 PM
It has to do with the time delay per meter of cable.
The following link may help...
Discover and save your favorite ideas. Come back to expert answers, step-by-step guides, recent topics, and more.
New here? Get started with these tips. How to use Community New member guide