Thread: The Anternet
View Single Post
Old 09-04-2012, 05:15 AM   #11
jeepgrandch

Join Date
Oct 2005
Posts
382
Senior Member
Default
To include the actual points of scientific interest from link, /*

Transmission Control Protocol, or TCP, is an algorithm that manages data congestion on the Internet, and as such was integral in allowing the early web to scale up from a few dozen nodes to the billions in use today. Here's how it works: As a source, A, transfers a file to a destination, B, the file is broken into numbered packets. When B receives each packet, it sends an acknowledgment, or an ack, to A, that the packet arrived.

This feedback loop allows TCP to run congestion avoidance: If acks return at a slower rate than the data was sent out, that indicates that there is little bandwidth available, and the source throttles data transmission down accordingly. If acks return quickly, the source boosts its transmission speed. The process determines how much bandwidth is available and throttles data transmission accordingly.

*/ .


Evolutionary design of a simple robust algorithm seems unsurprising, even if it is elegant in being simple and robust.
Your points appear to be intersting in an engineering sense, but I can't see how a forced analogy is relevant scientifically. Would you please expand on the quoted post?
jeepgrandch is offline


 

All times are GMT +1. The time now is 09:24 AM.
Copyright ©2000 - 2012, Jelsoft Enterprises Ltd.
Design & Developed by Amodity.com
Copyright© Amodity