Reliable Wireless and Link Layers

It is often the case that a “different” puzzle presents an opportunity for a young scientist to say “Oh, I can solve this problem because it’s different”. Well, sure… but is it simpler, or simply different?

The question begs in a discussion of using reliable link layers in wireless to solve the problem of retransmissions and poor QOS. The problem is that artifact of retransmission distorts the use of the medium, because too many retransmits / congestion events occur, biasing the statistics and becoming unfair. The solution for the wireless approach is to use the same thing that causes artifacts of noise as an architectural solution.

In the wired case, the solution is smaller packet sizes, so that when congestion occurs, congestion recovery impacts fewer events. But the increase in the number of packets than distorts the results of reliable link layer, so you get the same problem, only it’s less obvious, which is why I mention the wireless case first.

In the first case, it’s a first order effect. In the second case, it’s a second order effect dominating the first. But in neither case is it a “different” effect that can be solved independent of the other. And there’s where the delusion lies.

An old physics trick — look at the really lossy case first, and then once you’ve figured out what’s the problem, ask if a similar problem hides in the less lossy case. Given the scope of the Internet, little problems become big fast.

Of course, I got told last year that this wasn’t a problem in wireless. Oh, and we’ll eventually find weapons of mass destruction, too. Are you holding your breath?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.