Chuck Darwin<p>Arıkan devoted the next year to learning about networks, but he never gave up on his passion for information science. </p><p>What gripped him most was solving a challenge that Shannon himself had spelled out in his 1948 paper: <br>how to transport accurate information at high speed while defeating the inevitable “noise”<br>—undesirable alterations of the message<br>—introduced in the process of moving all those bits. </p><p>The problem was known as <a href="https://c.im/tags/channel" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>channel</span></a> <a href="https://c.im/tags/capacity" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>capacity</span></a>. </p><p>According to Shannon, every communications channel had a kind of speed limit for transmitting information reliably. </p><p>This as-yet-unattained theoretical boundary was referred to as the <a href="https://c.im/tags/Shannon" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Shannon</span></a> <a href="https://c.im/tags/limit" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>limit</span></a>.</p><p>Gallager had wrestled with the Shannon limit early in his career, and he got close. His much celebrated theoretical approach was something he called low-density parity-check codes, or LDPC, which were, in simplest terms, a high-speed method of <a href="https://c.im/tags/correcting" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>correcting</span></a> <a href="https://c.im/tags/errors" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>errors</span></a> on the fly. </p><p>While the mathematics of LDPC were innovative, Gallager understood at the time that it wasn't commercially viable. </p><p>“It was just too complicated for the cost of the logical operations that were needed,” Gallager says now. </p><p>Gallager and others at MIT figured that they had gotten as close to the Shannon limit as one could get, and he moved on. </p><p>At MIT in the 1980s, the excitement about information theory had waned.<br>But not for Arıkan. </p><p>He wanted to solve the problem that stood in the way of reaching the Shannon limit. </p><p>Even as he pursued his thesis on the networking problem that Gallager had pointed him to, he seized on a piece that included error correction. </p><p>“When you do error-correction coding, you are in Shannon theory,” he says.</p><p>Arıkan finished his doctoral thesis in 1986, and after a brief stint at the University of Illinois he returned to Turkey to join the country's first private, nonprofit research institution, <a href="https://c.im/tags/Bilkent" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Bilkent</span></a> <a href="https://c.im/tags/University" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>University</span></a>, located on the outskirts of Ankara. </p><p>Arıkan helped establish its engineering school. He taught classes. He published papers. </p><p>But Bilkent also allowed him to pursue his potentially fruitless battle with the Shannon limit. </p><p>“The best people are in the US, but why aren't they working for 10 years, 20 years on the same problem?” he said. <br>“Because they wouldn't be able to get tenure; they wouldn't be able to get research funding.” </p><p>Rather than advancing his field in tiny increments, he went on a monumental quest. It would be his work for the next 20 years.</p><p>In December 2005 he had a kind of <a href="https://c.im/tags/eureka" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eureka</span></a> moment. <br>Spurred by a question posed in a three-page dispatch written in 1965 by a Russian information scientist, Arıkan reframed the problem for himself. </p><p>“The key to discoveries is to look at those places where there is still a paradox,” Arıkan says. </p><p>“It's like the tip of an iceberg. If there is a point of dissatisfaction, take a closer look at it. You are likely to find a treasure trove underneath.”</p><p>Arıkan's goal was to transmit messages accurately over a noisy channel at the fastest possible speed. </p><p>The key word is <a href="https://c.im/tags/accurately" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>accurately</span></a>. If you don't care about accuracy, you can send messages unfettered. </p><p>But if you want the recipient to get the same data that you sent, you have to insert some <a href="https://c.im/tags/redundancy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>redundancy</span></a> into the message. <br>That gives the recipient a way to cross-check the message to make sure it's what you sent. </p><p>Inevitably, that extra cross-checking slows things down. <br>This is known as the <a href="https://c.im/tags/channel" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>channel</span></a> <a href="https://c.im/tags/coding" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>coding</span></a> <a href="https://c.im/tags/problem" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>problem</span></a>. </p><p>The greater the amount of noise, the more added redundancy is needed to protect the message. </p><p>And the more redundancy you add, the slower the rate of transmission becomes. </p><p>The coding problem tries to defeat that trade-off and find ways to achieve reliable transmission of information at the fastest possible rate. </p><p>The optimum rate would be the Shannon limit: channel coding nirvana.</p>