Monday, May 30, 2016

Caltech sues Apple!

See the post pat Appleinsider: Caltech sues Apple & Broadcom over alleged Wi-Fi patent infringements

link: http://appleinsider.com/articles/16/05/30/caltech-sues-apple-broadcom-over-alleged-wi-fi-patent-infringements

There is a New Jersey connection. The first named inventor [Hui Jin] of Caltech's 7,116,710 has a listed address in Glen Gardner, NJ.

Of the '710 patent, note: The U.S. Government has a paid-up license in this invention and the right in limited circumstances to require the patent owner to license others on reasonable terms as provided for by the terms of Grant No. CCR-9804793 awarded by the National Science Foundation.

As to related cases: This application claims priority to U.S. Provisional Application Ser. No. 60/205,095, filed on May 18, 2000, and to U.S. application Ser. No. 09/922,852, filed on Aug. 18, 2000 and entitled Interleaved Serial Concatenation Forming Turbo-Like Codes.

The "background" of '710 states:


Properties of a channel affect the amount of data that can be handled by the channel. The so-called "Shannon limit" defines the theoretical limit of the amount of data that a channel can carry.

Different techniques have been used to increase the data rate that can be handled by a channel. "Near Shannon Limit Error-Correcting Coding and Decoding: Turbo Codes," by Berrou et al. ICC, pp 1064 1070, (1993), described a new "turbo code" technique that has revolutionized the field of error correcting codes. Turbo codes have sufficient randomness to allow reliable communication over the channel at a high data rate near capacity. However, they still retain sufficient structure to allow practical encoding and decoding algorithms. Still, the technique for encoding and decoding turbo codes can be relatively complex.

A standard turbo coder 100 is shown in FIG. 1. A block of k information bits is input directly to a first coder 102. A k bit interleaver 106 also receives the k bits and interleaves them prior to applying them to a second coder 104. The second coder produces an output that has more bits than its input, that is, it is a coder with rate that is less than 1. The coders 102, 104 are typically recursive convolutional coders.

Three different items are sent over the channel 150: the original k bits, first encoded bits 110, and second encoded bits 112. At the decoding end, two decoders are used: a first constituent decoder 160 and a second constituent decoder 162. Each receives both the original k bits, and one of the encoded portions 110, 112. Each decoder sends likelihood estimates of the decoded bits to the other decoders. The estimates are used to decode the uncoded information bits as corrupted by the noisy channel.

0 Comments:

Post a Comment

<< Home