User:Basicer/Sandbox

The algorithm exception and the patent-eligibility trilogy
The exception to patenting algorithms arose out of three Supreme Court cases commonly referred to as the "Supreme Court Trilogy" or "patent-eligibility trilogy". This is a designation for three Supreme Court cases decided within a decade on whether, and in what circumstances, a claimed invention was within the scope of the US patent system (that is, was eligible to be considered for a patent grant). The three cases of the trilogy can be harmonized on the basis of when a claimed implementation of an idea or principle is old or departs from the prior art in only a facially trivial way, the claim is patent-ineligible (as Nielson and Morse said, and Flook reaffirmed, it must be treated as if in the prior art).

Gottschalk v. Benson
The invention in this case was a method of programming a general-purpose digital computer using an algorithm to convert binary-coded decimal numbers into pure binary numbers. The Supreme Court noted that phenomena of nature, mental processes and abstract intellectual concepts were not patentable, since they were the basic tools of scientific and technological work. However, new and useful inventions derived from such discoveries are patentable. The Court found that the discovery in Benson was unpatentable since the invention, an algorithm, was no more than abstract mathematics. Despite this holding, the Court emphasized that its decision did not preclude computer software from being patented, but rather precluded the patentability of software where the only useful characteristic was an algorithm. The Court further noted that validating this type of patent would foreclose all future use of the algorithm in question. Therefore, like the traditional exceptions to patentable subject matter, the purpose of the algorithm exception was to encourage development of new technologies by not granting patents that would preclude others from using abstract mathematical principles.

Parker v. Flook
The invention in this case was a method of calculating alarm limits by using a "smoothing algorithm" to make the system responsive to trends but not momentary fluctuations in process variables (such as temperature). Because it was conceded that the implementation of the algorithm was conventional, the Court found that the inventor did not even purport to have invented anything on which a patent could be granted. The Court did so on the basis of the principle that the nonstatutory subject matter (the algorithm) must be regarded as already in the prior art. Therefore, there was nothing left on which a patent could issue. In a case in which a patent was sought on an implementation of a principle (the algorithm), the implementation itself must be inventive for a patent to issue. Since that was not so, the Court held that the patent office had properly rejected Flook's claim to a patent. The Court relied on the decision in Neilson v. Harford, an English case that the Supreme Court had relied upon in O'Reilly v. Morse, for the proposition that an idea or principle must be treated as if it were already in the prior art, irrespective of whether it was actually new or old. This approach is something like that of analytic dissection in computer-software copyright law, although its use in patent law preceded its use in copyright law by a century or more.

Diamond v. Diehr
In this case the Court backed away from the analytic dissection approach, and insisted that patent-eligibility must be decided on the basis of the claim (or invention) considered as a whole. That requirement is found in the statute, but only for section 103 (governing obviousness or inventive step) and not for section 101 (governing patent-eligibility). Despite this difference in emphasis, however, Diehr can be harmonized with Flook and Benson, and the Diehr Court studiously avoided stating that Flook and Benson were overruled or limited.