This is a companion episode and article to Episode 1 which touched on whether there is potential friction between the U.S. Patent and Trademark Office’s (USPTO’s) AI subject matter eligibility guidance and the 2019 Revised Patent Subject Matter Eligibility Guidance (“2019 PEG”). Practitioners have noted that Example 39 of the 2019 PEG is seemingly less effective in overcoming Section 101 rejections following the issuance of the AI Guidance, particularly in light of Example 47, which dealt with training limitations, while Example 39 was a training claim. I’m here to say that there is no conflict between the examples and the two guidance documents overall and to explain why you should come to this conclusion too. This message is for practitioners, not lobbyists.
No Conflict
If there is a conflict between these sets of examples, then they can’t coexist in the same universe at the same time. In that case, it will be the new one that applies, and the former guidance is effectively overturned, which we don’t want because we like the old one better. The AI Guidance even states, “This guidance is meant to be consistent with existing USPTO guidance. However, if any earlier guidance from the USPTO, including any section of the current MPEP, is inconsistent with the guidance set forth in this notice, USPTO personnel are to follow this guidance.” So, in the Office’s mind, there is no conflict, the AI Guidance is flawless. But if we, the people, are going to say there is a conflict, they will then hold us to our own words and apply the new guidance exclusively.
Takeaways
Therefore, be careful not to frame your arguments as if there could be a conflict between the two guidance documents. That is, don’t give the impression that you’re asking the USPTO to apply one guidance document or one set of examples over another; that sounds like there is a conflict. Instead, do the following:
1) Argue distinctions, not conflicts.
Distinguish between the new examples and the former examples. These examples are all fact specific so pay attention to the fact patterns of each instance and argue differences and similarities when appropriate.
2) Neutralize the unfavorable examples by making them fact specific.
We will talk specifically about ways to make Example 47 fact specific in that the examples accompanying the AI Guidance have more complicated fact patterns in comparison to the former guidance.
3) Argue broader relevance of the former examples.
Because the examples of the new AI guidance have more facts and much longer analysis compared to the examples of the 2019 PEG, the examples of the new AI Guidance are actually much narrower in their scope and applicability. Therefore, the examples accompanying the 2019 PEG should have far more relevance and applicability.
Overview of Mathematical Concept as a Judicial Exception
With machine learning claims, it often comes down to whether there is recitation of mathematical concept. There are three ways to recite mathematical concept according to the 2019 PEG – by reciting mathematical relationships, by reciting formulas or equations or by reciting mathematical calculations.
Mathematical calculation is the most encompassing of the three mathematical concept categories based on the way it is defined. A claim recites a mathematical calculation by reciting a mathematical operation or an act of calculating mathematical methods to determine a variable. A lot of computer activity, particularly machine learning activity, comes down to determining a variable. Method claims are at particular risk because all method claims at some point will recite the step of determining a variable.
Example 47
Example 47 was deemed to have recited mathematical calculation in steps (b) and (c) under the AI Guidance, specifically the limitation of training based on a backpropagation algorithm and a gradient descent algorithm. The Example states:
When given their broadest reasonable interpretation in light of the background, the backpropagation algorithm and gradient descent algorithm are mathematical calculations. The plain meaning of these terms are optimization algorithms, which compute neural network parameters using a series of mathematical calculations. The fourth paragraph of the background supports the plain meaning by stating the “gradient descent begins by initializing the values of parameters and then applying a gradient descent calculation, which uses mathematical calculations to iteratively adjust the values so they minimize a loss function.” The background also states that “backpropagation is a mathematical calculation for supervised learning of ANNs using gradient descent.”
Practice tip: The claim terms “backpropagation algorithm” and “gradient descent algorithm” were afforded a plain meaning definition which stated that these are mathematical calculations and supported by the hypothetical background, which further reaffirmed that they are mathematical calculations. A mathematical calculation designation is inevitable given this fact pattern. Distinguish your claim terms from Example 47 accordingly; remind the examiners that Example 47 should be narrowly construed and applied given these facts. Do not describe your algorithms as mathematical calculations. If there is no workaround and it must be in your claim, claim practical application.
Example 39
Example 39 is an actual method of training claim, whereas Example 47 is a method of using an ANN with training steps. Example 39 is a case in which the claim is eligible based on Step 2A, Prong 1; there is no judicial exception recited in this claim. The relevant step is this:
“applying one or more transformations to each digital facial image including mirroring, rotating, smoothing, or contrast reduction to create a modified set of digital facial images…”
The background of Example 39 states: “This expanded training set is developed by applying mathematical transformation functions on an acquired set of facial images. These transformations can include affine transformations, for example, rotating, shifting or mirroring, or filtering transformations, for example, smoothing or contrast reduction.” Therefore, while the claim does not recite mathematical transformation per se, it does recite what is described as being mathematical transformation.
Isn’t this mathematical concept? No.
The explanation accompanying Example 39 states, “the claim does not recite any mathematical relationship, formula or calculation. While some limitations may be based on mathematical concepts, mathematical concepts are not recited in the claims.”
Interesting.
When the USPTO excludes mathematical concept as a judicial exception with a list of specific exclusion categories – formulas, relationships, calculations – the Office is communicating that the mathematical concept category doesn’t actually encompass math as a principle. This is because math influences everything – it’s physics, it’s science. You can’t exclude math in every shape and form from patent claims. Unless, of course, you’re the American Axle panel, which showed math was also a law of nature. But the point is that Example 39 wasn’t mathless. There was math in the claim, but it did not recite mathematical concept in a way that renders it a judicial exception because it did not fall under one of the categories of mathematical concept. Example 47 did not negate this principle. Example 47 was deemed to recite mathematical calculation because it said over and over again this is a mathematical calculation. Also, compared to the explanation of Example 47, Example 39 is far broader and general in applicability.
Practice tip: The explanation for Example 39 is broadly worded, much broader than any of the examples in the AI Guidance, making it more relevant in scope and application. Example 39 shows that a claim limitation can still have something to do with math without reciting a mathematical relationship, formula or calculation. And this was not undone by Example 47, which dealt with steps that were described to be mathematical calculations explicitly by plain meaning and its written description. These are arguments you can use to argue broader relevance and applicability of Example 39 to your claims.
But Wen, I really want to recite backpropagation and gradient descent in my claims. How do you get around Example 47?
First off, replace “algorithm” with a more generic term like “method.” The terms “backpropagation algorithm” and “gradient descent algorithm” were afforded plain meaning in accordance with their written description. You don’t want to use “algorithm” in this case then because “backpropagation algorithm” and “gradient descent algorithm” are known lexicon and will have a plain meaning, a plain meaning which we don’t like. By giving a more generic descriptor like “method” you can then give can then afford the terms a special meaning. That last step will be the hard part, I’m not going to lie. I leave it up to you. Godspeed.