Share this article:

Emotional Perception AI: Where have we been, and where are we now?

By Olivia Crawford, Partner and Caroline Day, Partner and Gemma Robin, Partner

The Emotional Perception AI case has been closely watched for its potential to shake up the patentability of Artificial Neural Networks in the UK. Following the Appeal hearing in May 2024, members of HLK’s AI team summarise the evolution of the arguments in this case, and discuss possible outcomes.

This article provides an overview of material presented in HLK’s webinar of the same name. For a more thorough treatment of the issues, a recording of the webinar can be requested here.

Get in touch with our AI team

The application

The application relates generally to the field of artificial neural networks (ANNs). The invention is concerned with training an ANN to recommend a file, with the example of audio files being used throughout the various proceedings.

The process of training the ANN involves making pairwise comparisons between music tracks in the training dataset. For a given pair of tracks, text descriptions of each track are analysed by natural language processing (NLP) to provide a vector in semantic space. These semantic vectors capture a subjective, emotional response to the music by a human listener. Semantically similar tracks will have semantic vectors that are closer together in semantic space than semantically dissimilar tracks. Each track of the pair is also analysed in terms of its measurable physical properties, such as rhythm, tonality, timbre and/or musical texture. This produces a property vector in property space.

During backpropagation, the weights and biases of the ANN are adjusted so that a pair of tracks which are close together in semantic space become close together in property space. This results in a system which can identify the semantic similarity between a given pair of music tracks based on an analysis of their measurable physical properties. It ultimately enables a semantically similar music track to be recommended to a user.

The UKIPO proceedings

The UKIPO examiner maintained throughout prosecution that the invention relates to a mathematical method and a program for a computer as such, and that consequently the invention is not eligible for patent protection. At the UKIPO hearing, the question of excluded subject matter was addressed at length, following the well-established “Aerotel” approach:

  1. Properly construe the claim;
  2. identify the actual contribution;
  3. ask whether it falls solely within the excluded subject matter;
  4. check whether the actual or alleged contribution is actually technical in nature.

The majority of the arguments focussed on claim 4, which is computer implemented, and could be implemented in hardware or using a software emulator. In applying steps 1 and 2 of the Aerotel test, the UKIPO Hearing Officer construed the claim as providing a tool for recommending semantically similar files, and found that its contribution “does not reside in NLP or the extraction of measurable properties from files per se”, but rather “the system is better at identifying and recommending files to the user based on their semantic similarity” (Paragraphs 51 and 53). The Hearing Officer then addressed steps 3 and 4 together.

The Hearing Officer first considered mathematical methods, ultimately concluding that the claims did not relate to a mathematical method as such:

“although an ANN and a method of training an ANN per se is no more than an abstract mathematical algorithm, its specific application here as part of a file recommendation engine is, in my opinion, enough to dispense with the mathematical method as such objection.”

Moving on to computer programs, the Applicant had argued that ANNs don’t run a computer program, as a program is a series of ‘if-then’ type logical statements defined by a human. The Applicant also argued that “if the hardware ANNs are patentable, then the related software emulations will also be patentable”.

The Hearing officer started by considering what happens external to the computer:

“provision to a user of an improved file recommendation transmission process is achieved in a standard fashion…within the conventional computer network. It is external to the computer in the sense that there is a beneficial effect on the end user in being provided with a better recommendation, such as a song they are likely to enjoy. However, such a beneficial effect is of a subjective and cognitive nature and does not suggest there is any technical effect over and above the running of a program on a computer.” (Paragraph 69)

Comparison with existing EPO and UK case law didn’t help the Applicant, with the Hearing officer focussing on the idea that the recommended file is characterised by the content of its information, rather than the manner in which it is provided to the user.

UK case law provides a series of signposts (the AT&T signposts) to help asses step (3) of the Aerotel test: whether the contribution falls solely within the excluded subject matter. Two of these signposts were deemed relevant, but neither was considered by the Hearing officer to support the assertion that the contribution did not fall solely within excluded subject matter.

The UKIPO proceedings concluded with the Hearing Officer deciding that the invention was not a mathematical method, but was excluded from patentability as a computer program. This decision was largely in line with what many in the profession would have expected, although we would suggest that it might have been slightly shortsighted on the part of the UKIPO to dismiss the mathematical method exclusion entirely. It is worth remembering that combinations of exclusions are allowed, and it would seem they could have been used to good effect in this case to support the UKIPO’s position.

The High Court proceedings

Emotional Perception AI (EP AI) challenged the UKIPO Decision on two main bases:

  1. The computer program exclusion is not engaged at all; in fact, one does not get as far as finding any relevant computer program.
  2. The reasoning of the Hearing Officer fails to acknowledge a line of cases which they referred to as the “patentable ignoring a computer program” line of cases. Following these cases should mean, according to EP AI, that even if there is a computer program and the exclusion is prima facie engaged, it does not apply because the claim reveals a technical contribution and the claim is not to a program for a computer “as such”.

Sir Anthony Mann (the judge who heard the case in the High Court), invited both EP AI and the UKIPO to provide further submission on what he considered to be the relevant question of: “How, if at all, is the exclusion engaged – where is the computer and where is the program?” The submissions from the parties are summarised in the graphic below. For reference, a “hardware ANN” was referred to in the High Court judgement as “a physical box with electronics in it”, and an emulated ANN as an arrangement in which a conventional computer runs a piece of software which enables the computer to emulate the hardware ANN.

Considering first the question of a computer, Sir Anthony Mann started with the Oxford Dictionary definition: “An electronic device (or system of devices) which is used to store, manipulate, and communicate information, perform complex calculations, or control or regulate other devices or machines, and is capable of receiving information (data) and of processing it in accordance with variable procedural instructions (programs or software)…” Sir Anthony Mann stated that the “variable procedural instructions” are, while it is learning, the elements by which the ANN is able to learn and backpropagate (so any loss functions involved in the backpropagation and any constraints applied), and also indicated that the frozen state contains biases, weighting and so on which it has learnt for itself and which “one might call instructions”. Sir Anthony Mann then indicated that an emulated ANN would be regarded as a computer, and ought to be treated as one within the exclusion. A computer is therefore present in both a hardware and an emulated ANN. However, Sir Anthony Mann was clear that a computer “is not defined by the fact that it runs things called programs.” In other words, a computer isn’t or shouldn’t be defined by a need to have a program (highly relevant for the next question).

Moving on to the presence or otherwise of a computer program, there was agreement from all parties that, for a hardware ANN a “program for a computer” is not present. When considering implications of this for an emulated ANN, Sir Anthonay Mann made a distinction between the implementation of instructions input by a human (such as a programmer) and a trained ANN, stating in paragraphs 54 to 58:

“[for a hardware ANN] the hardware is not implementing a series of instructions pre-ordained by a human. It is operating according to something that it has learned itself … I do not see why the same should not apply to the emulated ANN. It is not implementing code given to it by a human… I therefore consider that the “decoupling” can be achieved and is correct and the emulated ANN is not a program for a computer for these purposes.”

While a hardware ANN did not involve a program for a computer, and an emulated ANN was also not a program for a computer, Sir Anthony Mann considered, with agreement from both parties, that programming activity was involved in the training phase, and so “the only remaining candidate computer program is therefore the program which achieves, or initiates, the training” (paragraph 59). However, when considering the extent to which the training claim actually claimed the computer program that achieves or initiates the training, the judge concluded that what is special about the training process is not contained within that computer program:

“What is said to be special is the idea of using pairs of files for training, and setting the training objective and parameters accordingly. If that is right, and I consider it is, then the actual program is a subsidiary part of the claim and is not what is claimed. The claims go beyond that.” (Paragraph 61).

Technical contribution

The above conclusion would have been sufficient to complete the Aerotel test, but nevertheless, Sir Anthony Mann helpfully considered whether, if he were to be wrong about the above, and if the ANN is in essence a computer program, there would be a technical contribution.

One possible interpretation of technical contribution turned on the effect achieved by transmission of the recommended file. To quote Paragraphs 76 to 78 of the judgement:

“The Hearing Officer seemed to consider that a subjective appreciation of the output of the system was just that, subjective and in the user, and therefore not a technical effect. I do not consider that to be the correct analysis… The correct view of what happened, for these purposes, is that a file has been identified, and then moved, because it fulfilled certain criteria… So the output is of a file that would not otherwise be selected. That seems to me to be a technical effect outside the computer for these purposes, and when coupled with the purpose and method of selection it fulfils the requirement of technical effect in order to escape the exclusion. I do not see why the possible subjective effect within a user’s own non-artificial neural network should disqualify it for these purposes.”

The above reasoning looks to the end result as being something which helps to take the case away from being a case of a computer program “as such”.  Turning to the possibility that the computer program (contrary to the judge’s conclusion) was to be either the training program or the overall training activity:

“[I]f one is assuming for these purposes that the computer program is either the training program or the overall training activity… the resulting ANN, and particularly a trained hardware ANN, can be regarded as a technical effect which prevents the exclusion applying… I therefore consider that, insofar as necessary, the trained hardware ANN is capable of being an external technical effect which prevents the exclusion applying to any prior computer program. There ought to be no difference between a hardware ANN and an emulated ANN for these purposes.”

Consequently, Sir Anthony Mann found that, even if he were to be wrong about the ANN or training being a computer program, the invention would escape the exclusion by virtue of a technical contribution.

Unfortunately, the possibility of other exclusions, and specifically the exclusion to mathematical methods, weren’t addressed in the High Court proceedings. The mathematical method exclusion did not form an alternative basis of the Hearing Officer’s decision during the UKIPO proceedings, and the UKIPO did not file a respondent’s notice to resurrect it. That argument could not therefore be run as an alternative and was not addressed by the judge.

The Appeal proceedings

The UKIPO appealed the judgement form the High Court with four grounds of appeal:

  1. The Judge erred in holding that the “a program for computer” exclusions did not apply.
  2. The Judge was wrong to rely on the Appellant’s concession regarding hardware ANNs (e.g. not having a computer program) as basis for holding that the software ANN is not operating a computer program.
  3. The Judge was wrong to exclude the consideration of the mathematical method exclusion.
  4. The Judge was wrong to hold that the claimed invention involved a substantive technical contribution.

Grounds 1 and 2 essentially amount to the question of the terms “computer and “program” again. Here the arguments from the UKIPO slightly mutated. In previous proceedings, the UKIPO had argued that that which implements the ANN is the “computer” and both training and operating the ANN constitute the computer program. During the appeal proceedings, the UKIPO argued that a trained ANN is a generic ANN (e.g. a specific structure of nodes and edges arranged in layers) plus the set of weights and biases acquired by training. The argument continues that the trained ANN is still a valid form of computer (regardless of implementation) by virtue of its function: it takes input data, processes it via some mathematical manipulation, and outputs the calculated result. The weights and biases of the trained ANN comprise “a set of instructions that make a generic computer perform a particular task”, and consequently form a computer program.

As support for their position, the UKIPO pointed to the wide range of different types and architectures of computer that exist (digital or analogue, 8-bit or 32-bit, classical or quantum), noting that the word “computer” has always been a broad term, and therefore rejecting the argument that an ANN could not be a computer merely because it does not use a conventional sequential imperative programming language taking the form of serial, logical, “if-then” type statements written by a human programmer and defining exactly what the computer should do.

EP AI argued that the above definition of “computer” was much too broad, that it ignored the meaning of the term as used in common parlance, and that it encompassed entities which would never have been intended to be excluded by the UK Patents Act (such as sextants, mechanical adding machines, hardware filters for mobile phones, slide rules, and even human beings). Even if an ANN could be considered a computer, they argued, then the weights and biases could not be construed as a “program”, since definitions for “computer program” all referenced “instructions” that make a computer do something. EP AI argued that “instructions” implied an element of imperative or command to the computer that is lacking in an ANN. An “instruction”, they said, is not the same as a component which just happens to have a downstream effect on something else. EP AI also argued against the conceptual division of an ANN into its structure and its “program” of weights and biases, which they said was artificial, since a real ANN is a complex electronic device with an intermingled overall structure that operates as a composite whole.

Considering Ground 3, the UKIPO argued that if it is possible to “decouple” the trained ANN from the wider system, then the result is entirely a mathematical method.  Recent EP decision Mitsubishi was quoted, in which an ANN was found to be a mathematical method as the entire thing could effectively be recast as a (albeit horrifically complicated and convoluted) function. EP AI argued that a trained ANN is not a mathematical method despite being able to be described mathematically, and that in contrast to Mitsubishi, the claimed ANN has a practical application and is therefore not a mathematical method as such.

In arguing Ground 4, the UKIPO returned to pretty much the logic from the original Hearing Officer’s decision: that the recommendation is only better in an aesthetic sense, and this is not enough to take the claim out of the exclusion as such. EP AI asserted that the recommendation is based on technical analysis based on the physical characteristics of the content; the analysis and the selection is gone about in a technical way and therefore that there is a technical contribution in the recommendation.

While we await the outcome of the Appeal hearing, the graphic below provides an overview of what has happened for the main issues across the course of this case.

What could happen?

The judgement in the Appeal hearing could arrive at any time, but for now we have indulged in some crystal ball gazing…

With respect to Grounds 1 and 2, a definition of the terms “computer” and “computer program” would add some welcome clarity to the patentability of AI systems in the UK, particularly outside the relatively narrow confines of ANNs. Such a definition would also bring a whole host of challenges, so it seems likely that the judgement will avoid this, and will focus rather on whether or not a trained ANN in particular is a computer or computer program. We can see a line of argument in which the Court of Appeal decides that an ANN is an evolution of the original concept of a computer program as envisaged at the time at which the Patents Act was drafted, and so on this basis shouldn’t escape the exclusion. However, this case would seem to represent an opportunity to revisit the original premise of the computer program exclusion; the idea that computer programs would be better protected by copyright. At the very least for AI systems, with their enormous capabilities, this is clearly not the case, so will we see an opening towards what we would argue might be a more realistic approach to the question of whether an ANN is, or should be, caught by the computer program exclusion? This may in the end come down to a policy position of whether or not computer programs are the sort of innovation that the patents system should be protecting. Perhaps the UK government’s stated wish to welcome AI investment and innovation in the UK could be helpful here?

It seems very possible that we may be left frustrated on the question of Ground 3, as the Court of Appeal could, and possibly should, avoid this question entirely, leaving the possibility for a combination of computer program and mathematical method exclusions to continue to cause headaches for applicants.  We do have the Hearing Officer’s decision stating, in respect of this case, that: “although an ANN and a method of training an ANN per se is no more than an abstract mathematical algorithm, its specific application here as part of a file recommendation engine is, in my opinion, enough to dispense with the mathematical method as such objection.” While many of us may be gearing up to use this in support of AI claims in the UK, past experience suggests that the application of this new bit of case law by the UKIPO may not be as helpful to applicants as we would wish.

When it comes to Ground 4 and the question of technical contribution, we are holding on to hope that the Court of Appeal will take the opportunity to recognize a level of technicality in the recommendation provided by the invention. This could be in the acknowledgement that while the recommended file is subjectively “better”, it is nonetheless a different file to that which would have been presented without the ANN, and it has been identified on the basis of physical properties of the file. Another approach would be to recognize a level of technical insight in the bridging of the gap between semantic space and objective space, which is achieved by the invention.

When asked, a majority of our webinar audience considered the EP AI claims to be patentable, and while we wait for the Appeal judgement to issue, we are taking that as a promising sign!

We will of course be back with more analysis once the Appeal judgement issues, and regardless of how the Appeal Court rules, we will still have to wait to see how the UKIPO updates its practice to reflect the judgement. In the meantime, for a detailed walk through the issues covered above, and lively discussion of some very interesting audience questions at the end, the recording of our webinar on the Emotional Perception AI proceedings so far can be requested here.

This is for general information only and does not constitute legal advice. Should you require advice on this or any other topic then please contact hlk@hlk-ip.com or your usual HLK advisor.

HLK bubble graphic HLK bubble graphic

Stay connected with HLK

Keep up-to-date with the latest IP insights and updates as well as upcoming webinars and seminars via HLK’s social media.