Patentability of AI – A Legal Perspective on Emotional Perception
AI has the power to revolutionise almost any industry and is being embraced far and wide. The pressure is on to develop ever more sophisticated AI systems. In other areas of technology, the patent system plays a pivotal role in protecting rights to ensure a return on investment in research and development, but is it fit for purpose when it comes to protecting AI-related innovation?
Patent law in the UK as we know it today dates back to the 1970s, when the idea of AI was the stuff of science fiction. As the legal framework for protecting innovation was established without AI in mind, it is inevitable that the law is having to play catch-up with the commercial reality of using AI to assist in innovation and with the development of AI technology itself. The increasing prevalence of AI is demanding a reassessment of fundamental patent law concepts and their application to new technology.
A recent decision from the UK High Court (Emotional Perception AI Ltd v Comptroller-General of Patents [2023] EWHC 2948 (Ch)) has gone some way to clarifying some of the issues. This is the first time the High Court has considered whether the use of an aspect of Artificial Intelligence, namely an Artificial Neural Network (ANN), on the facts of this case, engaged the statutory exclusion from patentability of “a program for a computer … as such.” The decision may pave the way to make it easier to patent AI inventions in the UK.
Patentability issues for AI inventions – excluded subject matter and technical contribution
One of the main issues faced when trying to patent AI inventions is the law relating to exclusions from patentability. UK legislation1 declares that anything that consists of certain categories of excluded matter, such as a program for a computer “as such” or a mathematical method, is not an invention and therefore cannot be protected by a patent. AI by its nature is based on computational models and mathematical algorithms – does this mean that it is not possible to patent the results of endeavours to improve AI and to create new AI systems?
The answer is that the claimed invention needs to amount to more than a computer program (or a mathematical method) – it must provide a “technical effect”. Guidance from the UK Intellectual Property Office (UKIPO) Manual of Patent Practice provides that a computer-related invention will not be excluded from patentability if it is directed to a specific technical process outside of a computer and contributes to a solution of a technical problem lying outside of the computer. Conversely, a task or process is unlikely to make the required technical contribution if it relates solely to excluded subject matter or to processing or manipulating information or data, or if it has the effect of just being a better or well-written program for a conventional computer. In short, the computer program needs to have an effect on a real-world external process or apparatus (which could include a better functioning computer).
The “technical contribution” is assessed in accordance with a 4-step framework developed through case law (the Aerotel test2 ) in which the claim is construed, the actual contribution is identified and the question is then asked as to whether the contribution falls within the excluded subject matter. Finally, there is consideration as to whether the contribution is actually technical in nature. In practice, the test involves consideration of the problem to be solved by the alleged invention, how the invention works, what the advantages are and what the inventor has added to human knowledge.
The requirement for a technical effect means that core AI may be difficult to patent – improving an abstract AI algorithm is unlikely to make the necessary technical contribution. Instead, the focus should be on the technical implementation of the AI.
The Emotional Perception case
The recent Emotional Perception case exemplifies the application of these principles. This was an appeal to the High Court challenging a decision by the UKIPO to refuse to grant Emotional Perception’s patent. The UKIPO found that the patent application claimed a computer program as such, which was therefore excluded from patentability, despite acknowledging that the invention was a significant improvement over the prior art. On appeal to the High Court, Mann J considered whether the use of an ANN, in the way claimed in Emotional Perception’s patent application, engaged the computer program exclusion from patentability.
The technology
The patent application claimed an improved system for providing media file recommendations to an end user, including sending a file and message in accordance with the recommendation. This might be used, for example, by a music website where a user may be interested in receiving music similar to another track. In contrast to existing systems where similar tracks are suggested according to a category derived from human classification (rock, heavy metal, folk, classical etc) or human-compiled playlists, the claimed advantage of the invention is that the AI system can offer suggestions of similar music in terms of human perception and emotion irrespective of the genre of music and the apparently similar tastes of other humans.
The technology underlying the invention claimed in the patent application involves the use of ANNs. An ANN can be trained as how to process an input, learn by that training process, hold that learning within itself and then process that input in a way derived from that training and learning.
According to the claimed invention, one ANN is trained to characterise certain music tracks based on a description of how they are perceived by a human (e.g. happy, sad, relaxing, though the descriptions would be more complicated and wordy than that) and to produce co-ordinates in a notional “semantic space” for each track in a pair of music files. Two tracks of music which are semantically similar will have co-ordinates closer together.
A second ANN analyses the physical and measurable properties of the same two tracks - tone, timbre, speed, loudness etc - and produces co-ordinates in a notional “property space”. Again, differences or similarities are reflected in the proximity of the co-ordinates. The second ANN is then trained to make the distances between pairs of the property co-ordinates converge or diverge in alignment with the distancing between them in the semantic space. So, if the property space co-ordinates are farther apart (or closer together) than those in the semantic space, they are moved closer together (or further apart). This training is achieved by a process called back-propagation in which the “error” in the property space is corrected using an algorithm provided by a human. Correction is achieved by the ANN adjusting its own internal workings, such as weights and biases. The ANN learns from the experience without being told how to do it by a human being.
The training process is repeated many times with many pairs of tracks, so the ANN learns, by repetitive correction, how to produce property vectors whose relative distances reflect semantic similarity or dissimilarity between two tracks. In other words, the ANN learns how to discern semantic similarity (or dissimilarity) from the physical characteristics of a music track. It can then take a track, analyse its physical properties, and then recommend semantically similar music files (i.e. music which will, for example, generate a similar emotional response in humans) from a database. The system sends a recommendation message and a music file.
The decision
The High Court considered two questions, as follows.
First, was there excluded subject matter – did the patent application claim a computer program?
The court found that the ANN itself is not a computer program. The data is processed by the nodes of the ANN, and the way the nodes operate and pass on the data is determined by the ANN itself, which learns via the training process rather than being programmed by implementing code given to it by a human.
However, the parties agreed that a computer program achieves, or initiates, the training, so there was some computer programming activity involved in the invention. The Court therefore considered whether the internal training and the subsequent operation of the trained ANN is a computer program for the purposes of the exclusion.
The Court held that it was not correct to view the whole thing as some sort of overall programming activity for the purposes of the exclusion (which it suspected the UKIPO had done) – one should be a bit more analytical than that. The invention is the idea of using pairs of files for training, and setting the training objective and parameters accordingly. Although computer programming was involved in setting the training objectives, the patent does not claim that program and the program is subsidiary to the invention, so the exclusion is not invoked.
Second, was there a technical contribution sufficient to avoid the computer program exclusion?
The Court went on to consider the position if it was wrong about the claim not being a claim to a computer program. It referred to authorities3 which held that the mere involvement of a computer or a computer program does not by itself invoke the statutory exclusion.
The Court recognised that, were the claim to involve a claim to a computer program it would be necessary to consider steps 3 and 4 of the Aerotel test in the light of the identified contribution (i.e. whether it falls solely within the excluded subject matter; and whether the actual or alleged contribution is actually technical in nature), and in particular whether the computer program made a technical contribution outside itself.
There was no dispute as to the law – if there is a technical contribution which lies outside the excluded subject matter, then the invention is unlikely to fall foul of the computer program exclusion because it is not a claim to a program “as such”. Emotional Perception argued that the contribution is the provision of improved file recommendations via a sophisticated learning process and operation of the ANN, and the technical effect is the sending of an improved recommendation message. All of this is external to the computer.
The Court considered Gemstar 4, which Emotional Perception submitted supported their case. Gemstar concerned 3 patents which involved computer systems. Two of them did no more than produce displays, and that was held to be an insufficient technical effect to prevent the inventions falling foul of the exclusion. The third facilitated the moving of a file from one apparatus to another and that was held to be a sufficient “real world” or external effect to be technical and sufficient to enable the patent to escape the exclusion.
Emotional also submitted that its case was effectively on all fours with the Protecting Kids5 case. That case involved a “data communication analysis engine” which was capable of detecting the undesirable use of computers by children (or others) by “packet sniffing”. Contents of the packets were assessed by the computer for an alert level, and if that level was reached an alert was sent digitally and electronically to an appropriate adult so that an appropriate response could be sent back to the computer. The Court held that the inventive concept, although relating to the content of electronic communications, was undoubtedly a physical one rather than an abstract one – more akin to the third of the three patents considered in Gemstar. The contribution was improved monitoring of the content of electronic communications, which was held to have the necessary characteristics of a technical contribution outside the computer itself.
Emotional Perception argued that, similarly, data was moved outside the computer system in the form of the music file that is transferred, which provides an external effect in the outside world. The Court agreed.
Significantly, the UKIPO had found that the sending of the file to the end user is a matter external to the computer but that the “… beneficial effect is of a subjective and cognitive nature and does not suggest there is any technical effect over and above the running of a program on a computer”. The High Court disagreed with this conclusion. It held that a file has been identified, and then moved, because it fulfilled certain criteria and although those criteria cannot be described in purely technical terms, the ANN has gone about its analysis and selection in a technical way. That is a technical effect outside the computer and the possible subjective effect should not disqualify it.
The Court held that if the computer program was either the training program or the overall training activity (which the UKIPO appeared to have considered to be the case), the resulting trained ANN may itself constitute a technical effect which prevents the exclusion applying.
Discussion
This decision marks an exciting development and some much needed clarification for those looking to protect AI inventions. But does it indicate that there will be a new approach to examination of computer-related inventions at the UKIPO and mark the dawn of an increasingly AI-friendly era in the patent world, in the UK at least?
The answer would seem to be “yes”, to some extent. In response to the decision, the UKIPO issued statutory guidance stating “the office is making an immediate change to practice for the examination of ANNs for excluded subject matter. Patent Examiners should not object to inventions involving an ANN under the “program for a computer” exclusion of section 1(2)(c)”. However, the UKIPO’s guidelines for examining patent applications relating to AI are currently temporarily suspended, so we can expect further developments once the UKIPO has given due consideration to the practical implications of the Emotional Perception case.
The computer program exclusion is not the only statutory exclusion that might be levelled at an AI invention. The reliance of AI on mathematical algorithms might invoke the exclusion for mathematical methods. An attempt to raise this before the High Court in the Emotional Perception case failed for procedural reasons, but one would hope that the UKIPO will consider all possibly relevant exclusions from patentability in re-assessing its guidelines for examination.
It should be noted that the approach of the European Patent Office (EPO) to the computer program (and mathematical method) exclusion is somewhat different to that of the UKIPO. Before the EPO, the requirement for a “technical contribution” has a much lower bar (one that the UK courts refer to as the any hardware approach), with detailed consideration of the technical character of the invention falling to be considered as part of the assessment of inventive step, using the ‘problem-solution’ approach. The question becomes one of whether the invention provides a technical solution to a technical problem.
It will be interesting to see whether the EPO is influenced by the UK Court’s approach in Emotional Perception. The EPO Guidelines for Examination are amended regularly, with a recent amendment including relatively generous provisions for AI-related applications to be considered to have the necessary technical character. However, as things stand, prosecution of AI-related patent applications through the UKIPO might be considered to be an attractive option for trying to secure protection for the UK (although, given the potential for a fast pace of change, it might be prudent to submit national and EP applications in parallel).
Many will welcome the pragmatism shown by the High Court and may feel that this is a sound application of the existing law. But in an area as fast moving and complex as AI, it is unlikely to be a complete answer and it is conceivable that an overhaul of the legislation will be needed before too long. After all, the exclusions to patentability were originally introduced, in part, to deal with the fact that patent offices were not equipped to search the prior art concerning computer programs and the like. But times have moved on and one has to ask whether, to the extent that remains the case, it is sufficient justification for exclusions that potentially bite on so many inventions in the modern world.
1Patents Act 1977, section 1(2)
2 After Aerotel Ltd v Telco Holdings Ltd and Macrossan’s Application [2006] EWCA Civ 1371
3 For example, Protecting Kids all over the World (PKTWO) Ltd’s Application [2012] RPC 13 and Halliburton Energy Services Inc’s Patent Application [2012] RPC 12.
4 Gemstar-TV Guide International Inc v Virgin Media Inc [2010] RPC 10
5 Protecting Kids all over the World (PKTWO) Ltd’s Application [2012] RPC 13