SDS 632: Liquid Neural Networks

Podcast Guest: Adrian Kosowski

December 2, 2022

Liquid neural networks are a type of bio-inspired machine learning set to make a huge impact in the field of data analytics. On this week’s Five-Minute Friday, Jon Krohn speaks with Pathway.com Co-Founder Dr. Adrian Kosowski about the development of this new type of network and what this means for the future of data.

About Adrian Kosowski
Adrian obtained his PhD in algorithms at the age of 20. He specializes in network science and modeling processes involving graphs, time, and all things random. He is currently a co-founder of Pathway (pathway.com) – a programming framework which takes care of data updates in data streams.
Adrian spent a decade in academia, at Inria and Ecole Polytechnique in France. He took a strong interest in bio-inspired distributed systems, working on topics ranging from DNA computing to modeling ant behavior. His publication record includes two Best Paper talks at major ACM conferences.
Adrian is also a co-founder of the competitive programming website spoj.com, which has been used by a million people to boost their programming skills.
Overview
The simple brain structure of the worm C. elegans (Caenorhabditis elegans) became a model organism for a team at MIT looking to develop a bio-inspired neural network. By taking inspiration from the worm’s few neurons that interact more simply than neurons in the human brain but with surprisingly complex results, the team managed to build a model that simplifies and therefore facilitates the learning process. Adrian explains that this network does so precisely by following the behaviours of the worm’s biology, where neurons interact with each other in a method he describes as “hydraulic”, pushing on cells just as water in motion might direct movement. This helps the neural network to adapt to changing data continuously rather than only through a training data set.
These developments could improve the way that data scientists look at time-series data. Nevertheless, Jon and Adrian recognize that this is the very beginning of bio-inspired machine learning, and limitations to these approaches remain. The most significant hurdle will be to truly mimic biology – in this case, the way that the worm learns from time-series data over time – to create an expressive network inspired by nature.
Interested in sponsoring a SuperDataScience Podcast episode? Visit JonKrohn.com/podcast for sponsorship information. 
Items mentioned in this podcast:
Follow Adrian:
Did you enjoy the podcast?

Podcast Transcript

Jon Krohn: 00:06

This is Five-Minute Friday on Liquid Neural Networks.
00:09
Like the preceding two weeks for Five-Minute Friday, today I’m having a short five minute-ish conversation with a preeminent data science speaker that I met in person at ODSC West in San Francisco. Our guest today is Dr. Adrian Kosowski, who introduces the concept of liquid neural networks. All right. We’re here at Open Data Science West, ODSC West in San Francisco 2022. I’m here live filming with Adrian Kosowski. He’s co-founder and chief product officer at Pathway.com, which is a programming framework that handles streaming data updates. 
 
01:02
Adrian holds a PhD in computer science from Gdansk University in Poland, which he completed at 20 years old. And then he went into a research career in Paris at the prestigious Ecole Polytechnique and Inria the Computer Science Institute behind many key innovations, including the ubiquitous Scikit-learn machine-learning library in Python. So Adrian, I am fascinated by bio-inspired machine learning. It’s something that I talk about as often as I can in my book, Deep Learning Illustrated. So I like to talk about and learn about connections between biology and machine learning, and that happens to be an area of expertise for you. So there’s a particular term that I’m fascinated by and then I know that you’re fascinated by, and this is liquid neural networks. So Adrian, what are liquid neural networks and how could they make a real world impact? 
Adrian Kosowski: 01:56
Jon, it’s a pleasure to be here. Liquid neural networks are a new concept which concerns certain bio-inspired extension of recurrent neural networks. 
Jon Krohn: 02:10
Cool. 
Adrian Kosowski: 02:11
The team at MIT behind it was first looking for inspiration at the brain of a very simple worm called C. elegans. 
Jon Krohn: 02:22
Very common biological prototype for studying the brain. 
Adrian Kosowski: 02:26
Indeed, there’s a model organism for many good reasons. One of them is that it has a super simple brain structure. It’s simple in that it has very few neurons, only about 300, but also these neurons are very, very simple and act differently than neurons in the human mind. 
Jon Krohn: 02:45
Right. So only 300 neurons in a C. elegans whereas the human brain has like 80 million or 90 million, that kind of area. Sorry, billion, on the order of 90 billion neurons. So simpler in that sense, but also simpler in terms of structure. 
Adrian Kosowski: 03:04
In terms of structure, indeed. So there are some nice studies to compare a single human neuron in terms of computational capacity to actually a pretty large structure in an artificial neural network, you would need something like several thousand artificial neurons to do the same work as a single human neuron does. By contrast for C. elegans the neuron is very, very simple. Some like to say it’s really hydraulic in the sense that it pushes on other neurons that its connected to, rather as water would push on another cell. So the structure of a neuron, the behavior of a neuron can be described by a simple set of differential equations, which are known, which are easy to describe, and it’s tempting to actually try to get an artificial neural network which tries to implement a similar dynamics.
Jon Krohn: 04:08
Cool. So this liquid neural network idea is related to the biological inspiration of this C. elegans worm and its hydraulic mechanism for conveying information between brain cells, between neurons, but it’s also a bit of a pun. Explain why it’s also a pun. 
Adrian Kosowski: 04:30
So indeed, the two senses in which it’s liquid. It’s liquid in the sense as what you described that it behaves a bit like liquid pushing, but it is also liquid in the sense that the implementation of the learning process of the network of time is treated as continuous. So for most engineers, the usual way of looking at time as in terms of discrete steps where certain transformation of weights is applied to the network. In this case, we look more at a differential time continuous way of looking at the neural network and apply a special type of learning process based on back propagation. And now to emphasize, C. elegans does not directly apply back propagation. Back propagation is applied in the artificial simulation, which is described in particular by liquid neural networks. When trying to design mechanisms like this, researchers are not exactly trying to model biology. 
Jon Krohn: 05:43
Yeah, yeah, yeah. And it’s impossible, right? Because in biology, the worm is learning over time. Whereas when we apply techniques like back propagation to learn with machine learning, we’re using data that were collected over time, but then we’re forced to move forward in time as we perform back propagation going backwards over these data points that were collected. But it can’t work the same way as it does in C. elegans because the C. elegans is actually learning over time, whereas we’re only retrospectively looking at data points that have been recorded as we move forward through time. 
Adrian Kosowski: 06:31
That’s the thing actually. And it’s also something to realize in a much, much broader context that the capabilities of machine learning, which has to be real time, which is forced to work in real time context, are somehow limited, restrained. They don’t exactly cover the same models as those that we are used to. So in particular, if we are working with time series data, and a lot of us are including ourselves at Pathway, we work with time series data, not with bio-inspired models of course, but with time series data. Anyway, the time series comes in the way, which sometimes allows to look back at it from the beginning in the learning process, and sometimes decisions have to be taken immediately. So there are two different settings with a different interpretation. 
Jon Krohn: 07:33
Cool. And sounds fascinating. These liquid neural networks, I love that they’re biologically inspired, obviously, even if we can’t capture all elements of how biological systems learn, but how could these liquid neural networks make a real difference in the world? How could this revolutionize parts of machine learning?
Adrian Kosowski: 07:53
So I guess it’s always safest to take small steps. The way this is looking now is that certain inspirations, improvements that have been achieved are influencing new network designs, the best practices speeds up the seemingly little mathematical tricks, which help to shave off the complexity of transformations to allow for a smoother learning process. So of course, one of the major challenges in machine learning is related, for example, to how gradients propagate in the network, how the change propagates in the network. And every little trick, every mathematical optimization, which allows for a smoother realization of this process that cadence don’t zero out, et cetera, et cetera, is helpful. So there are many different contexts where we can get inspiration. 
Jon Krohn: 08:54
Cool. So we go step by step, take small steps. But are there any specific practical applications of this so far? 
Adrian Kosowski: 09:06
So again, it’s looking at time series data. There’s also a big area which might potentially show promise in general purpose models, which is related to creating reservoirs, reservoir computing as a kind of pre-processing step to other neural networks which use some type of bio-inspired neural network as a way to perform a pre-processing, a dimensionality increase of the input data. 
Jon Krohn: 09:43
A dimensionality increase of the data. So you could have relatively simplistic inputs, pass it through a liquid neural network that would increase the number of features that go into a downstream machine learning model. 
Adrian Kosowski: 09:55
This could be the future of where these models are going. It’s definitely not where we are specifically with liquid neural networks. There are the paddle approaches which take similar biological inspirations, which are perhaps even closer to Visco. 
Jon Krohn: 10:10
Yeah. Cool. All right, Well, something to keep an eye on for all of us. Fascinating to learn about it. Thank you so much for taking the time, Adrian, to fill us in on this 5 Minute Friday of the Super Data Science episode on Liquid Neural Networks. 
Adrian Kosowski: 10:20
My pleasure, Jon. Thank you. 
Jon Krohn: 10:23
Okay, that’s it for this special guest episode of Five Minute Friday, filmed onsite at ODSC West. We’ll be back with another one of these soon. Until next time, keep on rocking it out there folks. And I’m looking forward to enjoying another round of the Super Data Science Podcast with you very soon. 
Show All

Share on

Related Podcasts