Deep learning: new neural nets could model continuous processes

In deep learning, neural nets use specific hidden layers to deliver defined results. AI researcher David Duvenaud is questioning all of that with ODE nets.

Deep learning is incredible: truly, it is. Being able to map human-like brain power onto a computer, so that it learns as we do, should never be taken for granted. It is one of the most astonishing scientific breakthroughs in the history of our species, however, deep learning is not beyond improvement.

At the heart of a deep learning model lies a neural net. This is the brain, if you like: a combination of stacked layers of simple nodes that work to try and find the patterns in data. The net then assigns values to data that it processes, filtering this data through different layers to come to a final conclusion.

Now, scientists are questioning how the values are assigned to data and whether there’s a more efficient way to run deep learning algorithms.

David Duvenaud, an AI researcher at the University of Toronto, set out to build a medical deep learning model that would predict a patient’s health over a period of time. Traditional neural networks thrive when they learn from data with defined observation stages: basically, the hidden layers within a deep learning model. This is difficult to align with healthcare.

Health is a continuous topic to assess. It does not rely on binary questions as it contains so many variables. So how can a neural net pick up on continuous data?

Can neural nets be improved?

Think of a deep learning model as being similar to a game of classic board game, Guess Who. In the game, each player has a selection of characters in front of them, all with a different appearance: some have facial hair, glasses, blue eyes, brown eyes and each of them unique.

One player of Guess Who asks the other binary questions to discount characters from their investigation, until they are left with the final chosen character through this process of elimination: this is the output layer.

This is similar to how a neural network works. It processes its data through different stages, eliminating more and more of the dataset until it’s left with the correct answers available. This is the technology that is used in face recognition software, for example.

Software 2.0: How neural networks work
Basic neural network model

David Duvenaud saw an opportunity. He sought to break from the binary for a more fluid form of deep learning.

Traditionally, the answer is to simply add more layers to a neural net to reach a more accurate endpoint. This is not always sensible though. Why, for example, should you have to define the number of layers within a neural network, train the data and then wait to see how accurate it is? Duvenaud’s neural net lets you specify the accuracy first, then it finds the most efficient way to train itself within that margin of error.

This is what researchers describe as an “ODE net”, short for “ordinary differential equations”.

How can an ODE be solved?

Solving an ODE numerically can be done by integration. This is a computationally intensive task and there have been methods suggested in the past to reduce the hidden stages within deep learning.

Duvenaud worked with a number of researchers on a paper that proposed a simpler method to solve an ODE. The method relies on solving a second, augmented ODE backwards and doesn’t take up too much memory. The gradient computation algorithm works by introducing an “ODEsolve” operation as an operator later on in the process.


The ODE poses interesting questions about what the most efficient methods of deep learning truly are.


This operator relies on the initial state, the function, the initial time, the end time and the searched parameters from the ODE. The presented paper provided Python code to easily compute the derivatives of the ODE solver.

The paper suggested that supervised learning – particularly MNIST written digit classification – was one application in which the ODESolve method can perform compared to a residual network with much fewer parameters.

Will ODEs revolutionise deep learning?

The ODE is not the only way to run a deep learning model. There could be any number of reasons that a scientist would want to define the number of stages for the AI that they run. Either way, “it’s not ready for prime time yet,” Duvenaud claims.

However, the ODE poses interesting questions for deep learning moving forward about how we build neural nets and what the most efficient methods of deep learning truly are. This is not a particularly new idea, but this is a breakthrough of kinds. Whether this approach works for a range of models remains to be seen.

Luke Conrad

Technology & Marketing Enthusiast

The Future of Smart Buildings: Trends in Occupancy Monitoring

Khai Zin Thein • 12th June 2024

Occupancy monitoring technology is revolutionising building management with advancements in AI and IoT. AI algorithms analyse data from IoT sensors, enabling automated adjustments in lighting, HVAC, and security systems based on occupancy levels. Modern systems leverage big data and AI to optimise space usage and resource management, reducing energy consumption and promoting sustainability. Enhanced encryption...

The need to weave agility throughout the business

John Craig Swartz SVP at POWWR • 11th June 2024

With geopolitical tensions, more extreme weather events and the legacy of a global pandemic, it is more difficult for energy suppliers to preserve their margins and remain competitive than ever before. To thrive in the current climate, it is imperative that a supplier makes marginal gains wherever they can. Profitability within the sector today hinges...

Artificial general intelligence is closer than expected

AI expert Stuart Fenton • 10th June 2024

Whilst most of the attention around artificial intelligence (AI) thus far has been on ChatGPT, it is just the tip of the iceberg. In many ways, ChatGPT shouldn’t be thought of as true AI as it is – at its heart – just generative, learned behaviour. The future of AI, in contrast, is a system...

The State of Data Streaming

Confluent • 06th June 2024

Confluent survey: 90% of respondents say data streaming platforms can lead to more product and service innovation in AI and ML development 86% of respondents cite data streaming as a strategic or important priority for IT investments in 2024 For 91% of respondents, data streaming platforms are critical or important for achieving data-related goals

The State of Data Streaming

Confluent • 06th June 2024

Confluent survey: 90% of respondents say data streaming platforms can lead to more product and service innovation in AI and ML development 86% of respondents cite data streaming as a strategic or important priority for IT investments in 2024 For 91% of respondents, data streaming platforms are critical or important for achieving data-related goals

Grant Funding Awarded to Advance Cancer Therapeutics Discovery

Dr Alan Roth • 04th June 2024

The CRUK (Cancer Research UK) Scotland Institute and Oxford Drug Design, a biotechnology company with core expertise in AI drug discovery, have announced that their joint application for the MRC (UK Medical Research Council) National Mouse Genetics Network (NMGN) Business Engagement Fund has been awarded.

Grant Funding Awarded to Advance Cancer Therapeutics Discovery

Dr Alan Roth • 04th June 2024

The CRUK (Cancer Research UK) Scotland Institute and Oxford Drug Design, a biotechnology company with core expertise in AI drug discovery, have announced that their joint application for the MRC (UK Medical Research Council) National Mouse Genetics Network (NMGN) Business Engagement Fund has been awarded.