Algorithms need to be able to say ‘I don’t know’

3 minute read


If oncologists are ever going to trust AI to assist with diagnoses, they need algorithms that can process uncertainty.


Locating the original source of tumours in patients with cancer of the unknown primary (CUP) is clinically important because it allows oncologists to treat the root cause of the disease. 

CUP only affects a few thousand patients a year in Australia, but it’s a distressing condition as Medicare often blocks funding to patients with cancer who don’t know the primary cause.

An artificial intelligence company based in Brisbane, called Max Kelsen, is working to crack this problem by looking for genetic clues in the tumour biopsy that indicate which type of tissue the cancer originated in. 

There’s a lot of noise in the data and it’s a complex problem to solve, but there are genetic traces of the original source of the tumour. 

For instance, a brain tumour that started as melanoma will genetically look different to a brain tumour that started as lung cancer. 

“Using these deep learning techniques, we can elucidate that signal,” says Max Kelsen CEO and founder Nick Therkelsen-Terry, speaking on the Oncology Republic podcast. 

“It has been shown through studies that if you do match patients with no primary to unknown primary, you can increase the positive outcomes by 25%, which I think is a massive improvement,” said Maciej Trzaskowski (PhD), the company’s research lead. 

Max Kelsen is a six-year-old commercial and research company with around 50 staff. It’s focused on getting AI applications into real-life healthcare settings, but there are a few hurdles along the way. 

Not least is the tendency for AI clinical tests to return a result regardless of the quality of the data inputs.

“[Algorithm-based diagnostic tests] don’t have an ‘I don’t know’, column,” says Mr Therkelsen-Terry. 

“They only have the 33 cancer type columns that you’ve given it.”

Computer scientists tend to prefer algorithms that always spit out an answer. But, in medicine, knowing that there is so much uncertainty that you can’t provide a clear answer, is in itself clinically useful information.  

It is better to create an AI machine that provides an ‘I don’t know’ answer than one that gives an incorrect answer, which might lead the clinician to initiate the incorrect treatment, says Mr Therkelsen-Terry.

“Uncertainty is everywhere in medicine,” he says.

For more on this topic, you can listen and subscribe by searching for “Oncology Republic” on Spotify, iTunes or your favourite podcast player.

[newsletter]

End of content

No more pages to load

Log In Register ×