recentpopularlog in

charlesarthur : eye   2

Artificial intelligence 'did not miss a single urgent case' • BBC News
Fergus Walsh:
<p>A team at DeepMind, based in London, created an algorithm, or mathematical set of rules, to enable a computer to analyse optical coherence tomography (OCT), a high resolution 3D scan of the back of the eye.

Thousands of scans were used to train the machine how to read the scans. Then, artificial intelligence was pitted against humans. The computer was asked to give a diagnosis in the cases of 1,000 patients whose clinical outcomes were already known.

The same scans were shown to eight clinicians - four leading ophthalmologists and four optometrists. Each was asked to make one of four referrals: urgent, semi-urgent, routine and observation only.

Artificial intelligence performed as well as two of the world's leading retina specialists, with an error rate of only 5.5%. Crucially, the algorithm did not miss a single urgent case.

The results, published in the journal Nature Medicine , were described as "jaw-dropping" by Dr Pearse Keane, consultant ophthalmologist, who is leading the research at Moorfields Eye Hospital.

He told the BBC: "I think this will make most eye specialists gasp because we have shown this algorithm is as good as the world's leading experts in interpreting these scans."

Artificial intelligence was able to identify serious conditions such as wet age-related macular degeneration (AMD), which can lead to blindness unless treated quickly. Dr Keane said the huge number of patients awaiting assessment was a "massive problem".</p>


Contrast this with IBM's Watson, trying to solve cancer and doing badly. This has a better data set, clearer pathways to disease, and is better understood generally. Part of doing well with AI is choosing the correct limits to work within.

And this won't replace the doctors; it will just be a pre-screen.
moorfields  eye  deepmind  ai 
9 weeks ago by charlesarthur
iPhone eye test spots vision problems cheaply >> New Scientist
<a href="http://smartvisionlabs.com/">Smart Vision Labs</a>, a start-up in New York City, wants to make it easier to diagnose vision problems in developing countries with an iPhone camera add-on.

The World Health Organization estimates that 246 million people have poor vision. Of these, about 90% live in low-income areas without good access to healthcare or expensive diagnostic machines.

To solve this problem Smart Vision Labs has combined two tools often used for eye tests into a single inexpensive and portable device. The first tool, an autorefractor, calculates whether someone is short-sighted or long-sighted, and to what extent, by measuring the size and shape of their eye. The second, an aberrometer, looks for distortions in how light reflects off the eye, which could indicate rarer problems such as double vision.


Android phones would be a lot cheaper to do it with.
iphone  eye  remote 
november 2014 by charlesarthur

Copy this bookmark:





to read