Some further thoughts on the Google map episode I described yesterday.
It strikes me that AI in general will have problems with "dark swan" events. That's true by definition--if AI is trained by using a big database of past information, it can't be trained to handle events which aren't reflected in the database. In many cases, like autonomous cars, the algorithms can be set to do nothing. It would be similar to the "dead man switch" found on locomotives--if there's no longer an intelligence at the controls which can react properly to events, you stop the engine. But in other cases it may be harder to define such cases.
In cases like I encountered yesterday there may be salvation found in Moore's Law. Presumably Google samples current traffic volumes using some priority rules--sample most often the most traveled routes. But priorities are needed only when resources are scarce; as technology becomes cheaper it would be possible to sample everything all the time (which is almost what the human sensory system does).
No comments:
Post a Comment