In 1997 IBM’s supercomputer “Deep blue” became the first machine to beat a reigning world chess champion. Though Kasparov went on to win the series 5:1, it was a significant milestone.
These days a grand master can only occasionally beat a chess engine like Komodo or Stockfish running on an ordinary laptop. But the gap between AI and human chess players has been likened to the difference between Magnus Carlson, the reigning world champion, and a good club player.
In the more complex game of Go, the significance of some moves played by the self self taught AI AlphaGo Zero, remain opaque to analysts until the game has played out.
Though such games are played in small, well defined domains e.g. a chequered board, they seem to be the playgrounds where AI is growing up. Aged 19 Demis Hassabis, the founder of DeepMind the company that developed AlphaZero, led development on Theme Park an early simulation game.
“I do not see why [the computer] should not enter any one of the fields normally covered by the human intellect, and eventually compete on equal terms. I do not think you can even draw the line about sonnets, though the comparison is perhaps a little bit unfair because a sonnet written by a machine will be better appreciated by another machine.”Alan Turing – London Times 1949
AI is here to stay. Google has invested heavily in the technology and sits on unimaginable amounts of data (estimated to be 10-15 Exabytes). It’s “Assistant“, included with Android phones, tablets and laptops, connects to cloud based AI for “federated learning” and is surely already better at predicting online behaviour than humans are. Computerised diagnostic health systems already out perform skilled clinicians in evaluating retinal scans (diabetic retinopathy) and mammograms.
Such technology looks set to revolutionise everything, including web development.
Traditionally, testing a page’s compliance with accessibility standards has been done by submitting its URL to an online validation tool. Now, with the freemium version of Google analytics established as the first choice for measuring the performance of small and medium sized websites, sophisticated algorithms combine with big data to detect and diagnose usability issues e.g. which text on a page was overlooked and the behaviour that preceded online orders being submitted or abandoned.
Statistics show what is happening, but not necessarily why. Traditionally, that’s been found by asking and observing people often in smaller studies (qualitative research). But perhaps AI will shortly provide both qualitative and quantitative information from automatically analysing billions of interactions, benchmarking and comparing similar sites, discovering behavioural patterns, and tracking both positive and negative outcomes as a matter of course.