My favourite analogy about artificial intelligence is this: Alfred Nobel discovered how we can use dynamite to improve mining, build roads and tunnel inside a mountain. Dynamite improved our lives. But dynamite was also used as a weapon. That’s why we must be responsible when it comes to AI. Artificial Intelligence will allow us to have better lives, but it does require some vigilance.
Generally, it’s about making computers perform actions which would be considered intelligent were they to be carried out by a person. The term “artificial intelligence” was first coined in 1956, at a summer workshop at Dartmouth College in New Hampshire, though clearly we’ve come a long way since then, especially recently.
At its core, AI is a combination of statistics (data analysis), computer science, software engineering and operations research – and it is data that is driving the recent growth of artificial intelligence.
While companies and researchers have always collected a lot of information, the concept of “Big Data” is relatively new. With the advent of companies like Facebook, Google, Amazon and others, huge amounts of data are being collected. Companies in the fintech sector (financial technologies) were among the first to successfully utilize AI, since they had access to tremendous amounts of data: they know all about transactions, loyalty, and consumer habits.
Those vast amounts of data allow us in turn to train our algorithms and get accurate models. The more we have access to data, the more accurate our AI algorithms are.
Most of these algorithms use Artificial Neural Networks, which attempt to replicate the human brain. Using these networks – which we refer to as machine learning – means we can explore increasingly complex models and problems. The bigger and more layered the neural network is – called deep learning – the more complex the issues we can explore, like speech and natural language understanding.
But with the amount of data we utilize in AI comes an ethical component. All that data is yours. And mine, and everyone’s. We authorize a lot of applications to access our information, like our address books our finances, our jobs – virtually every aspect of our lives. But we also need to be careful with the data we share. The dichotomy is that if we want to have more options and better services, we need to share our data, but how that data is used requires us to be somewhat cautious. Both industry and governments are taking ethics issues seriously and formulating strategies to deal with these concerns.