Quantum computing and AI and machine learning are dramatically changing the face of computing, presenting the risk of a range of unintended consequences.
This is according to Cliff de Wit, former chief innovation officer at Microsoft and now the CTO and co-founder of Metrofile group’s Dexterity Digital, who was addressing the opening of the annual DevConf developers’ conference in Johannesburg, recently.
De Wit said developers had moved from using low-level building blocks to composition tools and high end platforms, quantum computing was fundamentally changing computing, and AI and machine learning were reversing traditional programming models.
“Machine learning starts with a model, you feed in a sample output and then you train the programme to deliver the output – so you never wrote the programme, you trained it,” he said.
This could have unintended consequences, as has been illustrated by AI tools that developed racial or gender bias of their own accord.
“In AI and machine learning, the programme is learnt and taught, so it changes over time and this demands continual monitoring,” he added.
De Wit cited cases where developers themselves did not know exactly how their AI and machine learning systems had evolved, and on what they based their decision making.
In one instance, image recognition algorithms fed images of dogs and wolves persisted in identifying huskies as wolves.
On investigation, it was found that the system had determined that pictures of wolves were always in snowy landscapes, therefore a husky pictured in a snowy landscape had to be a wolf.
“These unintended consequences seem to be particularly prevalent in machine learning, and the responsibility lies with us to focus on the ethics in computer science. Never before have there been so much computing power or the people behind it been in such a powerful position. We just got a big loaded gun, and we need to think about what we do with it. We need to think about the bigger picture, ask the why questions and consider whether there could be unintended consequences,” he added.
De Wit noted that software powers systems that determine prison sentencing, access to funding, or who qualifies for university access or home loans, and that the developers behind the software need to understand the impact their software can have.
Formalised AI and machine learning practices and guidelines will take some time, and currently commercial vendors are driving technology innovations in these spaces.
“At the moment, ethical software development is an industry responsibility, and developers themselves need to take the time to understand the consequences of what they are doing,” he added.
DevConf has grown year on year and this year was the biggest event yet, attracting over 1,100 delegates in Cape Town and Johannesburg.
Experts across five tracks covered the latest trends, tools and challenges in the world of software development, and the event included new features including an ‘unpanel’ area, where developers could discuss trends and pain points in a relaxed, safe space.
Credit: Digital Street SA