Ganduje: the video, the science and the law
Ganduje: the video, the science and the law
By Ibraheem Dooba, Ph.D.
We've heard and read truckloads of debates on the authenticity of the series of videos published by the Daily Nigerian showing Governor Abdullahi Ganduje of Kano State receiving some money. Many have argued that it is fake, others contend that it is real. Even foreign organisations have been cited about the authenticity.
However, beyond the torrents of opinions, to my knowledge, no discussion has delved into the science of how it can be real or fake. If something is real, it invites no further discussion. If it's fake however, it elicits questions about how it can be fake.
Therefore, that is the gap I want to fill. Since I have a PhD in information technology and have been a student of machine learning, I intend to use this controversy as a teachable moment on what is possible. And warn the nation that it could be worse.
Yet, it's not the duty of this column to tell if the videos are fake or real. I don't know!
This is also not a comment on Jaafar Jaafar's plucky journalism; which I believe is a breath of fresh air.
Therefore, let's go straight to the meat of the matter. There's a new technology called Deepfake which used to be in only the hands of highly skilled experts and university researchers, but which is now available to the general population.
Deepfake is associated with fancy terms like neural networks, deep learning and artificial intelligence. Except that DeepFake is not a joke.
What DeepFake does is to take a handful of videos and pictures on any individual and feed them to the computer. The computer will then be instructed to learn everything about the person; the sound of their voice, the picture of the face, any particular ticks, the colour of the skin, the texture of the skin, etc.
After that, they can be made to say anything. Do anything; such as dancing or performing oral sex.
In April this year Supasorn Suwajanakorn started his TED Talk with this question: "Look at these images. Now tell me which Obama here is real."
He then showed four videos of President Obama all saying the same thing simultaneously. Seconds later, Suwajanakorn surprised the audience by declaring: "The answer is none of them."
A layman could have sworn that all of them were real. I could have sworn that all of them were real.
Last month in its Moving Upstream show, the Wall Street Journal did a deep dive into DeepFake and how the technology is being used for good and bad.
The anchor of the show, Jason Bellini, interviewed experts including Hany Farid, professor of computer science at Dartmouth College and UC Berkeley, who has been investigating doctored images for two decades. Farid said: "You can literally put into a person's mouth anything you want. If you can change visual images, you can change history. There's something very powerful still about seeing something about changing people's beliefs."
Now almost anyone can use Deepfake, he said. "We've democratised access to a very powerful software," Professor Farid said, "that's a game changer."
It can be used for good such as in movies. This actually has been done before; for example in Forest Gump, Tom Hanks was shown shaking hands with President John F. Kennedy. Kennedy was never part of that movie because at the time of production, he was already dead.
In education we can use it to cast a great teacher who is no longer available to teach a subject in any language. For instance, other than being a Nobel winner Richard Fienman was seen as the best teacher in America. Now that he is dead, you can cast him to teach physics to any audience.
We can get Sarduana to talk to our contemporary leaders in the north about good leadership. Or we can get Chinua Achebe to teach literature.
You can use it to listen to the soothing words of advice "in-person" from long gone grandparents.
It can also be used for bad. In fact it's already being used for ponorgraphy. Noell Martin from Australia was 18 years old when she fell victim to what is called morphporn where you put the face of an innocent person on to the form of a porn star.
"I just can't explain the level of violation and shame that I felt." Martin said.
In the last six years, she continues to see many fake videos of herself in varied pornographic positions but she doesn't know who is responsible. "Now I don't even want to look," she said, "because I know there are new ones," she told Wall Street Journal.
Celebrities were the main targets. Angelina Jolie and Emma Watson (of Harry Potter) have fallen victims. But now, it appears that anyone can be a target.
Can it be stopped?
Hany Farid is trying to do that. Farid understands that the world quickly needs a way to tell Deepfakes from real videos. He gave a scary scenerio where someone could produce a video of President Trump saying that he had launched a nuclear attack against North Korea. North Korea would see this but before they realise it's fake, they might have retaliated.
That's why Supasorn Suwajanakorn told the TED conference that he and his colleagues were working on a software you can plug into your browser that will automatically detect Deepfakes. If only we had that software now, we could determine whether the person in the video was actually Governor Ganduje.
If you asked my personal opinion, it would appear that the Ganduje video is real. But then, in the light of what I've just written, it is risky to make that call.
But for now the only thing the governor need to get the charge thrown out of the court, if he ever finds himself in court, is this article.
The courts operate on the principle of "beyond reasonable doubt" to convict.
This is the reasonable doubt.
By Ibraheem Dooba, Ph.D.
We've heard and read truckloads of debates on the authenticity of the series of videos published by the Daily Nigerian showing Governor Abdullahi Ganduje of Kano State receiving some money. Many have argued that it is fake, others contend that it is real. Even foreign organisations have been cited about the authenticity.
However, beyond the torrents of opinions, to my knowledge, no discussion has delved into the science of how it can be real or fake. If something is real, it invites no further discussion. If it's fake however, it elicits questions about how it can be fake.
Therefore, that is the gap I want to fill. Since I have a PhD in information technology and have been a student of machine learning, I intend to use this controversy as a teachable moment on what is possible. And warn the nation that it could be worse.
Yet, it's not the duty of this column to tell if the videos are fake or real. I don't know!
This is also not a comment on Jaafar Jaafar's plucky journalism; which I believe is a breath of fresh air.
Therefore, let's go straight to the meat of the matter. There's a new technology called Deepfake which used to be in only the hands of highly skilled experts and university researchers, but which is now available to the general population.
Deepfake is associated with fancy terms like neural networks, deep learning and artificial intelligence. Except that DeepFake is not a joke.
What DeepFake does is to take a handful of videos and pictures on any individual and feed them to the computer. The computer will then be instructed to learn everything about the person; the sound of their voice, the picture of the face, any particular ticks, the colour of the skin, the texture of the skin, etc.
After that, they can be made to say anything. Do anything; such as dancing or performing oral sex.
In April this year Supasorn Suwajanakorn started his TED Talk with this question: "Look at these images. Now tell me which Obama here is real."
He then showed four videos of President Obama all saying the same thing simultaneously. Seconds later, Suwajanakorn surprised the audience by declaring: "The answer is none of them."
A layman could have sworn that all of them were real. I could have sworn that all of them were real.
Last month in its Moving Upstream show, the Wall Street Journal did a deep dive into DeepFake and how the technology is being used for good and bad.
The anchor of the show, Jason Bellini, interviewed experts including Hany Farid, professor of computer science at Dartmouth College and UC Berkeley, who has been investigating doctored images for two decades. Farid said: "You can literally put into a person's mouth anything you want. If you can change visual images, you can change history. There's something very powerful still about seeing something about changing people's beliefs."
Now almost anyone can use Deepfake, he said. "We've democratised access to a very powerful software," Professor Farid said, "that's a game changer."
It can be used for good such as in movies. This actually has been done before; for example in Forest Gump, Tom Hanks was shown shaking hands with President John F. Kennedy. Kennedy was never part of that movie because at the time of production, he was already dead.
In education we can use it to cast a great teacher who is no longer available to teach a subject in any language. For instance, other than being a Nobel winner Richard Fienman was seen as the best teacher in America. Now that he is dead, you can cast him to teach physics to any audience.
We can get Sarduana to talk to our contemporary leaders in the north about good leadership. Or we can get Chinua Achebe to teach literature.
You can use it to listen to the soothing words of advice "in-person" from long gone grandparents.
It can also be used for bad. In fact it's already being used for ponorgraphy. Noell Martin from Australia was 18 years old when she fell victim to what is called morphporn where you put the face of an innocent person on to the form of a porn star.
"I just can't explain the level of violation and shame that I felt." Martin said.
In the last six years, she continues to see many fake videos of herself in varied pornographic positions but she doesn't know who is responsible. "Now I don't even want to look," she said, "because I know there are new ones," she told Wall Street Journal.
Celebrities were the main targets. Angelina Jolie and Emma Watson (of Harry Potter) have fallen victims. But now, it appears that anyone can be a target.
Can it be stopped?
Hany Farid is trying to do that. Farid understands that the world quickly needs a way to tell Deepfakes from real videos. He gave a scary scenerio where someone could produce a video of President Trump saying that he had launched a nuclear attack against North Korea. North Korea would see this but before they realise it's fake, they might have retaliated.
That's why Supasorn Suwajanakorn told the TED conference that he and his colleagues were working on a software you can plug into your browser that will automatically detect Deepfakes. If only we had that software now, we could determine whether the person in the video was actually Governor Ganduje.
If you asked my personal opinion, it would appear that the Ganduje video is real. But then, in the light of what I've just written, it is risky to make that call.
But for now the only thing the governor need to get the charge thrown out of the court, if he ever finds himself in court, is this article.
The courts operate on the principle of "beyond reasonable doubt" to convict.
This is the reasonable doubt.
Comments
Post a Comment