Prime Minister Narendra Modi on Friday expressed serious concern over the emerging threat from “deep fake” videos created through artificial intelligence-powered technology. He was addressing a Diwali Milan program with journalists in Delhi. Modi said, “a new crisis is emerging due to deep fakes produced through artificial intelligence. We have a big section of people who do not have the tools to carry out verification about their authenticity and ultimately people end up believing the videos to be genuine. This is going to become a big challenge.” Modi mentioned how he was targeted in a deep fake video showing he was doing ‘garba’ dance at a Navratri festival. “They did it very well, but the fact is I have not played garba since ages. I used to play garba when I was a child, and I stopped playing after my school days. Because of this fake video made through artificial intelligence, my followers are forwarding this”, the Prime Minister said. Modi said, he had raised the issue with stakeholders in artificial intelligence industry. “I suggested to them that they should consider tagging AI-generated content which is vulnerable to misuse”, he said. Modi said, while new AI technology is making life easier, it can be dangerous too, with the making of ‘deep fake’ videos. Such videos can spoil lives and damage the social fabric, they can create tension in society, and everyone needs to be on guard, he said. Modi said, with most of the Indians using cell phones frequently, any ‘deep fake video’ can be circulated through social media within seconds to millions of people and it can cause a big harm to society. He said, the issue may be a minor one, but the danger is big. In a ‘deep fake’ video, a person’s face or other parts of the body can be superimposed on the face or body parts of another person and can be passed off as genuine. Katrina Kaif, Kajol, Rashmika Mandanna and several other celebrities have been targeted recently through ‘deep fake’ videos. In the Madhya Pradesh assembly elections, fake videos were used by political parties, and both Congress and BJP lodged complaints with the Election Commission. On Friday, after voting was over in MP, chief minister Shivraj Singh Chouhan alleged that voters were misled by circulating fake videos. BJP has lodged at least two dozen complaints with the EC, and most of them relate to fake videos. In one of the fake videos, Chouhan is shown telling ministers and bureaucrats that “people are angry..BJP can lose by big margins..go to every village and booth….go and manage”. The video was of the last cabinet meeting presided over the CM, but the voice superimposed was not that of Chouhan. A voice nearly matching the CM’s was superimposed in the video. Police had to ask social media platforms to remove this fake video. Even, some popular TV show videos were morphed to misguide voters. In one ‘Kaun Banega Crorepati’ video clip, the entire content was changed by superimposing the voices of the host and the contestant. The purpose was to convey to the viewers that Shivraj Singh Chouhan is a “ghoshna mukhyamantri”. Police is unable to trace the origin of this video, but it was widely circulated by Congess supporters. Last month, Sony TV had to issue a clarification describing it as fabricated. It said, “we have been alerted to the circulation of an unauthorized video from our show ‘Kaun Banega Crorepati’. This video misleadingly overlays a fabricated voice-over of our host and presents distorted content….we are actively addressing this matter with the cyber crime cell. We strongly condemn such misinformation, urge our audience to be vigilant, and refrain from sharing unverified content.” Bollywood actor Kartik Aaryan was also targeted in a morphed video by a Congress supporter sporting a blue-tick mark on X. In the fake video, Aaryan was shown endorsing the Congress party in MP elections. The original video featured Kartik in a promotional campaign for a Disney Hotstar ad. This was morphed into a Congress election campaign ad. Kartik Aaryan had to clarify on Twitter saying, “This is the REAL AD @DisneyPlusHS Rest all is Fake.” The morphed video featured the Congress ‘hand’ election symbol. The fake and ‘deep fake’ video problem has now become serious. Artificial Intelligence tools have multiplied the risks. Even the popularity of a mega star like Amitabh Bachchan was misused through fake video to defame chief minister Chouhan. The credibility of a big show like KBC was misused. This is a dangerous trend. By the time, the complaints reached the Election Commission, and FIR was lodged, much damage had been done. These are only a few examples. Most of us get fake and morphed videos on our cell phones, almost on a daily basis. Most of the people do not take them seriously. But those who take these videos seriously have no tools to test their credibility. Personally, I get five to six videos daily by people asking whether they are fake or genuine. India TV has a team which verifies such videos, but the internet is an ocean with its net spread far and wide. It is next to impossible to verify all videos. Secondly, till two or three years ago, it took two or three days to make a fake video, because it requires very hi-definition footage. Now, a software is available which can prepare a fake video in four to five hours. It has now become easier to superimpose a fake voice ly match lip sync. Fake video can damage a person’s image in a matter of few hours. It can incite violence and tension in community. We should understand the danger involved. There is one more disadvantage. If a leader is caught taking bribe or violating laws in a genuine video, he or she can easily claim that the video is fake and can challenge people to conduct a forensic test. It takes months for the forensic report to arrive. Even if the report comes, questions are raised. Therefore, one has to understand the risks involved. Creating awareness among people with the help of cyber experts is essential. We should all be vigilant, and refrain from forwarding doubtful videos without verifying them. As Prime Minister Modi said, putting ‘AI-powered content’ as a mark on such deep fake videos should be the first step. As information technology makes advances, we should become more alert. The responsibility of media is greater. We should keep people informed. Instead of giving credibility to AI and ChatGPT, we should point out such videos as fake. It is our collective responsibility not to allow misuse of technology. We must ensure that the image of any person is not tarnished by use of morphed or ‘deep fake’ videos. I would also like to caution viewers. I have got complaints that some people are selling medicines using my picture, some are promising employment in media using my images, somebody even sent fake phone messages in my name. My office has sent complaints to the police in all such matters. I will ask all of you to remain alert and vigilant. Do not trust fake ads. Inform India TV, if required, whenever you see such fake ads. Verify all such videos and ads. Trust only those messages, posts and videos that are posted on my handle or or India TV’s official handle. They are verified and you can trust them.
My Opinion > DEEP FAKE VIDEO : THE NEW DANGER
Archives
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- October 2019
- September 2019
- August 2019
- July 2019
- June 2019
- May 2019
- April 2019
- March 2019
- February 2019
- January 2019
- December 2018
- November 2018
- October 2018
- September 2018
- August 2018
- July 2018
- May 2018
- April 2018
- March 2018
- February 2018
- January 2018
- December 2017
- November 2017
- October 2017
- September 2017
- August 2017
- July 2017
- June 2017
- May 2017
- April 2017
- March 2017
- February 2017
- January 2017
- December 2016
- November 2016
- October 2016
- September 2016
- August 2016
- July 2016
- June 2016
- May 2016
- April 2016
- March 2016
- February 2016
- January 2016
- December 2015
- November 2015
- October 2015
- September 2015
- August 2015
- July 2015