After NSE, BSE cautions against deepfake videos of its chief recommending stocks

   NEW DELHI, Apr 18 : After the National Stock Exchange (NSE), rival bourse BSE on Thursday cautioned investors against deepfake videos of its MD and CEO Sundararaman Ramamurthy giving stock recommendations.
  In a statement, the exchange said it has noticed some fake, unauthorized and fraudulent videos and audios created through innovative and ingenious technology impersonating the BSE’s top honcho are being circulated on social media recommending certain investments and advisory in stocks.
In fact, BSE said its managing director and chief executive officer does not initiate or endorse any such communication through Facebook or any other social media platform.
Also, the exchange asked investors not to trust such videos and audios and not to follow fake recommendations or unsolicited communication circulated through deceptive means impersonating Ramamurthy.
Further, the exchange said it will initiate all possible steps to prevent misrepresentation by unknown elements.
“In the meantime, investors/public are urged not to join any group on social media platforms impersonating BSE or its officials and also not rely on any stock/share recommendation. BSE also advises investors/public to exercise caution and not to engage or re-circulate such fraudulent messages and not to share any personal and/or confidential information, financial or otherwise,” the exchange said.
As per the BSE, any official communication is made only through its official website and the exchange’s social media handles. Further, the exchange has asked investors to verify the source of communication before making their decisions.
It may be noted that BSE’s employees are not authorised to recommend any stock or deal in those stocks.
On April 10, NSE issued a cautionary statement against deepfake videos of its MD and CEO Ashishkumar Chauhan giving stock recommendations.
Deepfakes are manipulated videos or other digital representations that use artificial intelligence to create cogent videos or audio of individuals they never did or said, posing a risk of spreading misinformation and damaging their reputation. (PTI)