User privacy not something for companies to speculate
By ZHANG ZHOUXIANG | China Daily | Updated: 2023-11-23 08:14
WPS Office, a Chinese office software, has invited users' wrath after it introduced an artificial intelligence module last week, saying it will use users' documents for AI training.
After users complained about breach of privacy, Kingsoft, the developer of the software, issued an apology through its official Sina Weibo account, saying they had been a little vague and that users' documents won't be used for AI training or any other purpose without the users' consent.
The company must keep its word or face the consequences of violating the law on personal information protection. The case also exposes the challenge of protecting users' information from AI training.
The fact is that the growing Large Language Model of companies has been absorbing data so that the entities learn to behave more like humans. This way the LLMs will innovate the AIs we use, but this should not come at the cost of users, who have the right to keep their documents private. Unless a user clearly authorizes the software to use its documents for AI training, his/her documents should be considered private and never be used for that purpose.
Embedding the authorization in the long "agreement" that often pops up on computers or smartphone apps, seeking permission from users before going forward, is not an effective defense for any company because most users click on the "accept" button without actually reading through the long agreement. Some companies have lost cases in this regard.
A temporary regulation on generative AI services that came into effect in July made it clear that AI developers must obtain approval from users before using any personal information.
Following this, software developers cannot take users for granted. The case of WPS Office serves as a lesson to all similar companies that AI is there to serve people, not exploit them. Training AI can never be an excuse for violating a user's privacy or right.