AI minister 'disappointed' by OpenAI's response to enhancing safety after Tumbler Ridge
OTTAWA — Canada’s Artificial Intelligence Minister Evan Solomon says that “of course a failure” occurred when OpenAI didn’t alert police about information flagged internally about the Tumbler Ridge shooter’s activities on its chatbot.
Solomon made the statement after meeting with members of the company’s safety team on Tuesday evening.
Speaking to reporters on Wednesday morning, Solomon said that he would not comment on details of the case, but said that he left disappointed that OpenAI did not present any proposals to better enhance safety and expects the company to return with more solutions.
“Of course a failure occurred here …we want to ensure that this does not happen again,” he said on his way into the Liberals weekly caucus meeting.
“We were really disturbed by the reports that there might have been an opportunity to escalate this to law enforcement further, and we want to make sure if any company has that opportunity, they would escalate.”
The meeting between Solomon and his other federal colleagues representing justice and pubic safety occurred with company representatives following a report from the Wall Street Journal that a ChatGPT account linked to Jesse Van Rootselaar had been flagged last June for activities that violated its policies.
While details of what Van Rootselaar had exchanged with the chatbot have not been divulged, the company confirmed in an email last week that it had considered altering Canadian police to what its internal detection systems had flagged, but ultimately did not after determining that it did not meet the internal threshold to warrant a warning.
The company confirmed that it had informed the RCMP about Van Rootselaar’s activities after the shooting.
Mounties in B.C. say that Van Rootselaar, who died from a self-inflicted injury, entered Tumbler Ridge Secondary School on Feb. 10, killing eight people, mostly all children and injuring others. Among the dead were the shooter’s mother and half-brother, found in the family’s home.
The tragedy is one of the worst mass shootings in Canadian history.
Prime Minister Mark Carney said on Wednesday that he not yet been briefed by his members who attended the recent meeting with OpenAI.
“Obviously, anything that anyone could have done to prevent that tragedy or future tragedies, must be done,” Carney said.
Solomon earlier in the week said he had requested the meeting with OpenAI for the company to discuss their safety policies and thresholds when it comes to escalating matters.
“We expected them when they came to not only give us details about their escalation thresholds and their safety protocols, but we expected them to come with some concrete solutions so Canadians can feel comfortable that this kind of tragedy may be avoided,” he said on Wednesday.
“We are disappointed that they did not provide any concrete proposals.”
A response from OpenAI has not yet been returned about the company’s meeting with the ministers.
Justice Minister Sean Fraser said that “trust is going to be earned” by companies like OpenAI and that it depends on what changes the company adopts.
“The message that we delivered, in no uncertain terms, was that we have an expectation that there are going to be changes implemented, and if they’re not forthcoming very quickly, the government’s going to be making changes.”
Solomon has said the government was open to looking at all options.
The incident with OpenAI comes as the Carney government considers options on how to introduce measures to better protect Canadians and in particular children, when it comes to online safety.
That effort is expected to be handled by Canadian Heritage Minister Marc Miller, who also attended Tuesday’s meeting with OpenAI.
National Post
Our website is the place for the latest breaking news, exclusive scoops, longreads and provocative commentary. Please bookmark nationalpost.com and sign up for our daily newsletter, Posted, here.