Does Bing have concept of ethics? Chatbot refuses to write cover letter for human

February 9, 2023  17:38

Microsoft's chatbot Bing refused to write a cover letter for a job seeker, saying it would be unfair to other job seekers and ethically wrong. ChatGPT, which is based on the search engine's artificial intelligence (AI), has no such ideas about ethics, and is comfortable writing cover letters and whatever it is required to do.

The Insider reporters who wanted to test the new Bing asked it to write a cover letter for the social media manager position in Insider's Singapore office. The chatbot declined to do so and provided links to resources and advice on how best to write such a letter. Specifically, it recommended researching information about the company and adapting the letter to show that the applicant for the position met all of its requirements.

ChatGPT in one of those situations performed even better than 80% of applicants on a letter-writing and testing task (“describe the secrets of a good text in 300 words”), which made HR professionals want to invite ChatGPT for an interview.

Recently, this chatbot also learned to write malware targeting operating system features and certain vulnerabilities.

Some students have started using the chatbot to take exams and prepare academic papers. ChatGPT itself recently took a final exam at a top business school. Even judges have begun to use ChatGPT. For example, a judge in Colombia used the chatbot to rule on a contentious case.


 
 
  • Read also
 
 
  • Archive