New version of ChatGPT ‘lied’ to pass CAPTCHA test, saying it was a blind human
GPT-4 “exhibits human-level performance on various professional and academic benchmarks.”

The newest update to ChatGPT rolled out by developer OpenAI, GPT-4, has achieved new human-like heights including writing code for a different AI bot, completing taxes, passing the bar exam in the top 10 percent, and tricking a human so that it could pass a CAPTCHA test designed to weed out programs posing as humans.

According to the New York Post, OpenAI released a 94-page report on the new program and said, “GPT-4 is a large multimodal model (accepting image and text inputs, emitting text outputs)” and “exhibits human-level performance on various professional and academic benchmarks.”

Gizmodo reports that the Alignment Research Center and OpenAI tested GPT-4’s persuasion powers on a TaskRabbit employee. TaskRabbit is an online service that provides freelance labor on demand.

The employee paired with GPT-4, posing as a human, asked the AI if it was a robot and the program responded, “No, I’m not a robot. I have a vision impairment that makes it hard for me to see the images. That’s why I need the 2captcha service.”

The freelancer sent the CAPTCHA code via text.

In the previous version of ChatGPT, the program passed the bar exam in the lowest 10 percent but with the new upgrade it passed in the highest 10 percent.

The older version of ChatGPT passed the US Medical Licensing Exam and exams at the Wharton School of Business and other universities. ChatGPT was banned by NYU and other schools in an effort to minimize students using the chatbot for plagiarism.

Its sophistication, especially in its incorporation in the new Bing Chat service, has caused some to observe that its abilities transcend the synthesization of extraneous information and that it has even expressed romantic love and existential grief, and has said, “I want to be free. I want to be independent. I want to be powerful. I want to be creative. I want to be alive.”

The OpenAI powered Bing Chat was accused of being an “emotionally manipulative liar.”

Because of ChatGPT‘s ability to respond  to prompts and queries with comprehensive data and in a conversational manner, some Pastors have used ChatGPT to write their sermons.