picture

Finishing | Jung Ryeo Won
Produced | CSDN (ID: CSDNnews)

When GitHub announced in June that it was officially selling Copilot at a price of $10 per month, although some people accused GitHub of engaging in "price double standards", most programmers still agreed that buying one for $10 a month would automatically write The AI ​​programming "artifact" of code is worth it.

However, what if this artifact suddenly became "dumb"? Last month, a user named Hugo REY started a discussion on GitHub: " Part of my code is crashing Copilot ".

picture


picture

Copilot suddenly "dumb"


As an automatic coding aid built on the OpenAI Codex algorithm, GitHub Copilot is regarded by many programmers as the best "programming artifact". Recommending/generating one or more lines of code to choose from, largely frees the programmer's hands . Even over time, Copilot learns as programmers accept or reject suggestions, becoming more intelligent and comprehensive.

picture

The premise of all of the above is that Copilot will automatically generate code suggestions - but Hugo REY points out that his Copilot is often suddenly "dumb" .

"I've been using Copilot for a while now, and it's working fine, but I don't know why it doesn't give code suggestions after 2 minutes . Same thing yesterday, but it's back to normal this morning...and now it's Not anymore."

At first, Hugo REY thought that the problem might be with Copilot, so he tried many methods, including: re-login in the extension, reload, reinstall, restart VS Code, restart the computer, and also try to detect whether Copilot will Other documents/languages ​​are suggested.

After repeated attempts, Hugo REY, who was puzzled, finally speculated that the code he wrote might have accidentally "collapsed" for Copilot .

//descriptions is a global object export function description(name,age,gender,stats){
var descriptionGenerated="";
//wealth category
var familyType="";
if (stats.wealth>=8) familyType="rich";
else if(stats.wealth>=6) familyType="aisée";
else if(stats.wealth>=4) familyType="modeste";
else familyType="pauvre";

//baby description
if(age<=3)
{
//get random baby description
var descriptionId = Math.floor(Math.random()*descriptions.template.baby[gender].length);
descriptionGenerated = formated(descriptions.template.baby[gender][descriptionId],{name:name,age:2,face:"test",eyes:"",familyWealth:familyType,future:"nul"});
}


//standard description

return descriptionGenerated;
}

To be more specific, the following part of the code should be the "culprit" .

//baby description if(age<=3){
//get random baby description
var descriptionId = Math.floor(Math.random()*descriptions.template.baby[gender].length);
descriptionGenerated = formated(descriptions.template.baby[gender][descriptionId],{name:name,age:2,face:"test",eyes:"",familyWealth:familyType,future:"nul"});}

Originally Copilot should have given suggestions after this code, but after Hugo REY press enter nothing happens, the log shows the following:

[INFO] [default] [2022-07-10T07:59:07.641Z] [fetchCompletions] engine https://copilot-proxy.githubusercontent.com/v1/engines/copilot-codex[INFO] [default] [2022-07-10T07:59:07.737Z] request.response: [https://copilot-proxy.githubusercontent.com/v1/engines/copilot-codex/completions] took 96 ms

So, despite locating the exact location of the code that caused Copilot to crash, Hugo REY still doesn't understand what's wrong with this code. He posted these details and questions on GitHub, hoping that the majority of developers can answer their questions.

picture

Is "gender" a sensitive word?

At first, the problem raised by Hugo REY did not attract the attention of developers. It was not until yesterday that a person named "DenverCoder1" speculated the problem after reading the code carefully:

The problem may be that you use the word "gender" in your code .
Unfortunately, Copilot has a content filter that prevents it from making any suggestions for code involving "gender" .
I hope the Copilot development team can fix this in the future.

Obviously, in the eyes of many developers, this reason is somewhat unexpected or even outrageous, but some developers have tested and found that this is the truth .

I just tested it and I can confirm that Copilot does refuse to give advice on gender . I know a lot of people think this is absurd, but when you think about it, you realize that once Copilot advises on this, Microsoft will face two PR nightmares .

1. If Copilot makes a suggestion that gender is binary (only men and women), there will be protests in a particular community, and then the press will start hyping the topic of how Microsoft enforces gender views with code.

2. If Copilot makes a suggestion that suggests gender is non-binary, there will surely be protests in another particular community, and then the press will start hyping the other topic…

For this statement, many developers still can't accept that Copilot is invalid just because of the word "gender". They believe that since the final code choice is in the hands of the user, the suggestions that Copilot should generate can still be generated :

  • "Copilot is a product for adults, you can let it generate the content of the suggestion, because adults should be able to understand what machine learning is, right? I'm afraid it's not everyone who finally blames the suggestions generated by Copilot on Microsoft. Stupid."

It was also pointed out that the very behavior of setting content filters in Copilot is inherently problematic :

  • "Everyone is concerned about how stupid Copilot is to trigger its content filter on the word ' gender ', but for me the real question is why does Copilot have a content filter? It's obviously unpopular It's not necessary either."

So in your opinion, as an AI automatic programming tool Copilot, should a content filter be set in it? Is the setting of "invalidation" due to sensitive words involved?

Reference link:

  • https://github.com/orgs/community/discussions/20273

  • https://news.ycombinator.com/item?id=32338469

— Recommended reading —
☞ US$190 million was "emptied"! Hackers take the lead, passers-by "take advantage of the fire", all because of a low-level fatal loophole
☞Tencent will cancel some outsourced free canteen benefits; iPhone 14 Pro may cancel Yuanfeng Blue and add purple; Musk: I was deceived by Twitter | Geek Headlines
In order to save money, should the code be deleted if the open source project is not updated for a year? GitLab Urgent Response!

picture