No Chat GPT In The FAA

2

The FAA has issued a statement saying it does not use Chat GPT to help write computer code in its air traffic operations (ATO) or in any other systems. The agency hastily issued the statement after the Department of Transportation published a list of AI use throughout its organization. The list is required for compliance with an executive order mandating the enumeration of any and all AI use in government reports. The inclusion of three uses of Chat GPT within the ATO was reported by FedScoop and resulted in a quick response from the agency.

“The FAA does NOT use Chat GPT in any of its systems, including air traffic systems,” the FAA told FedScoop. “The entry was made in error and has been updated.” The DOT list said the FAA used Chat GPT for something called “Automated Delay Detection using voice processing” and to classify incident reports. But perhaps the most alarming reference was the DOT’s assertion that the FAA used Chat GPT to help write computer code for the ATO. FedScoop says Chat GPT is notoriously bad at writing code, quoting an Australian expert on the subject as saying it often produces “buggy” and insecure code. The DOT has since removed the references from its list of AI applications.

Russ Niles
Russ Niles is Editor-in-Chief of AVweb. He has been a pilot for 30 years and joined AVweb 22 years ago. He and his wife Marni live in southern British Columbia where they also operate a small winery.

Other AVwebflash Articles

2 COMMENTS

  1. ChatGPT is NOT an AI. It doesn’t understand anything. It’s incapable of reasoning. All it does is string together the most likely words in response to a prompt. It has no problem making up false information, because it has no mechanism for determining what is true. It’s literally a “plausible sounding BS generator.”

    Of course, that won’t stop people from using it anyway. Should I spend all day writing a government report that no one will read; or just cheat, use the AI, and take the rest of the day off?

LEAVE A REPLY