

1000% percent. If they can’t even figure out how dates work in COBOL we are getting a vibe coded SSA. Let’s hope they trained LLMs on COBOL or we are cooked.
1000% percent. If they can’t even figure out how dates work in COBOL we are getting a vibe coded SSA. Let’s hope they trained LLMs on COBOL or we are cooked.
Pen Tester here. While i don’t focus on LLMs, it would be trivial in the right AI designed app. In a tool-assist app without a human in the loop as simple as adding to any input field.
&& [whatever command you want]] ;
If you wanted to poison the actual training set in sure it would be trivial, but It might take awhile to gain some respect to get a PR accepted, but we only caught an upstream attack on ssh due to some guy who feels the milliseconds of a ssh login sessions. Given how new the field is, i don’t think we have developed strong enough autism to catch this kind thing like in SSH.
Unless vibe coders are specifically prompting chatgpt for input sanitization, validation, and secure coding practices then a large portion of design patterns these LLMs spit out are also vulnerable.
Really the whole tech field is just a nightmare waiting to happen though.
Which faang company are you sr. engineer at?
This is the kind of reasoned response i am on lemmy for. I was firmly in OPs camp and almost didn’t read your reply. I read it and you convinced me.
Great point about total sales volume!