posted ago by AI_ML_Expert ago by AI_ML_Expert +8 / -1

I've seen lots of articles popping up about chatGPT, its use in Bing, Google talking about militarizing AI, etc.

AI == training a model and using it withing a product. This product is optimized for some sort of utility. The model itself is optimized for some loss metric. The data used for the base model was likely selected intentionally. Others with do another training run on additional data to bias it for their utility.

AI in militarized applications is scary. AI isn't as advanced as regular schmos think; conversely, most regular schmos think AI needs to be super advanced to fuck shit up. Both are wrong.

Black mirror did the episode of a rouge dog-like machine that basically killed everyone. Not super advanced.

Side note, I had a vivid dream of farmers getting wasted by drone AI weapons about 2 years ago.