When I see this sort of thing, I immediately remember something that I learned from discourse analysis: look at what is said and what is not said.
OpenAI knows that military and warfare are profitable and unpopular. So how do you profit from it without getting the associated bad rep (“OpenAI has bloods on its hands!”)? Do it as silently as possible, and cover it under an explanation that it’s “clearer” for you.
When I see this sort of thing, I immediately remember something that I learned from discourse analysis: look at what is said and what is not said.
OpenAI knows that military and warfare are profitable and unpopular. So how do you profit from it without getting the associated bad rep (“OpenAI has bloods on its hands!”)? Do it as silently as possible, and cover it under an explanation that it’s “clearer” for you.