Did you know OpenAI Raises Eyebrows After Allowing Its Technology To Be Used For Military And Warfare Applications
Tech giant OpenAI is making it very clear how it can make some
unexpected changes without announcing them beforehand. And its recent
allowance to military applications is the perfect example of that
notion.
ChatGPT’s parent firm just confirmed the news and it’s
shocking considering how it had banned that act in its policy from the
past. Previously, we saw the company shun the usage of its tech for
things related to military and warfare. But this seems to be a complete
180-degree flip.
The change (as per TheIntercept)
was said to have gone live on January 10 but it’s now causing a
commotion online as that’s not something that many people saw coming.
While we do agree that many tech giants do make sudden changes to
policies that have been in use for years, this one is certainly
controversial.
Plenty of changes were made to the policy’s
working and they tend to happen quite frequently in the world of tech.
With time, products alter and evolve and therefore change is inevitable.
Let’s take the company’s own example of launching GPT models that could
be customized with a monetization strategy in place. Clearly, that
needed some major changes to the original policy and therefore was done
accordingly.
The change to this no-military policy is hardly
going to be taken lightly by many. After all, a statement from OpenAI in
terms of this update is clear proof that this is a new policy and not
one that’s updated from the past.
The entire policy can be seen on the website and it’s definitely a new
one that has been published and not rewritten after making changes to
the old one. And that is what is causing concern amongst critics and
tech experts.
OpenAI knew that this change would be major and
therefore its rep has tried to provide an explanation about how a
blanket is still in place that prohibits using and creating weapons
through this means. You can tell how it was done separately and
originally from the list designated as military and warfare. And it
makes sense because the military does a lot more than generate weapons
and it’s not just the military who makes weapons.
On the brighter
side of things, the GPT app could be used for great purposes like
summarizing years of documents from a certain location’s infrastructure
to be used to make better plans. But again, the question arises that how
much is too much and where should the line be drawn.
With that
being said, the complete removal of both military and warfare from the
company’s list of prohibited uses gives rise to the claim of the firm
being open to offering its services to those in the armed forces.
OpenAI failed to dispute how it happens to be open to the likes of military applications and also military clients.
m