bartturner t1_jcbhi3u wrote
Reply to comment by wywywywy in [D] What do people think about OpenAI not releasing its research but benefiting from others’ research? Should google meta enforce its patents against them? by [deleted]
Exactly. That is why there should be push back on OpenAI behavior.
twilight-actual t1_jcd0wcs wrote
What exactly would that pushback be? Boycott? Post mean things?
About the only thing that could potentially prevent this is if the algorithms that we put into the public domain are protected by a license like the GPL, or something similar.
I haven't been following code releases, so I don't know if that's being done. And to be honest, I doubt most of the information flow is going by code. Rather, it's in the papers.
Is there a way to protect papers by a "GPL"? I honestly doubt it, because at that level, we're dealing strictly with ideas. And the only way to protect an idea is to patent them.
Perhaps the community, as a whole, should start patenting all their ideas, and then assigning the patents to a public trust that ensures that any derivative technology is published freely, too, under the same patent type.
VelveteenAmbush t1_jcd6opg wrote
You could patent your algorithm and offer some sort of GPL-like patent license, but no one respects software patents anyway (for good reason IMO) and you'd be viewed as a patent troll if you tried to sue to enforce it.
GPL itself is a copyright license and does you no good if OpenAI is using your ideas but not your code. (Plus you'd actually want AGPL to force code release for an API-gated service, but that's a separate issue.)
Smallpaul t1_jcdnffe wrote
Software patents assigned to a public trust are a different idea than randomly suing people.
It might be set up to only sue companies that are not open.
VelveteenAmbush t1_jcdxc8v wrote
Maybe you're onto something.
I guess the trick is coming up with foundational patents that can't be traced back to a large tech company that would worry about being countersued. Like if you make these inventions at Google and then Google contributes them to the GPL-esque patent enforcer entity, and then that entity starts suing other tech co's, you can bet that those tech co's will start asserting their patents against Google, and Google (anticipating that) likely wouldn't be willing to contribute the patents in the first place.
Also patent litigation is really expensive, and you have to prove damages.
But maybe I'm just reaching to find problems at this point. It's not a crazy idea.
twilight-actual t1_jce1tou wrote
The cat's kinda out of the bag at this point. But a non-profit public trust that acted as a patent-store to enforce the public dissemination of any derivative works based on the ideas maintained by the patent-store could make a huge difference ten, twenty years down the road. It would need an initial endowment to get started, retain a lawyer or two to manage it.
And then, publicize the hell out of it, evangelize the foundation over every college campus with a CS department. When students have established new state of art with ML, they can toss the design to the foundation in addition to arxiv, and where ever else they might publish.
Smallpaul t1_jce114b wrote
Just to be clear, I was just elaborating on /u/twilight-actual’s idea.
twilight-actual t1_jcdqi1j wrote
You got it.
bartturner t1_jcetimy wrote
> Post mean things?
Not the terminology I would choose. But yes post things that they should not be doing this. Public opinion is a very, very powerful tool to get people to behave.
1998marcom t1_jcf3er8 wrote
Detail note: to the best of my knowledge, as for what OpenAI is doing right now with their software, they could very well be using GPL code in their stack, and they wouldn't be violating any of the GPL clauses. A stricter licence such as AGPL I guess would be needed to cover as usage cases not only the shipping of software to the customer but also the mere utilization of the software.
Single_Ad_2188 t1_jcer0ba wrote
>It seems like the days for open research in AI are gone.
if the Google not released the paper "Attention is all you need". then GPT is not possible to create
bartturner t1_jcet1m2 wrote
Exactly. I do not like that OpenAI looks to be changing the culture of sharing.
existential_one t1_jcgtu4h wrote
I think the culture of publishing has been dying and people will think OpenAI was the one to trigger it, but in reality other companies already started restricting publications. Deepmind being the biggest one.
bartturner t1_jcgu47z wrote
Love how much DeepMind shares with the papers. Same with Google Brain.
To me the issue is OpenAI. What makes it worse is they use breakthroughs from DeepMind, Google Brain and others and then do not share.
We call them filtches
existential_one t1_jcgur9j wrote
I agree, but what I'm saying is that Deepmind is gonna stop publishing their good stuff. And it's not because of OpenAI.
IMO ml research papers weren't profitable before, and companies benefited for the collective effort, plus to retain talent. But now we're seeing ML models having huge impact on companies and single incremental papers can actually improve the bottom line, so all companies are gonna start closing their doors
bartturner t1_jcgvl8h wrote
> I agree, but what I'm saying is that Deepmind is gonna stop publishing their good stuff. And it's not because of OpenAI.
I do not believe that will happen. But the behavior of OpenAI does not help.
But Google has been more of a leader than a follower so hopefully the crappy behavior by OpenAI does not change anything.
I think the sharing of the research papers was done for a variety of reasons.
First, I fully agree to keep and retain talent. Which Google understood before others that was going to be critical. Why they were able to get DeepMind for $500 million and that would by easily 20x that today.
But the other reason is data. Nobody has more data than Google and also access to more data.
Google has the most popular web site in history and then the second most popular in addition. Then they also have the most popular operating system in history.
So if everyone had access to the same models it still keeps Google in a better position.
But the other reason is Google touches more people than any other company by a wide margin. Google now has 10 different services with over a billion daily active users.
Then the last reason is their hope that someone would not get something they can not get. I believe Google's goal from day 1 has always been AGI. That is what search has been about since pretty much day 1.
They worry that someone will figure it out in some basement somewhere. Very unlikely. But possible. If they can help drive a culture of sharing then it is far less likely to happen.
Viewing a single comment thread. View all comments