Except they already drew that line long ago, when they started out open-sourcing their papers, models and code.
As soon as they took VC capital, it is hardly 'Open' is it? Especially when they are now giving excuses for closing off their research?:
From the technical paper [0]
>> Given both the competitive landscape and the safety implications of large-scale models like GPT-4, this report contains no further details about the architecture (including model size), hardware, training compute, dataset construction, training method, or similar.
Except they already drew that line long ago, when they started out open-sourcing their papers, models and code.
As soon as they took VC capital, it is hardly 'Open' is it? Especially when they are now giving excuses for closing off their research?:
From the technical paper [0]
>> Given both the competitive landscape and the safety implications of large-scale models like GPT-4, this report contains no further details about the architecture (including model size), hardware, training compute, dataset construction, training method, or similar.
At this point, they are no better than DeepMind.
[0] https://cdn.openai.com/papers/gpt-4.pdf