>Given both the competitive landscape and the safety implications of large-scale models like GPT-4, this report contains no further details about the architecture (including model size), hardware, training compute, dataset construction, training method, or similar.
and if that's the tone from them, who else will start following suit? is the era of relatively open collaboration coming to a close in the name of competition? :(
as youtuber CGP Grey says, "shenanigans beget shenanigans"
>Given both the competitive landscape and the safety implications of large-scale models like GPT-4, this report contains no further details about the architecture (including model size), hardware, training compute, dataset construction, training method, or similar.
At that point, why bother putting out a paper?