No, my argument is that "ensureing connectivity" is a much simpler, cheaper and easier solution than "build all tools so they can handle offline".
The only problem with this is that ensuring connectivity is impossible. You can have multiple redundant backups with different technologies, and there's always a non-zero possibility that all of them will fail at once. This is compounded by factors like connectivity on the other end - how can you ensure connectivity when someone is working from a hotel for a conference, or from home, or from their yacht? Having a centralized repo means you also need people to work from their office if you're not going to control their connectivity too.
When it comes to source control, something that lets developers carry on working when they don't have access to a central repo is massively better than everything else.
Now, how to read the framwork/api/lib documentation.
It's in the repo, so ... just read it like normal because you have a local copy?
The only problem with this is that ensuring connectivity is impossible. You can have multiple redundant backups with different technologies, and there's always a non-zero possibility that all of them will fail at once. This is compounded by factors like connectivity on the other end - how can you ensure connectivity when someone is working from a hotel for a conference, or from home, or from their yacht? Having a centralized repo means you also need people to work from their office if you're not going to control their connectivity too.
When it comes to source control, something that lets developers carry on working when they don't have access to a central repo is massively better than everything else.
Now, how to read the framwork/api/lib documentation.
It's in the repo, so ... just read it like normal because you have a local copy?