top | item 6779883

How do game companies share massive files?

34 points| Libertatea | 12 years ago |bbc.co.uk | reply

44 comments

order
[+] boothead|12 years ago|reply
As someone said on twitter the other day:

  If "tech experts" writing news articles aren't really experts, what does that say about all the other experts..?
[+] simias|12 years ago|reply
So basically rsync?

Honestly synchronizing 50GB of contents across the globe doesn't sound that terribly novel or difficult to me, or the article makes a pretty bad job of explaining the difficulty.

[+] mryan|12 years ago|reply
> That's because some of the complete game files were as large as 50GB

50GB is not the total size of the files needing to be transferred - some of the files were 50GB. Knowing the total size would have been interesting for the sake of discussion.

[+] RBerenguel|12 years ago|reply
I thought the same. Sending only the delta of the content has been known for a long while...
[+] notduncansmith|12 years ago|reply
Not gonna lie, when I saw "delta" I immediately assumed Git or some other source control solution. What boggles the mind is how these professional game developers recently discovered this technology. I didn't think Git was so exclusively beholden to the web development community.
[+] NickPollard|12 years ago|reply
Git solves a different problem; sending large binary blobs around is not it. As another comment said, this is essentially RSync, not git.

I've been out of game development for a year, but unfortunately git is not being taken up much by large game developers. The main reason cited is actually git's handling of binary files, which leads to every checkout containing many duplicates of huge files. However this is possible to work around and the benefits of git still make it worth it.

[+] AndrewDucker|12 years ago|reply
You wouldn't use git for massive binary files that change on a regular basis. Particularly when you only need your testers to have the latest version of them, not the complete history (which is what git would give them).
[+] unwind|12 years ago|reply
You are fantastically mistaken if you think game devs don't use source control. Heh.

Of course they do, and I'm ... let's say pretty sure that much of EA runs on http://www.perforce.com/.

Also weird that the article starts with a BF4 image, but then claims that BF4 was developed in Californa. I guess that hurts my national pride (go DICE!).

[+] ethanhunt_|12 years ago|reply
I think a lot of game developers actually use perforce because it handles large files better. This is at least what I've gather from eavesdropping some of them (e.g., Jon Blow) on twitter.
[+] 7952|12 years ago|reply
When you look at how non-programming projects are managed you often see a lot of the same issues that source control addresses. Files (and information in peoples brains) become out of sync, or need to be locked.

Programmers always focus on the technology of source control, but the difference it makes is also social. It forces a particular workflow that allows tasks to be carried out concurrently. This workflow could be adopted without necessarily needing git.

[+] unsigner|12 years ago|reply
At some point Microsoft used a third-party solution called Aspera FASP to accept uploads of final game disc images. It was crazy fast - when you launched it, it reached our office' net connection speed within 10-15 sec, and other Internet connections at the office dropp.ed
[+] icelancer|12 years ago|reply
I remember that! I worked at MS in one of the gaming capacities (no need to be specific) and it was unreal how close to maximum saturation FASP would get. Such an awesome tool.
[+] t0|12 years ago|reply
Sneakernets are still the fastest way to transfer data. http://en.wikipedia.org/wiki/Sneakernet
[+] Piskvorrr|12 years ago|reply
Not necessarily the cheapest, though - express shipping a bunch of disks is an uncertain venture ("oops, we accidentally dropped it a few times, is that bad?") and "I need a return cross-Atlantic flight ticket NOW" is sort of expensive.

That said, I have heard of companies actually couriering semi-large volumes of sensitive data this way.

[+] Zolomon|12 years ago|reply
I think the article should mention that it is a Swedish based game developer company, and a California based publisher for the game.

Sounds more fair that way.

[+] kayoone|12 years ago|reply
"That's because some of the complete game files were as large as 50GB, and future games with more advanced graphics for the new Xbox One console and Sony's PlayStation 4 are likely to be even bigger."

Ahm no, Battlefield 4 is already a PC game and scaled down even on the next gen consoles, so console processing power isnt the limiting factor here.

[+] rurounijones|12 years ago|reply
I have always wanted to know more about Valve's steam architecture.

Why aren't they presenting at more conferences damnit!

[+] moocowduckquack|12 years ago|reply
Above a certain point, it becomes far quicker to send a hard drive by courier.
[+] hengheng|12 years ago|reply
That used to be true, but considering that hard drive transfer rates cap out around 200 MByte/s, you'd have to send a large bunch of drives that must be read and written in parallel to be faster than a network connection.
[+] planetjones|12 years ago|reply
of all the technical challenges facing us in the 21st century, sending a 50GB file quickly across some wire doesn't seem like the biggest!

also the feedback i see about battlefield's stability and quality has been very poor (on ps4) so I had to laugh when I saw the " to locate defects and improve quality" quote :)

[+] golergka|12 years ago|reply
Sounds a lot like git or other typical solution.
[+] samnardoni|12 years ago|reply
git would be pretty bad in this situation. It sounds more like rsync.
[+] _sabe_|12 years ago|reply
Sponsored article?
[+] JonnieCache|12 years ago|reply
Innit. Someone at EA/Panzura's PR firm just got a raise.