Re: Optimizing basic operations
- Posted by James_at_Euphoria Aug 17, 2013
- 4543 views
The talk has mostly been about the speed of processing the data.
You need to consider first of all the upload time. Given 10^12 records with a evaluation integer and 50 numbers you have a very large amount to load. say 200 Tb.
According to Tom's hardware the fastest off the shelf and affordable SSDs have data transfer rates approximating 0.5 Gb per second. That is to say you would be looking at an upload time of 400,000 seconds or so.
That is 111 hours or 4.5 days.
The records are internally generated and are not loaded from an outside source. Think of a chess game and evaluating n levels deep. The (geometric) combinatorial explosion of possibilities is what creates so many records. My project isn't a chess game but the branching at each level is similar.