Re: What difference (object and sequence)
- Posted by Robert Craig <rds at CLASSIC.MSN.COM> Aug 26, 1997
- 803 views
A few days ago Doug Edmunds asked: > Is there any difference (size, performance, etc.) > between declaring a variable as an object > and declaring it as a sequence, > other than type checking? Using *very* rough numbers: 90% of the time you will see a slight (1% to 5%) improvement in speed if you declare a heavily-used variable as a sequence, rather than an object. 10% of the time you may see a very slight loss (1% to 2%). It's not something that most people should worry about. As a rule of thumb, you should declare variables using the most restrictive of the pre-defined types that you can. This will catch more bugs, make your code easier to understand, and probably make it run a few percent faster. For instance it's usually faster to declare something as an integer rather than an atom. As for memory consumption, it only matters what the current *value* of a variable is, not how it is declared. If you are serious about performance, you need to make measurements. As we saw from some postings a while back, it isn't always obvious which way of coding something is going to be faster. Even though I know how things are implemented, I am often surprised. Put small code fragments into for-loops up to a million and time them. Profile your program. Don't assume that you know which statement is the "bottleneck". For instance, most people assume that it must be faster to perform arithmetic on a whole sequence in one statement, rather than looping. This is not a good assumption. You have to measure it. There's a lot of invisible, but significant stuff going on when your program executes. Storage allocation and deallocation are important, but there are subtle hardware effects such as caching that can be very significant and hard to predict. Regards, Rob Craig Rapid Deployment Software Regards, Rob Craig Rapid Deployment Software