Zsh Mailing List Archive
Messages sorted by: Reverse Date, Date, Thread, Author

Re: Why large arrays are extremely slow to handle?



On 25 March 2011 01:37,  <nix@xxxxxxxxxxxxxxxx> wrote:
> Tested on AMD Phenom(tm) II X6 1090T Processor 3.6GHz using one core.
>
> I think there's is a big flaw somewhere that causes the following:
>
> #!/bin/zsh
>
> emulate zsh
>
> TEST=()
>
> for i in {1..10000} ; do
>
> TEST+="$i" # append (push) to an array
>
> done
>
> --- 10K
> time ./bench
> real    0m3.944s
>
> --- 50K BOOOM! WTF?
>
> time ./bench
> real    1m53.321s
>
> Does not make much sense to me. Im also a PHP developer. Just for
> comparison, let's do the same with PHP.
>
> <?php
>
> $test = array();
>
> for ($i=1; $i < 50000; $i++) {
>
> $test[] = $i;
>
> }
>
> print_r($test);
>
> ?>
>
> --- 10K
>
> time php TEST_PHP
> real    0m0.011s
>
> --- 50K
>
> time php TEST_PHP
> real    0m0.025s
>
>
> Any ideas why it's extremely slow? I have need to use very large arrays
> (even over one million elements in a single array) but it's currently
> impossible due to the above.

The problem is not the array, but that you are handing 50000 arguments
to the for loop. With this optimization it "only" takes 5 seconds ;)
for (( i = 0; i < 10000; i++ )) { arr+=$i }
That said, you generally don't want to use large arrays in zsh, it will be slow.

-- 
Mikael Magnusson



Messages sorted by: Reverse Date, Date, Thread, Author