Zsh Mailing List Archive
Messages sorted by: Reverse Date, Date, Thread, Author

Re: Large COLUMNS crashes zsh



[On 05 May, @16:32, Peter Stephenson wrote in "Re: Large COLUMNS crashes zsh ..."]
> >export COLUMNS=10000000000000000
> >
> >When doing this zsh crashes. I know it seems stupid, but wouldn't it be
> >sane to ignore such large number, or simply set it to the largest
> >possible?
> 
> Probably, but the trouble is there's no single "largest possible".
> We've just run into a similar problem with the maximum size of arrays.
> If it doesn't crash the shell, it's potentially useful, but we don't
> know a priori how large that is.  Some checks on malloc might help, but
> that's a big can of worms, too:  once you need it in one place, you need
> it all over.  Furthermore, I've got a feeling that on many virtual
> memory systems the malloc might succeed but cause havoc later.

bash seems handle this just fine. And it resets the value to something
more sane (don't know if that is a bug or feature):

$ export COLUMNS=10000000000000000
$ echo $COLUMNS
10000000000000000
$ ls
...bunch of files...
$ echo $COLUMNS
100
$

This is with bash3. Zsh on the other hand is trying to trash my
machine :)

--
grtz,
  - Miek

  http://www.miek.nl              http://www.nlnetlabs.nl
  PGP: 6A3C F450 6D4E 7C6B C23C  F982 258B 85CF 3880 D0F6

Attachment: signature.asc
Description: Digital signature



Messages sorted by: Reverse Date, Date, Thread, Author