¿"Lista de argumentos demasiado larga" con solo 2 argumentos?
Frecuentes
Visto 1,785 veces
7
I'm debugging someone else's code, and I've run into a situation I wouldn't know how to produce if I tried to code it deliberately. It's coming from a very large Bash script, being run by Bash 4.1.2 on a CentOS 6 box. While the overall program is gigantic, the error consistently occurs in the following function:
get_las() {
echo "Getting LAS..."
pushd ${ferret_workdir} >& /dev/null
#Download:
if [ ! -e ${las_dist_file} ] || ((force_install)) ; then
echo "Don't see LAS tar file ${las_dist_file}"
echo "Downloading LAS from ${las_dist_file} -to-> $(pwd)/${las_dist_file}"
echo "wget -O '${las_dist_file}' '${las_tar_url}'"
wget -O "${las_dist_file}" "${las_tar_url}"
[ $? != 0 ] && echo " ERROR: Could not download LAS:${las_dist_file}" && popd >/dev/null && checked_done 1
fi
popd >& /dev/null
return 0
}
If I allow the script to run from scratch in a pristine environment, when this section is reached it will spit out the following error and die:
Don't see LAS tar file las-esg-v7.3.9.tar.gz
Downloading LAS from las-esg-v7.3.9.tar.gz -to-> /usr/local/src/esgf/workbench/esg/ferret/7.3.9/las-esg-v7.3.9.tar.gz
wget -O 'las-esg-v7.3.9.tar.gz' 'ftp://ftp.pmel.noaa.gov/pub/las/las-esg-v7.3.9.tar.gz'
/usr/local/bin/esg-product-server: line 428: /usr/bin/wget: Argument list too long
ERROR: Could not download LAS:las-esg-v7.3.9.tar.gz
Note that I even have a debug echo in there to prove that the arguments are only two small strings.
If I let the program error out at the point above and then immediately re-run it from the same expect script, with the only change being that it has already completed all the stages prior to this one and is detecting that and skipping them, this section will execute normally with no error. This behavior is 100% reproducible on my test box -- if I purge all traces that running the code leaves, the first run thereafter bombs out at this point, and subsequent runs will be fine.
The only thing I can think is that I've run into some obscure bug in Bash itself that is somehow causing it to leak MAX_ARG_PAGES memory invisibly, but I can't think of even any theoretical ways to make this happen, so I'm asking here.
What the heck is going on and how do I make it stop (without extreme measures like recompiling the kernel to just throw more memory at it)?
Actualizar: To answer a question in the comments, line 428 is
wget -O "${las_dist_file}" "${las_tar_url}"
1 Respuestas
9
El error E2BIG
refers to the sum of the bytes in the environment and the argv list. Has the script exported a huge number (or huge size) of variables? Run printenv
justo antes de la wget
para ver qué está pasando.
Respondido 25 ago 12, 00:08
Interesting. I was unaware that it was a global limit, rather than a command-line specific limit. The script does, indeed, heavily abuse the environment. I'm running a fresh build now to get a dump at that point just before the error, but it'll take an hour or so to get there. - Zed
Holy cow! I can't even run printenv. The printenv command itself, with no arguments, returns "Argument list too long"! I think I can safely say that I've got an environment problem, now I just have to figure out what to do about it. - Zed
No es la respuesta que estás buscando? Examinar otras preguntas etiquetadas linux bash or haz tu propia pregunta.
It's the wget line itself. I made a special note of it in an edit now. - Zed
FYI,
echo "wget -O '${las_dist_file}' '${las_tar_url}'"
gives you a worse representation than you would get from eitherset -x
orprintf '%q ' wget -O "$las_dist_file" "$las_tar_url"
; in the former case, you're assuming that neither variable includes single quotes in its contents. - Charles Duffy