Push notifications in your browser are not yet configured.
You are not logged in, you may not see all content and functionalities. If you have an account, please login .
Not all data available
Hi all,
I am currently trying to setup a 0.025° sub-domain within a full Antarctic domain that was simulated on 0.22°.
The int2lm routine for this subdomain halts with the error:
ERROR CODE is 5051
Unfortunately I can’t find what might be missing?
In attach the content of the driving files, int2lm namelist and log from int2lm.
Please note that in contradiction to the 0.22° run, this smaller domain does not cross the south pole.
Thanks for the help.
Matthias
The vertical wind W is missing in the initial field.
Ha, indeed. For some reason W was saved in out02.
I have now appended this field into out01.
Int2lm now continues, but is stuck on the following:
interpol_coarse_special_l: NEW soil moisture interpolation with SMI
interpol_coarse_special_l: FOR lmulti_layer_in WITH ke_soil_lm= 9
interpol_coarse_special_l: IF: l_smi = F is .TRUE. THEN w_so_lm >= pwpb
I have added l_smi = .FALSE. to no avail.
Might this be related to the fact tha W_SO is zero for this region?
This is probably a question for the TERRA experts. I have no clue on this.
Which landcover and soil data sets did you use when creating the external parameters? Some of the raw data is not available over Antarctica.
Please find the .def files from WEBPEP in attach.
For the full 0.22° Antarctic run I did not have any problem. ALl went fine.
It is the nesting in the output of this run that is causing problems (so the 0.025° resolution).
I used the same underlying datasets for both.
Hello Matthias,
I am not an expert of TERRA , but can comment on the 0 soil moisture. If you have l_smi=.FALSE. in INT2LM, the soil moisture is set to 0 over ice points, which are sure many in the antarctic region. But this should not cause the INT2LM to stuck. Is the output you are citing ( NEW soil moisture interpolation….) the last one you get from the INT2LM run?
By the way: with l_smi=.TRUE. the soil moisture is not explicitely set to 0 for ice points, but no value is computed, which could result in a crash later on (depending on the compiler).
Hi Ulrich,
Yes, that is the last output from int2lm after reading the first time step of the meteo boundaries produced by 0.22°.
For the full log, please see attach.
And if I understand it well, it is then best to l_smi = .FALSE.?
Thanks.
Hello Matthias,
yes: setting l_smi=.FALSE. should at least give some reasonable results. But from your output I cannot see, what is going on there.
If you could provide me the necessary data, I can offer to test it on our computer. Perhaps then I can see what is going wrong.
Hi!
Thanks for this offer.
I’ll send you my domain file, int2lm namelist as well as some forcing files from the first CCLM simulation via email.
Please let me know if you would need more info.
Cheers,
Matthias
Hello Matthias,
I tested with our versions INT2LM 2.0 and also the latest one, 2.02, using your data. And I had no problem running the INT2LM.
Which version are you using? And which compiler? Could you also test on a different computer?
Ciao
Uli
Ha, that is interesting.
So the test that crashed was done with int2lm_131101_2.00_clm2, using the intel compiler.
I just tried with int2lm_131101_2.00_clm3, compiled with gcc, and that works well.
The reason I switched to our intel version was that, for the 0.22° preprocessing, the gcc compiler options appeared to be sensitive to boundary violations (see bug int2lm). This was not a problem for our intel compiled version.
And now there seems to be an inverse sensitivity? Or it could be related due to differences between the two versions?
No time to test this now, but if this would be relevant, I can test this later on?
You mean the bug in INT2LM, that Burkhardt lately posted? But this is also something I do not understand yet. At least we have never seen something similar.
I do not know the differences between the subversions clm2 and clm3 (they are not available on the web page, are they?), so perhaps this really should be tested.
You said that version clm3 with gcc compiler was sensitive to the boundary violations when producing the data for the 0.22° domain? Could you perhaps also provide me all the necessary data for that run? I would be curious to see, whether I can reproduce these problems with our version.
Ciao
Hi Uli,
I just tried to produce the boudary files for the 0.22° and 0.0625° run, using a gcc and intel compiled version of int2lm_131101_2.00_clm2 and int2lm_131101_2.00_clm3 .
Perhaps first with respect to your question on the differences between _clm2 en _clm3, see README _changes file in attach.
For the results on the little test:
1) with _clm2
——————-
This is the error:
_interpol_coarse_special_l: NEW soil moisture interpolation with SMI
interpol_coarse_special_l: FOR lmulti_layer_in WITH ke_soil_lm= 9
interpol_coarse_special_l: IF: l_smi = F is .TRUE. THEN w_so_lm >= pwpb
_
2) with _clm3
——————-
This is the error:
_At line 1032 of file /home/rcs/software/gcc/int2lm_131101_2.00_clm3/src/external_data.f90
Fortran runtime error: Index ‘0’ of dimension 1 of array ‘lolp_in’ below lower bound of 1_
So it seems that all works fine (for both resolutions) either with _clm2+gcc or _clm3+intel.
I am not sure what could be causing the other errors.
I’ll send you some boundary files for the 0.22° domain, perhaps you could test this on your system as well (although this one this work for intel).