Hello all,
I run a big model 1gb with some contacts sliding. The simulation crashed with this error:
Segmentation fault (core dumped) /var/sw/calculix/2.22/bin/ccx_2.22_MT $@
This error looks an error due to Calculix itself (model that put in crisis the ccx)
Have you ever seen this error?
I’ve seen ccx
run out of memory when using SPOOLES.
Most of the segmentation faults I’ve seen were when calculix was built using misconfigured libraries, for example OpenBLAS built without locking.
From the manual:
With 32GB of RAM you can solve up to 1,000,000 equations.
and:
Starting with version 2.8 the environment variable CCX_LOG_ALLOC has been introduced. If set to 1 (default is zero) one gets detailed information on all allocated, reallocated and deallocated fields during the executation of CalculiX. This may be particularly important during debugging of segmentation faults.
I am using pardiso on linux. Ram availsble 500 gb at 48 cpu
I see a lot of strange things for 2 models.
1 model) very big model with 20 contacts sliding. It works until the elements of slave/ master are c3d10 vs C3d4. If I use c3d10 for boths surface iI got that error
2 model ) I dont have a big model but 40 contact ,20 sliding 20 tie. The sliding contacts works but if I activate *tie contact I got the error above.
Very very strange
What do you advice?
Thus error in windows os is egual error u calloc memory I think
UNIX-like systems like Linux can limit the amount of memory a process is allowed to use.
Check the output of ulimit -a
.
Where can I check it? Do you have other options to overcome this limit?
The ulimit
commands runs in a terminal, like most Linux commands.
It can both show and set the limits. Note that “hard” limits can only be set by the the user root
.
See this link for example.