Hello everybody.
I am having a question to the NewtonLineSearch algorithm.
I am doing a transient analysis (IDA) with the NewtonLineSearch algorithm and the Newton algorithm, respectively.
When I use the Newton algorithm, I have no problems with my memory, since the demand remains constant during analysis.
When I use the NewtonLineSearch algorithm, the memory demand is increasing continuously. This leadssooner or later to a lack of memory and opensees quits.
Why do the NoewtonLineSearch algorithm needs such a lot of memory?
Because I wanted to use this algorithm, since it converges much better then the original Newton.
Thanks for answer,
Clemens
NewtonLineSearch
Moderators: silvia, selimgunay, Moderators
Re: NewtonLineSearch
How big is your model?
Theoretically, the choice of algorithm should not make that big difference in terms of memory demand. It may be that you have a "memory leak" in your code to which the case of using a NewtonLineSearch algorithm is more sensitive.
Theoretically, the choice of algorithm should not make that big difference in terms of memory demand. It may be that you have a "memory leak" in your code to which the case of using a NewtonLineSearch algorithm is more sensitive.
Re: NewtonLineSearch
Hi Vesna,
the model isn't big at all. Just a 18-story-1-bay-frame with plastic hinges at the end of the beam/column elements.
I checked whether I am creating lists which I don't clear after usage - there are no.
When I look at my main memory when using NewtonLineSearch, than the demand increases continuously. When I use Newton, than not - the demand of main memory remains constant.
Just when I change the algorithm to NewtonLineSearch because of convergence problems, the demand increases too. If changing the algorithm back when dooing the next time step, the demand remains constant again.
Maybe there is a difference in the usage of the commands:
wipeAnalysis, wipe and reset,
which I use differently.
Can you please explain shortly what the difference between this commands is?
Thanks for replying,
Clemens
the model isn't big at all. Just a 18-story-1-bay-frame with plastic hinges at the end of the beam/column elements.
I checked whether I am creating lists which I don't clear after usage - there are no.
When I look at my main memory when using NewtonLineSearch, than the demand increases continuously. When I use Newton, than not - the demand of main memory remains constant.
Just when I change the algorithm to NewtonLineSearch because of convergence problems, the demand increases too. If changing the algorithm back when dooing the next time step, the demand remains constant again.
Maybe there is a difference in the usage of the commands:
wipeAnalysis, wipe and reset,
which I use differently.
Can you please explain shortly what the difference between this commands is?
Thanks for replying,
Clemens
Re: NewtonLineSearch
"wipeAnalysis" clears previously defined analysis objects.
"wipe" clears everything. it causes all elements, nodes, constraints, loads to be removed from the domain. In addition it deletes all recorders, analysis objects and all material objects created by the model builder.
"reset" is used to set the state of the domain to its original state.
"wipe" clears everything. it causes all elements, nodes, constraints, loads to be removed from the domain. In addition it deletes all recorders, analysis objects and all material objects created by the model builder.
"reset" is used to set the state of the domain to its original state.
Re: NewtonLineSearch
Dear Mr. Fmckenna and All OpenSees Users,
I'm trying to import a txt file of a matrix with 200 rows and 40000 columns which has been built by Matlab to opensees domain. This file is the data of particle wave kinematics for a 400 second random irregular wave time history. I've written these tow different codes for this purpose. While importing this big file ( Which have 100 mb size on disk) the memory usage of opensees program progressively increase up to 1.6 gigabyte. Sinse I have to import 4 of these matrixes when the memory usage exceeds 1.8 giga byte the opensees stops working due to high amount of memory usage. It seems that this code has a memory leakage because when I unset( delete) the variable(fg. Pvh($i,$j) , the other variables are deleted automatically due to use of proc command ) of first matrix the memory usage decrease only 20 mb. I've realized that this problem only occurs when importing this file into a list variable and whenever I try to access every elements of this list variable some memory is allocated in virtual memory (ram) of pc. For instance, in second code just when using the line : array set Pvh3 $Pvh2 this high amount of memory is stored.
How could I solve this problem? Is there any way to import data file directly into an array variable instead of a list variable? or Is there any way in Tcl programming to destruct these unusable memory usage after importing of the file ?
Firstly :
proc WaveKinematicPvh {} {
global Pvh
global AnlStps
set j 1
set z 0
if [catch {open WavePtclVelHz.txt r} inFileID] {; # Open the input file and check for error
puts stderr "Cannot open $inFilename for reading"; # output error statement
} else {
foreach line [split [read $inFileID] \n] { ; # Look at each line in the file
if {[llength $line] == 0} {; # Blank line --> do nothing
continue;
} else {
for {set i 1} {$i <= [llength $line] } { incr i} {
set Pvh($j,$i) [lindex $line [expr $i-1]]; # execute operation on read data
}
incr j
if {$z==0} {
set AnlStps [expr $i-1]
set z 1
}
}
}
close $inFileID; ; # Close the input file
}
}
Secondly:
set Pvh1 [open WavePtclVelHz.txt r]
set Pvh2 [read -nonewline $Pvh1]
array set Pvh3 $Pvh2
close $Pvh1
set columnN [expr [array size Pvh3]/200]
for {set i 1} {$i <= 200} { incr i} {
for {set j 1} {$j <= $columnN} { incr j} {
set Pvh($i,$j) $Pvh3([expr ($i-1)*$columnN+$j]);
}
}
the style of input files for these tow codes vary with each other.
Thanks for your kind help.
Mohamad Zarrin from KNTU university of technology in Tehran, Iran
I'm trying to import a txt file of a matrix with 200 rows and 40000 columns which has been built by Matlab to opensees domain. This file is the data of particle wave kinematics for a 400 second random irregular wave time history. I've written these tow different codes for this purpose. While importing this big file ( Which have 100 mb size on disk) the memory usage of opensees program progressively increase up to 1.6 gigabyte. Sinse I have to import 4 of these matrixes when the memory usage exceeds 1.8 giga byte the opensees stops working due to high amount of memory usage. It seems that this code has a memory leakage because when I unset( delete) the variable(fg. Pvh($i,$j) , the other variables are deleted automatically due to use of proc command ) of first matrix the memory usage decrease only 20 mb. I've realized that this problem only occurs when importing this file into a list variable and whenever I try to access every elements of this list variable some memory is allocated in virtual memory (ram) of pc. For instance, in second code just when using the line : array set Pvh3 $Pvh2 this high amount of memory is stored.
How could I solve this problem? Is there any way to import data file directly into an array variable instead of a list variable? or Is there any way in Tcl programming to destruct these unusable memory usage after importing of the file ?
Firstly :
proc WaveKinematicPvh {} {
global Pvh
global AnlStps
set j 1
set z 0
if [catch {open WavePtclVelHz.txt r} inFileID] {; # Open the input file and check for error
puts stderr "Cannot open $inFilename for reading"; # output error statement
} else {
foreach line [split [read $inFileID] \n] { ; # Look at each line in the file
if {[llength $line] == 0} {; # Blank line --> do nothing
continue;
} else {
for {set i 1} {$i <= [llength $line] } { incr i} {
set Pvh($j,$i) [lindex $line [expr $i-1]]; # execute operation on read data
}
incr j
if {$z==0} {
set AnlStps [expr $i-1]
set z 1
}
}
}
close $inFileID; ; # Close the input file
}
}
Secondly:
set Pvh1 [open WavePtclVelHz.txt r]
set Pvh2 [read -nonewline $Pvh1]
array set Pvh3 $Pvh2
close $Pvh1
set columnN [expr [array size Pvh3]/200]
for {set i 1} {$i <= 200} { incr i} {
for {set j 1} {$j <= $columnN} { incr j} {
set Pvh($i,$j) $Pvh3([expr ($i-1)*$columnN+$j]);
}
}
the style of input files for these tow codes vary with each other.
Thanks for your kind help.
Mohamad Zarrin from KNTU university of technology in Tehran, Iran