RuntimeWarning: overflow encountered in double_scalars

Asked by Mithushan Soundaranathan

Hi,

I am running sphere compaction simulation, I am getting following error message:

1. RuntimeWarning: overflow encountered in double_scalars.
2.RuntimeWarning: overflow encountered in multip: steps = self._extended_steps * scale,
3. WARNING:matplotlib.text:posx and posy should be finite values

How should I solve this?

Best,
Mithu

Here is my code:
#!/usr/bin/env python
#encoding: ascii

# Testing of the Deformation Enginge with Luding Contact Law
# Modified Oedometric Test
# The reference paper [Haustein2017]

from __future__ import print_function
from yade import utils, plot, timing
from yade import pack
import pandas as pd
import numpy as np
from yade import pack, export

o = Omega()

# Physical parameters
fr = 0.54
rho = 1050
Diameter = 0.0006
D=Diameter
r1 = Diameter/2
#r2 = Diameter/2
k1 = 100000
kp = 12.0*k1
kc = k1 * 0.1
ks = k1 * 0.1
DeltaPMax = Diameter/3.0
Chi1 = 0.34

o.dt = 1.0e-7
particleMass = 4.0/3.0*math.pi*r1*r1*r1*rho

Vi1 = math.sqrt(k1/particleMass)*DeltaPMax*Chi1
PhiF1=0.999
#PhiF1 = DeltaPMax*(kp-k1)*(r1+r2)/(kp*2*r1*r2)

Tab_rad=0.005
Cyl_height=0.065
cross_area=math.pi*(Tab_rad**2)

Comp_press= 1.4e8
Comp_force=Comp_press*cross_area

compression_data_save=[]

#*************************************

# Add material
mat1 = O.materials.append(LudingMat(frictionAngle=fr, density=rho, k1=k1, kp=kp, ks=ks, kc=kc, PhiF=PhiF1, G0 = 0.0))

# Spheres for compression and walls
sp=pack.SpherePack()
sp.makeCloud((-4.5*Diameter,-4.5*Diameter,-50*Diameter),(4.5*Diameter,4.5*Diameter,40.0*Diameter), rMean=Diameter/2.0,rRelFuzz=0.18,num=4201)
sp.toSimulation(material=mat1)
walls=O.bodies.append(yade.geom.facetCylinder((0,0,0),radius=Tab_rad,height=Cyl_height,segmentsNumber=20,wallMask=6,material=mat1))

# Add engines
o.engines = [
  ForceResetter(),
  InsertionSortCollider([Bo1_Sphere_Aabb(aabbEnlargeFactor=1.05),
                         Bo1_Wall_Aabb(),
                         Bo1_Facet_Aabb()
                         ]),
  InteractionLoop(
    [Ig2_Sphere_Sphere_ScGeom(interactionDetectionFactor=1.05),
     Ig2_Facet_Sphere_ScGeom(),
     Ig2_Wall_Sphere_ScGeom()],
    [Ip2_LudingMat_LudingMat_LudingPhys()],
    [Law2_ScGeom_LudingPhys_Basic()]
  ),
  NewtonIntegrator(damping=0.1, gravity=[0, 0, -9.81]),
  PyRunner(command='checkForce()', realPeriod=1, label="fCheck"),
  #DeformControl(label="DefControl")
]

def checkForce():
    # at the very start, unbalanced force can be low as there is only few
    # contacts, but it does not mean the packing is stable
    if O.iter < 1200000:
        return
    # the rest will be run only if unbalanced is < .1 (stabilized packing)
    timing.reset()
    if unbalancedForce() > 0.2:
        return
    # add plate at upper box side

    highSphere = 0.0
    for b in O.bodies:
        if highSphere < b.state.pos[2] and isinstance(b.shape, Sphere):
            highSphere = b.state.pos[2]
        else:
            pass

    O.bodies.append(wall(highSphere+0.5*Diameter, axis=2, sense=-1, material=mat1))
    # without this line, the plate variable would only exist inside this
    # function
    global plate
    plate = O.bodies[-1] # the last particles is the plate
    # Wall objects are "fixed" by default, i.e. not subject to forces
    # prescribing a velocity will therefore make it move at constant velocity
    # (downwards)
    plate.state.vel = (0, 0, -1)
    # start plotting the data now, it was not interesting before
    O.engines = O.engines + [VTKRecorder(fileName='vtk-',recorders=['all'],iterPeriod=5000),
                            PyRunner(command='storeData()', iterPeriod=4000),]
    # next time, do not call this function anymore, but the next one
    # (unloadPlate) instead
    fCheck.command = 'unloadPlate()'

def unloadPlate():
    # if the force on plate exceeds maximum load, start unloading
    # if abs(O.forces.f(plate.id)[2]) > 5e-2:
    if abs(O.forces.f(plate.id)[2]) > Comp_force:
        plate.state.vel *= -1
        # next time, do not call this function anymore, but the next one
        # (stopUnloading) instead
        fCheck.command = 'stopUnloading()'

def stopUnloading():
    if abs(O.forces.f(plate.id)[2]) == 0:
        # O.tags can be used to retrieve unique identifiers of the simulation
        # if running in batch, subsequent simulation would overwrite each other's output files otherwise
        # d (or description) is simulation description (composed of parameter values)
        # while the id is composed of time and process number
        # plot.saveDataTxt(O.tags['d.id'] + '.txt')
        #plot.saveDataTxt('data'+ O.tags['id'] +'.txt')
        #print(timing.stats())
        O.pause()
        compression_data=pd.DataFrame(compression_data_save, columns=['time(s)','compression_pressure(MPa)','iteration'])
        compression_data.to_csv(r'compression_data_PH101_rp_0.0006.csv')
        export.text('test.txt')

def storeData():
    time=O.iter*o.dt
    compression_pressure=(abs(O.forces.f(plate.id)[2])/cross_area)*1e-6
    iteration=O.iter
    data_to_save=[time,compression_pressure,iteration]
    compression_data_save.append(data_to_save)

Question information

Language:
English Edit question
Status:
Expired
For:
Yade Edit question
Assignee:
No assignee Edit question
Last query:
Last reply:
Revision history for this message
Jan Stránský (honzik) said :
#1

Hello,

> I am getting following error message

please post the complete error message

Cheers
Jan

Revision history for this message
Mithushan Soundaranathan (mithushan93) said :
#2

Hij Jan ,
Here is the complete error mesaage:

/usr/lib/python3/dist-packages/matplotlib/ticker.py:2039: RuntimeWarning: overflow encountered in multiply
  steps = self._extended_steps * scale
/usr/lib/python3/dist-packages/matplotlib/ticker.py:2071: RuntimeWarning: overflow encountered in double_scalars
  high = edge.ge(_vmax - best_vmin)
/usr/lib/python3/dist-packages/matplotlib/ticker.py:1879: RuntimeWarning: invalid value encountered in double_scalars
  d, m = divmod(x, self.step)
/usr/lib/python3/dist-packages/matplotlib/transforms.py:2832: RuntimeWarning: overflow encountered in double_scalars
  elif vmax - vmin <= maxabsvalue * tiny:
/usr/lib/python3/dist-packages/matplotlib/transforms.py:417: RuntimeWarning: overflow encountered in double_scalars
  return (x0, y0, x1 - x0, y1 - y0)
/usr/lib/python3/dist-packages/matplotlib/ticker.py:1832: RuntimeWarning: overflow encountered in double_scalars
  dv = abs(vmax - vmin) # > 0 as nonsingular is called before.
/usr/lib/python3/dist-packages/matplotlib/ticker.py:2038: RuntimeWarning: overflow encountered in double_scalars
  raw_step = (_vmax - _vmin) / nbins
/usr/lib/python3/dist-packages/matplotlib/ticker.py:2045: RuntimeWarning: invalid value encountered in greater_equal
  istep = np.nonzero(steps >= raw_step)[0][0]
In [1]: WARNING:matplotlib.text:posx and posy should be finite values
/usr/lib/python3/dist-packages/matplotlib/axes/_base.py:2477: RuntimeWarning: overflow encountered in double_scalars
  delta = (x1t - x0t) * margin
In [1]: WARNING:matplotlib.text:posx and posy should be finite values
WARNING:matplotlib.text:posx and posy should be finite values
WARNING:matplotlib.text:posx and posy should be finite values
python3.8: /build/vtk6-9ueea8/vtk6-6.3.0+dfsg2/IO/XML/vtkXMLOffsetsManager.h:142: void OffsetsManagerGroup::Allocate(int, int): Assertion `numElements > 0' failed.
Aborted (core dumped)

Revision history for this message
Jan Stránský (honzik) said :
#3

The first bunch of problems is related to matplotlib.
Do you use plot.plot() ?
Or do you have any other hint where the matplotlib could be called?

On the other hand, the matplotlib stuff is only warnings.
The last message, concerning VTK, could be more problematic.

Cheers
Jan

Revision history for this message
Mithushan Soundaranathan (mithushan93) said :
#4

Hi,

I removed the plot now, it is not needed.

I use the same code to compact larger particle (double in size), then I don't get the VTK error message. The message only pop-up when the particle size is decreased and number of particle increased.

Best,
Mithu

Revision history for this message
Launchpad Janitor (janitor) said :
#5

This question was expired because it remained in the 'Open' state without activity for the last 15 days.