Max number of connections

Asked by Quasar on 2010-07-20


I've made little test for PySQLPool using Pool from multiprocessing:

#!/usr/bin/env python

import PySQLPool
from multiprocessing import Pool

N = 20
c = PySQLPool.getNewConnection(username='me', password='mypassword', host='myhost', db='mydatabase')

def f(x):
    q = PySQLPool.getNewQuery(c)
    counter = 0
    for i in xrange(100):
        counter = counter + q.Query('SELECT * FROM forms;')
    return counter

if __name__ == '__main__':
    p = Pool(N)
    print, xrange(N))

This code working for N (number of processes) less about 160. If N bigger than 160 exception is raised:
_mysql_exceptions.OperationalError: (1040, 'Too many connections').

I found "maxActiveConnections = 10" in "" file, so I expected that this exception should be raised for N > 10.
Why exception 'Too many connections' was raised only for 160+ parallel processes not for 10+?

Probably answer for first question is: only 10 or less processes was active in the same time, so only 10 or less connections was used. Ok, I've tried to increase "maxActiveConnections" in two ways:
1. PySQLPool.PySQLPool.PySQLPool.maxActiveConnections = 128
2. PySQLPool.getNewPool().maxActiveConnections = 128
In case of both approaches, calling PySQLPool.getNewPool().maxActiveConnections returns 128.
But I was surprised when exception was raised for the same value of N=160+ as in case of default value for maxActiveConnections (I've expected that N can be mush more after increasing maxActiveConnections).
So the question is why changing of maxActiveConnections didn't affected?

Question information

English Edit question
PySQLPool Edit question
Nick Verbeck Edit question
Last query:
Last reply:
Nick Verbeck (nerdynick) said : #1

So the error you are actually seeing is actually being reported by the under layer MySQLdb client. Which in this case is that your DB's max connections is actually maxing out not PySQLPool's.

As for the maxActiveConnections config. You are setting that right. However because you are using multiprocess instead of threading each processes is sparking off its own instance of PySQLPool. So maxActiveConnections would only control the max number of open connections PySQLPool will open per connection type(Unique on Schema/Host/Port/Username). So every time you increased N you would open up N number of connections because PySQLPool can't, at this time, keep track of the current number of connections across processes.

Multi Processes is something I would like to add in the future but am still looking into ways that I would be able to achieve this. However PySQLPool is tested and built against heavy threaded system and single process/thread systems, and will work wonders in those kinds of systems.

Can you help with this problem?

Provide an answer of your own, or ask Quasar for more information if necessary.

To post a message you must log in.