Friday 26 April 2013

More complicated data profiling query

This is the extended version of basic-data-profiling, it is slower but gives you much more interesting informations.

When you do data profiling you need to give examples for each data grouping, GROUP_CONCAT MySQL function is here for you

SELECT COUNT(*) AS nb, tablename.title AS datas, 
 SUBSTRING_INDEX(GROUP_CONCAT(tablename.emp_no ORDER BY RAND(42)), ',', 10) AS Examples
FROM titles AS tablename
GROUP BY datas
ORDER BY nb DESC;

Result:
nb datas Examples
115003 Engineer 222309,64167,287849,222887,256836,237309,262823,242428,222680,225055
107391 Staff 427715,209965,36589,74250,407627,251530,409579,254748,456807,204250
97750 Senior Engineer 238256,468681,477369,205608,224208,263874,251767,82661,245168,213359
92853 Senior Staff 83208,45313,211968,264723,36232,263944,46428,471920,66956,442742
15159 Technique Leader 260169,436055,295995,251618,434080,492772,54096,18718,97636,496663
15128 Assistant Engineer 448269,461300,443947,417358,21784,437352,412830,94900,262281,98231
24 Manager 110511,110800,110022,110303,111133,110420,110567,110854,111784,111692


Explainations:
COUNT(*) will count the number of records for each value grouped in the GROUP BY clause
tablename.title AS datas will show in the column "datas" the name of the current group 
GROUP_CONCAT will join the string value of the first arg with coma between them, the max size of group_concat is set by group_concat_max_len variable, default is 1024chars
ORDER BY RAND(42) is the internal order of the group_concat, in this case we order by random with the seed 42, if you rerun the query you will have exactly the same random order if you use the same seed. If you want different examples each time just change the seed or use ORDER BY RAND() instead
SUBSTRING_INDEX is looking for the comma separator and will keep only  everything left of the 10th comma
ORDER by nb DESC will order the final result by the number of occurrence of each data putting the most use data top of the table

Thursday 25 April 2013

Create a Website in Python in 30sec

With a small 1 file lib called BottlePy and nothing else but standard Python install you can create a website in less than 30se.

  • Download bottle.py file here bottle.py on GitHub
  • Create a python script example.py
  • Put bottle.py in the same folder
  • Copy paste the HelloWord example
from bottle import route, run, template

@route('/hello/:name')
def index(name='World'):
    return template('Hello {{name}}!', name=name)

run(host='localhost', port=8080)
 


Bottle can even work with an Apache server in front of it as a WSGI app.
Why use Tx, GTK or QT  for your scripts ui when you can have a lightweight multiplatform web ui in 4 lines of code ?


Tuesday 23 April 2013

How Mysql query cache can kill your perf

query_cache_size is a tricky tricky variable to tune.

Here is how the Query cache works in 4 lines:
  • For every SELECT made MySQL looks into the cache if there is an entry, if an entry is found it is returned immediately
  • If there is no entry for a query the string of the SELECT is store with the list of tables involved and the result
  • For every Update/Insert/Delete on a table MySQL delete from the cache all the queries that use this table
  • When a thread is looking for something in the cache nobody else can

 I have seen that the cost of those search if you have query_cache_size > 0 is around 15% more cpu but it can be a LOT more than that
In my case the query_cache_size was set to 512Mb and was not a problem for a month, then the system reach a critical point where very small queries were spending 20x more time waiting for their turn to look into the cache than executing the query itself even if the query was in the cache.

The system is a Sugarcrm database for a big company, between 9am and 5pm there is around 1M database opp per hour. If we speak in seconds it is around 300 selects/s and  17updates/s. The problem is that with 17 updates per second the cache is invalidated almost immediately from a front end perspective.
I changed this value to 16Mb yesterday and on with more opp/h the server is using 3 times less CPU, 100% to 30% !! The avg response time of the front-end web pages dropped from 1500ms to 300ms the result is people were spending less time waiting and more time working, it is why the number of opp raise on this day.
I will continue to monitor the CPU use and the avg response time and next week I will try to turn the query cache completely of to see if there is any other gains.

Conclusions:
MySQL query cache is really fantastic for server with long selects on a database not often updated, but in this case most of the time it is better to do this caching on an application layer with memcache for example.
MySQL query cache is horrible if you have a high concurrency system with a lot of update, in this case you can test with a small value 16Mb for example or with no cache at all.

Basic Data Profiling

As a DBA I am very often asked to give profiling information from the database. This result need to be understandable by non technical persons.

The basic query
SELECT count(*) as Nb, the_field as PrettyName
FROM the_table
GROUP BY the_field