[Phpmyadmin-devel] Some bugs in master
rouslan at placella.com
Thu Jul 14 12:28:04 CEST 2011
On Thu, 2011-07-14 at 11:14 +0300, Tyron Madlener wrote:
> 2011/7/12 Michal Čihař <michal at cihar.com>:
> > Hi
> > You again missed reply to list, moving discussion back there :-).
> > Dne Tue, 12 Jul 2011 18:48:57 +0300
> > Tyron Madlener <tyronx at gmail.com> napsal(a):
> >> On Tue, Jul 12, 2011 at 3:49 PM, Michal Čihař <michal at cihar.com> wrote:
> >> > Hi
> >> >
> >> > Dne Tue, 12 Jul 2011 13:08:37 +0300
> >> > Tyron Madlener <tyronx at gmail.com> napsal(a):
> >> >
> >> >> And one more suggestion:
> >> >>
> >> >> Counting rows in big tables that use CSV as Engine (such as the
> >> >> general_log) seem very slow. Maybe rows should not be counted
> >> >> automatically for CSV Tables and only done upon user request.
> >> >> In my test I counted 36k rows on the demo server, that takes around
> >> >> 250ms, so imagine the general_log running all day. Then you will have
> >> >> 1mil+ rows, which then requires ~6-8 seconds to count.
> >> >
> >> > There is already similar logic for InnoDB or views, so only another
> >> > engine should be added here.
> >> Do you know in which file/line this is?
> > It should be libraries/display_tbl.lib.php somewhere near usage of
> > MaxExactCount.
> This code there is seriously odd. $unlim_num_rows seems to be the
> total count of rows, which, from what I can see, is calculated in
> There it calls PMA_Table::countRecords($db, $table) without checking
> for views or innodb. countRecords() in Table.class.php I can see it
> doesn't calculate the count for views or only up to a limit of
> But I don't see any limit being applied when the table engine is InnoDB.
> Either way, for limiting the count on CSV Engine tables, I guess that
> should be done in countRecords() in Table.class.php?
> >> Next suggestion :D
> >> Just saw in header_scripts.inc.php that codemirror.js and mysql.js is
> >> included globally. Isn't that a bit overkill? Not every page (like my
> >> status page) requires codemirror.
> > I don't think it hurts that much (browser should have that file in
> > cache), generally I was too lazy to figure out in which all places SQL
> > box needing highlighting might appear.
> browser each time, adding delay. Also it still requires it's own GET
> request, returning 304. Chrome seems to even skip the GET requests if
> the page is not refreshed, but firefox 5 doesn't do that in my tests.
> delay introduced by this gets quite the noticeable. So I would like to
> keep the amount of loaded js files as small as possible by removing
> what is not required
> - codemiror.js + mysql.js should only go where it's needed (-2)
> - load chart export on demand (-3)
> - load monitor js code on demand (-2.5)
> - We could merge always included files into one: functions.js, jquery,
> jquery.ui, jquery.qtip (-3)
No way. Merging jquery with functions.js may save the users a few
milliseconds per page load, but will add hours of hacking to the
> Then the amount of loaded files would be half already, in the case of
> the status page.
> Checking with chromes devtools it takes around 4 seconds to load the
> status page. With js disabled it takes around 2.3 seconds. So the js
> accounts for almost half the loading time (and CSS Sprites could
> probably save us half of that 2.3secs).
> With the js files reduced, the not immediately needed js code loaded
> only on demand and css sprites I bet we could get the loading time
> down to 2 seconds. 4s is way to much imo.
More information about the Developers