Mam bardzo agresywne podejście, używając dynamicznego SQL Brute Force
SET group_concat_max_len = 1024 * 1024 * 100;
SELECT CONCAT('SELECT * FROM (',GROUP_CONCAT(CONCAT('SELECT ',QUOTE(tb),' Tables_in_database,
COUNT(1) "Number of Rows" FROM ',db,'.',tb) SEPARATOR ' UNION '),') A;')
INTO @sql FROM (SELECT table_schema db,table_name tb
FROM information_schema.tables WHERE table_schema = DATABASE()) A;
PREPARE s FROM @sql; EXECUTE s; DEALLOCATE PREPARE s;
Przykład: dostaję to w mojej testowej bazie danych
mysql> use test
Database changed
mysql> SET group_concat_max_len = 1024 * 1024 * 100;
Query OK, 0 rows affected (0.00 sec)
mysql> SELECT CONCAT('SELECT * FROM (',GROUP_CONCAT(CONCAT('SELECT ',QUOTE(tb),' Tables_in_database,
'> COUNT(1) "Number of Rows" FROM ',db,'.',tb) SEPARATOR ' UNION '),') A;')
-> INTO @sql FROM (SELECT table_schema db,table_name tb
-> FROM information_schema.tables WHERE table_schema = DATABASE()) A;
Query OK, 1 row affected (0.00 sec)
mysql> PREPARE s FROM @sql; EXECUTE s; DEALLOCATE PREPARE s;
Query OK, 0 rows affected (0.00 sec)
Statement prepared
+--------------------+----------------+
| Tables_in_database | Number of Rows |
+--------------------+----------------+
| biblio | 3 |
| biblio_old | 7 |
| dep | 5 |
| e | 14 |
| emp | 4 |
| fruit | 12 |
| fruit_outoforder | 12 |
| nums_composite | 0 |
| nuoji | 4 |
| prod | 3 |
| prodcat | 6 |
| test2 | 9 |
| worktable | 5 |
| yoshi_scores | 24 |
+--------------------+----------------+
14 rows in set (0.00 sec)
Query OK, 0 rows affected (0.00 sec)
mysql>
SPRÓBUJ !!!
CAVEAT: Jeśli wszystkie tabele są MyISAM, stanie się to bardzo szybko. Jeśli wszystkie tabele są InnoDB, każda tabela zostanie policzona. Może to być brutalne i nieubłagane w przypadku bardzo dużych tabel InnoDB.