Summary of the problems of Piwik at large scale
Piwik works well for up to 100k-500k daily pageviews. If your website is around these values, you start might experiencing some performance issues, listed below.
EDIT Nov 2011: 1) and 2) below are now dealt with, as of Piwik 1.7, by using the new misc/cron/archive.php script instead of archive.sh
1) archive.sh memory usage reaching php error: it will probably first throw the infamous "Archiving memory exhausted error". See discussion and possible solutions in http://dev.piwik.org/trac/ticket/766
2) archive.sh execution time. It is possible that we hit Mysql limits (or some other limitations/ bugs) in the system which result in very long archive.sh execution time. For example, Mysql could behave badly when the log table becomes very large and INDEXes bloated and we are trying to do a group by on a 15M row set... well it will take more than 1s See ticket
3) High server load when Piwik is tracking data. Apparently Mysql Innodb (as you point out) seems relatively good about this, but you could hit some limit there since I've never seen Piwik with this much traffic.
Solution is already planned for next few months: write a queue and then Bulk import requests from the queue in mysql
4) Reaching Mysql performance threshold for large scale analytics
At some point, if we improve all the rest, there will still be the "Mysql" factor that will prevent good performance for largest piwik servers (millions of pages per day). We will investigate alternatives such as: InfiniDB, MongoDB, HBase, etc