That actually consists of a large number of variables.
Its a combination of product achitecture, hardware/software specs, used platform, product configuration, objects handled by the product, used methods and so on..
"Most" of the time, the limiting factor is bandwidth related, and that can be anything from system bus, memory, hard drives, network, etc..
Its however not uncommon to reach a products hard limit when pushed far enough, used version (age) of the product and the way methods are being used.
Bad or inefficient coding will generally bring things down much faster, on either program or user space.
Most boundaries set out there are mere specs of used compilers, mere theoretical of course unless using the same framework on which it was developed/tested (HCL), and even then, a minor revision of things will probably deliver different results regarding speed and/or bandwidth consumption.
I would guess as long as it isn't bothering the box where your DB is running from and aren't eating away memory due to the amount of concurrent sessions, you should be safe, checkout this link here
(Assuming this also reflects on the PHP/mysql question)
Nope, its what you actually do with the DB, for example, if you're serving images from a DB, it would require very little framework nor queries but lots of bandwidth in terms of network.
If on the other hand you're performing a relational approach and do lots of subqueries, it would require far more computational power and disk access than just delivering contents from a table/record.
So it all depends on intended use before you can make such assessment.