I remember only one thing: a desktop computer nowadays does about 2^27 operations in one second. The other numbres are very easy to derive.
When I estimate quickly the runtime f(n), I tend to work with (lg f(n)), and check if it goes over 27. For your Dijkstra example, I'd say there are <=2^25<=2^(2^5) nodes and <=2^26 edges, so the runtime should be <=2^(26+5), which is over the budget. Of course, this is a
back-of-the-envelope calculation that you routinely do in a few seconds, not a precise estimate.
Edit: Previous version had nodes/edges in the Dijkstra example swapped.