top | item 10071974

(no title)

NickPollard | 10 years ago

This is, generally, undecidable. In fact, it's one of the most famous results in Computer Science, known as The Halting Problem[1].

Alan Turing showed that, for a program run on a particularly definition of a computer (a 'Turing Machine'), that is equivalent to modern computers, there is no general solution for determining whether a Program, run with a given Input, will halt or run forever. Your question about how long an algorithm will run is a more complex version of the same problem.

If you confine yourself to particular instances of certain algorithms, then you can do what you mean. The field of computational complexity is focused on the performance of algorithms, though the common measures[2] - big/little O/Omega/theta measure asymptotic complexity; that is how the complexity varies as inputs become arbitrarily large.

[1] https://en.wikipedia.org/wiki/Halting_problem [2] https://en.wikipedia.org/wiki/Big_O_notation

discuss

order

No comments yet.