Time complexity gives an indication of the time an algorithm will complete its task. However, it is merely an indication; two algorithms with the same time complexity won't necessarily take the same amount of time to complete. For instance, comparing two primitive values is a constant-time operation. Swapping those values is also a constant-time operation, however a swap requires more individual operations than a comparison does, so a swap will take longer even though the time complexity is exactly the same.
The time complexity of an algorithm is the length of time to complete the algorithm given certain inputs. Usually, the algorithm with the best average time will be selected for a task, unless it can be proven that a certain class of conditions has to exist for an average time, in which case an algorithm that is faster in certain cases will be chosen based on that characteristic.
For example, given the option of a merge sort or a bubble sort, a programmer might be tempted to say that the merge sort would be faster, on average, than a bubble sort. It so happens that they are correct; the merge sort is probably the better of the two options.
However, if the data is usually "almost" sorted and rarely has out of place elements, then the bubble sort, despite being an inferior algorithm, would be faster than a merge sort for large data sets, which has a fixed time complexity that is better than bubble sort's average time complexity, but worse than bubble sort's best time complexity.
Thus, an informed programmer would look at all the possible algorithms available to solve a task, and select the one that yields the best results for the majority of test cases. This is similar to how a business manager might compare the overall cost of acquisition with the cost of ownership over time to determine the best solution.
Check out his link for the same question...... http://www.daniweb.com/forums/thread13488.html this one has preety good answer to this...
what do you mean by time and space complexity and how to represent these complexity
Check this Site by Topcoder
http://www.topcoder.com/tc?module=Static&d1=tutorials&d2=complexity1
just identify the number of operation to your algorithm with respect to number of inputs. keep in mind that number of inputs should be tends to infinity.
The algorithm will have both a constant time complexity and a constant space complexity: O(1)
time complexity is 2^57..and space complexity is 2^(n+1).
Dijkstra's original algorithm (published in 1959) has a time-complexity of O(N*N), where N is the number of nodes.
Time complexity and space complexity.
o(nm)
Time complexity and space complexity. More specifically, how well an algorithm will scale when given larger inputs.
The algorithm will have both a constant time complexity and a constant space complexity: O(1)
time complexity is 2^57..and space complexity is 2^(n+1).
Dijkstra's original algorithm (published in 1959) has a time-complexity of O(N*N), where N is the number of nodes.
Time complexity and space complexity.
o(nm)
Time complexity is a function which value depend on the input and algorithm of a program and give us idea about how long it would take to execute the program
Finding a time complexity for an algorithm is better than measuring the actual running time for a few reasons: # Time complexity is unaffected by outside factors; running time is determined as much by other running processes as by algorithm efficiency. # Time complexity describes how an algorithm will scale; running time can only describe how one particular set of inputs will cause the algorithm to perform. Note that there are downsides to time complexity measurements: # Users/clients do not care about how efficient your algorithm is, only how fast it seems to run. # Time complexity is ambiguous; two different O(n2) sort algorithms can have vastly different run times for the same data. # Time complexity ignores any constant-time parts of an algorithm. A O(n) algorithm could, in theory, have a constant ten second section, which isn't normally shown in big-o notation.
O 2^(n)
The usual definition of an algorithm's time complexity is called Big O Notation. If an algorithm has a value of O(1), it is a fixed time algorithm, the best possible type of algorithm for speed. As you approach O(∞) (a.k.a. infinite loop), the algorithm takes progressively longer to complete (an algorithm of O(∞) would never complete).
yea me too dude. Mahleko :(
"Running Time" is essentially a synonym of "Time Complexity", although the latter is the more technical term. "Running Time" is confusing, since it sounds like it could mean "the time something takes to run", whereas Time Complexity unambiguously refers to the relationship between the time and the size of the input.