Computational Complexity
1. ## Computational Complexity

Given two implementations of an algorithm how do I determine the relative
computational complexity of each?

Are there tools that can determine the relative performance of two
algorithms or implementations?

Joris van Lier

My System Specs

2. ## Re: Computational Complexity

"Joris van Lier" <whizzrd@xxxxxx> wrote in message
news:#2nq9jgyIHA.2064@xxxxxx

> Given two implementations of an algorithm how do I determine the relative
> computational complexity of each?
>
> Are there tools that can determine the relative performance of two
> algorithms or implementations?
>
> Joris van Lier
Let me explain this a bit more:

I'm working on a process where multiple copies of data may exist,
new data is supplied to the process with a location where to store it,
the process stores it in the designated location and
replaces all copies with links to the new location

For the purpose of this example I'll assume that the storage is the
filesystem , the data is contained in files and all files are of equal
length, location is a path, and all locations where data can be stored are
known beforehand

With my limited math skills and assuming the program runs as a single thread
I tried to resolve this as below

D= data unit (file)
L = number of locations
F = average number of files in a location
B = average number of bytes in a file

O(fx(D))= L*F*B

is this correct?

How do I account for multiple concurrent threads of execution?

--
Joris van Lier

My System Specs

3. ## Re: Computational Complexity

On Jun 9, 5:03*am, "Joris van Lier" <whiz...@xxxxxx> wrote:

> "Joris van Lier" <whiz...@xxxxxx> wrote in messagenews:#2nq9jgyIHA.2064@xxxxxx
>

> > Given two implementations of an algorithm how do I determine the relative
> > computational complexity of each?
>

> > Are there tools that can determine the relative performance of two
> > algorithms or implementations?
>

> > Joris van Lier
>
> Let me explain this a bit more:
>
> I'm working on a process where multiple copies of data may exist,
> new data is supplied to the process with a location where to store it,
> the process stores it in the designated location and
> replaces all copies with links to the new location
>
> For the purpose of this example I'll assume that the storage is the
> filesystem , the data is contained in files and all files are of equal
> length, *location is a path, and all locations where data can be stored are
> known beforehand
>
> With my limited math skills and assuming the program runs as a single thread
> I tried to resolve this as below
>
> D= data unit (file)
> L = number of locations
> F = average number of files in a location
> B = average number of bytes in a file
>
> O(fx(D))= L*F*B
>
> is this correct?
>
> How do I account for multiple concurrent threads of execution?
>
> --
> Joris van Lier
It looks like you were working through the calculations for
complexity right? What is it that you are wanting to do to the data?
The only way to work through how long it will take your process to run
is to understand what it needs to do and how it will do that. Can you
give us a better description?

My System Specs

4. ## Re: Computational Complexity

On Jun 9, 3:15*am, "Joris van Lier" <whiz...@xxxxxx> wrote:

> Given two implementations of an algorithm how do I determine the relative
> computational complexity of each?
You do a Big-Oh analysis of the algorithms.

>
> Are there tools that can determine the relative performance of two
> algorithms or implementations?
Well, you can use a profiler to assist in determining the runtime
differences of the algorithms. However, you must hold the problem
size constant to be able to make a reasonable comparison between the
algorithms. That comparison is only valid for the specific problem
size chosen. Given enough information you may be able to make some
general conclusions about all problem sizes, but they won't be as
reliable as doing the Big-Oh analysis.

My System Specs

5. ## Re: Computational Complexity

"Brian Gideon" <briangideon@xxxxxx> wrote in message

Hi Brian, thanks for responding

> On Jun 9, 3:15 am, "Joris van Lier" <whiz...@xxxxxx> wrote:

>> Given two implementations of an algorithm how do I determine the relative
>> computational complexity of each?
>
> You do a Big-Oh analysis of the algorithms.
I know but.... How?

>>
>> Are there tools that can determine the relative performance of two
>> algorithms or implementations?
>
> Well, you can use a profiler to assist in determining the runtime
> differences of the algorithms. However, you must hold the problem
> size constant to be able to make a reasonable comparison between the
> algorithms. That comparison is only valid for the specific problem
> size chosen. Given enough information you may be able to make some
> general conclusions about all problem sizes, but they won't be as
> reliable as doing the Big-Oh analysis.
My first hunch was to do measuring , but when measuring I would have to make
really sure that the environment is stable in terms of performance / load,
other processes running at the same time might influence the tests, either
negatively by lowering the available resources or positively by causing data
used by my process to be cached, and this measurement would then not reflect
the real world in which the process will be operating,

Can you recommend any sources for learning Big-Oh analysis for
non-mathematicians ?

Joris van Lier

My System Specs

6. ## Re: Computational Complexity

On Jun 9, 12:52*pm, "Joris van Lier" <whiz...@xxxxxx> wrote:

> Can you recommend any sources for learning Big-Oh analysis for
> non-mathematicians ?
>
> Joris van Lier
Google around a bit. Take your time. It can be quite confusing and
complex so start out simple. Avoid the math in the middle and just
concentrate on the Big-Oh notation of concrete examples. Use
complexities of known sub-algorithms to help derive the Big-Oh for the
main algorithm. Understand the differences between best, average, and
worst case scenarios. Understand what coding patterns yield the these
common complexities: O(1), O(n), O(n^2), O(log(n)), O(n*log(n)). Once
you understand those it gets easier to combine them.

My System Specs

Computational Complexity problems?

 Similar Threads Thread Thread Starter Forum Replies Last Post jackyclever Server General 0 29 Apr 2010 John Braun SBS Server 6 19 Oct 2009 Marcin Server General 0 16 Jul 2009 Andrew Corley, MCSD, MCDBA .NET General 0 01 Apr 2009 z3r010 Vista News 0 23 Aug 2006