我在问更多关于这对我的代码意味着什么。我能从数学上理解这些概念,只是很难理解它们在概念上的含义。例如,如果有人要对一个数据结构执行O(1)操作,我知道它必须执行的操作数量不会增长,因为有更多的项。O(n)操作意味着您将对每个元素执行一组操作。有人能帮我填一下吗?

比如O(n²)的运算会怎样? 如果一个操作是O(nlog (n))这是什么意思? 有人必须吸可卡因才能写出O(x!)吗?


当前回答

我会试着为一个真正的八岁男孩写一个解释,除了专业术语和数学概念。

比如O(n²)的运算会怎样?

如果你在一个聚会上,包括你在内有n个人。需要多少次握手才能让每个人都和其他人握手,因为人们可能会在某个时候忘记他们握手的人是谁。

注意:这近似于产生n(n-1)的单形,这足够接近于n²。

如果一个操作是O(nlog (n))这是什么意思?

你最喜欢的球队赢了,他们站在队伍里,队伍里有n名球员。你需要和每个玩家握手多少次,假设你要和每个玩家握手多次,多少次,玩家的号码n中有多少位数字。

注意:这将产生n * log n的10次方。

有人必须吸可卡因才能写出O(x!)吗?

你是一个富二代,你的衣柜里有很多衣服,每种衣服有x个抽屉,抽屉一个挨着一个,第一个抽屉里有一件衣服,每个抽屉里有和左边抽屉一样多的衣服,所以你有一顶帽子,两顶假发,…(x-1)条裤子,然后是x件衬衫。现在,用每个抽屉里的一件物品,你能装扮出多少种风格呢?

注意:这个例子表示一个决策树中有多少个叶结点,其中子结点数=深度,通过1 * 2 * 3 *完成。* x

其他回答

为了对被问到的问题保持真诚,我会用回答8岁孩子的方式来回答这个问题

假设一个冰淇淋小贩准备了许多不同形状的冰淇淋(比如N个),按顺序排列。 你想吃中间的冰淇淋

情况1:只有吃完所有比它小的冰淇淋,你才能吃冰淇淋 你将不得不吃掉一半准备好的冰淇淋(输入)。答案直接取决于输入的大小 解是o(N)阶的

情况2:—你可以直接吃中间的冰淇淋

解是O(1)

情况3:只有当你吃完所有比它小的冰淇淋时,你才能吃冰淇淋,每次你吃冰淇淋时,你都允许另一个孩子(每次都是新孩子)吃掉他所有的冰淇淋 总时间为N + N + N.......(N/2)次 溶液是O(N2)

其中很多都很容易用非编程的东西来演示,比如洗牌。

对一副牌进行排序通过遍历整副牌找到黑桃a,然后遍历整副牌找到黑桃2,以此类推最坏情况是n^2,如果这副牌已经倒着排序了。你看了52张牌52次。

一般来说,真正糟糕的算法不一定是故意的,它们通常是对其他东西的误用,比如在同一集合上线性重复的另一个方法中调用一个线性方法。

不,O(n)算法并不意味着它将对每个元素执行操作。大o符号给了你一种方法来谈论你的算法的“速度”独立于你的实际机器。

O(n)表示算法花费的时间随着输入的增加而线性增长。O(n²)意味着你的算法花费的时间是你输入的平方。等等。

我试图用c#和JavaScript给出简单的代码示例来解释。

C#

For List<int> numbers = new List<int> {1,2,3,4,5,6,7,12,543,7};

O(1)看起来像

return numbers.First();

O(n)看起来像

int result = 0;
foreach (int num in numbers)
{
  result += num;
}
return result;

O(nlog (n))是这样的

int result = 0;
foreach (int num in numbers)
{
    int index = numbers.Count - 1;
    while (index > 1)
    {
        // yeah, stupid, but couldn't come up with something more useful :-(
        result += numbers[index];
        index /= 2;
    }
}
return result;

O(n2)是这样的

int result = 0;
foreach (int outerNum in numbers)
{
    foreach (int innerNum in numbers)
    {
        result += outerNum * innerNum;
    }
}
return result;

O(n!)看起来,嗯,太累了,想不出任何简单的东西。 但我希望你能明白大意?


JavaScript

对于const数= [1,2,3,4,5,6,7,12,543,7];

O(1)看起来像

numbers[0];

O(n)看起来像

let result = 0;
for (num of numbers){
    result += num;
}

O(nlog (n))是这样的

let result = 0;
for (num of numbers){

    let index = numbers.length - 1;
    while (index > 1){
        // yeah, stupid, but couldn't come up with something more useful :-(
        result += numbers[index];
        index = Math.floor(index/2)
    }
}

O(n2)是这样的

let result = 0;
for (outerNum of numbers){
    for (innerNum of numbers){
        result += outerNum * innerNum;
    }
}

这可能太数学化了,但这是我的尝试。(我是数学家。)

如果某个东西是O(f(n)),那么它在n个元素上的运行时间将等于A f(n) + B(以时钟周期或CPU操作为单位)。理解这些常量A和B是非常关键的,它们来自特定的实现。B本质上代表你的操作的“常量开销”,例如你所做的一些预处理不依赖于集合的大小。A表示实际项目处理算法的速度。

关键在于,你可以使用大O符号来计算某物的可伸缩性。所以这些常数并不重要:如果你想弄清楚如何从10个项目扩展到10000个项目,谁会关心开销常数B呢?类似地,其他问题(见下文)肯定会超过乘法常数A的重要性。

So the real deal is f(n). If f grows not at all with n, e.g. f(n) = 1, then you'll scale fantastically---your running time will always just be A + B. If f grows linearly with n, i.e. f(n) = n, your running time will scale pretty much as best as can be expected---if your users are waiting 10 ns for 10 elements, they'll wait 10000 ns for 10000 elements (ignoring the additive constant). But if it grows faster, like n2, then you're in trouble; things will start slowing down way too much when you get larger collections. f(n) = n log(n) is a good compromise, usually: your operation can't be so simple as to give linear scaling, but you've managed to cut things down such that it'll scale much better than f(n) = n2.

实际上,这里有一些很好的例子:

O(1): retrieving an element from an array. We know exactly where it is in memory, so we just go get it. It doesn't matter if the collection has 10 items or 10000; it's still at index (say) 3, so we just jump to location 3 in memory. O(n): retrieving an element from a linked list. Here, A = 0.5, because on average you''ll have to go through 1/2 of the linked list before you find the element you're looking for. O(n2): various "dumb" sorting algorithms. Because generally their strategy involves, for each element (n), you look at all the other elements (so times another n, giving n2), then position yourself in the right place. O(n log(n)): various "smart" sorting algorithms. It turns out that you only need to look at, say, 10 elements in a 1010-element collection to intelligently sort yourself relative to everyone else in the collection. Because everyone else is also going to look at 10 elements, and the emergent behavior is orchestrated just right so that this is enough to produce a sorted list. O(n!): an algorithm that "tries everything," since there are (proportional to) n! possible combinations of n elements that might solve a given problem. So it just loops through all such combinations, tries them, then stops whenever it succeeds.