这个问题来自于对过去50年左右计算领域各种进展的评论。

其他一些与会者请我把这个问题作为一个问题向整个论坛提出。

这里的基本思想不是抨击事物的现状,而是试图理解提出基本新思想和原则的过程。

我认为我们在大多数计算领域都需要真正的新想法,我想知道最近已经完成的任何重要而有力的想法。如果我们真的找不到他们,那么我们应该问“为什么?”和“我们应该做什么?”


当前回答

这是一个消极的结果,作为一个“基础创新”很奇怪,但我认为适用,因为它开辟了新的研究领域,关闭了无用的领域。

分配共识的不可能性:2001年PODC影响力论文奖

We assumed that the main value of our impossibility result was to close off unproductive lines of research on trying to find fault-tolerant consensus algorithms. But much to our surprise, it opened up entirely new lines of research. There has been analysis of exactly what assumptions about the distributed system model are needed for the impossibility proof. Many related distributed problems to which the proof also applies have been found, together with seemingly similar problems which do have solutions. Eventually a long line of research developed in which primitives were classified based on their ability to implement wait-free fault-tolerant consensus.

其他回答

I'd say the biggest trend is an ever increasing lack of location dependence and pervasiveness. An interesting philosophical exercise these days is to count the computers in you immediate area. They're everywhere desktops, keyboards, microwaves, radios, televisions, cell phones etc... My grandmother computer is illiterate however her life is as infested with small computers as everyone else's. She can make a call to me from the middle of an empty field. I can then answer that call zipping down the highway.

回答“为什么新思想会消亡”和“如何应对”这两个问题?

I suspect a lot of the lack of progress is due to the massive influx of capital and entrenched wealth in the industry. Sounds counterintuitive, but I think it's become conventional wisdom that any new idea gets one shot; if it doesn't make it at the first try, it can't come back. It gets bought by someone with entrenched interests, or just FAILs, and the energy is gone. A couple examples are tablet computers, and integrated office software. The Newton and several others had real potential, but ended up (through competitive attrition and bad judgment) squandering their birthrights, killing whole categories. (I was especially fond of Ashton Tate's Framework; but I'm still stuck with Word and Excel).

怎么办呢?首先想到的是Wm。莎士比亚的建议:“让我们杀了所有的律师。”但恐怕他们现在装备太精良了。实际上,我认为最好的选择是找到某种开源计划。它们似乎比其他选择更好地保持可访问性和增量改进。但是这个行业已经变得足够大了,所以某种有机的合作机制是必要的。

I also think that there's a dynamic that says that the entrenched interests (especially platforms) require a substantial amount of change - churn - to justify continuing revenue streams; and this absorbs a lot of creative energy that could have been spent in better ways. Look how much time we spend treading water with the newest iteration from Microsoft or Sun or Linux or Firefox, making changes to systems that for the most part work fine already. It's not because they are evil, it's just built into the industry. There's no such thing as Stable Equilibrium; all the feedback mechanisms are positive, favoring change over stability. (Did you ever see a feature withdrawn, or a change retracted?)

关于SO的另一个讨论线索是臭鼬工厂综合症(参考:Geoffrey Moore):在大型组织中,真正的创新几乎总是(90%以上)出现在自发出现的未经授权的项目中,这些项目完全由个人或小团队的主动性推动(通常会受到正式的管理等级的反对)。所以:质疑权威,反抗体制。

我认为我们需要真正的新想法 在计算机的大部分领域,我 想知道有什么重要的吗 以及已经完成的强有力的任务 最近。如果我们真的找不到 他们,那么我们应该问“为什么? “我们该怎么办?”

在我看来,我们在计算领域没有那么多新想法,因为我们在很大程度上不需要它们。我们一直在挖掘旧的想法,并从中获得了很多东西,比如cpu速度的显著增长。

当我们因为“井干了”而需要新想法时,我们就会明白需求是发明之母。

函数式编程研究者对单子的重新发现。单子有助于让一种纯粹的、懒惰的语言(Haskell)成为一种实用的工具;它还影响了组合子库的设计(一元解析器组合子甚至在Python中找到了自己的方式)。

Moggi的“程序模块的范畴理论解释”(1989)通常被认为是将单子引入有效计算的观点;Wadler的作品(例如,“命令式函数式编程”(1993))将单子作为实用工具。

我认为自20世纪80年代以来发明的最好的想法将是我们不知道的。要么是因为它们很小,无处不在,以至于不引人注意,要么是因为它们的受欢迎程度还没有真正起飞。

前者的一个例子是单击并拖动以选择文本的一部分。我相信这是1984年首次出现在麦金塔电脑上。在此之前,您有单独的按钮用于选择选择的开始和结束。相当繁重。

后者的一个例子是(可能是)可视化编程语言。我不是说像hypercard,我是说像Max/MSP, Prograph, Quartz Composer, yahoo pipes等。目前它们确实是小众的,但我认为,除了思想分享之外,没有什么能阻止它们像标准编程语言一样具有表现力和强大的功能。

可视化编程语言有效地加强了引用透明性的函数式编程范式。这对于代码来说是一个非常有用的属性。他们执行这一点的方式也不是人为的——这只是由于他们使用的比喻。

VPL让那些本来不会编程的人也能编程,比如有语言障碍的人,比如阅读困难的人,甚至只是需要简单节省时间的门外汉。专业程序员可能会对此嗤之以鼻,但就我个人而言,我认为如果编程成为一种真正无处不在的技能,就像识字一样,那就太好了。

就目前来看,VPL只是一个小众的兴趣,还没有真正成为主流。

我们应该做些什么不同的事情

all computer science majors should be required to double major- coupling the CS major with one of the humanities. Painting, literature, design, psychology, history, english, whatever. A lot of the problem is that the industry is populated with people that have a really narrow and unimaginative understanding of the world, and therefore can't begin to imagine a computer working any significantly differently than it already does. (if it helps, you can imagine that I'm talking about someone other than you, the person reading this.) Mathematics is great, but in the end it's just a tool for achieving. we need experts who understand the nature of creativity, who also understand technology.

But even if we have them, there needs to be an environment where there's a possibility that doing something new would be worth the risk. It's 100 times more likely that anything truly new gets rejected out of hand, rather viciously. (the newton is an example of this). so we need a much higher tolerance for failure. We should not be afraid to try an idea which has failed in the past. We should not fully reject our own failures- and we should learn to recognize when we have failed. We should not see failure as a bad thing, and so we shouldn't lie to ourselves or to others about it. We should just get used to it, because it is just about the only constant in this ever changing industry. Post mortems are useful in this regard.

One of the more interesting things, about smalltalk, I think, was not the language itself, but the process that was used to arrive at the design of smalltalk. The iterative design process, going through many many revisions- But also very carefully and critically identifying the flaws of the existing system, and finding solutions in the next one. The more perspectives, and the broader the perspectives we have on the situation, the better we can judge where the mistakes and problems are. So don't just study computer science. Study as many other academic subjects as you can get yourself to be interested in.