这个问题来自于对过去50年左右计算领域各种进展的评论。
其他一些与会者请我把这个问题作为一个问题向整个论坛提出。
这里的基本思想不是抨击事物的现状,而是试图理解提出基本新思想和原则的过程。
我认为我们在大多数计算领域都需要真正的新想法,我想知道最近已经完成的任何重要而有力的想法。如果我们真的找不到他们,那么我们应该问“为什么?”和“我们应该做什么?”
这个问题来自于对过去50年左右计算领域各种进展的评论。
其他一些与会者请我把这个问题作为一个问题向整个论坛提出。
这里的基本思想不是抨击事物的现状,而是试图理解提出基本新思想和原则的过程。
我认为我们在大多数计算领域都需要真正的新想法,我想知道最近已经完成的任何重要而有力的想法。如果我们真的找不到他们,那么我们应该问“为什么?”和“我们应该做什么?”
当前回答
1980年1月2日我开始编程。我试着思考在我的职业生涯中有哪些重大的新发明。我很难想出一个。大多数我认为重要的东西实际上是在1980年之前发明的,但直到1980年之后才被广泛采用或改进。
Graphical User Interface. Fast processing. Large memory (I paid $200.00 for 16k in 1980). Small sizes - cell phones, pocket pc's, iPhones, Netbooks. Large storage capacities. (I've gone from carrying a large 90k floppy to an 8 gig usb thumb drive. Multiple processors. (Almost all my computers have more than one now, software struggles to keep them busy). Standard interfaces (like USB) to easily attach hardware peripherals. Multiple Touch displays. Network connectivity - leading to the mid 90's internet explosion. IDE's with Intellisense and incremental compiling.
虽然硬件有了巨大的进步,但软件行业一直在努力跟上。我们比1980年领先了几光年,但大多数改进都是改进,而不是发明。自1980年以来,我们一直忙于应用技术进步,而不是发明创造。就其本身而言,这些渐进式的发明大多不重要或不强大,但当你回顾过去29年,它们相当强大。
我们可能需要接受渐进式的改进并引导它们。我相信真正原创的想法可能会来自很少接触计算机的人,而且他们越来越难找到。
其他回答
Paxos协议。很难描述它在互联网时代的价值。
在操作系统核心开发中使用函数式编程/语言。
我相信单元测试、TDD和持续集成是1980年之后的重大发明。
软件:
虚拟化和仿真 P2P数据传输 社区驱动的项目,如维基百科、SETI@home…… 网络爬行和网络搜索引擎,即索引信息分布在世界各地
硬件:
模块化PC 电子纸
我认为自20世纪80年代以来发明的最好的想法将是我们不知道的。要么是因为它们很小,无处不在,以至于不引人注意,要么是因为它们的受欢迎程度还没有真正起飞。
前者的一个例子是单击并拖动以选择文本的一部分。我相信这是1984年首次出现在麦金塔电脑上。在此之前,您有单独的按钮用于选择选择的开始和结束。相当繁重。
后者的一个例子是(可能是)可视化编程语言。我不是说像hypercard,我是说像Max/MSP, Prograph, Quartz Composer, yahoo pipes等。目前它们确实是小众的,但我认为,除了思想分享之外,没有什么能阻止它们像标准编程语言一样具有表现力和强大的功能。
可视化编程语言有效地加强了引用透明性的函数式编程范式。这对于代码来说是一个非常有用的属性。他们执行这一点的方式也不是人为的——这只是由于他们使用的比喻。
VPL让那些本来不会编程的人也能编程,比如有语言障碍的人,比如阅读困难的人,甚至只是需要简单节省时间的门外汉。专业程序员可能会对此嗤之以鼻,但就我个人而言,我认为如果编程成为一种真正无处不在的技能,就像识字一样,那就太好了。
就目前来看,VPL只是一个小众的兴趣,还没有真正成为主流。
我们应该做些什么不同的事情
all computer science majors should be required to double major- coupling the CS major with one of the humanities. Painting, literature, design, psychology, history, english, whatever. A lot of the problem is that the industry is populated with people that have a really narrow and unimaginative understanding of the world, and therefore can't begin to imagine a computer working any significantly differently than it already does. (if it helps, you can imagine that I'm talking about someone other than you, the person reading this.) Mathematics is great, but in the end it's just a tool for achieving. we need experts who understand the nature of creativity, who also understand technology.
But even if we have them, there needs to be an environment where there's a possibility that doing something new would be worth the risk. It's 100 times more likely that anything truly new gets rejected out of hand, rather viciously. (the newton is an example of this). so we need a much higher tolerance for failure. We should not be afraid to try an idea which has failed in the past. We should not fully reject our own failures- and we should learn to recognize when we have failed. We should not see failure as a bad thing, and so we shouldn't lie to ourselves or to others about it. We should just get used to it, because it is just about the only constant in this ever changing industry. Post mortems are useful in this regard.
One of the more interesting things, about smalltalk, I think, was not the language itself, but the process that was used to arrive at the design of smalltalk. The iterative design process, going through many many revisions- But also very carefully and critically identifying the flaws of the existing system, and finding solutions in the next one. The more perspectives, and the broader the perspectives we have on the situation, the better we can judge where the mistakes and problems are. So don't just study computer science. Study as many other academic subjects as you can get yourself to be interested in.