这个问题来自于对过去50年左右计算领域各种进展的评论。
其他一些与会者请我把这个问题作为一个问题向整个论坛提出。
这里的基本思想不是抨击事物的现状,而是试图理解提出基本新思想和原则的过程。
我认为我们在大多数计算领域都需要真正的新想法,我想知道最近已经完成的任何重要而有力的想法。如果我们真的找不到他们,那么我们应该问“为什么?”和“我们应该做什么?”
这个问题来自于对过去50年左右计算领域各种进展的评论。
其他一些与会者请我把这个问题作为一个问题向整个论坛提出。
这里的基本思想不是抨击事物的现状,而是试图理解提出基本新思想和原则的过程。
我认为我们在大多数计算领域都需要真正的新想法,我想知道最近已经完成的任何重要而有力的想法。如果我们真的找不到他们,那么我们应该问“为什么?”和“我们应该做什么?”
当前回答
我相信单元测试、TDD和持续集成是1980年之后的重大发明。
其他回答
我相信单元测试、TDD和持续集成是1980年之后的重大发明。
我想说的是Linux和“越坏越好”哲学的具体化,但你也可以说这些都是更古老的。 所以我会说:量子,化学,肽,dna和膜计算,(重新)以一种非特别的方式和自动化,方面,泛型编程,一些类型的类型推断,一些类型的测试,
我们没有新想法的原因是:专利(这来自60年代末…),公司和教育。
为了开始思考这个问题,我需要一个关于“创新”意味着什么的模型。
我所见过的最好的模式是技术采用生命周期。你可以在这篇维基百科文章中得到一个概述。
利用这个模型,我开始问自己……软件本身处于生命周期的哪个阶段?我们可以把“软件”看作一种不同于机械的技术,这种技术可以一直追溯到巴贝奇(Babbage),或者更准确地说,追溯到Ada Lovelace夫人。
但至少在1951年之前,它仍然处于非常早期的开拓阶段。这是编程计算机“商业化”的一年,即销售计算机产品的模型,并制造大量该模型的单元。我在想Univac卖给人口普查局的那台机器。
从1951年到1985年,软件创新层出不穷。它们主要与将计算的范围扩展到一个更广阔的领域有关。与此同时,大规模营销和大规模生产不断降低入门成本,直到苹果和IBM-PC使可编程设备成为一种常见的设备。
在1980年到1985年之间,我认为软件从创新者的领域过渡到“早期大多数”领域。抱歉,伙计们,这让你们所有参与MS-DOS, Mac, Windows, c++和Java早期的大多数人而不是创新者。这并不妨碍你在自己的领域和自己的项目中进行重大创新。这只是意味着这个领域本身已经从最早的阶段发展起来了。
虽然互联网的前身早在20世纪70年代就出现了,但直到阿尔·戈尔发明了互联网(抱歉),人们才开始上网。在那个阶段,软件从早期的主流变成了晚期的主流。正如钟形曲线的顶部所显示的那样,这种变化是微妙的。并非所有商店都同时从早期多数转变为晚期多数。
我认为软件还没有完全进入“落后”阶段,但我认为真正的创新者正在解决今天在不同战线上产生进步的问题。
我能想到的两个方面是生物工程和信息设备。这两个领域都需要软件,但主要的推力不是软件创新。它正在把软件应用到未知领域。可能还有很多我不知道的战线。
现代阴影语言和现代gpu的流行。
GPU也是一个低成本的并行超级计算机,拥有CUDA和OpenCL等工具,可以快速编写高级并行代码。感谢所有在那里压低这些越来越令人印象深刻的硬件奇迹价格的玩家。在接下来的五年里,我希望每台售出的新电脑(iphone也一样)都能基本具备运行大量并行代码的能力,就像24位彩色或32位保护模式一样。
当然,1980年以前是施乐PARC的辉煌时期。在图形用户界面、鼠标、激光打印机、互联网和个人电脑刚刚诞生的时候。(鉴于我太年轻了,不可能活在那个年代,而你几乎在努力发明所有这些东西,关于1980年的事情,我不能告诉你任何你不知道的事情,所以我们继续吧。)
The thing is, though, that the pre-1980 days were a lot more vibrant in terms of truly disruptive new technologies. That's the way it is with any new field -- hwo many game-changing technology advances have you seen in railroads in the past 100 years? How many have you seen in lightbulbs? In the printing press? Once something ignites a hype in the right circles, there is an explosive period of invention, followed by a long period of maturing. After that, you're not going to see the same kind of completely radical changes again UNLESS the basic circumstances change.
幸运的是,这可能会发生在一些领域,而且已经发生在其他一些领域:
Mobility - smart phones bring computing to a truly portable platform, which will soon include location-based services and proximity-based ad-hoc networks. It's a completely new paradigm that's potentially as game-changing as the GUI has been The WWW (HTTP, HTML and DNS) has already been mentioned and is an obvious addition to the list, since it is enabling global, inexpensive, mainstream rich communication across the globe - all thanks to a computing platform On the interface side, both touch, multitouch (Jeff Han comes to mind) and the Wiimote need mentioning. Currently, they are basically curiosities, but so were the early GUIs. OOP design patterns -- higher level solutions as best practices to hard problems. Depending on your definition of 'computing', it may or may not belong on the list, but if you count OOP as a significant advance pre-1980 (I certainly do), I think design patterns and the GoF deserve a mention too Google's PageRank and MapReduce algorithms - I am pleased to notice I wasn't the first to mention them, and seriously --- where would the world be without the principles of both of them? I vividly remember what the world looked like before them, and suffice it to say Google really IS my friend. Non-volatile memory -- it's on the hardware side, but it is going to play a significant role in the future of computing - making bootup times a thing of the past, for example, and enabling us to use computers in entirely new ways Semantic (natural language) search / analysis / classification / translation... We're not quite there yet, but companies like Powerset give the impression that we're on the brink. On that note, intelligent HTMs should be on this list as well. I am yet another believer in Jeff Hawkins' model and approach, and if it works, it will mean a complete redefinition of what computers can do, what it means to be human, and where the world can go from here. Creating a real intelligence in that way (synthetically) would be bigger than anything the human race has accomplished before. GNU + Linux 3D printing / rapid prototyping (and, in time, manufacturing) P2P (which also lead to VoIP etc.) E-ink, once the technologies mature a bit more RFID might belong on the list, but the verdict is still out on that one Quantum Computing is the most obvious element on the list, except we still haven't been able to get enough qubits to play along. However, my friends in the field tell me there's incredible progress going on even as we speak, so I'm holding my breath for that one. And finally, I want to mention a personal favourite: distributed intelligence, or its other name: artificial artificial intelligence. The idea of connecting a huge number of people in a network and allowing them access to the combined minds of everyone else through some form of question answering interface. It's been done a number of times recently, with Yahoo Answers, Askville, Amazon Mechanical Turk, and so on, but in my mind, those are all missing the mark by a LOT... much like the many implementations of distributed hypertext that came before Tim Berners-Lee's HTML, or the many web crawlers before Google. Seriously -- someone needs to build an search interface into 'the hive mind' to blow everyone else out of the water. IMHO - it is only a matter of time.