Building very-large-scale parallel computing structures requires simulation and design-cost analysis whose associated optimizations are MegaHard problems (yes, I'm re-branding NP-Hard). A Software-as-a-Service (SaaS) provider of vendor-neutral simulation, synthesis, mapping, and place-and-route tools could change the electronic design automation industry substantially.
If a supercomputer architecture could perform the MegaHard compiler tasks associated with compiling parallel systems in less time than the current tools (within seconds instead of minutes/hours) many designers would gladly pay for access. If the supercomputer uses distributed algorithms, then many people may offer spare cycles to do parts of your MegaHard jobs. Most developers using such a tool would gladly lend their idle cycles to perform graph permutations and compute cost functions.
So let's motivate and create a business model for the tentatively-named, Megahard Corp.
Megahard Corp. (pronounced "Mega Hardcore") has the opposite business model of Microsoft Corp. Instead of selling closed source software and tools as licensed applications, Megahard provides open-source hardware IP and development tools as a service.
The writing is on the wall: we are transitioning from personal-computing, single-core desktop world to a shared-computing, multicore/FPGA embedded world. Yet the complexity of parallel system designs and the computational strain on tools to compile them is growing faster than the performance of a practical desktop computer. The decision for EDA firms to adopt the Application Service Provider (ASP) model probably made sense at some point in the last millennium: design assumptions are much different when you are connected to a massive parallel computer. Because current tools take much, much longer to compile than the file transfer time, there is potential to improve the compiler market by providing electronic design automation tools online.
So here's the business plan: build a supercomputer and code it to run a bunch of FPGA/multicore compilers and simulators really well. Since it's a supercomputer it should make the problem I/O bound. We can tell people we've got this supercomputer that will take in source and spit out configurations in about as long as it takes for you to upload the diff source and download the diff configuration. Since we have a really big supercomputer with lots of hardware all over the place, you can also target your application to some of our bored metal and provide your own service under our framework. (Sassafras is a good name for a SaaS Framework and can have the leaf-shaped logo)
Since we're thoroughly indoctrinated, we'll build our system as Free software and use all the existing open source compiler architecture too. Being open source means our tool won't be made obsolete by an eventual open source standard (read that again so you get it). Open source frameworks allow capable users to contribute to the product development: and wouldn't you know it, the entire EDA market happens to have degrees in computer science, electrical engineering, mathematics and physics.
There's also that recurring comp.arch.fpga thread complaining about the current tools, the lack of open source tools, and the lack of documentation of the configuration format and interface -- someone must appeal to these people because they have a real point: technology is inhibited by monopolized knowledge. It's harder for features to improve when they are part of closed-source packages because you are automatically limiting the number of developers that can improve it (this situation worsens as you lay off more developers).
Another benefit of using a SaaS tool: your users don't have to waste time upgrading. The only way to convince anyone to wait through a software upgrade process is by announcing new features that claim to make the product better: that's why users wait till the second iteration of a major release. SaaS providers can roll-out and test the waters with new features with much higher iteration rate.
When I use the term open source, I mean "free" like freedom, but it's important to point out that previous forays into open source FPGA tools have failed and disappeared because they were merely "open source" and not "free." When I was just starting to care about these sorts of things, GOSPL was being hyped up and eventually was canceled. The code is no where to be found because access to the source was restricted by invite-only membership for some nonsense reason: membership exclusivity maintains the integrity of the organization. When giant corporations disingenuously uses "open source" as a marketing term for non-free software, the project is destined/deserves to fail as long as the so-called "open" source is never actually provided to freedom-loving eyes.
On the other hand, free software, like debits or free hardware IP like opencores won't stop being useful just because some organization stops funding it. Free-software projects never die, they just go unmaintained.
Besides, Megahard Corp. follows a SaaS model, so the profitability is from developing and maintaining a parallel supercomputer system to run the free compiler software, instead of distributing that software. Good free software projects encourage users to "help make it better" instead of "help make us richer," though improving products is a good way to increase sales. A sufficiently better compilation service is probably more profitable than selling licenses too -- I would certainly pay more than what I'm currently paying for proprietary software licenses if I could get access to a supercomputer for FPGA compilation that does it faster.
One major problem: If you aren't Altera, Xilinx, Lattice, Actel, Achronix, Tilera, IBM, AMD, Intel etc. then making low-level Multicore and FPGA tools requires a whole bunch of reverse-engineering and "MegaHard Work" (TM). It's probably technically easier to design an FPGA from the bottom up than to reverse engineer the architecture of an existing chip and build a new tool-chain.
Open-source compilers and reverse-engineered, vendor-neutral FPGA tools have both been successful endeavors in the past. Providing Hardware-as-a-Service is still a "cloudy" future, but there are plenty of recent examples where a vendor came into a market ripe for SaaS-ification and redefined the market. I expect that the MegaHard problems associated with design automation make it ripe for a SaaS provider to change the game.
----
(Edit 9/2) If you find positions contrary to the Open Source EDA SaaS argument, please share them with me. Here's an old interview with Grant Martin, Chief Scientist of Tensilica, who argues that we should not hold our breath for Open Source EDA and specifically says:
I think the other issue with EDA is in terms of the general number of users. I don't think there's a large enough number of users in any particular sub-category of tools to really support very much open source development. Open source requires a big community involvement, plus ancillary things being built around that environment to attract companies into the effort.
EDA is still tool small to make open source the general model. The idea that all tools will be open source, and that all EDA companies would evolve to a service business model, is unlikely and makes no sense.
....
The business incentives just aren't there to motivate those efforts in an open source environment.
I tend to not trust the beliefs of people who were born before the Internet (aka the over-30 crowd). I think he's missing the perhaps non-obvious point that a service business model can subvert the traditional software business model by offering a faster software service with smoother new-feature roll-outs (perhaps when he says "service business model" he thinks RedHat instead of Google Apps). I also know from my prior immersion at CSAIL, that open source software development does NOT require users or a big community to be involved in development, but only requires one indefatigable iconoclast to chip away at status quo for his own personal reasons that are often incongruous with profit motive. When multiple hackers create disjoint useful tools that converge into a single product, user communities start to form to propel that framework further and potentially crack the existing market open.
I wonder how anyone can reject the service business model future for basically ANY computer tool. Seeing such a future no longer requires prescience, just patience. FPGA accelerated computing will increase the number of EDA users, but none of these users will ever identify with the existing EDA market (there's far too much FUD associated with "hardware" development that they traditionally sell: programmable arrays are more than reconfigurable HDL executers).
(end edit)