It's a reasonably safe bet that neither was the case, although it's unlikely we'll ever know. Freeman - reputed to be a gregarious man who would happily explain his ideas to those who wanted to learn - died in 1989.
But his invention of the field programmable gate array lives on and now, decades after its birth, FPGA has become the technological acronym of choice for financial firms.
Findings such as those found in the Automated Trader annual survey - which for the past couple of years has indicated increasing interest - suggest that both the sell side and the buy side have been busy investing in this area and plan to do a lot more.
So far, relatively high costs, ease-of-use issues and the general foreignness of the technology are all factors that have kept the market wary. As a result, the majority of trading firms have not yet adopted the technology.
But over time, FPGA proponents expect they will. There are powerful forces pushing in that direction, from the shift by exchanges to use higher speed Ethernet to the general 'race to zero latency' movement to the increasing complexity of trading models.
Those forces are only gaining speed while costs have been easing and could, according to some, fall further.
Matt Dangerfield, chief technology officer at low-latency solutions group Fixnetix, says the move towards FPGA is only beginning.
"It's still in its infancy, definitely," he said.
"There are so many misconceptions out there… and it's generated by a lack of understanding, and that lack of understanding and that noise generates the fear which then generates people pushing back."
Educating the masses
Dangerfield is an FPGA evangelist.
"There are so many misconceptions out there," he said, rattling off a list of the things he hears about FPGA: how it's difficult to program, how mistakes once made cannot be undone, how it can't be manipulated in real time.
"All this is nonsense. And it's generated by a lack of understanding, and that lack of understanding and that noise generates the fear which then generates people pushing back."
Like Freeman once did, Dangerfield sees his role as something of an educator. "The fear is lack of education, so I'm trying to work with Altera and BittWare and various other people to basically go on the world tour and beat people down about FPGA."
Altera is one of the main chip makers, while BittWare takes those chips and builds boards with them. Those boards end up in FPGA solutions that Fixnetix provides for clients.
Only a few years ago, hardly anyone in the markets was talking about FPGA.
Ron Huizen, vice president of systems and solutions at BittWare, said the change came when financial exchanges moved to 10 gigabit Ethernet.
"When they were running 1 gigabit Ethernet feeds, your standard CPUs could keep up," Huizen said. "But as soon as all the exchanges started switching to 10 gigabit Ethernet, all of a sudden the latency and the determinism went out the window."
John Melonakos, chief executive of software and services firm AccelerEyes, pointed to a general acceptance of heterogeneous computing as a factor driving FPGA adoption. "The fundamentals of the FPGA value proposition haven't changed in 10, 15, 20 years," Melonakos said, noting he had learned about the technology when he was at university.
FPGAs in action, illustration by Fixnetix
"The whole concept of using additional hardware in addition to CPUs to get the job done is much more accepted now," he said.
"They [those in the FPGA industry] are not the ones making the big changes right now in the market, but they're benefiting from the change of heterogeneous computing and the acceptance perceived by the average scientist, engineer or financial analyst."
Nowadays, whenever Melonakos goes to a conference, he says everyone is talking about accelerators. "I would say definitely in the financial space that it's picking up speed," he said. "When we started six years ago people had no clue." He told of his experience on a panel in Chicago that ended up becoming an impromptu classroom session for the audience to learn what exactly FPGA technology was.
Dangerfield of Fixnetix speaks of a similar shift in the market's mindset. "It was hard in the early days to talk to people about because it was just too mind-blowing for the industry to get their head around," he said.
Workflow with FPGA technology, illustration by Fixnetix
Even now, Dangerfield said, ideas such as the absence of an operating system or a kernel are foreign concepts. He described how, in an effort to market the technology a few years ago, Fixnetix booked a private suite at a conference so that he and a colleague could spend 12 hours a day educating people.
"We did this I think three days on the trot," he said. "From that point it's got stronger and stronger."
What has sparked much of the excitement in the past couple of years is the dramatic improvement in terms of performance. And that means not only latency, but also jitter.
"At the end of the day, the value proposition is that we provide deterministic latency - no jitter - and we provide a gain of somewhere between five, 10, 20 microseconds, even more during bursts. So there is a dollar value usually that customers will associate with that," said Nicolas Karonis, Asia-Pacific managing director for Enyx.
Want an indication of how much FPGA technology has piqued people's interest? Check out Altera's YouTube channel. The company has loaded more than 90 videos and has attracted more than 150,000 views. That's a long way away from "Gangnam Style", but it's a sizeable number for such a specialist subject.
Altera is one of a handful of big makers of FPGA chips. Another is Xlinix, the company that Freeman founded. FPGA technology is used in a wide variety of industries, from telecommunications to the military to medical sciences.
In Altera's latest video, a pair of the firm's scientists shows off a new 20 nanometre transceiver capable of handling 32 gigabits per second. In the demo, the system generated less than 10 picoseconds of total jitter.
To put that in perspective, silicon which is about the size of a couple of cell membranes was able to contain circuitry that processed 32 billion bits of information in a second, with less than 10 trillionths of a second of jitter. And to put that last number in perspective, light itself can only travel about 30 millimetres in that space of time.
"In essence, as far as technical requirements go, yes, you can essentially say that zero jitter is valid," Karonis said. "There is no exchange right now that, on the best FPGA board with the best design, can overrun an FPGA market data solution that's properly designed."
That zero jitter argument is a powerful one because, as Karonis says, the trick to making money is not just to have the lowest latency but to be there when it counts, such as when a major economic indicator is released: "When there's a number and everybody's desperately trying to figure out what the number means and massively flooding the exchanges."
At times like that, the firms whose systems have virtually no jitter are the ones that will get the fills.
Huizen explained what was happening for firms that did not use FPGA systems, when the exchanges moved to 10 gigabit Ethernet.
"Imagine that you used to have a hose that could have so much water coming out of it and you keep up," he said. "Now you've got a giant hose, like a fire hose, but instead of it constantly streaming a small flow, there's big bursts of water every once in a while. And the CPU, when it gets overwhelmed, is when there's a burst of data on the 10 gig feed."
In a normal situation, a CPU-only system might be able to react in 10 or 20 microseconds, but in this scenario it would be flooded and could take 500 microseconds.
As exchanges move to 40 gigabit Ethernet, the issue is expected to only get bigger. A few exchanges have already started in this direction.
"The fundamentals of the FPGA value proposition haven't changed in 10, 15, 20 years"
But moving to FPGA solutions can be a costly transition, requiring expensive hardware, a good deal of development time and considerable licensing fees. Even for those prepared to make the investment to shave microseconds off processing speeds, there are lingering suspicions - which people such as Dangerfield are trying to dispel - about how agile a company can be once it takes the FPGA plunge.
One of the biggest expenses is the development time. One industry figure estimated that for every month of coding in hardware, a firm needed two to three months of testing. And the salaries for an FPGA programmer are equivalent to those of a seasoned C to C++ programmer.
On the hardware side, FPGA cards can cost thousands of dollars. There are small, inexpensive FPGAs, but these would not be able to cope with what trading firms require. "It's almost like with processors. You can buy very cheap processors - the ones in your cell phone are extremely cheap," Huizen said.
"In general as a technology, FPGA has come down significantly over time and now you can buy small FPGAs for $5, $10, $20. But there are still the big giant FPGAs that cost thousands of dollars, just for the chip. So there's a whole range," Huizen said.
FPGA costs are expected to decline further, but it will take time.
"In this particular space, it is still fairly specialised in the number of boards that are being bought with FPGAs on them," Huizen of BittWare said. "If it gets to the point that it's a commodity, then perhaps the pricing will come down. But right now the volume's not there to justify that."
On top of all that, there is the chassis, power and a lab environment. Finally, there are licences, which can run into hundreds of thousands of dollars. Those licences are for the firms that update and maintain the FPGA boards with market protocols and developments.
But beyond the cost, there is the perception that it can be unwieldy to program for FPGAs. For many firms that is a bigger sticking point.
While vendors dispute the idea that FPGA technology is inflexible, they do acknowledge that it takes time to develop. So, the next Holy Grail is to make it much easier to use the technology.
Rapid Addition is one company that has just launched a hybrid-FPGA product, which is based on FPGA technology but behaves like the company's FIX engine software. The product is so new that at the time of writing it had not been given it a name.
"In general as a technology, FPGA has come down significantly over time and now you can buy small FPGAs for $5, $10, $20. But there are still the big giant FPGAs that cost thousands of dollars"
"People can programme it in Java, they can programme it in C, they programme it in .NET, and can do what they want with the application logic," said Kevin Houstoun, the company's chairman. "If they're using our software at the moment they'd be at about 15 microseconds from tick coming in to order going out. With the card, they'll be sub-five, with the hybrid solution," he said.
"But the thing that we do feel that people will benefit from is being able to get a lot of the advantages of FPGA, but without the requirement to programme it in Verilog or VHDL, because both of those are complicated languages."
A hybrid product such as this may not be in the same sub-microsecond territory that pure FPGA is in, but it can take a load off the rest of the system and improve performance that way.
"It ends up in user space with a 2-1/2 microsecond journey at the moment, from wire to user space, in a binary format," Houstoun said. "So you see far less cache contention, so all of your L2 cache is available through application. You don't have all of that parsing occupying the CPU. Cache misses are one of the main reasons applications run slowly, and removing the FIX processing from the CPU helps reduce these."
Houstoun said most FPGA products are based on the idea of moving an entire business function onto FPGA. But he argued that that could mean loss of flexibility.
"You're much more constrained in terms of what you can do compared to a software solution," he said. "Our card tries to combine some of the performance of an FPGA with the flexibility of software solutions."
"So, if you've got a guy who writes the algo in Java and he needs to change the algo, he can change the algo in 20 minutes [with the new product]. But he can deploy it with most of the performance of an FPGA, whereas if he tries to put the algo onto the FPGA, it'll probably take him three months and when he needs to change it, it could take him a week or two."
Financial professionals ultimately want to have their cake and eat it too. They want the ease of use and the sub-microsecond speed. Karonis of Enyx says that will be the end game.
"At the end of the day, what the trader wants is to have an Excel-like trading platform that is nanosecond capable," he said.
Karonis said that is still a few years away. "You will probably have the most simple models that can be ported to chips and that can be controlled by software, but that is already more or less possible. So the game will be about putting the more ambitious projects on."
In the meantime, Melonakos said he considers a variety of factors in any cost-benefit analysis. Performance is a clear one, but there are others that are less obvious. Programmability is important because the development time tends to represent such a large chunk of the overall cost.
Another factor is scalability. For instance, Melonakos notes that in some situations, a firm might get great speed at up to 10 nodes but the benefit may not scale up after a certain number of nodes.
Board diagram from BittWare
Portability is also important. Can a developer write something once and run it on a variety of types of hardware? The push towards OpenCL - or Open Computing Language - helps on this front. Huizen said the OpenCL movement is key for FPGA because it opens the way for more software programmers to be able to write for the chips. For a hardware engineer, FPGA languages are not considered such a big deal, but that's not the case for software developers.
Finally there is what Melonakos called 'community'. This is where a firm has to consider the overall viability of a language, system or platform.
"Thank you, SEC"
The adoption, perhaps expectedly given the cost, has been led by the banks and brokers.
"At the moment, the most activity is coming from the sell side to offer up to the buy side," Dangerfield of Fixnetix said. "A lot of the execution desks get beaten up on the fact that they're not fast enough," he said.
Hedge funds, he said, are also interested for obvious reasons, and exchanges are excited about the technology because of the implications for exchange gateways.
But there is another big driver beyond the pure performance factors.
Firms like Fixnetix, Enyx and BittWare can thank the Securities & Exchange Commission for Rule 15c3-5. As the SEC says, this requirement effectively eliminated the practice of naked access to an exchange or alternative trading venue.
Under the rule, broker dealers need to maintain a risk management system that can prevent orders from being entered that exceeded pre-set credit or capital thresholds or that appeared to be erroneous, non-compliant or otherwise restricted.
If broker dealers are to achieve those requirements while still maintaining speed, FPGA technology becomes important.
As in many areas of financial technology, the US markets are currently leading the way.
"The US markets are very mature. That's where we need to explain much less than anywhere else what is the value proposition," Karonis said. "I would say that outside of the US it's not a greenfield but it's not mature yet and there are plenty of opportunities."
Looking at Asia, which he manages, Karonis said there were definitely business cases for South Korea and Australia, while Hong Kong markets were seeing platform upgrades as well.
For Dangerfield, the business case is limitless.
"I don't see why, in the next two to three years, all execution management systems are not within the FPGA world," he said, noting the need for regulatory compliance.