Transactional Memory: An Idea Ahead of Its Time

Nearly 20 years ago, two Brown University computer scientists were working on a largely theoretical problem: How could multiple parallel processors make changes to shared resources safely and efficiently? Their proposal — transactional memory — is sparking fresh interest as a new generation of processors seeks improved power and speed.

In 1993, Maurice Herlihy and a colleague published a paper on transactional memory — a new, clever tactic in computing to deal with handling shared revisions to information seamlessly and concurrently. Few noticed.

Nearly 20 years later, transactional memory is an idea that’s now the rage in hardware computing, and Herlihy, computer science professor at Brown University, has morphed into a prophet of sorts, a computing pioneer who was far ahead of his time. Intel recently announced that transactional memory will be included in its mainstream “Haswell” hardware architecture by next year. IBM has adopted transactional memory in the Blue Gene/Q supercomputer. The original paper by Herlihy and Eliot Moss has been cited more than 1,300 times.

“At the time, we thought (transactional memory) was really cool, but it was hard to attract attention to it — it was this specialized corner,” Herlihy said. “We weren’t necessarily geniuses ahead of our time, because it solved a problem that didn’t exist yet.”

Maurice Herlihy. “We always said that parallel machines with more than one processor were going to be important someday, but nobody knew when it would happen.” Image credit: Mike Cohea/Brown University

The problem, as Herlihy explained it, was that core processors — the engines that program instructions — were changing in fundamental ways. As computers, along with their components got smaller and smaller, a single processor was unable to scale in size and still handle the data-lifting requirements. But multiple processors presented their own challenges: how to coordinate information sharing and revisions, in parallel and in real time.

At the time, processing was ruled by a series of locks, a kingdom where one thread held a key to the lock at any one time. This ensured that no other thread could pick the lock — modify a shared resource at the same time — but it also dragged down the transactional operating speed and efficiency.

“The trick was figuring out how to do this in a more efficient way,” Herlihy said.

Herlihy, who had just joined the Brown faculty from a Cambridge, Mass., company called Digital Equipment Corp., and Moss, a computer scientist at the University of Massachusetts–Amherst, sought to design a system that would be more flexible. Their solution was simple and elegant. The pair dreamed up a system of requests and permissions, whereby operations are begun and logged in, but wholesale changes, or transactions, are not made before the system checks to be sure no other thread has suggested changes to the pending transaction as well. If no other changes have been requested, the transaction is consummated; if there is another change request, the transaction is aborted, and the threads start anew.

“We always said that parallel machines with more than one processor were going to be important someday, but nobody knew when it would happen,” Herlihy said. “We were surprised that it did happen as we thought it would.”

Transactional memory has been lighting up the computing technology chattersphere in recent months. On the well-regarded Ars Technica site, a writer called transactional memory “a promising technique designed to make the creation of multithreaded programs easier.” Intel advertises its form of transactional memory as “hardware (that) can determine dynamically whether threads need to serialize through lock-protected critical sections, and perform serialization only when required. This lets the processor expose and exploit concurrency that would otherwise be hidden due to dynamically unnecessary synchronization.”

Although it was a long time coming, the buzz around transactional memory is “very satisfying,” Herlihy said, modestly. “This shows the importance of long-range research.”

– By Richard Lewis

*Source: Brown University

(Visited 24 times, 1 visits today)