What do Magnum, P.I., Guns N’ Roses, and mainframe computers have in common? As you probably guessed, they were all hits in the late 1980s. We’ve long moved on from the first two ‘80s wonders, but mainframes are still with us, serving up critical data for business decisions.

Fortunately, we’ve advanced considerably from the days of dedicated terminals talking to expensive hardware in a locked data center. Those advancements weren’t always simple or efficient, however. As we adopted personal computers in our everyday work lives, we needed to develop increasingly complex tools for accessing, analyzing, and presenting mainframe data. The growth of the Internet made life even more complex by requiring middleware to access and convert data for presentation on the web.

Technology consultant Rob Klopp witnessed these changes firsthand. With more than 25 years of experience with big data, Rob is well versed in the evolution of business computing—from mainframes running enterprise software, to client-server architectures, to distributed architectures, and finally to virtualized architectures on multi-core servers.

Prowess Consulting recently put together a short video that presents Rob’s take on the history of business computing. The video shows how technology evolved to become increasingly complex and bloated in order to meet expanding demands. All that changed with the simultaneous development of virtualization technology and increasingly efficient and fast multi-core processors from Intel. Database vendors like SAP are now optimizing software to take advantage of higher memory capacities and multi-core Intel processors to deliver new levels of performance at more affordable prices.

Check out the full story in our video, and let us know how your business is benefitting from multi-core technology for database computing.

Share this:

FacebooktwitterlinkedinmailFacebooktwitterlinkedinmail