Bitfusion Launches Beta Program
Latest News
November 17, 2015
Bitfusion, a developer of optimization technologies for high-performance, compute intensive applications, has launched Boost Beta. The program is a software-acceleration layer that automatically accelerates compute-intensive applications. It offers performance improvements for Blender, RNumPy, Octave, MATLAB, FFmpeg, Torch and more.
Boost Beta lets users automatically upgrade application performance without purchasing new hardware. To do so, it implements custom-compiled and pre-optimized open-source libraries for applications in data analytics, machine learning, rendering and scientific computing.
“So many startups and mid-market companies require faster speeds and better performance not just to keep up with the competition, but to get ahead and really flourish in today’s technology-driven economy,” said Subbu Rama, co-founder and CEO of Bitfusion. “Until now, the costs of upgrading to high-performance speeds was prohibitive. Boost Beta is dramatically cutting the price tag for speed and bringing supercomputing performance to the masses.”
The company has also released its Cloud Adaptor, letting users access FPGA (field programming gate array) and GPU (graphics processing unit) hardware in the cloud without extensive integration.
“Prohibitive pricing and delivery models are hindering innovation. Developers shouldn’t be paying for idle time when there’s better, cost-saving technology available to solve the problem,” said Bitfusion co-founder and COO Maciej Bajkowski. “With Adaptor, we are giving new meaning to what ‘bursting’ to a high-performance hardware cloud means.”
For more information, visit Bitfusion.
Sources: Press materials received from the company and additional information gleaned from the company’s website.
Subscribe to our FREE magazine,
FREE email newsletters or both!Latest News
About the Author
DE EditorsDE’s editors contribute news and new product announcements to Digital Engineering.
Press releases may be sent to them via [email protected].