Select News

The news in this category has been selected by us because we thought it would be interestingto hard core cluster geeks. Of course, you don't have to be a cluster geek to read the news stories.

HPC500 Members Get In-Depth Look at 2014 HPC Software Environment Market Data

Addison Snell of Intersect360 Research presented the firm’s findings on HPC Software Environments in 2014 to the members of the HPC500, an elite user group of organizations that represent a worldwide, diverse collection of established HPC professionals from a cross-section of academic, government, and commercial organizations.

The presentation drew on findings from several different reports. Intersect360 Research recently released two HPC User Site Census reports, one on applications and one on middleware. The reports examined the supplier, products, and primary usage of the application software and middleware reported at a number of sites over the previous year. The presentation also included findings from the research firm’s annual HPC user budget map, which tracks spending patterns and shows a percent of spending by category, as well as from a special study on “The Big Data Opportunity for HPC,” which surveyed both HPC and non-HPC enterprises (through a partnership with Gabriel Consulting) on Big Data applications and infrastructures.

Read more: HPC500 Members Get In-Depth Look at 2014 HPC Software Environment Market Data

How Fast Can You Use a Petabyte of Storage?

A petabyte of space should be enough for anybody

Quick, you just got a petabyte of storage, how fast can you fill it? And once it is full how fast can you dump it. If you estimated anything more than 14 hours, then you are too slow. Way too slow. By combining well designed hardware and BeeGFS (f.k.a. FhGFS) you get the Scalable Informatics FastPath Unison storage appliance. A big chunk of storage that delivers 20 gigabytes/sec (sustained). You can learn a more details from the interview on Inside HP.

Solving the Industry HPC Challange

Free HPC secrets! Your key to success!

The Council on Competitiveness recently released Solve. The Exascale Effect: the Benefits of Supercomputing Investment for U.S. Industry (pdf). As the federal government pursues exascale computing to achieve national security and science missions, Solve examines how U.S.-based companies also benefit from leading edge computation and new technologies first funded by government.

Read more: Solving the Industry HPC Challange

In Case You Were Wondering: Apache Spark, Hadoop, and HPC

Moving beyond MapReduce, if that is your cup of tea

Recently Hadoop distribution provider Hortonworks, announced it was Extending Spark on YARN for Enterprise Hadoop. If you are not up to speed in the Hadoop world there are a few points of interest for HPC in this announcement. First, while Hadoop version 1 was synonymous with MapReduce, Hadoop version 2 has "demoted" MapReduce to an application framework that runs under the YARN (Yet Another Resource Scheduler) resource manager. Thus, Hadoop version 2 has opened up a Hadoop Cluster to many more applications other than MapReduce.

Read more: In Case You Were Wondering: Apache Spark, Hadoop, and HPC

Update and Help on "Shellshock" BASH Vulnerability

Keep Calm and Patch On

The BASH vulnerability has taken everyone by surprise -- much like finding the wheels of your 20-year old bike can easily fall off. For Red Had based distributions, you can follow the progress at Bash Code Injection Vulnerability via Specially Crafted Environment Variables (CVE-2014-6271, CVE-2014-7169). There were two updates to BASH for Red Hat Based systems over the last few days. It is safe to say the extent of the vulnerability is still not fully known and exploitation vectors are still being investigated. While web servers can immediately benefit from the work of the community, home and small office routers may also be at risk. Obviously fixing BASH is the best approach to reduce the risk. The current, least vulnerable, version is:

bash.x86_64 0:4.1.2-15.el6_5.2

and should be available at most repositories by now. A good discussion and latest un-official patch of the previous and new issues can be found on Google security researcher Michal "lcamtuf" Zalewski blog. A test to check for the latest vulnerabilities is the following line:

foo='() { echo not patched; }' bash -c foo

If you run this on your systems, and get "echo not patched," then you are at risk. If it shows "command not found", you have the latest patch. Of course other measures such as making sure cgi_module is not loaded by Apache are a good idea in any case (Unless you are using cgi scripts, which is not a good idea!). Other mitigation strategies are offered by Red Hat.

Update: This site is a list of new exploits to try. The current version of Bash, mentioned above, seems to hold up against these issues.

Search

Login And Newsletter

Create an account to access exclusive content, comment on articles, and receive our newsletters.

Feedburner

Share The Bananas


Creative Commons License
©2005-2012 Copyright Seagrove LLC, Some rights reserved. Except where otherwise noted, this site is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 2.5 License. The Cluster Monkey Logo and Monkey Character are Trademarks of Seagrove LLC.