Get the lowest-cost and the best server colocation service in the business. Learn more.
Information Technology News.

Cray wins a $174 million contract with the U.S. military

Share on Twitter.

Install your server in Sun Hosting's modern colocation center in Montreal. Get all the details by clicking here.

Do it right this time. Click here and we will take good care of you!

Click here to order our special clearance dedicated servers.

Get the most reliable SMTP service for your business. You wished you got it sooner!

July 14, 2014

Cray said earlier this morning that it has won a $174 million contract to supply a new supercomputer to the National Nuclear Security Administration to watch over its arsenal of nuclear facilities throughout the United States.

To be sure, the Cray XC supercomputer to be provided under the agreement will be hooked up to the company's Sonexion storage system.

Called “Trinity”, the powerful system is expected to have more than 8 times the capacity of the NNSA's current supercomputer, a Cray XE-6 unit dubbed Cielo, which the TOP 500 list says has 107,152 cores and a theoretical peak performance of just over 1028 Terra Flops.

Cray says that Trinity is a joint venture between “the New Mexico Alliance for Computing at Extreme Scale (ACES) at the Los Alamos National Laboratory and Sandia National Laboratories as part of the NNSA Advanced Simulation and Computing Program (ASC)”.

To be installed at Los Alamos, the Trinity supercomputer will be based on Intel Xeon Haswell processors, as well as the upcoming “Knights Landing” Xeon powerful Phi processors.

The storage system will start at 82 PB of capacity with a design throughput of 1.7 TB per second.

The computer's main task will be to conduct simulations of the U.S. nuke stockpile, helping it to understand how its weapons are holding up as they age, while avoiding the need for underground detonations of devices.

Cray says the new system will test “the stockpile’s safety, security, reliability and performance.”

In other IT news

IBM said earlier this morning that it's working on porting its mainframe platform onto the cloud, even as it continues to invest money into the rollout of its SoftLayer offerings.

SoftLayer, with its thirteen data centers, was acquired by IBM last year as the IT giant needs to accelerate its own public cloud presence.

To be fair, SoftLayer's European coverage has been sparse so far, with a data centre in Amsterdam and a single London point of presence.

But IBM is pouring $1.2 billion into turning SoftLayer’s cloud around, with another 15 data centres slated to break the company out of its North American heartland.

SoftLayer sales manager and co-founder Steve Canale said its data centres follow a fairly set pattern-- each consists of four pods, housing between 3,500 to 4,000 cores and hitting around 10,000 square feet.

They are scaled up, which presumably means that IBM will be holding its breath to see how quickly it fills the first room.

In a statement, Softlayer said the London centre will house more than 15,000 physical servers. Once a centre is full, Softlayer simply builds another.

In fact, none of these will be IBM servers by the way. IBM is selling off its x86 server business to Chinese giant Lenovo, and even if it wasn’t, Softlayer has a long-standing relationship with whitebox x86 server vendor Supermicro.

But there is still some hope for those who'd like to see IBM’s veteran brands scrambling onto the cloud, in the shape of its Power architecture and mainframe technology as well as its Watson AI platform.

Doug Clark, UKI cloud computing leader at IBM, confirmed at a briefing on the Chessington site, “We’re looking at Linux on mainframe as a platform.”

While there might be some nostalgia value in having “mainframe” technology offered via the cloud, opinion might be split on exactly how much value this offers to modern companies who haven't grown up with the big iron.

The president of the Open Data Centre Alliance, Correy Voo, recently advised against trying to “lift” legacy applications onto the cloud, and said the sort of constraints under which legacy systems had been built were alien to the new generation of technology executives.

But given some of the data centre consolidation programs being floated by major companies today, it seems possible that at least some would still appreciate the sheer transactional heft and robustness of an on-tap mainframe.

In other IT news

There appears to be a lot of people out there that are confused and uncertain as to which version of Windows should they choose for their main work computer, and for good reason.

News have already started to propagate about the next version of Windows 8.1, namely update 2 and even Windows 9 for that matter. So what will Microsoft call it anyway?

Whether or not it bears the name Windows 9 or something else, the next major wave of updates, codenamed Threshold is due to land on our desktops and mobiles later in 2014. At any rate, we will try to help you with that confusion.

The upcoming refresh will probably see lots of new features aimed at the pre-Windows 8 – ie, the tradition desktop – crowd, who are more familiar with input using a mouse and keyboard instead of touch.

Threshold will have a Start menu, the ability to run Metro-style-Windows-Store apps in the normal Windows desktop and offer users the chance to shut off the Metro screen.

Before all that, in April, we had the Windows 8.1 Update 1, which can be seen as a step by Microsoft to convince enterprises that the operating system is really enterprise ready.

But not everybody is convinced that it is. The update doesn’t offer any real new features, but rather certain enhancements for the traditional desktop user – giving them a working experience similar to what they are used to.

If you’re using Windows 8.1 on a tablet or a touch-enabled device, you won't spot much in the way of difference.

But before we go any further, let’s be honest here. For people in IT who are already very familiar with Windows, Windows 8 has taken some considerable adjustment.

Switching between new-style apps and traditional desktop apps can be very confusing, and it takes some time to figure it all out, which can lead to a loss of productivity.

Traditional Windows functions are either missing or so hidden down that it takes three people to figure them out.

The last thing enterprise IT system admins want to do is deploy a new operating system that confuses workers.

For instance, what does Update 1 do? On the Surface Pro, it didn’t even provide a wow factor. At first, you don’t even notice when it comes to look and feel. However, if you look closely, you'll notice that there is a Power button on the Start page! What?

One of our biggest pet peeves with Windows 8 and Windows Server 2012 was how to shut down the system. Seriously Microsoft, why did you make it so difficult to shut it down?

Shutting down and restarting a computer is a very basic function that should be easy to find and shouldn’t take more than 60 seconds or less to locate.

Prior to Update 1, powering down your computer or laptop required you to move the mouse to the lower left-hand corner of your screen, slowly hovering over that corner, right-clicking the Start button or pressing Windows key and X on your keyboard, then clicking on "Shut down" or "Sign out" in the case of Windows Server 2012.

Now the Power button is clearly on display in the Start Menu. Something like this shouldn’t have been made so hard to do.

Click here to order the best dedicated server and at a great price.

Another change was the addition of the Search function on the Start page. Just like the Power button feature, the Search function makes finding software so much easier than before.

On a non-Update 1 system, searching meant you had to swipe your finger in from the right edge of the screen to start the process, or hover your mouse over the lower-right hand corner of the screen and then move the cursor up to the Search box to type in your query. AH!

In other IT news

Intel, Samsung, Dell and Broadcom among others, have founded a new group, called the 'Open Interconnect Consortium' to promote new standards that will help the development of the 'Internet of Things'.

The new consortium says it “will seek to define a common communication framework based on industry standard technologies to wirelessly connect and intelligently manage the flow of information among devices, regardless of form factor, operating system or service provider.”

The new consortium also says “It is our intention to create a specification and an open source project that will allow interoperability for all types of devices and operating systems.”

So far, we've only got the statement that says “Additional technical details will be announced at a later time” for now.

The launch comes just six days after Microsoft pledged itself to the Allseen Alliance, another group that says its aim is “To enable widespread adoption and help accelerate the development and evolution of an interoperable peer connectivity and communications framework based on AllJoyn for devices and applications in the Internet of Everything.”

So the question is, are the two groups different? The Open Interconnect Consortium (OIC) offers the following description of its differences with rivals-- “Today, there are multiple forums driving different approaches to solve the challenge of IoT connectivity and interoperability. Currently, we don’t see just one single effort that addresses all the necessary requirements. The companies involved in OIC believe that secure and reliable device discovery and connectivity is a foundational capability to enable IoT. The companies also believe that a common, interoperable approach is essential, and that both a standard and open source implementation are the best route to enable scale.”

That statement seems a little odd as the Allseen Alliance's spiel says its “Members are collaborating on a universal software framework, based on AllJoyn open source code, that allows devices to autonomously discover and interact with nearby products regardless of their underlying proprietary technology or communications protocols.”

Source: Cray Inc.

Get the most dependable SMTP server for your company. You will congratulate yourself!

Share on Twitter.

IT News Archives | Site Search | Advertise on IT Direction | Contact | Home

All logos, trade marks or service marks on this site are the property of their respective owners.

Sponsored by Sure Mail™, Avantex and
by Montreal Server Colocation.

       © IT Direction. All rights reserved.