AuDA begins final round of DNSSEC testing before its approval
Share on Twitter.
Get the most reliable SMTP service for your business. You wished you got it sooner!
April 24, 2014
AuDA wants to introduce DNSSEC into the Australian domain name space, signing the .au domain in its production environment as the first step in a 4-month
.au Domain Administration Ltd (auDA) is the governing authority and industry self-regulatory body for the .au domain segment in Australia.
DNSSEC has been possible for years, but has been held back by industry inertia. Under DNSSEC, a DNS (domain name system) record is
signed, allowing resolvers to authenticate the relationship between domain names and IP addresses where they are hosted.
But the slowly evolving rollout has gathered some small momentum in response to the increasing use of DNS as an attack vector (for example, via redirections).
In 2013, Google began validating DNSSEC records in its public DNS resolvers. The issue for the typical system admin is that DNSSEC is
needed all the way up the chain, from their own site back to the root zone, meaning that the AuDA rollout is a vital step in the deployment
of the protocol for .au domains.
AuDA explains that it has taken a cautious approach over the last 1 1/2 year because the protocol introduces a new level of risk for
registry operators. DNSSEC requires the inclusion of cryptographic keys in the DNS and at times frequent editing of a zone file. This
level of interaction and the complexity of cryptographic keys largely increase the risk of error during a zone change or update.
A DNS error made to a signed zone can cause a zone to appear offline or bogus to validating resolvers, the organisation writes.
Right now, the body says, the signed .au zone is simply experimental. Over the next four months, the group plans to use the signed
domain to finish testing its own processes for supporting signed domains, including production load tests, testing signing events, and
helping second-level domain owners add their own signed records into the .au zone.
The plan is that on August 28, 2014, AuDA will submit its record to IANA, and DNSSEC will then be available for .au domain owners.
In other IT news
The OpenPOWER Consortium was formed by IBM a year ago, at a time when Big Blue and other IT companies were seeing their hardware
divisions cut down by a serious drop in spending from enterprise clients.
The drop in sales was also attributed to low-cost servers that could still be customized by low-cost manufacturers in Asia.
Also, Intel's x86 architecture continued to dominate the market in both typical servers and high-performance computing, putting
alternate architecture providers like Oracle, IBM and, to a lesser extent HP, in a very tough position.
So the question is, how should IBM keep its POWER chips alive and guarantee them a larger market in a changing world? Big Blue's answer
to this weird situation was OpenPOWER, which seeks to do for its chip architecture what British company ARM's licensing model did for its
eponymous chips, causing them to become the fundamental technology to the vast majority of the world's phones and tablets.
IBM is seeking with OpenPOWER to do to 'hyperscale' servers what ARM did to phones, and in doing so create itself a huge stream of
low-margin revenue that it can rely upon in years to come.
And although no one has said it so far, a helpful side effect is that this may cut down Intel's large business in huge data
IBM's hope is that by licensing the innards of its POWER chips to companies like Google, Canonical, Nvidia, Tyan and Suzhou PowerCore Technology,
it may be able to create new markets for the chip beyond Big Blue's traditional mainframes and high-end enterprise systems.
The OpenPOWER Consortium is, in many ways, where the guerrilla development approach of open source meets the expensive, complex
world of chip hardware.
IBM and its partners are betting that the architecture is good enough to meet their expectations. Giving the enthusiastic mood
that existed throughout the press conference, it was only natural that Intel would point out some of the possible drawbacks of the
"The OpenPOWER Foundation may hope to someday create an open solution, but it also faces a complex multi-year effort to establish
an ecosystem around the design, manufacturing and software," an Intel spokesperson said.
"Most data centers today run on Intel and we are not slowing down. Businesses recognize the value and there is a
large and growing x86 ecosystem (established over many years) that isn't going away. Creating an ecosystem is not an easy feat and
could take several years and a significant investment of time and money in porting architectures," he added.
By comparison, Intel competitor ARM was much more upbeat about the whole matter. "Across the server market, even within non-volume
servers, users are ready to move beyond the 'one size fits all' approach for servers and OpenPOWER is further validation of this as
well as the ARM business model," an ARM spokesperson said.
"Server customers are demanding choice and differentiation which is why ARM and its partners are already well underway with our work
to move the volume server market beyond the limitations associated with a proprietary architecture."
With OpenPOWER's 26 partners ranging from equipment makers to rich potential customers like Google, the scheme has a chance of
working. Maybe ARM has a new potential partner in its plan to pull Intel's chips out of the biggest data centers? We shall see.
In other IT news
Scientists and university researchers have built DNA genomes for many years, but applying what we already know about genetics
to everyday medicine is a difficult and rather daunting task.
Crafting treatments from genes is so complex that IBM recently entered a partnership to get its Watson supercomputer
learning to help the medical profession tailor personalised treatments for cancer.
Part of the issue that researchers want to solve is gene expression. In all the complexities of how genes interact, what
interactions are expressed in a physical trait? Whether that trait is blue eyes, or why one individual dies of a cancer that's
arrested in someone else.
What's needed is a method to accurately predict gene expression, and one angle of the research is based on RNA sequencing
The problem is that analysing RNA sequencing is a very slow process, and that's where the research out of Carnegie-Mellon
University and the University of Maryland comes in.
Their so-called Sailfish algorithm dramatically accelerates estimates of the likely outputs of RNA sequence. To explain why
this is so important, the researchers' release says-- “Though an organism's genetic makeup is static, the activity of individual
genes varies greatly over time, making gene expression an important factor in understanding how various organisms work and what
occurs during disease processes. Gene activity can't be measured directly, but can be inferred by monitoring RNA, the molecules
that carry information from the genes for producing proteins and other cellular activities.”
But analysing the RNA-seq reads (short sequences of RNA) traditionally results in huge datasets that have to be mapped back to
their original genetic processes.
The Sailfish algorithm completely skips this painstaking mapping step, thereby increasing the speed of the process by a wide
Instead, the university researchers found they could allocate parts of the reads to different types of RNA molecules, much as
if each read acted as several votes for one molecule or another.
Think of it as upvoting posts in a forum-- individual votes bestow a kind of consensus on which reads or posts carry the greatest
Source: IBM Corp.
Get the most dependable SMTP server for your company. You will congratulate yourself!
Share on Twitter.
Need to know more about the cloud? Sign up for your free Cloud Hosting White Paper.