If you listen to the persistent murmur in the market surrounding the Internet
of Things right now, you'd believe that it's all about sensors. Sensors and
big data. Sensors that monitor everything from entertainment habits to health
status to more mundane environmental data about your home and office.
to a certain degree this is accurate. The Internet of Things comprises, well,
things. But the question that must be asked - and is being asked in some
circles - is not only where that data ends up but how organizations are going
to analyze it and, more importantly, monetize it.
But there's yet another question that needs to be asked and answered - soon.
Assuming these things are talking to applications (whether they reside in the
cloud or in the corporate data center) and vice-versa, there must be some way
to identify them - and the people to whom they belong.
There is a... (more)
Inarguably one of the drivers of software-defined architectures (cloud, SDDC,
and SDN) as well as movements like DevOps is the complexity inherent in
today's data center networks. For years now we've added applications and
services, and responded to new threats and requirements from the business
with new boxes and new capabilities. All of them cobbled together using
traditional networking principles that adhere to providing reliability and
scale through redundancy.
The result is complex, hard to manage, and even more difficult to change at a
Emerging architectur... (more)
One of the primary principles of object-oriented programming (OOP) is
encapsulation. Encapsulation is the way in which the state of an object is
protected from manipulation in a way that is not consistent with the way the
variable is intended to be manipulated. The variable (state) is made private,
that is to say only the object itself can change it directly. Think of it as
the difference between an automatic transmission and a standard (stick). With
the latter, I can change gears whenever I see fit. The problem is that when I
see fit may not be appropriate to the way in which th... (more)
Despite the hype and drama surrounding the HTTP 2.0 effort, the latest
version of the ubiquitous HTTP protocol is not just a marketing term. It's a
real, live IETF standard that is scheduled to "go live" in November (2014).
And it changes everything.
There are a lot of performance enhancing related changes in the HTTP 2.0
specification including multiplexing and header compression. These are not to
be overlooked as minimal updates as they significantly improve performance,
particularly for clients connecting over a mobile network. Header
compression, for example, minimizes the re... (more)
Go ahead. Name a cloud environment that doesn't include load balancing as the
key enabler of elastic scalability. I've got coffee... so it's good, take
Exactly. Load balancing - whether implemented as traditional high
availability pairs or clustering - provides the means by which applications
(and infrastructure, in many cases) scale horizontally. It is load balancing
that is at the heart of elastic scalability models, and that provides a means
to ensure availability and even improve performance of applications.
But simple load balancing alone isn't enough. Too many ... (more)