When it comes to the future impact of virtualization upon data centers and
networks, we may already have experienced "the shot heard ‘round the
world." In fact, the stage is set for the next wave of the "virtualization
Most IT organizations already recognize the importance of virtual servers.
Cost savings through better utilization of the underlying server hardware is
well documented as are the environmental savings on power, cooling and rack
space. A 2012 survey conducted by Gartner Research found that virtualization
is among the top 10 highest IT technology priorities for CIOs. ("Gartner
Executive Programs' Worldwide Survey of More Than 2,300 CIOs Shows Flat IT
Budgets in 2012, but IT Organizations Must Deliver on Multiple Priorities,"
Many IT organizations are now looking at extending those benefits to the rest
of the enterprise. Virtua... (more)
For a company like Lafarge, the dispatching of trucks and materials is
mission critical. So when the performance of IT applications supporting this
function suffers, it's bad for business.
Based in Paris, France, Lafarge is a provider of building materials, with
more than $15 billion in annual revenue and 68,000 employees in 64 nations.
It specializes in cement, aggregates and concrete. If fleet drivers run into
delays in getting these materials to construction sites, deadlines are missed
and the fallout can get costly. Yet, after launching a server virtualization
project, such concerns surfaced due to a decline in the performance of key
applications for dispatch operations.
Ultimately, however, this setback - or at least its resolution - helped pave
the way for a better way of doing things: a company-wide, virtualized
approach to enterprise resource planning (ERP)... (more)
[This is not really about the Red Sox or pumpkins this Halloween, but how
could I not use those in the title? Go Red Sox]
I left an awful teaser at the end of my article last week. In Brent
Salisbury's original article that triggered some of these additional
virtualization thoughts, he articulated two very clear differences between
native network based L2 virtualization mechanisms and the mechanisms that are
being provided by overlay solutions based mostly in server vSwitch
infrastructure. These two fundamental functions are MAC learning and tunnel
encapsulation. In today's post I will spend a little more time looking at
Outside of logical separation of multiple virtual networks or tenants,
network virtualization allows the amount of attached VLANs, networks and
devices to scale well beyond what a single physical switch can handle. In a
This is the last article of my 3-part blog post series for facilitating
experienced Windows system administrators to get productive on Windows Server
2012 with a keyboard and a mouse as the input device. Part 1 and Part 2 are
focusing on basic user operations with the new Metro UI. To conclude the
series, here are two important facts that one should know when running
Windows Server 2012.
Again, I want to underscore that Windows Server 2012 is designed with cloud
computing in mind, and together with System Center 2012 as the foundation of
Microsoft private cloud solutions. For those who are working for becoming a
Microsoft private cloud expert, it is imperative to master Windows Server
2012 and System Center 2012 to develop technical depth in implementing and
operating of a private cloud.
11. Wireless Support
The same with Windows Server 2008, a default installation ... (more)
CiRBA on Wednesday announced the general availability of a Software License
Control System that enables organizations to reduce the costs associated with
processor-based software licensing by an average of 55%.
CiRBA CTO and co-founder Andrew Hillier noted that "with the shift to data
center class software licensing for virtual infrastructure, where licensing
an entire physical host server allows an unlimited number of instances to be
run, licensing optimization is now becoming a capacity management challenge."
The Software License Control module is an add-on to CiRBA's Capacity Control
Console and optimizes VM placements in virtual and cloud infrastructure in
1. Reduce the number of processors/hosts requiring licenses.
CiRBA determines optimized VM placements to both maximize the density of
licensed components on physical hosts and isolate these licensed VM... (more)
Author: Eric Slack, Senior Analyst, Storage Switzerland
Companies that need to improve application performance, for example, in
server virtualization or VDI environments, frequently come to the conclusion
that flash is the best strategy. SSD technology is becoming the "go to"
solution for enhancing the performance of these kinds of critical production
applications. After realizing that a flash-based solution is warranted, the
next question involves implementation. What is the most cost-effective way to
add solid state flash technology into an existing production environment? For
IT organizations that have multiple primary servers needing this performance
boost, most find themselves looking at network based storage appliances as
they can be easily shared across all servers and applications.
Two alternatives, all-flash arrays and hybrid storage systems, are often
In our ever changing technology environment, new opportunities are
everywhere. Predictive analytics, big data, mobile, cloud and
virtualization are but a few.
These new technologies improve business competitiveness, increase agility,
save money and more.
But migrating from existing systems is risk prone. And efforts often drag
Is There a Way to Migrate Successfully, Faster and with Less Risk?
Most people think of data virtualization as a high productivity, low cost way
to integrate data for business intelligence and analytics. However, it is
also very effective at legacy migration.
The key to data virtualization's migration success is how it decouples source
and consuming applications. This allows legacy migration to be done in a
phased approach within the constraints of existing infrastructure. Said
another way, data virtualization inserts flexible middl... (more)
Improving performance and scale is - or should be - a primary motivator for
those deploying VDI as well as trying to manage the continued growth of BYOD.
Forrester's David Johnson notes a convergence of trends that, on the surface,
appear only tangentially related but as he points out are likely
inter-related and driving desire and need for VDI solutions:
Survey question: What is your level of interest in being allowed to bring
your own PC as your work PC of any type, desktop or laptop? (Employees who
answered they would be willing to pay for some or all of the cost to get the
device of their choice):
Source: Forrsights Workforce Employee Survey, Q4 2012, Q4 2011, Q4 2012.
Sample size = 3284 (2012)
As Mr. Johnson goes on to note, changes in the market drivers for VDI are
likely to increase demand and interest in VDI in the near future. And the
most often searched ... (more)
Let's start at... the beginning. What is the in-memory computing? Kirill
Sheynkman from RTP Ventures gave the following crisp definition which I like
"In-Memory Computing is based on a memory-first principle utilizing
high-performance, integrated, distributed main memory systems to compute and
transact on large-scale data sets in real-time - orders of magnitude faster
than traditional disk-based systems."
The most important part of this definition is "memory-first principle". Let
Memory-first principle (or architecture) refers to a fundamental set of
algorithmic optimizations one can take advantage of when data is stored
mainly in Random Access Memory (RAM) vs. in block-level devices like HDD or
RAM has dramatically different characteristics than block-level devices
including disks, SSDs or Flash-on-PCI-E arrays... (more)
***This is a LIVING document*** I will be updating this article from time to
time as things like Release Updates, Hotfixes, Service Packs and other
updates come into being. Please check back often to get the latest
information. I will keep the revision list updated at the bottom so you know
what changes have been made. This is also *NOT* a complete Private Cloud
solution. there are many, many pieces to creating and managing a private
cloud including things like - rapid deployment via templates, elasticity and
scalability, high availability and redundancy, virtual machine mobility,
automation, service management, usage based charge back and more. i recommend
you check out http://www.microsoft.com/privatecloud to see the full story.
This document will guide you through the process of setting up the bare
minimum components to demo a Private Cloud environment using curre... (more)
This 3-part article details the 12 routines that I consider a Windows Server
2008 user ought to know first to accelerate the learning and adoption of
Windows Server 2012 without the need of a touch device. For those IT
professionals who are working towards becoming private cloud experts, it is
imperative to master Windows Server 2012 which is an essential component in
establishing a private cloud. And the earlier those master Windows Server
2012 platform, the sooner those will be become leaders in the IT
transformation into private cloud computing. There is everything to gain to
start learning Windows Server 2012 now as opposed to later.
The content of this series is based on Windows Server 2012 Beta as of May,
2012. It is intended for those who are familiar with the administration of
Windows Server 2008 (or later) to become comfortable and productive with
Windows ... (more)