EA Going Zynga Way

EA and Zynga that recently wrapped up the legal trouble between them but it is ironic that EA is taking cues from Zynga.

EA was mad about Zynga was copying The Sims for its Facebook game The Ville, but apparently it found something admirable, perhaps even respectable, about Zynga’s business practices because now it’s duplicating its moves.

Instead of charging a flat fee for the game, EA has decided to make some its games “freemium game”. Best example is  Real Racing 3 where EA decided to make it a freemium game. Players can only do so much before their cars crap out on them, needing regular maintenance and repairs. People then have to pay for these improvements. Of course paying for them is only the first step. They also have to wait for the repairs. It’s like someone’s playing a Zynga game. Pay up and you can go about your way. If you don’t, then I hope you have some other apps to kill time while you wait for your virtual car’s oil change. It’s a hallmark of social network gaming, a feature employed often in Zynga’s FarmVille and YoVille.

This strategy is in line with EA’s Chief Financial Officer’s view that micro transactions will be incorporated into all future games. Analysts believe that EA is going the Zynga way and it is a wrong turn to take. Zynga rode the micro transaction and social highway, but people got wise with shares plummeting. Hope EA does not go the same way!!!

Source: “EA games taking cues from Zynga”, Technology Tell, 2013

Digital Media Predictions for 2013

Major predictions in the Digital Media space in 2013:

  • Companies will try and maximize their online advertising revenue
  • More newspaper chains shuts down, Online video kills TV
  • “Branded content” will help create content that delights both marketers and readers alike
  • Moving digital contents across subscribers will remain a problem
  • More acceptance of HTML5-based “web apps”
  • Tough times ahead of New York Times Co. with declining ad revenue
  • Ad money spend moves from TV to online media.

What your take on the below predictions. Comment your views below.

Source: “What we’ll see in 2013 in digital media”, GigaOm, 2012.

 

Open Data and the Move to a Smart Government Transformation

The U.S. CTO Todd Park and CIO Steven VanRoekel announced a new initiative within the government to open up data that was previously locked up in government documents and arcane backend systems. This will allow developers to create new applications and services based on that data and decrease inefficiency in government.

The digital road map brings in the following five ideas:

  • Open Data as the new default
  • Anywhere, anytime on any device
  • Everything should be an API
  • Make government data social
  • Change the meaning of social participation

The major five projects announced as part of the initiative include launch of a portal called MyGov and 20% Campaign, introduction of RFP-EZ program, Blue Button app development program, and Open data for access from new industries, including energy, education, non-profits, and safety.

Analysis –

The new digital strategy released on 5/23/2012 will drive a more efficient and coordinated delivery of federal services on mobile devices. In reality this would mean expansion in public access to government data, from healthcare and education to energy and public safety, which the federal administration hopes will boost jobs along the way by encouraging innovation. There will also be projects that are aimed at making information readable in digital formats so they can be incorporated into other information systems and services. The updated plan the federal agencies have also include trying to identify Big Data policies and programs related to large collections of data and analytics to extract more value from the data. The challenges in large data management and analytics will force government organizations to explore investments in Big Data technologies. The real challenges as well as opportunity for government and IT service providers are to leverage the Big Data technology and approaches through appropriate plans to transition from an open to a smart government transformation.

How Cloud Computing is Changing Disaster Recovery

This post is an excellent article written by  Mike Klein, President & COO, Online Tech on titled “Disaster Recovery in Cloud Computing”.

There are a lot of benefits with cloud computing – cost-effective resource use, rapid provisioning, scalability and elasticity. One of the most significant advantages to cloud computing is how it changes disaster recovery, making it more cost-effective and lowering the bar for enterprises to deploy comprehensive DR plans for their entire IT infrastructure. Cloud Computing delivers faster recovery times and multi-site availability at a fraction of the cost of conventional disaster recovery.

What Changes in the Cloud?

Cloud computing, based on virtualization, takes a very different approach to disaster recovery. With virtualization, the entire server, including the operating system, applications, patches and data is encapsulated into a single software bundle or virtual server. This entire virtual server can be copied or backed up to an offsite data center and spun up on a virtual host in a matter of minutes.

Since the virtual server is hardware independent, the operating system, applications, patches and data can be safely and accurately transferred from one data center to a second data center without the burden of reloading each component of the server.   This can dramatically reduce recovery times compared to conventional (non-virtualized) disaster recovery approaches where servers need to be loaded with the OS and application software and patched to the last configuration used in production before the data can be restored.

The cloud shifts the disaster recovery tradeoff curve to the left, as shown below.  With cloud computing (as represented by the red arrow), disaster recovery becomes much more cost-effective with significantly faster recovery times.

Cloud Computing Disaster RecoveryCloud Computing and Disaster Recovery

When introduced with the cost-effectiveness of online backup between data centers, tape backup no longer makes sense in the cloud. The cost-effectiveness and recovery speed of online, offsite backup makes it difficult to justify tape backup.

The cloud makes cold site disaster recovery antiquated. With cloud computing, warm site disaster recovery becomes a very cost-effective option where backups of critical servers can be spun up in minutes on a shared or private cloud host platform.

With SAN-to-SAN replication between sites, hot site DR with very short recovery times also becomes a much more attractive, cost-effective option. This is a capability that was rarely delivered with conventional DR systems due to the cost and testing challenges. One of the most exciting capabilities of disaster recovery in the cloud is the ability to deliver multi-site availability.  SAN replication not only provides rapid failover to the disaster recovery site, but also the capability to return to the production site when the DR test or disaster event is over.

One of the added benefits of disaster recovery with cloud computing is the ability to finely tune the costs and performance for the DR platform. Applications and servers that are deemed less critical in a disaster can be tuned down with less resources, while assuring that the most critical applications get  the resources they need to keep the business running through the disaster.

Critical Path in Disaster Recovery – Networking

With the sea change in IT disaster recovery delivered by cloud computing, network replication becomes the critical path. With fast server recovery at an offsite data center, the critical path for a disaster recovery operation is replicating the production network at the DR site including IP address mapping, firewall rules & VLAN configuration.

Smart data center operators are providing full disaster recovery services that not only replicate the servers between data centers, but also replicate the entire network configuration in a way that recovers the network as quickly as the backed up cloud servers.

More on DR in the Cloud

I predict we’re going to hear much more about the changes in DR strategies with the cloud over the next year as more and more enterprises revisit their DR plan in light of the advantages of cloud hosting.

Source: http://resource.onlinetech.com/disaster-recovery-in-cloud-computing/

Cloud computing visionary John McCarthy dies at 84

Not many would be knowing John McCarthy, but he is one guy who came close to proposing something close to the cloud computing model. McCarthy has been credited with many innovations in the field of technology and more importantly as the inventor of the LISP language. Also, credited to his achievement is the contributions he made towards artificial intelligence and even credited with coining the term.

McCarthy was the first person to propose the idea that computing can be delivered as a utility. Even though there has been much debate about the actual evolution of the idea about utility computing with Douglas Parkhill coming up with the theme in his book “The Challenge of the Computer Utility” in 1966.

With cloud computing now gaining high momentum across organizations, the contributions that John McCarthy has given to the computing world is worth of high praise and admiration.

Cloud Computing Standards – Evolving

Cloud Computing is at a relatively early stage of development and the lack of standardization is a major barrier for its increased adoptions. Some of the organizations who have undertaken the effort to standardize cloud computing are listed below:

1. The Green Grid – The Green Grid is a non-profit, open industry consortium of end-users, policy-makers, technology providers, facility architects, and utility companies collaborating to improve the resource efficiency of data centers and business computing ecosystems. Its aim is to create standards for more efficient use of resources.

2. Cloud Security Alliance – Is a not-for-profit organization with a mission to promote the use of best practices for providing security assurance within Cloud Computing, and to provide education on the uses of Cloud Computing to help secure all other forms of computing.

3. The IEEE (Institute of Electrical and Electronics Engineers) Standards Association – Is a leading consensus building organization that nurtures, develops and advances global technologies. This year, the organization launched an initiative to develop cloud computing standards. More information can be obtained from this link.

4. Distributed Management Task Force – Is an industry groups whose mission is to enable more effective management of millions of IT systems worldwide by bringing the IT industry together to collaborate on the development, validation and promotion of systems management standards. They have created a Cloud Management Working Group to develop a set of standards to improve cloud management interoperability between service providers and their consumers and developers.

5. National Institute of Standards and Technology – Is a non-regulatory federal agency whose mission is to promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology. NIST has started a program to develop a set of cloud computing standards, with the first results being already published – NIST Cloud Computing Program.

 

Data Center Power Usage Reducing – 2010

A new study by Jonathan G. Koomey, Ph.D. found that data center power consumption increased by 36 percent from 2005 to 2010, a much smaller increase than the 100 percent gain projected in an influential study Koomey prepared in 2007.

Reasons for moderating pace of data center energy use cited are:

  • Lower than Expected Adoption of Servers
  • Lower Infrastructure Spending due to Economic Slowdown
  • Increased Use of Virtualization

Full report can be accessed though this link: Koomey – Data Center Electricity Use 2005 to 2010

GoDaddy Enters Cloud Infrastructure-as-a-Service (IaaS)

The Cloud Infrastructure-as-a-Service (IaaS) is the delivery of IT and computer architecture over the Internet. The service involves the application of remote computers (operating systems, databases, middleware, and applications) and storage solutions to aid cloud applications. Cloud IaaS is the capability provided to the consumer to IT and computer architecture over the Internet. The end consumers are able to access these resources through the internet and use the infrastructure resources to meet organizational infrastructural requirements. The entire deployment and maintenance of the IT resources are done by the cloud IaaS vendors. At the same time the end-users through the vendor specific control panel can manage the resources by monitoring as well as scaling the resources as and when the requirements arises. Also, the increased adoption of hybrid cloud model by organizations is accelerating the Cloud IaaS market.

There are many vendors in the Cloud IaaS market, with the biggest player in the market being Amazon. The market has recently seen the entry of a new player – GoDaddy.com. This company has been known for its Domain Registration, Web Hosting and awesome Super Bowl ads. The company is expected to release its cloud computing capabilities to take Amazon and Rackspace head-on.

GoDaddy’s Cloud Service –

The Cloud IaaS offering of GoDaddy is on the lines of Infrastructure-as-a-Service offering for Data Center-On-Demand. This is good news for many end users looking to leverage their data center capabilities. The offering of GoDaddy will focus mainly to deliver large IT infrastructure resources so that organizations can utilize the resources to establish organizational IT infrastructure. The infrastructure resources will be deployed and managed by the GoDaddy while the end-users have the liberty to use the service on a pay-per-use model. Also adding to the service portfolio will be its services for load balancing, multiple networks and template creation for its cloud environments.

Also of now the Go Daddy’s Data Center-On-Demand is limited to invite-only release as the company builds out its cloud infrastructure. The official release of the service is expected to be towards the end of July 2011.

GoDaddy Cloud IaaS Service Offerings –

1. Economy Package –

Price – start at $49.99 per month

Offering – one server with 1 GB of RAM and 40 GB of storage, unlimited inbound bandwidth, 100 GB of outbound bandwidth per month; and additional a la cart resources

2. Ultimate Package –

Price – $279.99 per month

Offering – six servers — three with 2 GB of RAM and 40 GB of storage and three with 1 GB of RAM and 40 GB of storage, unlimited inbound bandwidth, 100 GB of outbound bandwidth per month and additional resources a la carte

GoDaddy’s Cloud IaaS Market Entry Analysis –

Go Daddy’s entry into the cloud infrastructure space comes at a time when Amazon Web Services and Rackspace are dominating the market. Also, the Cloud IaaS and Cloud PaaS service are witnessing an integrated service portfolio. So what the future unfolds for GoDaddy in the coming months will be very interesting to watch for couple of reasons. If the company succeeds in gaining the Cloud IaaS market much will be expected of both the Cloud Computing as well as the Data Center market. First and foremost many new vendors will be expected to enter the cloud space particularly the other Web Hosting companies much like the way many Telecom Companies in the US market are entering the Cloud space. Secondly, the service price package of GoDaddy’s Cloud IaaS is very competitive compared to the other leading vendors like Amazon and Rackspace. So the market can expect a reduction in price in the coming months which will be good news for the Cloud IaaS market considering that the market is dominated by 2 or 3 players alone.

Containerized & Modular Data Center Solutions – The Way Forward

The overall Data Center market is witnessing an unprecedented growth and the area of containerization and modularity of data centers is witnessing increased interest across the globe. The market looks very dynamic post the recession and the increasing need for organizations to establish more data centers. The positive sign the market is witnessing is also due the explosion of the cloud computing market. More and more vendors are entering the cloud computing arena, both existing as well as new, with all trying for a piece of the $121.1 billion cloud market by FY2015. All these trends are together pushing the data center market further. With additional requirements for data center capabilities comes the need for increased scalability of the data center resources. This is where containerization and modularity of data centers score in the overall data center market. The questions that arise are which direction is the market moving. Will there be more adoption of containerization or modularity?

Many data center vendors are already providing the containerized data center solution with vendors like Cisco re-entering the market with its advanced solution. The need for containerized data centers evolved with the need for faster deployment of data center capabilities within very limited time period. The containerized data center solution has the capability to establish data center infrastructure within a span of 80 to 120 days. With growing need for additional data center capabilities due to the explosion in network traffic, many organizations are looking at containerized data centers as an ideal solution to scale the existing data center infrastructure. On the other hand there are areas where the containerized data centers are the best solution for establishing data center infrastructure like space research, military, and formula 1 racing. In all these above sectors, mobility of the data centers is a vital aspect for adoption. Even though, the containerized data centers are being designed for extreme weather environments, most use cases are implementing the solution within existing organizational infrastructure. Also, the containerized data centers are designed to be TIA-942 Tier 3 capable with particular focus being given to achieve a PUE of 1.3 or lower.

The modular based data centers are data centers which are designed to achieve extreme levels of performance. For this the overall design technique of data centers forms a vital part of the whole solution. The other aspect of the modular data center is the ability to reduce capital cost by standardization of data center components. The interesting fact in the modular data centers is the two different approaches being adopted by many data center vendors. Large data center solution providers are focused more on providing a complete data center solution, while other have truly conceived the modular concept by focusing on providing specify data center modules and even to the extent of rack levels. The approach to provide a comprehensive data center solution via the modular model is the best strategy to adopt for leading data center vendors. Since the modular data centers are designed in such a way that over a period of time there will be additional requirements for data center resources. The modular data centers which are designed for easy scalability provides an ideal platform in this regard apart from the ability to design better energy efficiency in data centers.

To an extent it is fair to say that the modular data center evolved from the containerized data center design concept. The data center market will witness many vendors providing both containerized as well as modular data center solutions. For data center vendors the main challenge is to develop capabilities in the containerized data center solution due to the limitation in space and the extent of product portfolio. For major vendors the ability to provide containerized data center solutions will provide an easier path to achieving modular data center capabilities. In short the containerization of data centers will be a capability while the modularity will be the ultimate solution that will transform the data center landscape in the coming years.