Cloud Computing and the E-commerce Industry

The unprecedented growth of the Internet and the phenomenal commercial opportunity it has unveiled, has propelled e-commerce to experience growth rates that have never been witnessed before. The web hosting features and resources required for e-commerce functionality call for uninterrupted performance, reliability, and data security among other factors. This is perhaps why we have all noticed the adoption of cloud computing by the e-commerce industry at an accelerated pace. However, not all cloud hosting service providers deliver services at the same level. This is why it is a good idea to conduct a quick assessment of your needs as an e-commerce enterprise when benchmarking various cloud delivery services and platforms against your needs. We present three important considerations. ecom

Choosing a Cloud Hosting Company

There are several talking points to consider when you feel it is time to take your e-commerce enterprise to the next progressive milestone by transitioning to the cloud. Some of these factors and considerations may not come to you as a surprise because of the frequent press they receive. Others may be interesting enough to peak your curiosity. They are all, however, quite significant and deserve equitable if not equal attention as you plan the migration.

Access Speeds of Cloud Hosting Servers

Experts describe the speed of access to web pages as the single most popular reason why e-commerce companies pursue the cloud migration pathway. Amazon increased its overall gross revenue by 1% for every 100 milliseconds of improvement in the speed with which its flagship website served up web pages. Inferior access speeds typically lead to traffic losses which invariably translate into lost revenue. According to a study conducted by the Aberdeen Group, page load delays cost e-commerce enterprises up to $117 million in lost sales annually. Shopping cart abandonment has also been linked to poor server speeds. Owing to its distributed computing capability, near instant scalability during peak times, and built-in redundancies that literally guarantee a zero downtime performance record, cloud servers maintain consistent access speeds regardless of how much traffic they are experiencing at any given point in time. This functionality is even more critical during the holiday shopping season in November and December each year when e-commerce activity is at an all time high. Many leading edge cloud vendors currently use Tier 3 a+ data centers to deliver optimal performance while fully supporting dedicated e-commerce applications such as shopping carts, inventory management software, CRM, live chat and help desk solutions simultaneously without any compromise to access speeds.

Security of Financial Data

Although traditional web hosting companies provide a level of data security that is industry compliant, cloud service providers have seemingly raised the bar through superior technology and self-governance. You will experience no difficulty in locating a cloud hosting company  is ISO 27001 compliant. Many cloud vendors also achieve SysTrust certification. For credit card and financial data processing in a fully secure and encrypted environment, the cloud hosting company you eventually select should be able to design PCI-DSS solutions. It should use the highest level of SSL encryption available which currently stands at 256 kb encryption. There are several other data security measures employed by most cloud hosting service providers such as biometric screening of personnel, two-factor authentication, and IDS which are now industry standards in the cloud.

The Trust Factor

Customer perception has a great deal to do with e-commerce success. If your enterprise can communicate to your customer community through newsletters and email alerts that they will never experience down time, always receive access to your product pages at consistent speeds, and never have to worry about their privacy and credit card data security, all thanks to state-of-the-art 21st century cloud technologies, chances are high that your customer attrition rates will seldom become a cause for concern all things being equal. Share the credentials of the cloud hosting vendor you eventually select with your customers as yet another confidence and trust building measure. Educate them about the multiple layers of protection the cloud provides them. You are sure to experience positive feedback from your customers sooner than later.

Concluding Thoughts

Managed cloud hosting services provided by qualified vendors allows e-commerce enterprises to focus on their core activities related to the sale of products and services on the Internet while technology related issues and challenges are handled by cloud vendors. Moreover, the pay-as-you-go model facilitates improved levels of resource management and usually generates long-term savings. It is therefore no wonder that the e-commerce industry continues to experience unbridled growth worldwide. In a recent study published by ComScore, Q1 2014 saw desktop e-commerce spending rise 12 percent year-over-year to $56.1 billion, marking the eighteenth consecutive quarter of positive year-over-year growth and fourteenth consecutive quarter of double-digit growth. mCommerce spending on smartphones and tablets added $7.3 billion for the quarter, up 23 percent vs. year ago, for a digital commerce spending total of $63.4 billion in the first quarter of 2014.

Has your e-commerce enterprise finally decided to connect with the cloud? What are some of the other factors you will take into consideration as you plan a cloud migration strategy? We would be very interested in hearing from you about your experience so far through your comments below.

Find out more about StratoGen

 

5 Real Reasons to Migrate to the Cloud in 2014

There are many ways to classify businesses both large and small.  Some based on geographic footprint, location, age, industry, size, products, services etc. However, another way to classify businesses in 2014 which has already gained substantial attention is those that have moved to the cloud and those that have not .

Before embarking on a discussion about the five real reasons why your organization should indeed migrate to the cloud this year, here are some of the popular reasons you should have already considered:

There are other reasons which are equally strong and perhaps more convincing. Use them as talking points during your next board or management meeting and earn the credit for triggering the migration process:

  • Scalability in an instance
  • Flexibility to switch vendors and platforms with minimal notice without cost overuns
  • Resilience and an overwhelming degree of reliability
  • Resource conservation owing to workforce redundancy
  • Increased operational efficiency
  • Absence of wasted capacity, routine server maintenance or daily backup issues
  • Data security and protection from denial of service attacks and spam
  • Statutory compliance when housing and processing medical or government data
  • Guaranteed uptime and SLA
  • Access to the latest licensed software and infrastructure without having to pay for it entirely
  • Overall Cost savings

1. The importance of planning for the future

Planning today will avoid over-deployment tomorrow. You will rarely have to plan for computing resources and risk over-deployment and additional costs once your organization is in the cloud. Cloud hosting lets you scale up or scale down according to demand and pay for only those resources you consume. For instance, if you happen to operate an ecommerce company, you will definitely need to scale up for major shopping days during the festival season such as Black Friday, Cyber Monday and the entire Christmas season. Regardless of whether you choose a SaaS or IaaS service provider or a cloud computing company that combines both, you will never risk over-budgeting your resources to play it safe.

2. Improved cross-team collaboration

Because of the ‘anytime anywhere” critical coverage the cloud provides, even members of small company teams such as application developers, sales reps and others can collaborate with one another from remote locations without compromising data security and privacy. Small and medium sized businesses no longer have to invest in expensive VPNs or Intranets and maintain them.

3. Flawless Disaster Recovery

It is common practice not to dwell too much on a discussion of potential disasters regardless of whether it is related to our personal lives or to our professional endeavors. Nevertheless, disasters do strike and at times when we least expect them. When it comes to our data resources, our websites and our operations, the key issue is not what caused the disaster and the resulting down time but how soon our servers will be back up again. With cloud hosting, such possibilities are negligible because expert teams are set up by cloud service providers and are on watch 24X7 waiting for a data disaster to occur so that they can fix it instantly. Disaster recovery as a service or DRaaS is an integral component of the cloud hosting service model and will always be factored into your SLA by your cloud service provider.

4. Focus on Core IT and Innovation

Now that your IT teams no longer have to address primary server and maintenance issues, you can now deploy them in new spheres that are directly related to your core business. Examples could include upgrading to a more robust and feature rich email management system, help desk software, RFID inventory management, and new payroll and accounting processes.

5. 2 Tier Operation

If you migrate to the cloud, one of the recommendations your cloud computing company will present is to suggest that you replicate your on-the-ground infrastructure in the cloud as a real time backup in addition to using cloud resources exclusively. This is excellent advice to follow because in the unlikely event of your office premises being flooded, contaminated or even destroyed, your entire workforce can function either from home or a hotel suite without knowing the difference. If the disruption is caused due to a natural catastrophe such as a hurricane, your clients will appreciate your ability to function with zero down time. Your enterprise will undoubtedly gain an edge over your competition. This is a significant vote for the cloud which is often implied but rarely discussed at length.

A study conducted by Forrester Research of 600 respondents belonging to the information technology sector from twelve large enterprises that had migrated from on premise desktop applications to the cloud experienced a risk adjusted ROI of 307% with a break-even attained in just 7 months. The migration provided an average savings of approximately $21 per user per month.

While resource conservation does play a role in the decision-making process, there are other grounds on which to initiate a healthy internal debate. Has your organization decided to peak into the cloud yet? Have you found other reasons to migrate to the cloud that add more punch to the argument? We would love to hear from you through your comments.

cloud vs no cloud

Deploying Hadoop in the Virtualized the Cloud

Apache Hadoop is a distributed file system for storing large amounts of data cross multiple commodity servers. It is said to store both unstructured and structured data, which is true, but you can use Apache Pig and Apache Hive to write a schema around this data to give it structure. That makes it something you can query. Otherwise it would not be of much use, yes?

Hadoop data is stored in a Hadoop Cluster. A Hadoop Cluster is the single name node plus multiple data nodes that make up the Hadoop Distributed File System (HDFS).  The namenodes keep track of what data is located on which virtual machine.  The datanodes are responsible for writing the files there.  Datanodes also run the batch jobs that retrieve data from the Hadoop Cluster when the user executes a query.

Hadoop queries and gathers using the batch jobs: MapReduce, Pig, Hive, plus other tools.  These are Hadoop tasks that run in parallel, thus giving the boost in performance of a distributed storage scheme over having one big server, like some kind of UNIX mainframe.

MapReduce jobs crawl across the Hadoop Distributed File System (HDFS) to obtain a subset of the data (i.e. Reduce) based on the query (i.e. Map).  Pig and Hive do the same thing.  These are tools to allow the developer to write this MapReduce logic using SQL, which is something practically every developer already knows.  To use this against unstructured data, the developer writes a scheme that describes the different types of data in Hadoop (logs, database extracts, Excel files, and other).  These use regular expressions to split strings of text into their correspond fields which can they be queried using SQL.

Hadoop uses replication to provide fault tolerance.  But how does one use Hadoop in a virtualized cloud environment?  There the vCD (Virtual Cloud Director) user might not have access to the vSphere configuration that spells out what virtual machine is assigned to which SAN LUNs and which blade chassis slot.

Why is this an issue?  Hadoop by default makes 3 copies of each data block.  Hadoop is rack-aware.  The Hadoop data dispersal algorithm copies these data blocks onto different storage medium in a manner designed to provide data redundancy, plus it takes into consideration in which rack is each physical server is located to provide additional data protection.

With vCD riding on top of vCenter, the customer does not have direct access to the vCenter details.  So, in the worst case, multiple virtual machines could all be on the same or nearly the same rack and their data stored on the same LUN (a logical partition of one physical drive).  Stratogen knows about this and configures vCenter to provide the required redundancy.  But part of the responsibility of doing that falls on VMware, which is what the Stratogen cloud uses.

VMware is aware of this issue and has been working since 2012 to address that and provide a tool for deploying Hadoop in VMware. First, they launched the open-source Apache Serengeti project, which is a tool that makes deploying Hadoop clusters across multiple virtual machines easier. Second, VMware has dedicated programmers and architects to the Apache Hadoop community to contribute changes to VMware to “enhance the support for failure and locality topologies by making Hadoop virtualization-aware.”

VMware summarizes the description of what they are doing and have done with the Apache Hadoop project (I fixed their grammar mistakes.  They are great engineers, but need a copy editor.)

The current Hadoop network topology (described in some previous issues like: Hadoop-692) works well in classic three-tier networks… However, it does not take into account other failure models or changes in the infrastructure that can affect network bandwidth efficiency like virtualization.

A virtualized platform has the following genes that shouldn’t been ignored by Hadoop topology in scheduling tasks, placing replicas, doing balancing or fetching blocks for reading:

1. VMs on the same physical host are affected by the same hardware failure. In order to match the reliability of a physical deployment, replication of data across two virtual machines on the same host should be avoided.

2. The network between VMs on the same physical host has higher throughput and lower latency and does not consume any physical switch bandwidth.

Thus, we propose to make Hadoop network topology extendable and introduce a new level in the hierarchical topology, a node group level, which maps well onto an infrastructure that is based on a virtualized environment.

As you can see, the goal is to make Hadoop network-aware to boost performance by adding a node group level.

VMware Hadoop Project Serengeti

Serengeti is a tool that lets the Hadoop administrators deploy and set up a Hadoop cluster in an easier fashion than using Hadoop tools natively.  Some of what Serengeti does is:

  • Tune Hadoop configuration
  • Define storage (i.e., local or shared)
  • Provide extensions to give Hive access to SQL databases
  • Enable VMware vMotion for moving clusters with machines
  • Provide additional control over HDFS clusters

VMware Hadoop Project Spring

Another VMware project is Apache Spring.  Spring is an open-source umbrella of projects.  For example, the Spring Framework provides lets developers model relationships between Java classes using XML so that objects can be instantiated in configuration files instead of given explicitly given in Java code. It also handles things like transactions.

The Spring Hadoop project lets programmers do various tasks like written Java code to do Hadoop tasks instead of using the Hadoop command line. It also extends the Spring Batch framework to manage the workflow of Hadoop batch jobs like MapReduce, Pig, and Hive.  Spring provide data access objects (Think of JDBC or ODBC.) to HBase data.  HBase is a way to turn Hadoop into something similar to a relational database by providing random read write access to the data there. Remember that Hadoop is not one file, like a database, but a collection of files, each of which could be of different types. So HBase is an abstraction layer of that as is Hadoop itself.

Find out more: http://www.stratogen.net/products/hadoop-hosting.html