Christian Szell: Is it safe? Is it safe?
Babe: You’re talking to me?
Insights on Enterprise Cloud, Hosting, Compliance & Security
Christian Szell: Is it safe? Is it safe?
Babe: You’re talking to me?
You’ve made the decision to move to the cloud, but as with anything, all products aren’t created equal. And like with any complex decision, you need a roadmap.
But let’s start with something important – you need to start to Think Vertical. Many organizations have the responsibility for the compute, storage, data center and network split across manager. Of course, when you only have a few servers in a closet and you’re running a local area network to connect your PCs, it might have been OK. But today it doesn’t make sense. The optimal decisions are totally connected. Let’s say you acquire a new business in Japan. Should you get a high-speed network back to your servers in California? Should you buy a data center cloud service in Japan and put your own servers in there? Or should you connect to a compute & storage cloud service in Singapore?
The next time your IT staff comes to you with a server or storage purchase order and says, “And the price is $1 million,” put on your Jack Nicholson mask, do your best “A Few Good Men” impersonation,” and growl, “Is that the truth? I don’t think so because, you can’t handle the truth.”
The truth is the cost of that hardware is not $1 million. Oh, sure, it’s the one time purchase price, but just like application software, that’s just the beginning of the cost.
Ten years ago, Nicholas Carr wrote a paper entitled “IT Doesn’t Matter” published in the Harvard Business Review. He might not have realized the far-reaching effects but in many IT shops, and with many senior executives, it signaled a shift from focusing on compute, storage, data centers and networks to applications. This also coincided with the rise of enterprise applications and, as a result, CIOs spend a lot of time discussing packaged applications, integration, and implementations, resulting in the treatment of the fundamental engine of their business as a commodity. But in most companies, packaged applications represent less than 20% of the overall footprint.
Recent changes to the HIPAA Rules through the HIPAA Omnibus Final Rule, may affect the way healthcare professionals do business. The changes, which became effective March 26, 2013, now apply the Security Rule not only to covered entities but also to business associates of covered entities and subcontractors of business associates. This means that any organization involved with electronic protected health information (EPHI) must have and follow a well-written information security policy with established practices and guidelines that protect this EPHI from falling into the wrong hands. Failure to comply with the HIPAA Rules could result in fines up to $1.5 million for all violations of an identical provision in a calendar year.
The First Annual Big Cloud Event took place in Minneapolis, MN in June. Layered Tech was a sponsor of this event, and I delivered a presentation on Big Cloud Adoption. This event was billed as the first annual; the second event is already scheduled for March of 2014 in Las Vegas, NV. Although many cloud-related topics were discussed at the event, many discussions focused on cloud adoption, or the lack thereof, for Fortune 1000 companies.
The cloud has become ubiquitous for most of us. To begin with, if you use a smartphone for anything beyond simple dialing and answering calls, you are almost certainly using the cloud. To the point, cloud adoption by large corporations can be hit or miss. The big question to be answered is “why?”
A couple of years ago, the simple answer to why organizations started using the cloud was cost. Now, more organizations are discovering that the cloud may cost a bit less, but more importantly, using the cloud effectively allows you to spend your computing dollars on resources more efficiently, when and where you need them.
Of course, one of the traditional reasons for not using the cloud has been the fear around security in the cloud. Without question, not every cloud solution is able to offer the security and controls that you need for sensitive or controlled data. That being said, there is no fundamental reason why security and control needs cannot be met in a cloud environment. In order to establish how much one should spend on security controls, your baseline will usually start with what is required for compliance with applicable laws, regulations, standards and policies.
Once your baseline has been determined, focus on the true value of what you have, and what you do. This is not as easy as it sounds for many organizations. The reason to do this is to ensure that you never pay more to control a loss than you would have lost if your resource(s) were comprised or completely lost. That sweet spot is your high-water mark for spending on security and controls. When this component of risk management is viewed as such, you can make better decisions about cloud computing and other areas of investments in your organization.
I will be presenting a session on Business-Critical Security and Compliance in the Cloud at the ACA 74th Annual Convention & Expo on July 15th in San Diego, CA. I hope to see you there.
Every customer running revenue-critical business applications should consider adding application performance monitoring and management. For this reason, we have taken our experience deploying application performance monitoring tools for our customers and released a standard managed solution. Our new Application Performance Management (APM) service, powered by AppDynamics, offers a “managed with” model in which we integrate APM with our customers’ managed hosting and cloud service. We handle the deployment, configuration, monitoring and assist clients with utilizing the APM solution.
The more complex your application is, and the higher the rate of change, the more likely your applications will perform poorly or fail.
An effective troubleshooting process today prioritizes uptime over seeking answers, and uses the intelligence and alerting capabilities of a monitoring tool to help you mitigate business impact. When an outage occurs, you can use your application monitoring to quickly identify the problem and come up with a workaround that will get your app back up and running; later, when production is stable, you can consult with your developers and architects about how to improve the application and prevent problems going forward.
Dealing with critical application issues quickly and effectively is key to minimizing the business impact of application failure. However, simply responding to outages is not enough. Smaller problems often go unresolved in applications because operation teams don’t have the time or resources to deal with them. Unfortunately, smaller issues can turn into major outages if they’re left unattended. So, enabling monitoring and identifying issues early is key.
When operations can’t troubleshoot a performance problem themselves, they often call in developers and architects to help. When you add up the man-hours spent fighting fires and the opportunity cost of not working on other projects, this is an expensive way to deal with problems.
To reduce the cost of troubleshooting, operations needs better visibility of their applications. The ops team needs to understand enough about the application that they won’t need to call in the experts to identify root cause. Ideally, ops can locate the root cause of problems quickly, and hand off to development with the right information and context for the fix.
If your development team is agile, then you need to be agile, too. In most organizations the more nimble the development team, the more frequent the incidents. If constant code changes causes your application to perform poorly in production, then your organization isn’t recognizing the benefits of being agile. The solution isn’t to slow down your developers but rather to improve your own processes to better manage change. For some organizations, this means rethinking the way you identify issues.
Monitoring application performance in production can have effects that reach far beyond IT. The data that your monitoring tool collects represents how your users interact with your application. This data can help inform important decisions about performance-tuning projects, user-interface design, and product roadmap. With helpful, actionable data about your end users, you can change the course of your application and shape your organization’s culture.
Production doesn’t have to be a black box – nor should it be. Today’s applications are complex and dynamic, and the techniques we used to manage them in the past no longer work. This means you need the fastest and most effective way to troubleshoot and solve performance problems. The good news is, business application monitoring is now easier than ever. Check out our Layered Tech APM White Paper to find out how we can help you can crack open the black box.
For more information about our Application Performance Management service, please contact us or visit www.layeredtech.com/apm.
About the Author: As Vice President of Product Management at Layered Tech, Kevin Van Mondfrans (@VANMONDFRANS | +Kevin Van Mondfrans) is responsible for driving the Layered Tech portfolio of infrastructure as a service (IaaS) and managed service offerings. With more than 20 years of experience product development and marketing, Kevin has been delivering innovative computing, storage, cloud and service offering with companies such as HP, Dell, and Savvis.
With the prevalence of cybercrime, governments have enacted laws; organizations have established standards; and companies such as Layered Tech have implemented considerable administrative, technical, and physical controls on information security to protect the confidentiality, integrity, and availability of data.
As a data security leader offering PCI-DSS, HIPAA- and FISMA-compliant solutions, Layered Tech has participated in these Safe Harbor frameworks since November 2007 to protect client privacy as mandated by the Safe Harbor Privacy Principles, and to provide assurance that Layered Tech obtains, maintains, and processes personal data in a manner consistent with client expectations of privacy.
The U.S. Department of Commerce, in coordination with the European Commission and the Federal Data Protection and Information Commissioner of Switzerland, developed these frameworks to provide guidance for U.S. organizations on how to provide adequate protection for personal data from the EU and Switzerland as required by the European Union’s Directive on Data Protection and the Swiss Federal Act on Data Protection respectively. In effect, these frameworks bring the methods of privacy governance in the U.S. into accord with that of the E.U. and Switzerland.
For Layered Tech, protection of all data, personal or otherwise, is part and parcel to its brand reputation and continued success in the security and compliance sector of internet cloud and hosting solutions.
About the Author: Terry G. Raitt holds the CISSP certification and is the Policy Enforcement Manager in the Risk Management Group of Layered Tech. His 14 years of experience in the ISP and IT Services industry also includes roles as a network administrator, Linux/Unix system administrator, and technical support manager.
At the 2013 Electronic Transactions Association (ETA) Annual Meeting and Expo in New Orleans recently, I had the opportunity to give a presentation on Hacktivism titled Managing Risk for Online Threats and Hacktivism Actions. Attending these shows allows me to experience a little bit of local culture (the food and the venue were awesome), network with colleagues and learn about what is trending in the cloud space. I’m not surprised that the trending theme this year revolved around mobile technology.
We all know that more and more, summary knowledge of who we are and what we do is often contained in the device in our hand. According to the Pew Research Center’s Internet & American Life Project, smartphone adoption has jumped from 35 percent in 2011 to 56 percent in 2013. Since there is no corresponding rise in computer use or card-present environments, this driver alone should account for the ascending desire for mobile payment technology.
Already, many people are using mobile payment technologies such as Dwolla, LevelUp, Starbucks’ mobile app, EMS+ just to name a few. These data should not suggest that all questions have been answered and mobile payment technology does not have inherent risks. We still have a long way to go but the genie is out of the bottle and as each generation steps up, we will need to offer ubiquitous and intuitive interfaces to use at work, at play, for social interactions and for commerce.
The challenge directly in front of us is in dealing with the multiple mobile platforms and preferences of people along with the, at times, cavalier attitude that some people take with their mobile device. Do you know someone that has lost their phone recently? Given all the data folks put on them, do you think they really controlled their data after the loss of the physical device?
Advancing technology will help us solve many of these issues but most importantly, we need to take control of our own data and our own digital personas in order to feel safe and secure.
Overall, the entire Meeting & Expo was a valuable experience. The Electronic Transactions Association is the must-join organization for anyone in the payments industry. The educational, networking and selling opportunities are unmatched and you have great learning opportunities.
I will be presenting a session on Big Cloud Usage at the Big Cloud Event, hosted by Big Cloud Sales on June 18th and Business-Critical Security and Compliance in the Cloud at the ACA 74th Annual Convention & Expo on July 15thin San Diego, CA. I hope to see you at one or both of these events.
I have spent the last 12 years working in sales in the hosting industry, and I must say that the last 4+ have been the most interesting with the emergence of cloud hosting in the market. I spend my days working with current and future cloud users, and from what I have gathered over the past few years, new cloud adopters usually have three major concerns when it comes to hosting their applications in the cloud:
My main suggestion to help answer all three of the above questions is for businesses to work with a few different cloud vendors to setup an environment on which to test their applications. This will allow users to see how their applications perform in the cloud, and will demonstrate how quickly businesses can scale their applications. Of course, every application and every company’s needs are unique, but typically, cloud users are amazed at how much more efficient they can really become. Whether businesses select a fully isolated cloud solution, or a shared resource, users quickly learn that cloud hosting helps them earn a maximum return on their computing investments. Simply said, the cloud offers a flexible, customizable platform that can respond quickly and efficiently to rapid change in demand, while at the same time, ensuring maximum up-time, speed and security needed for the most demanding IT environments.
Dipping a toe in the proverbial water can be the biggest obstacle for companies that are considering hosting their applications in the cloud, because fear of unknown risk discourages change. But companies that have taken the plunge have found that most cloud services are designed to deliver high availability and performance for a broad range of complex eBusiness computing applications that, in the end, will save them time and money.
Come join me at the Big Cloud Event on June 18th in Minneapolis and I will be happy to discuss your company’s cloud hosting needs with you.
The Big Cloud Event will be held at the Minikahda Club in Minneapolis, MN, from 8 a.m to 6 p.m.. For more information about the 2013 Big Cloud Event visit http://bigcloudevent.splashthat.com.
About the Author: Shane Reisner is Director of Strategic Accounts for Layered Tech and brings more than 12 years of experience in the cloud hosting and managed services industry.