It seems obvious that better trained employees will have a positive impact in any business. Yet, a recent survey by Accenture found that 35% of executives say they have not invested enough in training to develop the skills they need, and 64% anticipate loss of revenue due to this skill gap
Archive for the 'PCI Compliance' Category
As Director of Compliance and Security services at Layered Tech since 2008, I have seen our Compliant Services business grow significantly during that time. With that growth, there has been a noticeable phenomenon related to our startup clients who have reached an attractiveness level high enough to become acquisition targets.
We are in a unique position to see this happen from start to finish. It is a behind-the-scenes supporting role where our economy of scale and simplified audit-service goals lend upward momentum. I have seen this happen several times, including with Layered Tech itself. It is a topic that deserves some background, so let me lay out an example of what I mean.
The PCI DSS (Payment Card Industry Data Security Standard) is in a release cycle this year, meaning version 3.0 will be released shortly. At this year’s recent Community Meeting of the PCI Security Standards Council, much discussion centered on the new version of the standard, which is why both me and our Chief Risk Officer, Jeff Reich, attended.
I have seen a shift in responsibility for overseeing and managing applications. Application monitoring and management is increasingly moving from application architects and developers and into IT operations. Our clients’ IT management folks are expected to be responsible for ensuring application health and performance and therefore are increasingly relying upon Layered Tech to provide management information and dashboard.
Christian Szell: Is it safe? Is it safe?
Babe: You’re talking to me?
Every customer running revenue-critical business applications should consider adding application performance monitoring and management. For this reason, we have taken our experience deploying application performance monitoring tools for our customers and released a standard managed solution. Our new Application Performance Management (APM) service, powered by AppDynamics, offers a “managed with” model in which we integrate APM with our customers’ managed hosting and cloud service. We handle the deployment, configuration, monitoring and assist clients with utilizing the APM solution.
Here are 5 important reasons to monitor your revenue-critical business applications.
1. Minimize the business impact of application issues
The more complex your application is, and the higher the rate of change, the more likely your applications will perform poorly or fail.
An effective troubleshooting process today prioritizes uptime over seeking answers, and uses the intelligence and alerting capabilities of a monitoring tool to help you mitigate business impact. When an outage occurs, you can use your application monitoring to quickly identify the problem and come up with a workaround that will get your app back up and running; later, when production is stable, you can consult with your developers and architects about how to improve the application and prevent problems going forward.
2. Find Ticking Time Bombs
Dealing with critical application issues quickly and effectively is key to minimizing the business impact of application failure. However, simply responding to outages is not enough. Smaller problems often go unresolved in applications because operation teams don’t have the time or resources to deal with them. Unfortunately, smaller issues can turn into major outages if they’re left unattended. So, enabling monitoring and identifying issues early is key.
3. Get to Know Your Application
When operations can’t troubleshoot a performance problem themselves, they often call in developers and architects to help. When you add up the man-hours spent fighting fires and the opportunity cost of not working on other projects, this is an expensive way to deal with problems.
To reduce the cost of troubleshooting, operations needs better visibility of their applications. The ops team needs to understand enough about the application that they won’t need to call in the experts to identify root cause. Ideally, ops can locate the root cause of problems quickly, and hand off to development with the right information and context for the fix.
4. Become More Agile
If your development team is agile, then you need to be agile, too. In most organizations the more nimble the development team, the more frequent the incidents. If constant code changes causes your application to perform poorly in production, then your organization isn’t recognizing the benefits of being agile. The solution isn’t to slow down your developers but rather to improve your own processes to better manage change. For some organizations, this means rethinking the way you identify issues.
5. Create Application Transparency for Everyone
Monitoring application performance in production can have effects that reach far beyond IT. The data that your monitoring tool collects represents how your users interact with your application. This data can help inform important decisions about performance-tuning projects, user-interface design, and product roadmap. With helpful, actionable data about your end users, you can change the course of your application and shape your organization’s culture.
Production doesn’t have to be a black box – nor should it be. Today’s applications are complex and dynamic, and the techniques we used to manage them in the past no longer work. This means you need the fastest and most effective way to troubleshoot and solve performance problems. The good news is, business application monitoring is now easier than ever. Check out our Layered Tech APM White Paper to find out how we can help you can crack open the black box.
For more information about our Application Performance Management service, please contact us or visit www.layeredtech.com/apm.
About the Author: As Vice President of Product Management at Layered Tech, Kevin Van Mondfrans (@VANMONDFRANS | +Kevin Van Mondfrans) is responsible for driving the Layered Tech portfolio of infrastructure as a service (IaaS) and managed service offerings. With more than 20 years of experience product development and marketing, Kevin has been delivering innovative computing, storage, cloud and service offering with companies such as HP, Dell, and Savvis.
With the prevalence of cybercrime, governments have enacted laws; organizations have established standards; and companies such as Layered Tech have implemented considerable administrative, technical, and physical controls on information security to protect the confidentiality, integrity, and availability of data.
As a data security leader offering PCI-DSS, HIPAA- and FISMA-compliant solutions, Layered Tech has participated in these Safe Harbor frameworks since November 2007 to protect client privacy as mandated by the Safe Harbor Privacy Principles, and to provide assurance that Layered Tech obtains, maintains, and processes personal data in a manner consistent with client expectations of privacy.
The U.S. Department of Commerce, in coordination with the European Commission and the Federal Data Protection and Information Commissioner of Switzerland, developed these frameworks to provide guidance for U.S. organizations on how to provide adequate protection for personal data from the EU and Switzerland as required by the European Union’s Directive on Data Protection and the Swiss Federal Act on Data Protection respectively. In effect, these frameworks bring the methods of privacy governance in the U.S. into accord with that of the E.U. and Switzerland.
For Layered Tech, protection of all data, personal or otherwise, is part and parcel to its brand reputation and continued success in the security and compliance sector of internet cloud and hosting solutions.
About the Author: Terry G. Raitt holds the CISSP certification and is the Policy Enforcement Manager in the Risk Management Group of Layered Tech. His 14 years of experience in the ISP and IT Services industry also includes roles as a network administrator, Linux/Unix system administrator, and technical support manager.
At the 2013 Electronic Transactions Association (ETA) Annual Meeting and Expo in New Orleans recently, I had the opportunity to give a presentation on Hacktivism titled Managing Risk for Online Threats and Hacktivism Actions. Attending these shows allows me to experience a little bit of local culture (the food and the venue were awesome), network with colleagues and learn about what is trending in the cloud space. I’m not surprised that the trending theme this year revolved around mobile technology.
We all know that more and more, summary knowledge of who we are and what we do is often contained in the device in our hand. According to the Pew Research Center’s Internet & American Life Project, smartphone adoption has jumped from 35 percent in 2011 to 56 percent in 2013. Since there is no corresponding rise in computer use or card-present environments, this driver alone should account for the ascending desire for mobile payment technology.
Already, many people are using mobile payment technologies such as Dwolla, LevelUp, Starbucks’ mobile app, EMS+ just to name a few. These data should not suggest that all questions have been answered and mobile payment technology does not have inherent risks. We still have a long way to go but the genie is out of the bottle and as each generation steps up, we will need to offer ubiquitous and intuitive interfaces to use at work, at play, for social interactions and for commerce.
The challenge directly in front of us is in dealing with the multiple mobile platforms and preferences of people along with the, at times, cavalier attitude that some people take with their mobile device. Do you know someone that has lost their phone recently? Given all the data folks put on them, do you think they really controlled their data after the loss of the physical device?
Advancing technology will help us solve many of these issues but most importantly, we need to take control of our own data and our own digital personas in order to feel safe and secure.
Overall, the entire Meeting & Expo was a valuable experience. The Electronic Transactions Association is the must-join organization for anyone in the payments industry. The educational, networking and selling opportunities are unmatched and you have great learning opportunities.
I will be presenting a session on Big Cloud Usage at the Big Cloud Event, hosted by Big Cloud Sales on June 18th and Business-Critical Security and Compliance in the Cloud at the ACA 74th Annual Convention & Expo on July 15thin San Diego, CA. I hope to see you at one or both of these events.
I have spent the last 12 years working in sales in the hosting industry, and I must say that the last 4+ have been the most interesting with the emergence of cloud hosting in the market. I spend my days working with current and future cloud users, and from what I have gathered over the past few years, new cloud adopters usually have three major concerns when it comes to hosting their applications in the cloud:
- How will my application perform in the cloud?
- Can cloud really help me scale quicker and more efficiently?
- How can I save money by hosting in a cloud solution?
My main suggestion to help answer all three of the above questions is for businesses to work with a few different cloud vendors to setup an environment on which to test their applications. This will allow users to see how their applications perform in the cloud, and will demonstrate how quickly businesses can scale their applications. Of course, every application and every company’s needs are unique, but typically, cloud users are amazed at how much more efficient they can really become. Whether businesses select a fully isolated cloud solution, or a shared resource, users quickly learn that cloud hosting helps them earn a maximum return on their computing investments. Simply said, the cloud offers a flexible, customizable platform that can respond quickly and efficiently to rapid change in demand, while at the same time, ensuring maximum up-time, speed and security needed for the most demanding IT environments.
Dipping a toe in the proverbial water can be the biggest obstacle for companies that are considering hosting their applications in the cloud, because fear of unknown risk discourages change. But companies that have taken the plunge have found that most cloud services are designed to deliver high availability and performance for a broad range of complex eBusiness computing applications that, in the end, will save them time and money.
Come join me at the Big Cloud Event on June 18th in Minneapolis and I will be happy to discuss your company’s cloud hosting needs with you.
The Big Cloud Event will be held at the Minikahda Club in Minneapolis, MN, from 8 a.m to 6 p.m.. For more information about the 2013 Big Cloud Event visit http://bigcloudevent.splashthat.com.
About the Author: Shane Reisner is Director of Strategic Accounts for Layered Tech and brings more than 12 years of experience in the cloud hosting and managed services industry.