Avoiding vendor lock-in in the public cloud

A little while back, I had a pretty frank discussion with a customer about vendor lock-in in the public cloud and he left me under no illusions that he saw cloud more as a threat than an opportunity. I did wonder if there had been some incident in the past that had left him feeling this way, but didn’t really feel it appropriate to probe into that much further.

Instead of dwelling on the negatives of this situation, we decided to accentuate the positives and try to formulate some advice on how best this risk could be mitigated. This was especially important as there was already a significant investment made by the business into public cloud deployments. It is an important issue though – it’s easy enough to get in, but how do you get out? There are several strategies you could use, I’m just going to call out a couple of them as an example.

To start with, back in the days of all on premises deployments, generally you would try and go for a “best of breed” approach. You have a business problem that needs a technical solution so you look at all the potential solutions and choose the best fit based on a number of requirements. Typically these include cost, scalability, support, existing skill sets and strength of the vendor in the market (Gartner Magic Quadrant, etc.). This applies equally in the public cloud – it’s still a product set in a technical solution so the perspective needn’t change all that much.

One potential strategy is to use the best of breed approach to look at all public cloud vendors (for the purpose of this article, I really just mean the “big three” of AWS, Azure and Google Cloud Platform). As you might expect, the best cost, support and deployment options for say SQL Server on Windows would probably be from Microsoft. In that case, you deploy that part of the solution in Azure.

Conversely, you may have a need for a CDN solution and decide that AWS CloudFront represents the best solution, so you build that part of your solution around that product. This way you are mitigating risk by spreading services across two vendors while still retaining the best of breed approach.

However, “doing the splits” is not always preferable. It’s two sets of skills, two lots of billing to deal with and two vendors to punch if anything goes badly wrong.

Another more pragmatic approach is to make open source technologies a key plank of your strategy. Products such as MySQL, Postgres, Linux, Docker, Java, .NET, Chef and Puppet are widely available on public cloud platforms and mean that any effort put into these technologies can be moved elsewhere if need be (even back on premises if you need to). Not only this, but skills in the market place are pretty commoditised now and mean that bringing in new staff to help with the deployments (or even using outside parties) is made easier and more cost effective.

You could go down the road of deploying a typical web application on AWS using Postgres, Linux, Chef, Docker and Java and if for any reason later this approach becomes too expensive or other issues occur, it’s far easier to pick up the data you’ve generated in these environments, walk over to a competitor, drop it down and carry on.

Obviously this masks some of the complexities of how that move would actually take place, such as timelines, cost and skills required, but it presents a sensible approach to stakeholders that provider migration has been considered and has been accounted for in the technical solution.

The stark reality is that whatever you are doing with technology, there will always be an element of vendor lock in. Obviously from a financial perspective there is a motive for them to do that, but also this comes of innovation when a new technology is created which adds new formats and data blobs to the landscape. The key to addressing this is taking a balanced view and being able to tell project stakeholders that you’re taking a best of breed approach based on requirements and you have built in safeguards in case issues occur in future that prompt a re-evaluation of the underlying provider.




Linux Foundation Certified System Administrator  – Exam Experience & Tips


I’ve recently gone through the process of sitting the LFCSA exam and one thing I noticed as I was studying was the almost total lack of blogs and articles about this certification and the exam. There are some good courses from Pluralsight and Linux Academy, but not a great deal else. As such, I thought I would drop a few thoughts down in case it helps someone else.

Firstly, what is the LFCSA and why should I sit it? Well firstly it’s vendor agnostic (but not), you get to choose which distro you’d like to certify on – SUSE, CentOS or Ubuntu. I chose CentOS as in my experience, it’s the most common distro in use in the enterprise right now, if you leave out the paid Red Hat equivalent. If you didn’t know, CentOS is in essence the “free” version of Red Hat, so if you know one, you know the other, so to speak. This puts you in a really good place skills wise.

Secondly, now that Microsoft have hugged the penguin (and no, that’s not a euphemism!), if you pass both the 70-533 (Azure Operations) and LFCSA exam, you qualify for the MCSA : Linux on Azure certification. As far as I can tell, there don’t seem to be a whole lot of people around who have that at the moment.

On to the exam itself. It’s currently priced at $300 and you get a resit included if you fail the first time. I thought initially the exam cost was quite high for an entry level exam, but when you factor in the resit, it’s actually not bad value for money. Also, until the 22nd December, you can save 50% on exam vouchers by using the code HOLIDAY50. Vouchers last for a year, so well worth buying now, even if you don’t plan to sit the exam until next year sometime.

The exam is all on line proctored, so you don’t need to find a test centre in the back of beyond that looks like an office block in Pripyat.

article-2257256-16c0b4f4000005dc-636_964x653A typical exam centre waiting room. In Pripyat. Possibly.

You perform all the registration via the Linux Foundation website and then click through to the exam delivery company (much the same as you do from Microsoft to Pearson). There is a short wait time while you are checked in, and much like the Microsoft process, you are asked to do a 360 degree view of the room with your webcam and desk area. All the testing requirements are the same for Microsoft online proctored exams, so nothing new here. In fact, if anything, it’s far less stringent. No turning out your pockets, rolling out your sleeves or reciting the Catalina Magdalena Lupensteiner Wallabeiner song.

From here, the exam kicks off and you’re monitored via webcam, as is typical for these things. The exam itself comprises 25 questions in 2 hours, so tight for time, even if you know your stuff. If you’ve sat VCAP/VCIX exams, you should know by now how to manage your time during these exams. You can go backwards and forwards between the questions and I can’t recall there being any dependencies between the answers, so if you don’t answer question 1, it doesn’t prevent you answering question 2, for example.

My tip? Scroll through the questions and answer the “easy” ones first. Some questions have a single objective, some have three or four sub-objectives. If you’re not massively confident, go for the low hanging fruit first. Remember this is a practical exam, so even if you only part answer a question, you will still get credit for it.

The exam screen is in two halves, the left pane has the question panel and the right pane has a terminal session. And no, you don’t have GUI access, so get that command line stuff learned!

In terms of content, I went through the Linux Academy material and found it matched the exam blueprint pretty well. I also dipped in and out of the PluralSight videos to plug the gaps in my knowledge. Also, labbing stuff and trying it out has no substitute (or as I have expressed it in the past, “labbing the shit out of it”). Either use your home lab or burn some AWS Free Tier or Azure MSDN credit. CentOS boxes should be extremely cheap to run as there’s no licence cost, you’re just paying compute and storage fees.

So what should you know? Well of course I’m restricted by NDA, but as I said, look at the exam blueprint and how the domains are weighted. There is a high value placed on “Essential Commands” and “Operation of Running Systems”, so take it from the blueprint to mean :-

  • Redirection of files
  • Core commands such as ls, echo, cp, rm, find, sed, sudo, etc.
  • Know how to use vim!
  • Creating, extracting and types of archives
  • Install and remove software
  • Write a basic shell script
  • Manipulation of users
  • Storage commands including LVM and mount
  • Creation, deletion and maintenance of user and group accounts
  • Firewalls, services and startup commands
  • KVM virtualisation commands (virsh, etc.)

Years ago, I sat and passed the old SUSE CLP entry level exam and I have to say, I thought the LFCS exam was pitched at a lower level than that. It’s not massively taxing, but that being said, I’m still awaiting my score report so I could have failed it! The SLA is 72-75 hours from the end of your exam for your results. The first time around, I broke the lab VM and so had to quit and therefore fail the exam! Oddly, the questions in the resit were more or less the same as the first time around, which makes me think the question pool is not all that large.

Also, don’t think you can wing it with man pages. You simply don’t have time to wade through all of that if you don’t know the fundamentals. One useful tip is to use tab complete. Not only does this save small amounts of time, but if you’re also not quite sure about what the command is, but you know the first couple of letters of it, tab complete will give you a list of matching commands. I found that quite useful a couple of times.

Hopefully this has proved useful, as I said previously, there’s not a lot of content out there about this certification or exam apart from the aforementioned training courses. Here are some useful study resources:-

Good luck if you’re sitting this any time soon, fingers crossed I will get a positive result!

Update 18/12/16 – I just found out I passed with 85%. Well happy with that!