Why MEC is the place to be for Exchange admins/consultants/enthusiasts!

In less than a month, the 2014 edition of the Microsoft Exchange Conference will kick off in Austin, Texas. For those who haven’t decided if they will be going yet, here’s some reasons why you should.

The Value of Conferences

Being someone who frequently attends conferences, I *think* I’m in a position I can say that conferences provide great value. Typically, you can get up-to-date with the latest (and greatest) technology in IT.

Often, the cost for attending a conference are estimated higher than a traditional 5-day course. However, I find this not to be true – at least not all the time. It is true that – depending on where you fly in from – Travel & Expenses might add up to the cost. However, I think it is a good thing to be ‘away’ from your daily work environment. That typically leaves one less tempted to be pre-occupied with work rather than soaking in the knowledge shared throughout the conference. The experience is quite different from a training course. Conferences might not provide you the exact same information as in a training, but you’ll definitely be able to learn more (different) things. Especially if your skills in a particular product are already well-developed, conferences are the place to widen your knowledge.

On top of that, classroom trainings don’t offer you the same networking capabilities. In case of MEC, for instance, there will be a bunch of Exchange MVPs and Masters who you can talk to. All of them very knowledgeable and I’m sure they won’t mind a good discussion on Exchange! This could be your opportunity to ask some really difficult questions or just hear what their opinion is on a specific issue. Sometimes the insights of a 3rd person can make a difference…!

It is also the place where all the industry experts will meet. Like I mentioned earlier, there will be Masters and MVPs, but also a lot of people from within Microsoft’s Exchange Product Group will be there. What better people are there to ask your questions to?

Great Content

Without any doubt, the Exchange Conference will be the place in 2014 to learn about what’s happening with Exchange. Service Pack 1 – or Cumulative Update 4, if you will – has just been released and as you might’ve read there are many new things to discover.

At the same time, it’s been almost 1.5 years since Exchange 2013 has been released and there are quite some sessions that focus on deployment and migration. If you’re looking to migrate shortly, or if you’re a consultant migrating other companies, I’m sure you’ll get a lot of value from these sessions as they will be able to provide you with first-hand information. When MEC 2012 was held – shortly before the launch of Exchange 2013 – this wasn’t really possible as there weren’t many deployments out there.

Sure, one might argue that the install base for Exchange 2013 is still low. However, if you look back at it, deployments for Exchange 2010 only really kicked of once it was past the SP1 era. And I expect nothing else to happen for Exchange 2013.

As a reference: here’s a list of sessions I definitely look forward to:

And of course the “Experts unplugged” sessions:

I realize that’s way too many sessions already and I will probably have to make a choice which ones I will be able to attend…
But the fact that I have so many only proves that there’s so much valuable information at MEC…

Great speakers

I’ve had a look through who is speaking at MEC and I can only conclude that there is a TON of great speakers. All of which I am sure they will make it worth the wile. While Microsoft-speakers will most likely give you an overview of how things are supposed to work, many of the MVPs have sessions scheduled which might give you a slight less biased view of things. The combination of both makes for a good mix to get you started on the new stuff and broaden your knowledge of what was already there.

Location

Austin, Texas. I haven’t been there myself. But based on what Exchange Master Andrew Higginbotham blogged a few days ago; it looks promising!

Microsoft has big shoes to fill. MEC 2012 was a huge success and people are expecting the same – if not better – things from MEC 2014. Additionally, for those who were lucky enough to attend the Lync Conference in Vegas earlier this month, that is quite something MEC has to compete with. Knowing the community and the people behind MEC, I’m pretty confident this edition will be EPIC.

See you there!

Michael

Blog Exchange 2013 Microsoft Exchange Conference 2014 News Office 365 Uncategorized

Publishing multiple services to the internet on a single IP address using a KEMP Load Balancer and content switching rules

A few days ago, someone suggested I write this article as it seems many people are struggling with ‘problem’. In fact, the solution which I’m going to explain below is the answer to a problem typically found in “home labs” where the internet connection doesn’t always have multiple IP addresses. This doesn’t mean that it’s only valid for home-us or testing scenarios only. Given that IPv4 addresses are almost depleted, it’s a good thing not to waste these valuable resources if it’s not necessary.

Basically, what I’m going to explain is how you can use a KEMP Load Master to publish multiple services/workloads to the internet using only a single (external) IP address. In the example below, I will be publishing Exchange, Office Web Apps and Lync onto the internet.

The following image depicts how the network in my example looks like. It also displays the different domain names and IP addresses that I’m using. Note that – although I perfectly could – I’m not connecting the Load Master directly onto the internet. Instead, I mapped an external IP address from my router/firewall to the Load Master:

image

How it works

The principle behind all this is simple: whenever a request ‘hits’ the Load Master, it will read the host header which is used to connect to the server and use that to determine where to send the request to. Given that most of the applications we are publishing use SSL, we have to decrypt content at the Load Master. This means we will be configuring the Load Master in Layer 7. Because we need to decrypt traffic, there’s also a ‘problem’ which we need to work around. The workloads we are publishing to the internet all use different host names. Because we only use a single Virtual Service, we can assign only a single certificate to it. Therefore, you have to make sure that the certificate you will configure in the Load Master either includes all published host names as a Subject (Alternative) Name or use a wildcard certificate which automatically covers all the hosts for a given domain. The latter option is not valid if you have multiple different domain names involved.

How the Load Master handles this ‘problem’ is not new – far from it. The same principle is used in every reverse proxy and was also the way how our beloved – but sadly discontinued – TMG used to handle such scenarios. You do not necessarily need to enable the Load Master’s ESP capabilities.

Step 1: Creating Content Rules

First, we will start by creating the content rules which the Load Master will use to determine where to send the requests to. In this example we will be creating rules for the following host names:

  • outlook.exchangelab.be (Exchange)
  • meet.exchangelab.be (Lync)
  • dialin.exchangelab.be (Lync)
  • owa.exchangelab.be (Office Web Apps)
  1. Login to the Load Master and navigate to Rules & Checking and click > Content Rules:image
  2. Click Create New…
  3. On the Create Rule page, enter the details as follows:conten rule

Repeat steps 2-3 for each domain name. Change the value for the field Match String so that it matches the domain names you are using. The final result should look like the following:

content rules

Step 2: creating a new Virtual Service

This step is fairly easy. We will be creating a new virtual service which uses the internal IP address that is mapped to the external IP address. If you already have create a virtual service previously, you can skip this step.

  1. In the Load Master, click Virtual Services and the click > Add New:image
  2. Specify the internal IP address which you have previously mapped to an external IP address
  3. Specify port TCP 443
  4. Click Add this Virtual Serviceimage

Step 3: Configuring the Virtual Service

So how does the Load Master differentiate between the different host headers? Content Rules. Content rules allow you to use Regular Expressions which the Load Master will use to examine incoming requests. If a match is found through one of the expressions, the Load Master will forward the traffic to the real server which has been configured with that content rule.

First, we need to enable proper SSL handling by the Load Master:

  1. Under SSL Properties, click the checkbox next to Enabled.
  2. When presented with a warning about a temporary self-signed certificate, click OK.
  3. Select the box next to Reencrypt. This will ensure that traffic leaving the Load Master is encrypted again before being sent to the real servers. Although some services might support SSL offloading (thus not reencrypting traffc), it’s beyond the scope of this article and will not be discussed.
  4. Select HTTPS under Rewrite Rules.image

Before moving to the next step, we will also need to configure the (wildcard) certificate to be used with this Virtual Service:

  1. Next to Certificates, click Add New
  2. Click Import Certificate and follow the steps to import the wildcard certificate into the Load Master. These steps include selecting a certificate file, specifying a password for the certificate file (if applicable) and setting an identifying name for the certificate (e.g. wildcard).image
  3. Click Save
  4. Click “OK” in the confirmation prompt.
  5. Under Operations, click the dropdown menu VS to Add and select the virtual service.
  6. Now click Add VSimage

You’ve now successfully configured the certificate for the main Virtual Service. This will ensure the Load Master can decrypt an analyze traffic sent to it. Let’s move on to the next  step in which we will define the “Sub Virtual Services”.

Step 4: Adding Sub Virtual Services

While still on the properties pages for the (main) Virtual Service, we will now be adding new ‘Sub Virtual Services’. Having a Sub Virtual Service per workload allows us to define different real servers per SubVS as well as a different health check. This is the key functionality which allows to have multiple different workloads live under a single ‘main’ Virtual Service.

  1. Under Real Servers click Add SubVS…
  2. Click OK in the confirmation window.
  3. A new SubVS will now have appeared. Click Modify and configure the following parameters:
  • Nickname (makes it easier to differentiate from other SubVSs)
  • Persistence options (if necessary)
  • Real Server(s)

Repeat the steps above for each of the workloads you want to publish.

Note: a word of warning is needed here. Typically, you would add your ‘real servers’ using the same TCP port as the main Virtual Service, being TCP 443, in this case. However, if you are also using the Load Master as a reverse proxy for Lync, you will need to make sure your Lync servers are added using port 4443 instead.

Once you have configured the Sub Virtual Services, you still need to assign one of the content rules to it. Before you’re able to do so, you first have to enable Content Rules.

Step 5: enabling and configuring content rules

In the properties of the main Virtual Service, Under Advanced Properties click Enable next to Content Switching. You will notice that this option has become available after adding your first SubVS.

image

Once Content Switching is enabled, we need to assign the appropriate rules to each SubVS.

  1. Under SubVSs, Click None in the Rules column for the SubVS you just are configuring. For example, if you want to configure the content rule for the Exchange SubVS:image
  2. On the Rule Management page, select the appropriate Content Matching rule (created earlier) from the selection box and then click Add:image
  3. Repeat these steps for each Sub Virtual Service you created earlier

Testing

You can now test the configuration by navigating your browser to one of your published services or by using one of the service. If all is well, you should now be able to reach Exchange, Lync and Office Web Apps – all using the same external IP Address.

As you can see, there’s some fair amount of work involved, but it’s all in all relatively straightforward to configure. In this example we published Exchange, Lync and Office Web Apps, but you could just as easily add other services too. Especially with the many Load Balancing options you have with Exchange 2013, you could for instance use multiple additional Sub Virtual Services for Exchange alone. To get you started, here’s how the content rules for that would look like:

content rules exchange

Note: if you are defining multiple Sub Virtual Services for e.g. Exchange, you don’t need to use/configure a Sub Virtual Service which uses the content rule for the Exchange domain name “^outlook.domain.com*”. If you still do, you’d find that – depending on the order of the rules – your workload-specific virtual services would remain unused.

I hope you enjoyed this article!

Until later,

Michael

Blog Exchange 2013 How-To's Office 365 Uncategorized

Exchange Compliance options

The tasks an IT organization is charged with constantly change, but the tasks themselves are often influenced by trends — often just hype — in the market. As trends come and go, organizations can either act upon them or let them pass by.

One technology trend that has survived the test of time is compliance. Ever since Exchange 2010, Microsoft has put more emphasis on new features which help enterprises with their specific compliance requirements. Exchange 2013 takes it even a step further by adding even more features and improving some of the existing ones. In this article, I comment on some of the [compliance] trends and explain which Microsoft Exchange features might help you covering your specific compliance requirements. You can continue reading over on SearchExchange.

Happy reading!

-Michael

Uncategorized

Microsoft releases the new (re-branded) Exchange 2013 Server Role Requirements Calculator!

Just moments ago, Microsoft unleashed the all-new (or should I say re-branded) Exchange 2013 Server Role Requirements Calculator to the world.
Along with it’s release, Microsoft also made clear that multi-role deployments are still the way forward:

“Like with Exchange 2010, the recommendation in Exchange 2013 is to deploy multi-role servers”

Re-branding doesn’t mean that it changed completely. At least not the interface.
So don’t worry, the tool itself still looks very familiar and works very much in the same way as previous builds of the tool did.
Most of the changes were made under the hood. The tool now also provides sizing information for the Client Access Server role and takes into account the changes introduced by Exchange 2013’s new architecture.

Alongside a series of other improvements, a lot of changes were made in the area of High Availability calculations, including:

You can now specify the Witness Server location, either primary, secondary, or tertiary datacenter.
– The calculator allows you to simulate WAN failures, so that you can see how the databases are distributed during the worst failure mode.
– The calculator allows you to name servers and define a database prefix which are then used in the deployment scripts.
– The distribution algorithm supports single datacenter HA deployments, Active/Passive deployments, and Active/Active deployments.
– The calculator includes a PowerShell script to automate DAG creation.

In the event you are deploying your high availability architecture with direct attached storage, you can now specify the maximum number of database volumes each server will support. For example, if you are deploying a server architecture that can support 24 disks, you can specify a maximum support of 20 database volumes (leaving 2 disks for system, 1 disk for Restore Volume, and 1 disks as a spare for AutoReseed).

To download the Calculator and have a look at the full article for the Exchange team, have a look here: http://blogs.technet.com/b/exchange/archive/2013/05/14/released-exchange-2013-server-role-requirements-calculator.aspx

Enjoy!

Michael

Uncategorized

Microsoft re-releases update rollups for Exchange 2007 and Exchange 2010

Following a recent Microsoft Security Advisory article, the Exchange team has decided to re-release the following updates:

Now, before thinking back and referring to previous issues where Exchange update-rollups had to be re-released (sometimes even more than once), this issue isn’t really related to the Exchange code.
Apparently, digital signatures on files expire prematurely which can cause all sorts of (little?) issues (more info at http://technet.microsoft.com/en-us/security/advisory/2749655).

And while they were at it, they’ve also included an additional hotfix in Exchange 2010 SP2 Update-Rollup 4:

  • 2756987: Outlook only returns one result after you click the “Your search returned a large number of results. Narrow your search, or click here to view all results” message

For more information, please take a look at the original post from the Exchange team here: http://blogs.technet.com/b/exchange/archive/2012/10/09/re-released-exchange-2010-and-exchange-2007-update-rollups.aspx

 

Uncategorized

Blackberry soon to go live (out of beta) in Office 365!

Earlier this week, RIM announced – through an email to existing customers – that their service would go out of beta somewhere in the next few weeks.
There would be no change in support and the service would remain free. However; existing customers would be required to re-accept new terms and conditions.

At the moment, it’s not clear what changes will be made to the terms and conditions. My best guess is that changes will mostly involve service availability and support.
We’ll have to wait and see…

According to RIM, the move to the “general availability” should have no impact to the service. Nonetheless, I’ve had some customers complaining about intermittent connectivity lately…

To be continued!

Hereunder is the original statement by RIM:

image

Uncategorized

Office 365 Beta Exams: my thoughts (and feelings)

Hereunder you’ll find some of my thoughts (and feelings) on both Office 365 beta exams that I took earlier today.

I’m a UC guy and mostly busy with Exchange, Lync and Active Directory. Because I have been warned by some colleagues about the difficulty of the exam (http://thecloudmouth.com/2012/01/10/first-impressions-of-the-office-365-beta-exams/), I made sure to go through the SharePoint Online part at least one more time before taking a shot at the exam… And although I have been delivering quite some Office 365-ignite trainings for Microsoft, it did not get me very far…

Honestly said, this was one of the most difficult Microsoft exams that I took to date. But for some reason it doesn’t really surprise me. Not because I’ve been “warned”, but rather because Office 365 is really about more than a single product. It’s about Exchange, about Lync, about SharePoint and about the platform itself and you cannot expect Microsoft to lower their expectations towards certifications, just because more than one product is involved, can you?

I definitely agree with some of comments that I’ve read all over the net: you need at least one year experience with Office 365 in all of its aspects: Exchange, Lync, SharePoint, ADFS, … Just having a theoretical knowledge about it won’t get you there, you’ll need hands-on experience. The depth of some questions is crazy. But perhaps I just say that because I’m not really a SharePoint guy…

I guess that – if luck is on my side – the first exam (“deployment”) will be a PASS. The second one (“managing”) on the other hand, I flunked for sure.

For those out there that still need to take the exam: GOOD LUCK!

If you need some resources on SharePoint Online, I can certainly recommend the following page:

Uncategorized