Free/Busy in a hybrid environment fail and Test-Federationtrust returns error “Failed to validate delegation token”

Following an issue with Free/Busy in Exchange online, earlier this week, I was troubleshooting the exchange of Free/Busy information in some of my hybrid deployments as Free/Busy information was still not working.
After having checked some (obvious things) like the Organization Relationships and whether or not Autodiscover was working properly, I discovered an issue when running the Test-FederationTrust cmdlet.

In fact, the cmdlet completed almost entirely successful, except for the very last step in the process:

Id         : TokenValidation
Type       : Error
Message    : Failed to validate delegation token.

This also explained why I was seeing 401 Unauthorized messages when running the Test-OrganizationRelationship command.

I then checked the same in some of my other deployments and found out the all had the same issue. At least, there was some common ground to start working from.
I turned to co-MVP Steve Goodman and asked him to run the same command in one of his labs in order to have a point of reference. At the same time, he asked me to run a command which might help:

Get-FederationTrust | Set-Federationtrust –RefreshMetaData

After running the command, I re-ran the Test-FederationTrust command which now completed successfully.

Conclusion

Although the Free/Busy issues in Office 365 should be solved, some customers might still experience problems exchanging Free/Busy information. In this case, the problem manifests itself by e.g. online users not being able to request on-premises user’s availability information.

Blog Exchange Hybrid Exchange Office 365

Estimating the size of an Exchange (online) Archive

As part of some of the (archiving-) projects I have worked on, I frequently get asked if there is an easy way to determine what the size of the archive will be once it’s been activated. Although a bit odd at first, there are actually many good reasons why you’d want to know how big an archive will be.

First of all, determining the archive size allows to better size (or schedule for) the storage required for the archives. While there are also other ways to do this, knowing how big an archive will be when enabled is very helpful.

Secondly, if you’re using Exchange Online Archiving (EOA), it allows you to determine the amount of data that will pass through your internet connection for a specific mailbox. If the amount of data is large enough (compared to the available bandwidth), I personally prefer to provision an archiving on-premises, after which I can move it to Office 365 using MRS. But that’s another discussion. Especially for this scenario it can be useful to know how much archive you can (temporarily) host on-premises before sending them off to Office 365 and freeing up disk space again.

In order to calculate how big an archive would be, I’ve created a script which will go through all the items in one (or more) mailbox(es) and calculate the total size of all the items that will expire. When an item expires (and thus is eligible to be moved to the archive) depends on the Retention Policy you assign to a mailbox and what retention policy tags are included in that policy.

As the name of the script depicts, it’s important to understand that it’s an estimation of the archive size. There are situations in which the results of the script will be different from the real world. This could be the case when you enabled the archive and a user assigned personal tags to items before the Managed Folder assistant has processed the mailbox. In such a scenario, items with a retention tag that are different from the AgeLimit defined in the script will be calculated wrongfully. Then again, the script is meant to be ran before an archive is created.

Secondly, the script will go through all the folders in a mailbox. If you disabled archiving of calendar items, these items will be wrongfully included in the calculation as well. I will try to built this into the script in future releases, but this has a lower priority as the script was built to provide a pretty good estimation, not a 100% correct number.

The script, which you can download here, accepts multiple parameters:

UserPrimarySMTPAddresses the Primary SMTP Address of the mailbox for which you want to estimate the archive size
Report full file path to a txt file which contains the archive sizes
AgeLimit The retention time (in days) against which items should be probed. If you have a 60 day retention before items get moved to the archive, enter 60.
Server Used for connecting with EWS. Optional. Can be used if autodiscover is unable to determine the connection URI.
Credentials The credentials of an account that has the ApplicationImpersonation Management Role assigned to it.

 

The output of the script will be an object that contains the user’s Primary SMTP Address and the size of the archive in MB (TotalPRMessageSize).

Credit where credit is due! I would like to thank Michel de Rooij for his immensely insane PowerShell scripting skills and for helping me with cleaning up this script to its current form. Before I sent it off to Michel, the code was pretty inefficient [but hey! it was working], what you’ll download has been cleaned up and greatly enhanced. Now you have a clean code, additional error handling and some more parameters than in my original script [see parameters above].

I hope you’ll enjoy the script and find it useful. I’ve used it in multiple projects so far and it really helped me with planning of provisioning the archives.

Note:  To run the script, you’ll need to have Exchange Web Services installed and run it with an account that has the Application Impersonation Management Role assigned to it.

Cheers,

Michael

Blog Exchange

Review: Exchange 2013 Inside-Out: “Mailbox & High Availability” and “Connectivity, Clients & UM”

Although there’s a saying that you shouldn’t judge a book by it’s cover, for any book that features Tony Redmond and Paul Robichaux as the authors, it’s safe to assume it will be a great book! While both books have plenty of interesting technical content, to me that’s not the only things what defines a good book. Tony and Paul are very eloquent writers and I found reading both books extremely pleasant from a language-technical perspective. To be honest, as an amateur-writer myself, I can only dream of ever being able to put the English language to work as they do.

Having read the Exchange 2010 Inside-Out book before, I expected these 2013-version books to contain at least the same amount and type of content. The 2010-version is a HUGE book (+/- 1200 pages!) and therefore I was pleasantly surprised to see it was decided to split the content into two separate books now. This definitely helps making the amount of information more manageable. Don’t get me wrong: there’s still a lot to digest, but having two (slightly) smaller books makes the task to take on the books more easy, if not at least mentally it does! This is also something I do have to ‘warn’ you about. The amount of information and the technical breadth and depth of the content might sometimes feel a little overwhelming. This could be especially true if you aren’t very familiar with Exchange. For some, the technical depth might even be outright too deep. That’s also why I advise you not to try and read any of these books in one go. Instead, take your time to read through each of the chapters and allow some time to let the information sink in. Combine that with some fiddling around in your own lab and you’ll have a great learning experience.

What I like about these books is that you’re provided with all the information to successfully understand how Exchange operates and are then expected to put that knowledge to work yourself. Although there are plenty of examples in the book, if you are looking for pre-canned scripts, how-to’s or step-by-step guides, there might be better alternatives for you (e.g. Exchange 2013 PowerShell Cookbook). But then again, I don’t think that’s what Paul and Tony were trying to achieve anyway.

Conclusion

Paul and Tony have managed to combine a fun-to-read style with great (technical) content making the Exchange 2013 Inside-Out books a must to read. Whether you’re a consultant working day in and day out with Exchange or you’re an admin in charge of a Exchange organization, I’m sure you’ll find lots of valuable information in each of the books which will help you in your day-to-day job.

Blog Exchange Exchange 2013 Reviews

Help Exchange become a better product

In the Exchange community’s never-lasting efforts to help Microsoft increase the quality and usability of Exchange, a new website was recently created where YOU can send in and rate ideas for features and changes you would like to see in the next version(s)/updates of Exchange.

It’s important to understand that this is a community effort to convince Microsoft of taking a look at some heavily-requested features. Therefore, the more feedback we get, the better it is! If there’s enough feedback, I’m confident we are able to reach at least a few people in Redmond!

If you’ve been around long enough, you will see that some of the ideas have been lingering around for quite some time. With your help, we might just make enough noise for Microsoft to notice!

Right now, having the ability to centrally manage signatures and bringing back the Set-MailboxSentIntemsConfiguration cmdlet (why was it removed in the first place?) are on the top of the list. And they should be! If you find they should be in Exchange or you have other feature requests, feel free to vote for them so that Microsoft can see how important some features are to us.

Now, before doing anything else, have a look at exchange.ideascale.com and make your contribution to the list!

Cheers,

Michael

Blog Exchange Exchange 2013

Building an enterprise security strategy for Exchange

These days, the news is all about big brother watching us. You can’t open a newspaper of watch news on TV without being slammed with the news that one or the other intelligence agency is actively gathering information from other nations or important enterprises.

Especially the later case makes one wonder what you can do about it? I don’t believe there is a 100% safe system. However, there’s nothing wrong with trying to make an attacker’s life as hard as possible. There are many ways to do this and having a multi-layered defense is what you should be looking at. However, it’s not only about protecting your network or access to your network, it’s also about protecting your applications.

Recently, I wrote an article about some simple things you can do to secure your Exchange messaging environment. If you want to know more about it, have a look at my latest article for Search Exchange: http://searchexchange.techtarget.com/tip/Build-an-enterprise-security-strategy-for-Exchange

Enjoy!

Exchange Exchange 2013

Exchange Online Archiving (EOA): a view from the trenches – part 1

What is Exchange Online Archiving?

I’ve been meaning to write this article for quite a while now, so I’m glad it’s finally “ready”. First, let me start by introducing what Exchange Online Archiving (EOA in short) actually is.
This feature, first available since Exchange Hybrid, allows you to provision an cloud-based archive for an on-premises mailbox. While having an Exchange archive isn’t something new, at least not since Exchange 2010, the fact that the archive doesn’t have to be hosted within your own organization is pretty interesting.

Archives can be useful in many ways. One of the primary reasons why archives are used is to keep historical data for a longer period of time without cluttering a user’s primary mailbox. This could, for instance, be the case when you have to meet some compliance requirements which e.g. state that corporate data should be kept for 5 years. Although Exchange doesn’t have a problem with handling very large mailboxes including a high item count per folder, it’s usually the human component that cannot handle the overload of information that comes with having large amounts of data – at least that’s my experience. Keeping email inherently means that you’ll have to increase disk space to support the sometimes huge amounts of data that is involved. Although disk space has become quite cheap and Exchange 2013 is a great candidate to be used in combination with those cheap disks, there’s still a significant overhead involved in keeping that additional piece of infrastructure up and running.

This is where Exchange Online Archives could come in handy. First of all, there is no feature difference between an on-premises archive or a cloud-based (Office 365) archive. From a user’s point-of-view they both act and look the same. In fact, you are only offloading the task of storing archives to Office 365. The Exchange Online Plan 2 subscription automatically includes the right to provision unlimited-sized archives for your users. Although I don’t expect many people to run into the issue of filling up the initial 100GB, which you get provisioned to start with, any time soon, it’s very hard to match that offer for only  8$ per user per month… If you are only interested in EOA, there are specific EOA licenses as well which cost only a fraction of the full Exchange online license. Of course, this license will only allow you to use EOA and nothing more.

How does it work?

As briefly touched upon earlier, being able to use Exchange Online Archives is a by-product from having a hybrid Exchange deployment. A hybrid deployment, as the name stipulates, is the process of ‘pairing’ your On-Premises Exchange organization to Office 365; essentially creating one large “virtual Exchange organization”. As a result, having a (fully functional) Hybrid Deployment is the first requirement to abide to… Technically speaking it would be possible to setup a sort of minimalistic Hybrid deployment in which you leave out functionalities that you do not necessarily need to make Online Archives work (like e.g. cross-premises mail flow). Nonetheless I strongly encourage to still setup the full monty. It might save you some time afterwards if you decide to deploy cloud-based mailboxes anyway.

A very import part of the setup is set aside for DirSync. As you might remember, if you tick the “Hybrid Deployment” checkbox during DirSync setup, you allow it to write back some attributes into your on-premises organization. One of these attributes is the msExchArchiveStatus attribute. This attribute is a flag telling the on-premises organization whether an online archive has been provisioned or not. As we will see later in this section, this attribute is particularly important during the creation of an archive.

One of the questions I get asked regularly is whether you are required to deploy ADFS when setting up a hybrid deployment. The short answer is no. On the other hand, there are many good reasons why you would want to deploy ADFS, or rather: there are many good reasons why you would want to have some sort of single/same sign on. One reason I can think of it to simplify using online archives from an end user’s perspective. That way they won’t need to manage another set of credentials. Of course this isn’t only valid for online archives, it’s the same for each cloud-based workload in Office 365. ADFS can be one way of providing SSO, Password Sync is another. Both are valid options, neither are required and won’t be discussed here.

From a functional point-of-view, Online Archives have the exact same requirements as on-premises archives. You at least need Office 2007 SP3 Professional edition or later. Since we are running archives from Office 365, you also need to make sure to be up to speed with the latest required updates. For more information on what updates are needed, have a look at the following web page: http://office.microsoft.com/en-us/office365-suite-help/software-requirements-for-office-365-for-business-HA102817357.aspx

Now that we got the prerequisites covered, let’s have a look at how the provisioning process works from a high-level perspective:

image

As you can derive from the image above, there are two DirSync operations needed. The first one is used to “tell” Office 365 to create an archive for user “X”. The second DirSync operation is used to sync back the msExchArchiveStatus attribute which will now have a value of 1 instead of 0. This is to tell the on-premises organization the archive has been created. A good way to verify whether this process has completed is to run the Get-Mailbox | fl *arch* command:

image

Here you can see that the archive was created successfully (ArchiveStatus = Active). However, we are missing a part of the information. This is because the on-premises organization cannot provide the information from Office 365 (which is essentially another Exchange organization). To fetch the missing information, you’ll have to open up a remote PowerShell session to Exchange Online and run the Get-MailUser | fl *arch* command:

image

Conclusion

This is it for part one of this article.
In the following part, I will talk about some of the gotchas, do’s and don’ts. Stay tuned!

Exchange Exchange 2013 Hybrid Exchange Office 365

Why I think you should attend IT Connections in Las Vegas

You might probably wonder why I’m writing this article; why on Earth would I try to convince you to attend this conference? Well, first let me start by telling you that it’s NOT because I’m one of the speakers ~ though if that were a reason to attend, I would be flattered if that would be the reason. Anyway, back to reality now…

I consider myself a frequent conference attendee. As such I’ve attended multiple conferences over the past few years. Despite what you might think, I did not attend TechEd Europe or North America. Although they’re definitely on my “to-do” list, I usually prefer smaller scale conferences like the recently demised “The Experts Conference”.

So what is it that makes me want to promote this conference above the many others that exist out there?

Just like you, if I spend my money on a conference, I’m looking to get the most value out of it. This means that I need to get valuable content and I need to be able to socialize with like-minded peers. I prefer having an international crowd – that way you get a more diverse view on things. After all, how things are handled in the US can be very different from how certain IT related problems are dealt with in e.g. Europe; if at all the same business problems exist.

How do you know if content will be good? Actually, you don’t really know until you’ve attended. However, there are some benchmarks which can help you identify if content is likely to be good. And I can assure you, the signs for IT Connections are good. Heck, they’re even great! First, let’s have a look at some of the speakers who will speak at the conference:

Mary Jo Foley, Steve Goodman, Martina Grom, Adnan Hendircks, Dan Holme, Tim McMichael, Jeff Mealiffe, Mike Pfeiffer, Tony Redmond, Paul Robichaux, John Rodriguez, Mark Russinovich, Loryan Strant, Greg Taylor, Rod Trent, Jaap Wesselius and many, many more.

Anyone who hasn’t been hiding under a rock and will easily recognize most of these names. Every single one of them are reputable and well respected individuals that – in some way – have put their mark on the Technical Communities. Some of these speakers are MVPs, others are well-published authors, Certified Masters or Microsoft Employees. Each of these accreditations mean something. So, the likeliness to hear some crap come out of their mouths is very small. Additionally, you should know that all sessions in the Exchange track are subject to Tony Redmond’s scrutiny. I’ve spoken at several events and I have never had so much valuable input back as from Tony. He’s really working hard to ensure the quality, and by the looks of it you won’t be disappointed.

You might think that this is no different from, let’s say TechEd. Maybe that’s true. However, Microsoft conferences are usually about “how things are designed to work” whereas conferences like these will give you more information on “how things actually work [in the real world]”. Both might seem the same, but there’s a subtle, yet significant nuance between both. It’s just that ‘small’ difference that YOU – as an IT Pro – is looking for. That’s why conferences like these have a chance to stand up against the much larger ones, like Microsoft organizes.

Anyway, enough eulogizing the speakers; I wouldn’t want them to become complacent over it… 🙂

A second point which allows you to benchmark a conference are the sessions. A good speaker is one thing, but if he/she talks about a topic which does not interest you, it’s likely not going to bring you much value. And that’s exactly another point where this conference stands out from amongst other conferences. I mean, just have a look at that session list (for Exchange)!

I highly doubt that – in this very diverse list of sessions – there’s nothing that interests you…

Then there’s the aspect of “socializing”. There’s actually nothing much I can say more than: “It’s in VEGAS, baby”! Although I have never been to Vegas before myself, I can hardly imagine there will be a lack of socializing-opportunities. Some of them are organized by the conference, but ultimately it’s up to YOU to socialize with peers. And believe me the best conversations I ever had were at dinner or while having a beer or two (or three, or four, or…). The fact that you don’t have to mingle amongst several thousands of other people is just an additional bonus as you’ll be much more easily able to connect with speakers and other attendees.

Finally, there’s the aspect of cost. Although there’s less impact for people living in the US, it’s usually more of a problem when you’re travelling from Europe.
So, let’s have a look at what this conference might cost you:

Airfare (Brussels – Las Vegas) +/- 800 EUR (just checked via SkyScanner.net)
Conference (Basic Registration) +/- 1.130 EUR
Hotel (6 nights) +/- 900 EUR
TOTAL +/- 2.830 EUR

Considering that a full week of training on Microsoft Exchange (Advanced Solutions of Microsoft Exchange Server 2013) will cost you about 2700 EUR EUR (incl. VAT), this conference is a bargain! You’ll find much more value from these sessions and the experience than you’ll have from a week’s training.

Now, don’t get me wrong. I’m not against training. But given the difference between both, the conference is where I’d put my money in.

Note   Because both the conference and the training would take up 5 billable days, I didn’t include them in this comparison as it wouldn’t contribute to the case anyhow.

The UC Architects

I admit, this part is a shameless plug. Nonetheless, if you’re still not convinced after all I wrote, maybe here’s something to think about. Next to some other extra activities and panel discussions that will take place at the conference, The UC Architects will have a live panel discussion (which will be recorded) with some very interesting guests! And you can attend! We are currently working very hard to make something very special out of it; so make sure to keep an eye out for more information. Whatever you do: don’t miss it!

Cheers, Michael

Events Exchange