How To: Export and Modify Registrations in Bulk
There are scenarios in which it is required to create or modify large numbers of registrations in a notification hub. Some of these scenarios are tag updates following batch computations, or migrating an existing push implementation to use Notification Hubs.
This topic explains how to use Notification Hubs bulk support to perform a large number of operations on a notification hub, or to export all registrations.
High-Level Flow
Batch support is designed to support long-running jobs involving millions of registrations. In order to achieve this scale, batching support uses Azure Storage to store job details and output. For bulk update operations, the user is required to create a file in a blob container, whose content is the list of registration update operations. When starting the job, the user provides a URL to the input blob, along with a URL to an output directory (also in a blob container). After the job has started, the user can check the status by querying a URL location provided at job start. Note that a specific job can only perform operations of a specific kind (creates, updates, or deletes). Export operations are performed analogously.
Import
Setup
This section assumes you have the following:
A provisioned notification hub.
An Azure Storage blob container.
References to the Azure Storage and Azure Service Bus NuGet packages.
Create input file and store it in a blob
An input file contains a list of registrations serialized in XML, one per row. Using the Azure SDK, the following code example shows how to serialize the registrations and upload them to blob container.
private static void SerializeToBlob(CloudBlobContainer container, RegistrationDescription[] descriptions)
{
StringBuilder builder = new StringBuilder();
foreach (var registrationDescription in descriptions)
{
builder.AppendLine(RegistrationDescription.Serialize());
}
var inputBlob = container.GetBlockBlobReference(INPUT_FILE_NAME);
using (MemoryStream stream = new MemoryStream(Encoding.UTF8.GetBytes(builder.ToString())))
{
inputBlob.UploadFromStream(stream);
}
}
Important
The preceding code serializes the registrations in memory and then uploads the entire stream into a blob. If you have uploaded a file of more than just a few megabytes, please refer to the Azure blob guidance on how to perform these steps; for example, block blobs.
Create URL tokens
Once your input file is uploaded, you must generate the URLs to provide to your notification hub for both the input file and the output directory. Note that you can use two different blob containers for input and output.
static Uri GetOutputDirectoryUrl(CloudBlobContainer container)
{
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy
{
SharedAccessExpiryTime = DateTime.UtcNow.AddDays(1),
Permissions = SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.List | SharedAccessBlobPermissions.Read
};
string sasContainerToken = container.GetSharedAccessSignature(sasConstraints);
return new Uri(container.Uri + sasContainerToken);
}
static Uri GetInputFileUrl(CloudBlobContainer container, string filePath)
{
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy
{
SharedAccessExpiryTime = DateTime.UtcNow.AddDays(1),
Permissions = SharedAccessBlobPermissions.Read
};
string sasToken = container.GetBlockBlobReference(filePath).GetSharedAccessSignature(sasConstraints);
return new Uri(container.Uri + "/" + filePath + sasToken);
}
Submit the job
With the two input and output URLs, we can now start the batch job.
NotificationHubClient client = NotificationHubClient.CreateClientFromConnectionString(CONNECTION_STRING, HUB_NAME);
var createTask = client.SubmitNotificationHubJobAsync(
new NotificationHubJob {
JobType = NotificationHubJobType.ImportCreateRegistrations,
OutputContainerUri = outputContainerSasUri,
ImportFileUri = inputFileSasUri
}
);
createTask.Wait();
var job = createTask.Result;
long i = 10;
while (i > 0 && job.Status != NotificationHubJobStatus.Completed)
{
var getJobTask = client.GetNotificationHubJobAsync(job.JobId);
getJobTask.Wait();
job = getJobTask.Result;
Thread.Sleep(1000);
i--;
}
In addition to the input and output URLs, this example creates a NotificationHubJob
object that contains a JobType
object, which can be one of the following:
ImportCreateRegistrations
ImportUpdateRegistrations
ImportDeleteRegistrations
Once the call is completed, the job is continued by the notification hub, and you can check its status with the call to GetNotificationHubJobAsync.
At the completion of the job, you can inspect the results by looking at the following files in your output directory:
/<hub>/<jobid>/Failed.txt
/<hub>/<jobid>/Output.txt
These files contain the list of successful and failed operations from your batch. The file format is .cvs, in which each row has the line number of the original input file, and the output of the operation (usually the created or updated registration description).
Export
Exporting registration is similar to the import, with the following differences:
You only need the output URL.
You must create a NotificationHubJob of type ExportRegistrations.
Full Sample Code
The following is a full working sample that imports registrations into a notification hub.
using Microsoft.ServiceBus.Notifications;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
using System;
using System.Collections.Generic;
using System.Globalization;
using System.IO;
using System.Linq;
using System.Runtime.Serialization;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using System.Xml;
namespace ConsoleApplication1
{
class Program
{
private static string CONNECTION_STRING = "---";
private static string HUB_NAME = "---";
private static string INPUT_FILE_NAME = "CreateFile.txt";
private static string STORAGE_ACCOUNT = "---";
private static string STORAGE_PASSWORD = "---";
private static StorageUri STORAGE_ENDPOINT = new StorageUri(new Uri("---"));
static void Main(string[] args)
{
var descriptions = new[]
{
new MpnsRegistrationDescription(@"http://dm2.notify.live.net/throttledthirdparty/01.00/12G9Ed13dLb5RbCii5fWzpFpAgAAAAADAQAAAAQUZm52OkJCMjg1QTg1QkZDMkUxREQFBlVTTkMwMQ"),
new MpnsRegistrationDescription(@"http://dm2.notify.live.net/throttledthirdparty/01.00/12G9Ed13dLb5RbCii5fWzpFpAgAAAAADAQAAAAQUZm52OkJCMjg1QTg1QkZDMjUxREQFBlVTTkMwMQ"),
new MpnsRegistrationDescription(@"http://dm2.notify.live.net/throttledthirdparty/01.00/12G9Ed13dLb5RbCii5fWzpFpAgAAAAADAQAAAAQUZm52OkJCMjg1QTg1QkZDMhUxREQFBlVTTkMwMQ"),
new MpnsRegistrationDescription(@"http://dm2.notify.live.net/throttledthirdparty/01.00/12G9Ed13dLb5RbCii5fWzpFpAgAAAAADAQAAAAQUZm52OkJCMjg1QTg1QkZDMdUxREQFBlVTTkMwMQ"),
};
//write to blob store to create an input file
var blobClient = new CloudBlobClient(STORAGE_ENDPOINT, new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials(STORAGE_ACCOUNT, STORAGE_PASSWORD));
var container = blobClient.GetContainerReference("testjobs");
container.CreateIfNotExists();
SerializeToBlob(container, descriptions);
// TODO then create Sas
var outputContainerSasUri = GetOutputDirectoryUrl(container);
var inputFileSasUri = GetInputFileUrl(container, INPUT_FILE_NAME);
//Lets import this file
NotificationHubClient client = NotificationHubClient.CreateClientFromConnectionString(CONNECTION_STRING, HUB_NAME);
var createTask = client.SubmitNotificationHubJobAsync(
new NotificationHubJob {
JobType = NotificationHubJobType.ImportCreateRegistrations,
OutputContainerUri = outputContainerSasUri,
ImportFileUri = inputFileSasUri
}
);
createTask.Wait();
var job = createTask.Result;
long i = 10;
while (i > 0 && job.Status != NotificationHubJobStatus.Completed)
{
var getJobTask = client.GetNotificationHubJobAsync(job.JobId);
getJobTask.Wait();
job = getJobTask.Result;
Thread.Sleep(1000);
i--;
}
}
private static void SerializeToBlob(CloudBlobContainer container, RegistrationDescription[] descriptions)
{
StringBuilder builder = new StringBuilder();
foreach (var registrationDescription in descriptions)
{
builder.AppendLine(RegistrationDescription.Serialize());
}
var inputBlob = container.GetBlockBlobReference(INPUT_FILE_NAME);
using (MemoryStream stream = new MemoryStream(Encoding.UTF8.GetBytes(builder.ToString())))
{
inputBlob.UploadFromStream(stream);
}
}
static Uri GetOutputDirectoryUrl(CloudBlobContainer container)
{
//Set the expiry time and permissions for the container.
//In this case no start time is specified, so the shared access signature becomes valid immediately.
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy
{
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(4),
Permissions = SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.List | SharedAccessBlobPermissions.Read
};
//Generate the shared access signature on the container, setting the constraints directly on the signature.
string sasContainerToken = container.GetSharedAccessSignature(sasConstraints);
//Return the URI string for the container, including the SAS token.
return new Uri(container.Uri + sasContainerToken);
}
static Uri GetInputFileUrl(CloudBlobContainer container, string filePath)
{
//Set the expiry time and permissions for the container.
//In this case no start time is specified, so the shared access signature becomes valid immediately.
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy
{
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(4),
Permissions = SharedAccessBlobPermissions.Read
};
//Generate the shared access signature on the container, setting the constraints directly on the signature.
string sasToken = container.GetBlockBlobReference(filePath).GetSharedAccessSignature(sasConstraints);
//Return the URI string for the container, including the SAS token.
return new Uri(container.Uri + "/" + filePath + sasToken);
}
static string GetJobPath(string namespaceName, string notificationHubPath, string jobId)
{
return string.Format(CultureInfo.InvariantCulture, @"{0}//{1}/{2}/", namespaceName, notificationHubPath, jobId);
}
}
}