Learn About InformaCast Resiliency - Podcast

We've heard for years how customers need the ability to run InformaCast in a multi-server configuration. Lots of industries have called for this, including health care, and even some schools who don't want to run the bells for the whole district off of a single InformaCast server, for fear that it or the network it runs on might not be available. 
 
For this reason Singlewire has been working on InformaCast Resiliency.
 
 

 
Resiliency is probably the hardest feature we've had to implement inside InformaCast. It's taken us a long time to get it in, but in InformaCast 9.0.1, due to be released this spring, you will see Resiliency for the first time.
 
Resiliency has two parts. 
 
The first part is replication. Let's say you have multiple InformaCast servers able to reach each other in a network. 
  • We want to give the end user a single point of administration. That means when a customer makes a change on their primary server, it will replicate to their secondary servers.
  • We want the replication traffic to be lightweight. We know that customers run many other applications on the network, and we don't want to require the customer to make network changes for replication to work correctly.
     
The second part of resiliency is failover. If you've got multiple InformaCast servers replicating to each other, they need to agree on a way to transfer the active role between them. 
  • The way this works is that on the publisher, you define an ordered list of InformaCast subscribers. Each server is in contact with all of the others. When one server in the list sees that all of the servers ahead of it in the list are no longer active, then it goes active.
     
Let's get some definitions and details out of the way:
  • InformaCast Resiliency uses a publisher/subscriber model, like CUCM does. One server is a publisher and holds the authoritative copy of the database, the others are subscribers and hold a mostly read-only copy of the database.
  • When the publisher is running, it's always active. Subscribers can be active or standby. Only the active server accepts and processes inbound send message requests. 
One of the things that makes resilience so hard is supporting IP phones.  With the HTTP-based model that we're currently using, when a phone receives a command from InformaCast, it must authenticate against a single URL. Since there is only one authentication URL, there's no practical way to provide authentication services. In InformaCast 9.0.1, we add support for a new method of sending commands to IP phones. This is JTAPI transport. The idea is that instead of InformaCast sending commands directly to the phones, it sends these commands to CUCM, and CUCM sends these commands to the phones. 
 
The advantage here is that (1) the phone no longer has to authenticate and (2) InformaCast does not have to transmit usernames and passwords to the phones. 
 
We think this will increase performance, allowing us to activate phones faster, and increase security for customers who are sensitive to passwords being sent over the network. 
 
In InformaCast 9.0.1, JTAPI transport will be optional for the publisher, but required for subscribers. JTAPI transport as a feature can be used in non-resilient modes of InformaCast, so you'll be able to use it in everyday deployments as well as resilient ones. 
 
This leads us into our next topic, deployment topologies, where can customers deploy InformaCast in their networks? There are a couple of recommendations.
  1. One thing that we're not doing in this release is targeting a survivable remote. So, if you place InformaCast at a remote site without a CUCM and the WAN link goes down, that InformaCast server won't be able to reach the phones at that site. We're hoping to address this in a future release.
  2. Co-locate the InformaCast server with a CUCM cluster member. Since a resilient subscriber needs CUCM to reach the IP phones, it makes sense in most instances to place InformaCast with a CUCM cluster member. If you have a split CUCM cluster, you can now split your InformaCast servers along the same topology.

 

 

InformaCast Online Demo