Can I run my own latency / performance tests against your service easily?

Sure, of course you can.

An example in node.js

const ably = require("ably");

const realtime = new ably.Realtime({key:"ABLY_API_KEY"});
const rest = new ably.Rest({key:"ABLY_API_KEY"});

const channel = realtime.channels.get('latency');
const restChannel = rest.channels.get('latency');

realtime.connection.once("connected", () => {
  var dataCenter = realtime.connection.connectionManager.connectionDetails.serverId.split('.')[3];
  channel.subscribe((msg) => {
    var received = newDate(),
    type = || 'curl';
    sent = type === 'curl' ? :,
    duration = received.getTime() - Number(sent),

    messageText = `${received.getHours()}:${received.getMinutes()}:${received.getSeconds()}.${received.getMilliseconds()} - ${type} publish - ${duration}ms roundtrip latency`;



  for(leti = 0 ;i < 5; i++){

    restChannel.publish(null, {type:'rest', sent:String(newDate().getTime())});
    channel.publish(null, {type:'realtime', sent:String(newDate().getTime())});



You can also use:

const Ably = require('ably/promises');

let ably = new Ably.Realtime.Promise({key : "ABLY_API_KEY"});

ably.connection.once("connected", () => {

var dataCenter = ably.connection.connectionManager.connectionDetails.serverId.split('.')[3];




const connection = async() => {

await ably.connection.once("connected", () => {

var dataCenter = ably.connection.connectionManager.connectionDetails.serverId.split('.')[3];





*Please note that, since Connection.connectionManager is not part of the public API it’s not present in the typescript interface, so you will need to add an ambient module declaration to extend the typescript interface in order to make your code compile.

For this you would just add:

declare module 'ably' {
namespace Types {
interface ConnectionPromise {
connectionManager: {
connectionDetails: {
serverId: string;



That will also show you which Ably datacenter you are connecting to.

When testing please remember that:

  • Publishing over a Realtime connection offers the lowest possible publish latencies. REST publishes will be significantly slower than realtime publishes for the first request as a TLS TCP/IP connection is established. Subsequent publishes will be faster as that connection will be reused, but they will still be somewhat slower than realtime publishes, as there is a certain amount of overhead for every REST request (see point 4 of debugging slow REST requests).
  • Measuring how long a publish takes to call its callback is not a good way to measure message publish and receive latency; publish ACKs are rolled up and sent every 500ms, so the message is often delivered long before the publisher gets the ACK (and so the publish callback is called).  Instead, it's better to subscribe to the channel and do an end-to-end test of how long it takes for a message to actually be received after being published.
  • Publishing messages on channels without any subscribers will add latency.  However, read why this does not ever impact real-world performance.
  • We continuously measure our latencies globally using real browsers.  View our global latencies.


Further reading: