Working with message batches

Bundling and sending high-volume messages as a 'batch'


Sometimes, you may need to send a message to several recipients at once i.e. in a batch.

To facilitate this, Falu provides a message-batches endpoint that allows you to send up to 1,000 messages in one API call.

You can use message batches in the following use cases:

  • Announce a new product or service.
  • Generate alerts.
  • Send confirmation messages or reminders for appointments, reservations, and purchases at scale.
  • Recommend products to customers.

How do message batches work?

Generally, a batch comprises a message meant for a group of related recipients. For example, if you have a product announcement that you wish to broadcast to your customers, you would compose a message body and send it through Falu's message-batches endpoint. You can compose the message body from scratch or use an existing template.

The general syntax of a POST request to send a message batch is:

curl -X POST 'https://api.falu.io/v1/message_batches' \
--header 'Authorization: Bearer YOUR_SECRET_KEY' \
--header 'X-Falu-Version: 2022-09-01' \
--header 'Content-Type: application/json' \
--data '{
    "stream": "transactional",
    "messages": [
        {
            "tos": [ "+254recipient1", "+254recipient2" ],
            "body": "YOUR_MESSAGE_BODY"
        }
    ]
}'

Alternatively, if you wish to use an existing template, you would do so as follows:

curl -X POST 'https://api.falu.io/v1/message_batches' \
--header 'Authorization: Bearer YOUR_SECRET_KEY' \
--header 'X-Falu-Version: 2022-09-01' \
--header 'Content-Type: application/json' \
--data '{
    "stream": "transactional",
    "messages": [
        {
            "tos": [ "+254recipient1", "+254recipient2" ],
            "template": {
                "id": "TEMPLATE_ID",
                "model": {
                    "key": "value"
                }
            }
        }
    ]
}'

Message batches are best used for marketing purposes. For more details, check Sending marketing messages.