Limetree, like a lot of companies out there, relied on KissMetrics to get insights on users. We tracked lots of events and tried to make sense of them.
With all this data there was still one flaw. We could easily segment customers and create useful charts, but we could not reach them. KissMetrics did not help here; the data was fairly closed and there was no API available.
I wanted to email customers who had used the service for a few weeks and then stopped. Or those who only sent pictures but no videos. I wanted to tell them that video was an important part of the product.
How to migrate to Customer.io
We were not ditching KissMetrics, but Customer.io relied on its own event system, so I needed a way to import the old metrics. I used the KissMetrics export service, which dumps hundreds of JSON files to an S3 bucket of your choice. It takes a few hours the first time, but after that it keeps the bucket updated every few hours.
Step 0: KissMetrics to Customer.io Migration Class
Step 1: Export the KissMetrics data
This step is simple and KissMetrics had a straightforward guide: Follow these steps and come back for the next step.
Step 2: Export your users
Before you start sending events to Customer.io you need to create the customers, which is the same as users. Export your user base to a CSV file. A simple dump from your database is enough.
The required fields are user id, email, name, and creation date. You can add other properties you might find useful, probably the same ones you were already using on KissMetrics, like plan type or country.
23123, “John Doe”, “johndoe@disposable.com”, “2012-11-23”, “Portugal”
…
Step 3: Import your users
Feed each user to create_customer(id, email, createdAt, attributes)
This process will take a few minutes to hours depending on your user base.
error_reporting(-1);
define(‘SITE_ID’, ‘xxxx’); define(‘API_KEY’, ‘yyyy’);
require(“kissmetrics-to-customerio.php”);
$csvFile = file_get_contents(‘users-dump.csv’);
foreach (explode("\n", $csvFile) as $row) { $data = str_getcsv($row); $id = $data[0]; $name = $data[1]; $email = $data[2]; $createdAt = strtotime($data[3]); $country = $data[4];
$attributes = array('name' => $name, 'country' => $country);
try {
create_customer($id, $email, $createdAt, $attributes);
} catch (Exception $ex) {
echo $ex->getMessage();
}
}
Step 4: Collect your events
After you import all your users, the S3 bucket should be filled with the juicy content of all your KissMetrics events. The number of files can be overwhelming. They export hundreds of small files, each containing several one-line JSON records, so we need to compact them into one file to make the import easier. Example:
{“platform”:“iphone”,"_n":“login”,"_p":“12312”,"_t":1352317082}
{“platform”:“web”,"_n":“login”,"_p":“2221321”,"_t":1352316831}
{“platform”:“web”,"_n":“login”,"_p":“112123”,"_t":1352317100}
First sync the S3 bucket with a local directory:
s3cmd sync s3://kissmetrics-export-bucket km-export
Compact all the JSON files into one:
cat km-export/*.json > km-data.json
Step 5: Import your events
Feed each event to track_event(userId, name, timestamp, attributes)
This process will take a few hours depending on the volume of events.
define(‘SITE_ID’, ‘xxxx’);
define(‘API_KEY’, ‘yyyy’);
require(“kissmetrics-to-customerio.php”);
$eventsFile = file_get_contents(‘km-data.json’);
foreach (explode("\n", $eventsFile) as $row) { $data = json_decode($row, true);
$userId = $data['_p'];
$name = $data['_n'];
$timestamp = $data['_t'];
unset($data['_p']);
unset($data['_n']);
unset($data['_t']);
// After unsetting _p, _n and _t, the remaining values are event properties
$attributes = $data;
try {
track_event($userId, $name, $timestamp, $attributes);
} catch (Exception $ex) {
echo $ex->getMessage();
}
}
Step 6: Wait
At this step you can grab a coffee, watch a movie, do some more coding, or simply sleep. It is going to take a few hours, but it will save you days or even weeks of work. The data you already have on KissMetrics is worth a lot.
Step 7: Extra tip
If you have a huge amount of data, a few months or even years, you can accelerate the process by importing only the last 30 days, for example. There are use cases where you might need older data, but this timeframe worked for me.
Just add this piece of code:
$eventsSince = strtotime(’-30 days’);
foreach (explode("\n", $eventsFile) as $row) { $data = json_decode($row, true);
$userId = $data['_p'];
$name = $data['_n'];
$timestamp = $data['_t'];
if ($timestamp < $eventsSince) continue;
…
}
Conclusion
This guide may seem long, but you can do this process in a few hours. If you have any doubt, just get in touch.